Cut Nasuni and File Storage Costs with Azure Cold Storage

Jason Webster

Field CTO - Microsoft 365 & Azure

Nasuni is excellent for active file data, especially when you need global file sharing and edge performance. However, when dormant data piles up, archiving to Azure Blob Cold or Archive can cut storage costs dramatically if you plan for access, visibility, and rehydration workflows.


Nasuni is a great solution for distributed file storage. People I speak to who utilize it love it.  Particularly, its effectiveness for organizations that need global file sharing, edge caching, and predictable performance across many sites. It delivers a simple and effective solution that has been tough to match by the competition.  However, that storage under management bill comes with a big price tag for organizations with growing data sets. The long-term cost profile of storing that data exclusively in Nasuni’s backend object storage becomes a real discussion point. 

If your Nasuni footprint is expanding faster than your budget, Azure Blob Storage can be an appealing target for that data you hope you never need to access.  The challenge I’ve seen is with two things: 

  • Deciding what data to archive off Nasuni.
  • The tradeoff between convenience and complexity that delivers the optimal cost. 

Migrating off platforms like Nasuni is relatively simple, you can leverage AzCopy to migrate files from Nasuni to your target storage accounts/containers.  Finding data once there gets a little trickier unless all your users know how to use Azure Storage Explorer.  You have options to solve the latter challenge, but there are tradeoffs with each that we will cover in this post. 

Why look to have two solutions? 

Nasuni delivers excellent value for active, frequently accessed data – particularly at the edge. However, the cost starts to become an issue once large amounts of data accumulates. Particularly when you know all that data isn’t being used.  You may see: 

  • Rapid growth of backend object storage consumption. 
  • Expanding Nasuni licensing and support costs (files under management).
  • Cold data taking up expensive space. 
  • Difficulty justifying the cost year over year. 

I’d estimate from experience that up to 50% of a file share is dormant data.  If you have enough data, it becomes advantageous to stage that off to cheaper archives.  It’s about weighing the following: 

  • Cost of staying the same. We must save enough to make the tradeoffs matter.
  • Cost of complexity. Retrieval of cold data requires a process to make it cost-efficient. 
  • Cost of implementation. Moving, securing, and managing a second storage solution has costs. 

That said, the economics work when you can see 90% reduction in your data storage costs for cold data you hope you never need to retrieve. 

Basic Archival to Azure Blob Storage 

The simplest approach is to identify cold datasets and migrate them directly into Azure Blob Storage using Cold or maybe even Archive tiers, depending on how confident you are that you won’t need to retrieve it.  You gain an immediate lower $/TB storage compared to Nasuni.  You can effectively save on the licensing costs. 

  • Copy, verify, and delete from Nasuni.
  • Perfect solution for long-term, secure, compliant retention. 

Challenges 

  • No direct access from Nasuni.
  • No direct access for users. It’s not ideal having them browse Storage Explorer, and could even reduce ROI for excessive LIST operations browsing the data.
  • Admins accessing archived content requires pulling data back through a request/restore workflow.
  • Azure Archive and Cold tiers charge to bring data excessive back online.
  • Users may not know what data is in there, which creates confusion and support needs (people cost). 

When should you consider this? 

  • You’re sure the data is truly cold.
  • Retrieval is rare.
  • Compliance or cost reduction matters more than convenience.
  • You want the fastest path to cost savings. 

This is a perfectly practical approach and one that many take.  Your use case and end-user needs just need to align to expectations. 

Azure Blob Storage with Some Visibility 

If you want the savings/simplicity of the above without completely losing visibility into the archived data, you can take some steps that slightly increase costs but give you some ability to search and navigate what is in the storage for end users. 

What it looks like 

  • Data is archived to Azure Blob Storage.
  • Enable Data Inventory on the Azure Storage Blob to export the metadata and any custom tags to a “hotter” tier of storage in either CSV (smaller) or Parquet (larger) formats on a periodic basis.  This is handled as a setting, no complexity.
  • Leverage Azure Data Explorer (ADX) clusters to write simple Kusto (KQL) queries to build the data into views. Consume ADX in Power BI and publish periodic reports that provide file metadata searchability and tree-like navigation of the data source.
  • You still need to execute recovery operations through a standard process like Storage Explorer, but you can identify where the data to recover is much faster. 

Pros 

  • Users can “find” their data through published Power BI reports that are updated on a cadence you prefer (or live).
  • Costs are predictable, and variance should be limited to data rehydration.
  • The cold storage stays cold. The activity is on just a relatively small number of metadata that is stored in a “hotter” tier.
  • More efficient restore workflow where you can build automation to rehydrate data when needed. 

Challenges 

  • Power BI is not Windows Explorer – Users looking for a more SMB like experience browsing the files may not like the report experience as much.
  • Solution complexity – We now have multiple solutions in the chain that, while “set and forget,” may require maintenance from time to time. 

Why is this “better” 

  • You prefer to retain file visibility for legal, project, or discovery purposes.  Meaning, users are hesitant to archive and need to know what is in the archive.
  • Retrievals happen occasionally, and the users know where the data is, not admins.
  • The savings are meaningful enough to have a separate process. 

I like this approach because it uses native capabilities, some stacking of technologies, but is low-cost for cold data. The goal in the first place was to hope we never needed to use this process. However, I recognize that there is a lot involved here with light KQL expertise, Power BI reports, etc.  

User Friendly and Still Cost-Effective? 

The third solution combines the same Azure Blob Storage with building an intelligent layer that keeps the data discoverable to users, available to AI/Agents, and simpler to execute automated recovery workflows. 

This approach uses: 

  • The same low-cost Azure storage for inactive datasets
  • Deployment of Azure AI Search indexers to index the cold storage metadata, custom tags, and (optionally – at a cost) full content.
  • The development or deployment of a lightweight frontend Azure Logic App that gives your users the ability to search or parse through that index using your business logic.
  • Integration of that Logic App to a tightly controlled workflow that allows for efficient retrieval from cold storage back to production storage.
  • Flexibility over security and governance rules for when files are rehydrated or restored to chargeback/showback costs of recovery. 

Pros 

  • Users (or admins) can search for cold files without needing to browse Blob directly.
  • Predictable costs. Azure AI Search + Blob + App workflows are predictable, and governance/security rules control or show the cost of recovering data.
  • Recall requests are standardized and trackable.
  • Great for large businesses or lots of data because it aligns with governance, compliance, and financial controls. 

Challenges 

  • It needs to be built and maintained while defining metadata structure, search indexing, and automation processes. 
  • It’s still not “Windows Explorer”.  Paying for Azure Files instead of BLOB storage would deliver that outcome. 
  • Azure AI Search indexes can become expensive depending on how much data you index and how large (number of objects, not necessarily size) they are in cold storage. 

In my opinion, this is the right strategy when: 

  • You have hundreds of TBs or more in Nasuni that could be archived.
  • Storage growth is straining budget forecasts.
  • Searchability and governance are equally important as cost reduction.
  • You want to aggressively archive, but you may still need to recover more frequently.
  • You want your cold data usable by Agents and Copilot.
  • You already use Azure AI Search, Purview, or workflow automation tools. 

The result is Nasuni (or Azure Files) for active working data, Azure Blob for long-term data, and automation to bridge the gap. 

It’s not perfect, but it could be “Good Enough” and save a lot of budget. Here’s the TL;DR:

Cost Savings Come with Access Complexity and Visibility Challenges

You can save significant money by archiving to Azure cold storage, but those savings come with some loss of direct access and visibility from end users. Is this acceptable? 

Rehydration is far from FREE 

Azure Archive and Cold tiers offer meaningful $/TB savings, but also include: 

  • Retrieval fees
  • Early deletion fees
  • Slow recovery time
  • A focus on data security/recoverability 

We must set user expectations 

Your data isn’t gone, but it’s no longer directly within reach.  If you want it back, we can get it from the archive, but that comes with a delay and some costs. 

Clear policy is needed 

A solid archival strategy must define: 

  • What qualifies as “archive”? 
  • How long does data stay in each tier? 
  • What can we pull back, and when should we? 
  • Who approves retrievals?  

We have the tools to do this efficiently; however, success is determined by policy and communication. 

What should you do now (Next Steps)? 

Nasuni is a great platform.  As is Azure Files. Nasuni specifically does an incredible job at replicating hot/cold data between centralized and edge appliances to make sure it’s in the right place to perform for the user.  Azure Cold/Archive Storage balances that high-cost/high-performance solution by offloading stale data to cheaper storage options and reducing your files under management costs. 

If you’re evaluating your next steps, it’s important to understand your data and users.  This model can help you choose the approach that fits your needs and budget. 

If you want help analyzing your Nasuni footprint, identifying cold data, and modeling the cost savings of an Azure archival strategy, our team can help you build a plan with an Azure Storage Assessment. We’ll show you where the savings are, what the risks look like, and what an actionable roadmap forward could be. 


Team of IT Technicians Collaborating in Office

Model Your Storage Savings Before Costs Spiral

We’ll analyze your Nasuni footprint, identify cold data, and model Azure archival savings while aligning governance and lifecycle strategy to FinOps best practices.

Get in Touch with Us

Connect with an expert to learn what we can do for your business.

Request Access to Win Wires

Enter your work email to request access to the eGroup Win Wires repository.

By requesting access, you confirm you are using an approved business email domain. You’ll receive a secure, one-time login link after returning to the Win Wires page.