Cloud security gains overshadowed by soaring storage fees
Storage fees in general (e.g., API calls, operations, data access) comprise 49% of an average user’s service bill, compared to the actual stored capacity, according to a study conducted by Vanson Bourne.
Nearly all organizations globally have experienced data security-related benefits from public cloud storage. The top security benefits are:
- Improved data security capabilities compared to the previous environment.
- Easier prevention/mitigation of unplanned data loss events.
Cloud storage costs continue to soar
- Storage fee vs. capacity billing split (49% fees, 51% capacity) remains consistent with findings from the past two years, unfortunately indicating little to no improvement in this billing mix for the market overall.
- More than half (56%) of organizations say they experience IT or business delays due to egress or other data access fees associated with moving their organization’s data out of a public cloud environment.
- Cloud storage budget overruns are worsening, with 62% of respondents saying they exceeded budgeted spending in 2024, compared to 53% in 2023.
- Difficulty forecasting actual storage usage and app migration patterns, coupled with higher fees for operations, are top reasons driving budget excess, with 42% of organizations saying they migrated more applications and data than originally anticipated and 39% incurring higher data operation fees than expected.
“Unfortunately, cloud storage remains an unpredictable expense for many organizations, and fees associated with moving and accessing stored data only exacerbate the nature of this unpredictability, ultimately stalling business initiatives and slowing innovation,” said Andrew Smith, director of strategy and market intelligence at Wasabi Technologies. “It is imperative that organizations eliminate and minimize these fee structures whenever possible. Cloud object storage is only growing in capacity and usage, as organizations are demanding more and doing more with their data, often driven by new AI-based initiatives to explore use cases like GenAI or Agentic AI. Controlling costs associated with these new workloads will be critical to long-term business success, and legacy fee structures and billing models will only slow progress and punish innovators.”
Data security tops cloud storage decision-making checklist
- Cloud storage decision-makers pay particular attention to security fundamentals like quality/robustness of encryption (32%), ransomware protection capabilities (29%), and data durability and availability SLAs (28%) when choosing a service/provider.
- Almost all (99%) respondents indicate their use of public cloud storage has resulted in data security-related benefits for their organization – with improved data security capabilities compared to their previous environment, and improved ability to prevent and mitigate unplanned data loss cited as top benefits.
- Almost all (99%) of respondents say they recover data from their public cloud storage environment(s).
- However, surprisingly only 47% of respondents say they use object lock (immutability) as part of their public cloud storage backup procedures today. Despite this low rate of current utilization, the majority of organizations not currently using (49%) plan to introduce object lock in the next 1+ years.
- More than half (53%) of organizations recover data from their public cloud storage environment on a weekly basis (every 1-2 weeks) for backup purposes, including testing. This is a great indication of just how active secondary storage use cases like data backup/recovery can be.
Cold storage access remains a significant challenge for many organizations
Unfortunately, 98% of respondents using low-cost “cold” storage tiers say they deal with data performance degradation and data access penalties. Ultimately, this leads to one in five organizations saying their business operations are negatively impacted by performance or data access delays of cold storage tiers.
Although many organizations think they will never access data stored in ultra low-cost, deep archive tiers, the reality is most orgs (84%) say they end up accessing data stored in cold tiers with performance degradation/access penalties on a weekly or monthly basis.
Why such frequent rates of access? Researchers asked respondents about their more “active” archive use cases in the public cloud, and 51% of organizations say they leverage active archive storage for analytics and data processing, while 49% say they use active archive storage for security analytics and forensics.
Finally, the analysis sheds light on an extremely important detail: access requests for archive data are not always under the direct control of the organization itself, complicating rates of unplanned access. The top reasons cited for cold storage data access were regulatory and compliance needs, and to respond to security events like ransomware/malware. These are factors which can’t always be controlled or even planned for, and part of the reason why estimating cold storage data access patterns, and ultimately costs, can be difficult.