StorageNewsletter Vendors’ Facts 2025
Interesting past year with confirmation of trends and acceleration of adoption
By Philippe Nicolas | January 15, 2026 at 2:02 pmStorageNewsletter recently asked vendors across the industry to share their perspective on the past year, and we collected 31 responses. Each company provided three key facts, which you’ll find presented below.
These companies are by alpha order: 9livesdata, Arcitecta, Cerabyte, Cohesity, CTera, Databahn, Datadobi, DDN, ExaGrid, Grau Data, Hitachi Vantara, HYCU, HyperBunker, Infinidat, Leil Storage, Lucidity, Nakivo, Toshiba Electronics, Oxibox, Plakar, PoINT Software & Systems, Pure Storage, QStar Technologies, SIOS Technology, Spectra Logic, Starfish Storage, StorMagic, StorPool Storage, TrueNAS, Tuxera and Vdura.
Before that, we summarize the main insights drawn from their answers, highlighting emerging trends, confirmations of previous expectations, and developments announced last year—or even earlier. As always, this exercise proves insightful thanks to the industry’s collaboration.
We would like to thank all participants for the time and effort they dedicated to this initiative. In the coming days, we will also publish the industry’s Predictions for 2026.
Here is synthesis:
-
TCO is a primary driver of infrastructure decisions
Organizations are reassessing storage and data architectures with a strong focus on long-term cost efficiency, not just upfront investment or raw performance -
Next-generation archive storage has emerged
Archives are evolving from passive, cold repositories into accessible, intelligent platforms that support analytics, AI, and operational use cases -
AI has dramatically accelerated data growth
AI workloads are producing and consuming data at unprecedented scale, putting pressure on storage capacity, cost, and sustainability -
Cyber resilience has become a business imperative
Enterprises now assume breaches will happen and prioritize recovery, continuity, and resilience over prevention alone -
The myth of cloud invincibility has been dispelled
Cloud-only strategies have revealed limitations around cost, performance, control, and security, leading organizations to rethink blind cloud adoption -
AI delivers real value only when embedded in operational pipelines
AI initiatives succeed when integrated into production workflows; isolated experiments and standalone models fail to generate measurable ROI -
Unstructured data is a silent IT budget killer
Poorly governed unstructured data drives hidden costs, operational complexity, and uncontrolled infrastructure growth -
The “AI Factory” era has begun
Infrastructure is transforming into intelligent, factory-like systems designed to continuously ingest, process, and optimize data for AI workloads -
Artificial intelligence is no longer optional
AI has moved from experimentation to necessity across enterprises, becoming a foundational capability rather than an emerging technology -
Capture-at-creation enables AI readiness at petabyte scale
Capturing, enriching, and structuring data at the moment it is created is essential to scaling AI efficiently and cost-effectively
| Fact # | Storage | AI | Security | Cloud |
|---|---|---|---|---|
| 1. TCO drives decisions | ✅ | |||
| 2. Next-gen archive | ✅ | |||
| 3. AI accelerates data growth | ✅ | ✅ | ||
| 4. Cyber resilience imperative | ✅ | |||
| 5. Cloud invincibility myth | ✅ | |||
| 6. AI in operations | ✅ | |||
| 7. Unstructured data cost | ✅ | |||
| 8. AI Factory era | ✅ | ✅ | ||
| 9. AI no longer optional | ✅ | |||
| 10. Capture-at-creation | ✅ | ✅ |
This table below is also accessible here as xls file printed in pdf.
| Company | Fact 1 | Fact 1 Detail | Fact 2 | Fact 2 Detail | Fact 3 | Fact 3 Detail |
|---|---|---|---|---|---|---|
| 9livesdata | TCO Reduction: A Key Driver of Change | One of key factors influencing the evolution of storage infrastructure is the drive to reduce the TCO. Enterprises are grappling with data growth and escalating hardware costs, which have made the expense of storage increasingly burdensome. In response, organisations are exploring new products and rethinking how existing storage solutions are deployed and utilized. Additionally, there is a renewed focus on reviewing retention policies and SLAs with with an emphasis on cost containment. | Flash Storage Adoption: Slower Than Anticipated | While there were widespread predictions that flash storage would soon dominate the market, relegating hard disks to specialized niches, the reality has proved more complex. The current economics of storage, especially dramatic price increases, have slowed the expected rate of flash storage adoption. | no more facts | no more facts |
| Arcitecta | Next-Generation Archive Storage Has Emerged | As data volumes grow, more efficient and cost-effective archival storage solutions have become critical. Flash and disk-based storage options, while fast, come with high costs when scaling to large capacities. This has led to a resurgence in tape storage as a viable solution for modern needs, and the introduction of new, emerging technologies like storage on glass. Companies began to aggregate smaller units into larger configurations that combine the scalability of tape with the flexibility of c | Real-Time Data Streaming and the Geo-Distributed Workforce | The rise of remote work and geographically distributed teams changed how businesses operate. Real-time data streaming allows organizations to record events and share live feeds globally, enabling employees to collaborate on continuous data streams without needing to be physically present. More companies adopted tools that facilitate seamless broadcasting and data distribution. By enabling real-time collaboration across a distributed workforce, businesses are reducing travel costs, increasing eff | The Rise of Storage Virtualization and the Data Fabric | As organizations looked to optimize their storage strategies, the rise of storage virtualization has made it easier to interconnect various data storage technologies. Businesses are maximizing their existing investments and avoiding vendor lock-in by leveraging a data fabric—an architecture that unifies cloud, disk, tape, and flash storage into a single, logical namespace. This has allowed for a more flexible approach to data management, enabling businesses to mix and match technologies to meet |
| Cerabyte | AI Accelerated Data Growth | AI workloads are driving an unprecedented surge in data creation and storage needs. High-resolution images, complex video files, and other rich media used for diagnostics, training, and AI-driven decision-making continue to grow in size and volume. At the same time, AI systems generate vast amounts of metadata and audit logs that organizations must retain to document decisions and reduce potential liability. Together, these forces are fueling a rapid and sustained increase in data storage demand | AI assisted analysis of datasets changed access requirements | As organizations utilize AI for insight generation and potential new use cases rapid access to very large datasets is required, capacity-storage needs are shifting toward solutions that combine cost efficiency with high performance and high data throughput | Sustainability Moves Under the CFO | As sustainability shifts from ESG compliance toward the CFO’s remit, its emphasis is moving from ‘ecologically virtuous’ to ‘cost-efficient.’ Emerging technologies are addressing this convergence by delivering both cost savings and footprint reductions. |
| Cohesity | Enterprise cyber resilience imperative | Businesses increasingly find traditional backups insufficient against modern cyber threats. A growing focus is on holistic cyber resilience, integrating advanced ransomware protection and rapid data recovery. This strategic shift is vital for anticipating, withstanding, and quickly rebounding from disruptive cyber incidents, as 76% of organizations experienced a material cyberattack last year. | AI revolutionizing data intelligence | Artificial intelligence profoundly transforms data security and management. AI enhances threat detection, optimizes recovery, and improves data orchestration. This widespread application of AI not only fortifies defenses but also enables data to become a dynamic product, delivering insights and driving operational autonomy. | Cyber maturity is only as strong as your weakest link | Cyber resilience in 2026 hinges on the strength of the entire digital ecosystem, not just individual defenses. A single weak link can compromise everything. True resilience requires a collaborative, ecosystem-wide approach, prioritizing collective integrity and continuous improvement across all partners and systems. |
| CTERA | The Year the Cloud’s Invincibility Myth Died | 2025was the year the myth of cloud infallibility was shattered by an autumn run of outages.These were not catastrophic events but routine operationalerrors with a global blast radius,proving that massive centralization created a single point of failure.Theresult was widespread financial&operational paralysis,as stalled production lines and frozen markets revealed the true cost of downtime. This shock changed the industry’s risk calculus,proving reliance on a single provider was no longer viable. | The Year “Shadow AI” Entered the Legal Lexicon | The theoretical risk of shadow AI became a tangible legal liability in 2025. A series of court motions forced vendors to produce millions of user conversations, turning employee chats from productivity aids into discoverable corporate records. This precedent was a wake-up call, proving that vast amounts of sensitive corporate data now resided on third-party servers outside internal governance. AI risk was no longer a future concern but a present, documented legal danger. | The Year AI Was Forced to Show a Digital Passport | The era of the anonymous,free-roaming AI agent ended in 2025.High-profile conflicts between platform giants and AI-powered browsers set a new precedent: AI agents must authenticate or declare their identity to access platform data.This was more than a technical block; it was a policy shift as platforms asserted the right to grant or deny access based on an AI’s origin and intent. The first demand for a digital “passport” signaled the open web’s shift toward controlled, permissioned interactions. |
| Databahn | AI Delivered Real Value Only When It Lived in the Pipeline | In 2025, teams learned that AI made the biggest difference when it worked inside the data pipeline. It caught schema drift before detections broke, enriched events upstream, and filtered noisy telemetry that inflated storage bills. This was the year organizations stopped chasing chatbot features and focused on AI that improved the structure and reliability of their data. | Vendor Lock-In Slowed Down Security and Drove Up Storage Costs | In 2025, enterprises discovered that tightly coupled platforms were holding them back. When analytics, storage, and routing sat inside one vendor’s ecosystem, even small changes became high-risk. The shift to a neutral control plane gave teams the ability to route data where it made sense and finally separate operational needs from vendor strategy. | Data Discipline Became the Real Differentiator for AI Accuracy | The biggest gains in AI accuracy in 2025 came from improving data hygiene, not adding new models. Organizations that invested in lineage, validation, and observability saw better outcomes and fewer surprises. AI drift became a daily operational concern, and teams realized that clean, reliable data was the real engine behind trustworthy automation. |
| Datadobi | Unstructured Data is the silent IT budget killer. | Unstructured data already represents up to 90% of enterprise information, and accumulation rates are accelerating on the back of AI-driven workloads. Many organizations still default to simply adding more storage, but this approach is unsustainable and leads to spiraling costs. At the same time, SaaS sprawl and shadow IT continue to fragment data across clouds and endpoints, meaning that leaders who lack visibility will see these costs escalate. | From Infrastructure to Intelligence: Elevating IT’s Role in Business Strategy | I&O teams are no longer just keeping the lights on, they’re shaping the data for for AI, analytics, and compliance. I&O leaders are shifting from reactive operations to proactive strategy, aligning data management with business goals and unlocking new value from existing assets. | Data Governance on a Budget: Doing More with Less | With fewer resources but growing regulatory complexity, CDOs need automation and precision. Teams need to enforce governance at scale, reducing risk while accelerating time-to-insight. It’s not just compliance, it’s competitive advantage. |
| DDN | The AI Factory Era: Turning Infrastructure into Intelligence | AI has entered a new phase—defined by scale, efficiency, and sovereignty. Leading enterprises and nations are building AI factories: intelligent systems that turn vast data into insights and advantage. The challenge now isn’t model creation, but data mastery. How organizations move, manage, and operationalize data will define innovation speed, efficiency, and control. The new edge lies in intelligent infrastructure. | AI Sovereignty and the Rise of the Intelligent Data Economy | Governments and enterprises now see AI sovereignty as digital sovereignty. Nations are investing in domestic AI infrastructure to secure data, ensure compliance, and control AI-driven intellectual capital—fueling a new intelligent data economy. For enterprises, the focus is on data control, portability, and transparency. The next AI leaders will balance performance and governance across clouds, on-premises, and sovereign platforms. | Performance as a New Sustainability Metric | As AI workloads grow, performance and sustainability are now intertwined. Data center energy demand is set to exceed 1,000 TWh by 2027, driven by AI. Every idle GPU or inefficient pipeline wastes energy and cost. The new metric of success is intelligence per watt—maximizing utilization, reducing redundancy, and building smarter data pipelines to accelerate insight, boost efficiency, and strengthen resilience. |
| ExaGrid | Artificial Intelligence is Here – AI | After 70 years of waiting NVDIA made a GPU that could actually run AI. Since that critical moment AI has taken off and is starting to show up in all aspects of our lives. The cat is out of the bag and that is a fact. | Ransomware Attacks are on the Rise and are Here to Stay | It is a fact that Ransomware attacks are on the rise. In 2026, all organizations will need to assess and make changes to guard against Ransomware. It is not like a bad cold that will go away. | Heterogenous or Hybrid is the Word of The Times | It is clear that customers are using a mix of on premise, cloud and SaaS. Everything has gone heterogeneous or Hybrid. The day of all cloud or all on premise is dead. |
| Grau Data | Capture-at-Creation Enabled AI Readiness at Petabyte Scale | In 2025, large-scale research environments operating at hundreds of petabytes demonstrated that capturing insight and context at file creation materially improved AI and analytics readiness. By making data discoverable and usable without reopening files, organizations reduced operational friction, improved storage utilization, and enabled broader use of archival tiers without limiting downstream access. | AI Adoption Exposed Widespread Redundant Data Processing | As AI workloads scaled in 2025, enterprises surfaced a systemic inefficiency: the same unstructured data was repeatedly parsed, tokenized, embedded, and enriched across multiple workflows. This redundant preprocessing increased GPU utilization, cloud egress, and operational cost, shifting attention from storage capacity to compute inefficiency and data handling practices. | Early Context Capture Improved Legal Hold and Archive Efficiency | In 2025, organizations capturing metadata and contextual insight from machine-generated data at creation strengthened legal hold and governance while reducing storage requirements by up to 50%. Because discovery and audit needs could be met without repeated file access, data was archived more aggressively on low-cost storage while remaining searchable and accessible. |
| Hitachi Vantara | Managed Services Will Expand Across Hybrid and Multi-Vendor Environments | Customers will increasingly seek on-prem, hybrid and multi-vendor managed services rather than just employing hyperscalers for their infrastructure, compute and storage needs. Providers that can manage heterogeneous environments, including competitors’ hardware, will have a competitive edge. -Jeb Horton, SVP, Global Services | Infrastructure Security Shifts From Policy to Contractual Guarantees | Enterprises have long prioritized security but haven’t always embedded it as a core requirement in infrastructure contracts. That’s changing; security is becoming a formalized service guarantee. Vendors will respond with immutable-by-default architectures, fenced recovery environments, and rehearsed recovery time objective (RTO) commitments. Meanwhile, insurers will increasingly require telemetry to validate compliance and risk posture. -Sunitha Rao, SVP/GM for Hybrid Cloud Business | Enterprises Pivot From Stockpiling to Strategy | In recent months, many big companies stocked up on large amounts of GPUs in an attempt to gain an AI advantage. Now they’re sitting on a lot of computing infrastructure and trying to figure out what to do with it. In the year ahead, enterprises will focus less on capital expenditures like GPUs and shift into a show-me-the-money mindset. We’ll see a greater focus on software, applications and services that help these companies derive value from their fixed costs. Simon Ninan SVP Business Strategy |
| HYCU, Inc. | HYCU 2025: Raising the bar for SaaS Data Protection | Delivered the first and only data protection for iManage Cloud, the undisputed leader in Legal document management. In addition, HYCU expanded support to more than 90 workloads, adding new hypervisors Hyper-V, Azure Local, XenServer, and enhancing existing Atlassian portfolio (Confluence and JSM), and cloud platforms including Box, M365 and Entra ID. Made all integrations available through the HYCU Marketplace, enabling customers to extend protection quickly and consistently. | HYCU 2025: New cyber resilience fabric built into HYCU R-Cloud | Introduced R-Shield, the industry’s first cyber resilience across the full data estate that finds and stops ransomware right at the source. The release of R-Shield introduced a cyber resilience fabric built to protect SaaS, cloud, and hybrid environments. R-Shield delivers secure, customer-controlled malware scanning, rapid recovery from supply chain attacks, and resilient restore options that help organizations reduce risk and recover with confidence. | HYCU 2025: Artificial Intelligence Driving Innovation in Data Protection | Delivered comprehensive patent pending AI data protection. HYCU delivered patented, atomic protection for modern data platforms, including cloud data warehouses such as Google BigQuery, Azure Data Lake Storage, vector databases like Pinecone, and scalable, cost-effective protection for object storage workloads, ensuring critical data remains protected as enterprises modernize. |
| HyperBunker | HyperBunker introduces physically isolated recovery vault for ransomware scenarios | HyperBunker has developed a hardware-based data recovery vault that operates fully offline, using physical double air-gap isolation rather than logical air-gapping. The system is designed to preserve a last-known-clean recovery copies even when production systems, credentials, and backups are compromised by ransomware. | HyperBunker targets regulated and critical infrastructure sectors | HyperBunker is positioned for organizations with strict resilience and recovery requirements, including financial services, energy, healthcare, transport, and other regulated environments. The solution is designed to support audit, compliance, and operational recovery use cases rather than primary backup or storage workloads. | HyperBunker deployed as subscription service with minimal operational overhead | HyperBunker is delivered as a subscription-based service that includes hardware, maintenance, and recovery support via local partners. Day-to-day operation is limited to monitoring and test restores, with no continuous network connectivity to production systems. |
| Infinidat | Significant Strides to Make Enterprise Storage More Power-Efficient and More Compact | In 2025, power efficiency emerged as a priority. With the enormous amounts of power that AI requires, the level of efficiency of storage systems took centerstage. Infinidat addressed it by launching a new, more compact InfiniBox G4 system that delivers superior Green IT power efficiency − 31% smaller physical configuration for a more efficient power profile, 28% more capacity in a smaller footprint, and 45% reduction in power per petabyte, resulting in a greener, more sustainable IT solution. | Increasing Traction of Next-Generation Data Protection | The need for cyber storage resilience accelerated in 2025. Emerging from this trend was the momentum advancing next-generation data protection, which is an advancement over modern data protection and traditional data protection. The latest developments of innovative capabilities, such as Infinidat’s InfiniSafe Automated Cyber Protection and InfiniSafe Cyber Detection part if Infinidat’s InfiniSafe solutions suite deliver proactive cyber safeguards against cyberattacks. | Optimizing Enterprise Storage for AI Workloads and Applications | Retrieval-Augmented Generation (RAG) is the enterprise storage “killer app” that brought combines AI and storage in a powerful way. RAG combines the power of AI with an enterprise’s private data to make AI more accurate, performant, and relevant. With Infinidat’s AI RAG reference architecture, enterprises utilize Infinidat’s InfiniBox and InfiniBox SSA enterprise storage systems as the basis to optimize the output of AI models, without the need to purchase any specialized equipment. |
| Leil Storage | Shortage of capacity | Due to AI demand (mostly), there is an overwhelming shortage of capacity and this boosts and increases importance of larger drives, be it for flash or HDDs, | Extreme data growth in general | Reflected very well in stock exchange positions of all main vendors of flash and HDDs, they are ripping the benefits of the boom. No one knows for how long it will last but it is a very good time to be in the storage industry! | Widening adoption of SMR drives | A few announcements have been made on the topic. Some are merely just a smoke for now and just manifest plans for the future. However, it is indisputable that the efficiencies associated with them are better and better understood by mass data users and it drives adoption. |
| Lucidity | Most Organizations Use Only 30% of Their Cloud Block Storage | Based on more than 600 cloud storage assessments across industries and company sizes, most organizations continue to underutilize their cloud block storage by nearly 70%. The primary drivers are limited visibility into true storage utilization and the inability to automatically shrink over-provisioned volumes at scale. | Over 1.2M orphaned disks leading to Over $12M in lost business value | Across 100s of storage assessments in a single quarter, the data is strikingly consistent: waste is driven by structural underutilization, persistent idle volumes, and an alarming number of orphaned disks. Organizations collectively carry 32,699 orphaned disks, costing over $1M/month with zero business value. These trends appear across industries and CSPs, indicating a systemic issue—not isolated misconfigurations. The findings underscore the need for continuous and proactive optimization | True capacity pressure is rare – over-provisioning is cultural | True capacity pressure is rare – over-provisioning is cultural. Only 4–5% of disks exceed 80% utilization. 95% of provisioning decisions overshoot actual workload needs. |
| Nakivo | The expansion of edge storage | Edge storage and analytics usage expanded as IoT growth pushed organisations to process and store data closer to the source to cut latency and costs while enabling smarter hybrid architectures. 2025 emerged as a transformative milestone, establishing edge computing — frequently integrated with cloud-based hybrid architectures — as a key aspect of contemporary IT infrastructure. This advancement drove smarter, faster, and more dependable operations across diverse applications and industries. | AI-powered storage shifted to mainstream | In 2025, AI-powered storage moved from being a “nice to have” to a mainstream in storage infrastructure. Machine learning (ML) has been embedded in storage arrays and software-defined platforms to automate tiering, detect anomalies, and optimise performance in real time. In short, AI redefined what “smart” storage meant. | Cloud storage became universal | Cloud storage was the default. Multi-cloud and hybrid architectures became the norm, especially for backup and disaster recovery (DR), where scalability and cost control remained paramount. The ubiquity of cloud also highlighted a challenge: the importance of operational visibility. Managing data across multiple environments without a unified dashboard introduced unnecessary complexity and increased the risk of security blind spots. |
| on behalf of Toshiba Electroni | HDDs Remain Essential Amid Growing AI and Cloud Storage Demands | HDD demand remained strong in 2025, even in compute-intensive AI workloads, creating component shortages across the industry. As the scale and complexity of datasets continue to grow, HDDs remain the primary medium for large-scale storage, particularly in applications where capacity, reliability, and cost-effectiveness are critical. For Toshiba, 2025 marked a year of strategic innovation and expansion. Toshiba expanded its HDD Innovation Lab in Düsseldorf, Germany. | Toshiba Verifies 12-Disk Stacking for Future 40TB HDDs | In 2025, Toshiba became the first company in the storage industry to verify 12-disk stacking technology for high-capacity 3.5-inch HDDs, setting the stage for next-generation 40TB-class drives targeted for data centers in 2027. By adding two disks to the standard 10-disk stack used in nearline HDDs, Toshiba leveraged advanced design and analysis technologies, including new dedicated stack components and replacing the current aluminium substrate with durable glass. | Toshiba S300 AI HDD Supports Growing AI Surveillance Storage Needs | HDDs continue to play a critical role in managing the exponential growth of AI-generated video data. Purpose-built for AI-driven workloads, the S300 AI supports continuous recording, real-time analytics, and large-scale data retention across centralised surveillance systems, video archives, and multi-bay RAID setups. This launch highlights how HDD technology continued to evolve in 2025, supporting large-scale storage, real-time processing, and enterprise-grade reliability. |
| Oxibox | Backups are the #1 target for cyberattackers | Backups concentrate valuable data — account numbers, credentials, personal identities — easily resold on dark markets. Destroying backups eliminates recovery options, maximizing ransom pressure. Backup software also has privileged access across environments, making it ideal for supply chain attacks; compromising one vendor grants access to thousands of organizations. Weak backup security turns a recovery asset into a data goldmine and catastrophic vulnerability. | 3-2-1-1 rule is the gold standard for backup security | The 3-2-1-1 rule layers multiple defenses against different failure scenarios. Three copies ensure redundancy. Two media types protect against format-specific failures (e.g., drive corruption). One offsite copy guards against physical disasters like fires or floods. The final “1”—an immutable or air-gapped copy—defeats ransomware by ensuring attackers cannot encrypt or delete all backups even with full network access. This layered approach means no SPOF can eliminate recovery options. | Automated restoration testing is a must-have | 30-40% of backups fail during restoration—yet most organizations only discover this during a crisis. Manual testing is time-consuming and rarely performed consistently. Automation validates backup integrity continuously, verifies data recoverability, and alerts teams to failures before disasters strike. It also documents compliance with recovery objectives (RTO/RPO) and regulations. Without automated testing, backups provide false confidence—you only truly have a backup if you’ve proven it resto |
| Plakar | Backup targeting hit an all-time high, making air-gaps vital | Cyberattacks continued more than ever to actively hunt and destroy backup repositories first. In this hostile landscape, logical separation failed. 2025 proved that strong air-gapping, by removing storage media from the network or using strict one-way protocols, is the only way to survive. Resilience strategies are now redefined around a simple principle stating that if your backup is permanently online and writable, it is essentially compromised. | The end of multilateralism made data sovereignty a real issue | 2025 confirmed that sovereignty is not just about location, but about immunity from foreign laws. With the US CLOUD Act, China’s National Intelligence Law, Russia’s Yarovaya Law, and Australia’s TOLA Act, governments can compel vendors to hand over data. Consequently, organizations started prioritizing architecture over contracts, seeking to keep encryption keys and storage control strictly out of vendor reach to ensure immunity. | The cloud safety illusion continued to leave SMBs exposed | While large enterprises improved their posture in 2025, the shared responsibility gap widened for SMBs and scale-ups. Too many continued to mistake high availability for data resilience, assuming SaaS and cloud platforms natively secured their data. This persistent misconception left a massive segment of the economy scaling up without safety nets, operating totally unprotected against corruption, accidental deletion, or cyberattacks. |
| PoINT Software & Systems | Integration of Tape Storage in AI Workloads | Due to the immense energy requirements and exponential data growth in AI workloads, intelligent tiering of inactive data from primary object storage to tape libraries has become an effective solution for addressing data growth and energy issues. | Cloud Data Repatriation to on-prem Storage with Tape | Many companies have realized the drawbacks of storing their data with cloud providers, especially hyperscalers. In addition to the high cost of transaction, storage, and egress fees, there are technical and strategic reasons for “cloud repatriation” with the integration of on-prem S3-based tape storage. | Traditional Backup with S3-to-Tape | The importance of traditional backup using tape has increased in 2025 due to the ongoing rise in cybercrime threats. Backup applications now integrate tape via modern, object-based methods, such as S3-to-tape. |
| Pure Storage | Organisations need an Enterprise Data Cloud to focus on managing data, not storage | Ever increasing organisational complexity means businesses need simplicity so they can focus on higher value tasks. An Enterprise Data Cloud, an architectural approach to data and storage management, enables IT teams to centrally manage a virtualised cloud of data with unified control: on-premises, public cloud, and hybrid. Organisations can manage data at scale, reduce risk, gain increased control and insight across all environments – ultimately focusing on business outcomes, not infrastructure | Data sovereignty has become a critical business concern | Data sovereignty has moved beyond the strict concerns of compliance, becoming a much larger debate around competitiveness, innovation, and trust. A staggering 100% of respondents to a recent Pure Storage survey stated that data sovereignty concerns, including potential service disruptions, have made them reconsider where data is located and 92% agreed that the current geopolitical environment has increased the risks of not dealing with data sovereignty | Organisations need support to navigate modern virtualisation challenges | The virtualisation disruption re. pricing and licensing model changes left many customers frustrated and reassessing options. Concerns over costs were the catalyst, but the underlying driver is strategic: aligning infrastructure with modern application architectures, AI readiness, cloud-native practices, and long-term agility. Organisations want the freedom to choose the virtualization platform that best aligns with their strategy while maintaining high performance, resilience, and data services |
| QStar Technologies | QStar aligns modern archive software with AI-driven storage demands | As AI and HPC workloads accelerate data growth, QStar Technologies’ archive software bridges S3 object storage and tape systems, supporting multipart uploads and seamless data movement between disk, cloud, and tape. This approach enables organizations to scale archival capacity efficiently while maintaining high performance in multi-node HPC environments. | Tape and Private AI gain strategic relevance in archive architectures | The rising demand for tape storage—driven by AI training datasets and long-term HPC data retention—has reinforced tape’s role as a cost-effective and energy-efficient archive tier. QStar’s software complements this trend by enabling secure, S3-compatible tape archives that integrate with Private AI environments, ensuring data sovereignty, compliance, and protection against cyber threats through air-gap, replication, and media isolation. | Modern, hardware vendor-agnostic archive software enables cost-effective cloud repatriation. | QStar Technologies delivers a modern, hardware vendor-agnostic archive architecture that simplifies large-scale data management while protecting existing investments. QStar Global ArchiveSpace provides a unified namespace across disk, object,cloud, and tape, acting as a replacement for aging legacy HSM solutions such as Oracle HSM. QStar Network Migrator enables policy-driven data movement and cloud repatriation, reducing public cloud costs while maintaining performance and data protection |
| SIOS Technology | HA strategy shifted from recovery to prevention | In 2025, the smartest organizations stopped treating failure as inevitable and started engineering it out. High availability moved from reactive recovery toward proactive design: monitoring, automation, and intelligent failover minimizing outages before users noticed. The expectation changed: systems shouldn’t just recover quickly; they should be architected not to fail at all. HA became less about surviving chaos and more about eliminating it entirely. | Patch Management Became an HA Discipline | By 2025, downtime wasn’t just caused by hardware failure or traffic spikes — faulty patches and delayed updates were just as likely to bring down critical systems. Organizations began treating patching as a high-availability function, not routine maintenance. Validating updates, staging rollbacks, and automating recovery became standard practice, as one bad version could take an application offline. HA meant not just surviving failure, but avoiding it through controlled change | Hybrid HA became the enterprise default | The industry proved that “cloud-only” was a myth; hybrid became reality. In 2025, enterprises ran workloads across on-prem, cloud, and edge environments and HA had to follow them everywhere. The most successful strategies weren’t about one platform winning, but about best-in-class architectures working together to avoid downtime and avoid dependency on any single vendor. High availability became a way to preserve operational freedom as much as uptime, giving IT teams leverage instead of lock-in. |
| Spectra Logic | Tape Momentum Accelerated as the Foundational Tier for Cold Data at Scale | Momentum around modern tape storage accelerated as organizations scaled cold data environments across AI, research, media and regulated industries. As enterprises confronted exabyte-scale datasets, it became clear that neither disk nor cloud archive could match tape’s combination of cost efficiency, durability, longevity and energy efficiency. Tape’s ability to store massive volumes of infrequently accessed data with near-zero power consumption at rest continued to make it the optimal choice. | Power Availability Emerged as a Primary Constraint on Data Center Growth | Power availability became one of the most significant constraints shaping data center expansion and infrastructure planning. The rapid deployment of GPU-intensive AI systems placed unprecedented demand on electrical grids, exposing limitations in power generation, transmission, and local capacity. Storage infrastructure, long considered secondary to compute in power planning, came under increased scrutiny as always-on disk and flash environments contributed to rising energy and cooling loads. | Source-of-Truth Archives Emerged as a Requirement for AI-Era Data Integrity | Organizations increasingly recognized the need for source-of-truth archives to preserve data integrity. As generative AI, synthetic data, and automated content creation proliferated, concerns around provenance, integrity, and reproducibility moved from theoretical to operational. Enterprises, research institutions, and public-sector organizations began treating original datasets, metadata, and model inputs as authoritative records that must remain immutable and verifiable over long time horizons |
| Starfish Storage | Good data management is emerging as a vital enabler of successful AI and analytical research | Meaningful AI and analytical research results depend on data with known provenance, which is fully annotated and understood. Getting results in a timely manner means being able to locate and consolidate data relevant to the problem, regardless of where it resides or who created it. | Many research environments are still playing catch-up when it comes to understanding their data | Multiple funding sources, competing research initiatives, and the explosive growth of data have contributed to what Alex Woodie of HPCwire calls the “data disarray” gap. Most infrastructure and research leaders do not have a complete picture of the data they have under management—what is being stored, who owns it, and how much it is costing them. | Heterogeneous storage environments have not gone away | Rising storage costs are sustaining demand for tiered storage solutions. The need for archive storage, including tape, is not going away. High performance storage is needed for data handling within AI ecosystems, but the sheer volume of data generated by AI and iterative research processes is sustaining demand for lower cost, cooler storage solutions. |
| StorMagic | Hyperconverged infrastructure sees rapid growth | 2025 has seen the rapid growth of hyperconverged infrastructure (HCI). As the impact of Broadcom’s acquisition of VMware is still being felt across the market, many customers have turned to HCI due to its resilient and reliable design. In fact, the global HCI market is projected to grow from $11.98 billion in 2024 to $61.49 billion by 2032, a huge indication that HCI popularity is on the up. | Edge computing makes strides | 2025 has been quite the milestone for edge computing, as it is now the fastest-growing segment of enterprise IT. Affordable hardware and lightweight software make it practical to deploy everywhere, while high-availability storage and virtualisation stacks have guaranteed uptime outside the datacentre. These latest advancements have transformed edge IT into a reliable extension of the cloud. | Migration difficulties after VMware’s acquisition | After VMware’s acquisition by Broadcom, IT leaders expected a smooth transition to other hypervisors, but the reality has been far more complex. Edge environments have proved particularly tricky when migrating to datacentres and consequently, the inability of traditional datacentres to perform at the edge has represented a significant setback for IT leaders this year. |
| StorPool Storage | VMware Exodus Leads to Adoption of Cloud-Native Orchestration Platforms | With price hikes and solution option changes from Broadcom VMware, more businesses are moving to open-source, no vendor lock-in cloud management / orchestration platforms. This has led to a higher adoption of the leading solutions in this area such as CloudStack, Proxmox, OpenNebula, Oracle Virtualization, OpenStack, and Kubernetes – all of which are based on Linux KVM virtualization. Even other commercial or proprietary alternatives, like RedHat OpenShift, and Nutanix Acropolis are KVM-based. | Carbon Footprint Becomes a Factor in Purchasing Decisions | More organizations report that carbon footprint, mostly based on energy consumption, has become a bigger factor influencing data center purchasing decisions. In some cases this is driven purely by an interest in green initiatives, while in others it’s driven out of a desire to avoid penalties, take advantage of tax breaks, or out of pure necessity. Data centers in the most populous cities around the world are reaching the maximum load, making power draw a as much of a limitation as space. | Backup and Disaster Recovery Market Grew Significantly | As more organizations become aware of threats to their business, including but not limited to ransomware, 2025 brought huge growth to the backup and disaster recovery market. The largest growth trend was in the adoption of disaster recovery as a service (DRaaS) solutions as more businesses run on either hybrid or multi-cloud infrastructure. The DRaaS market is predicted to reach $15.82 billion by the end of 2025, which represents anywhere from 24 to 36% growth over the 2024 market. |
| TrueNAS | Sharp rise in all-flash adoption | TrueNAS is seeing massive growth in all-flash systems. All-flash sales grew more than 500 percent this year. Enterprises are adopting flash for its performance, reliability, and predictable scaling across mission-critical workloads. | Seamless fit into existing workflows | TrueNAS fits easily into existing customer environments. Teams deploy it with minimal friction, then use those early wins to identify additional TrueNAS use cases for new projects. This ease of integration is accelerating expansion across accounts. | Strong Community to Enterprise adoption | Community Edition continues to serve as a fast prototyping path. Customers validate TrueNAS in their environment before moving to production with TrueNAS Enterprise. This creates a self-reinforcing cycle of adoption across both products. |
| Tuxera | Ethernet-based are rapidly closing the gap with Fibre Channel | For many years, Fibre Channel (FC) has been the standard for mission-critical SANs. Ethernet-based solutions a gaining traction due to their speed, intelligence, and cost-effectiveness. Modern Ethernet storage, with protocols like iSCSI, NVMe/TCP, and NVMe/RDMA, is closing the performance gap, particularly for latency-sensitive workloads. This year, Ethernet Storage is experiencing significant growth, and customers are looking towards hybrid block and file-level access, increasing flexibility. | Flash is becoming the norm | This year, as SSD price per GB falls, we have seen a considerable increase in flash-first system design. NVMe boosts random I/O and latency-sensitive workloads so vendors are shipping NVMe caches, QLC capacity flash, and NVMe-oF front ends as default. All-flash NAS and SAN arrays are replacing hybrid disk tiers with HDDs mainly for cold archives. These modern deployments do, however, require all surrounding infrastructure and software to keep up. RDMA (SMB Direct) is particularly important. | Security defences increase | As a direct result of increased ransomware attacks, infrastructure-integrated, immutable snapshots have begun to appear, providing instant recovery by eliminating data movement. This approach, combined with global deduplication, avoids network bottlenecks and rebuild processes, significantly reducing recovery time from hours to seconds. The addition of air-gapped backup policies further strengthens the defences and protects valuable assets. |
| VDURA | Software defined durability is unlocking supercomputer class throughput. | “Organizations started to demonstrate that efficiency and performance can coexist by achieving breakthrough throughput on commodity hardware. Shared nothing architectures is redefining durability as a software capability, enabling supercomputer class performance without proprietary systems. This shift is proving that innovation lies not in expensive hardware, but in the intelligence of software defined resilience and scale.” | On the sheer volume of data powering AI workloads | AI is no longer held back by compute. It is held back by data. Modern accelerators can only reach peak performance when the storage system delivers massive throughput without interruption. Flash-first architectures with intelligent hybrid tiers are essential for keeping AI pipelines consistently fed and free of bottlenecks.” | On memory shortages and efficiency | Large-scale AI training has exposed every weakness in memory and I/O. Software-defined durability is solveing this by delivering supercomputer-level throughput on commodity hardware. With a true shared-nothing architecture, performance, efficiency, and resilience all rise together. You do not have to trade one for the other. |






