Like every year, StorageNewsletter.com has asked vendors for their 2022 retrospective. Here are analyzed these results to found the following major topics:
- This year, cloud in multiple flavors (multi, hybrid, private, edge, SaaS and operating model) arrives #1 with more than 56% in all answers.
- Without any surprise, ransomware is #2 with 32%.
- Then media in various aspects (SSD, NVMe, Tape, Optics), data protection and data management arrive #3 with 24%.
Arcitecta (Jason Lohrey, CTO)
Ransomware Went Mainstream
For as long as there have been computer systems, criminals have sought advantage by breaking into them. But something changed this year. Ransomware was no longer someone else’s problem – it became everyone’s problem. It affected organizations and it affected ordinary people at an increasing frequency. People’s private data was disclosed, including medical records – some of which were highly personal. The war in Ukraine was a catalyst for increased cyber activity and awareness of the threats. Suddenly, no one was safe.
Into the Cloud, or Maybe Not
For the past few years, it’s been an accepted mantra that moving your data to the cloud is the best, or possibly only, option if you want to reduce storage costs. However, that’s not always the case. This year more companies and institutions were open to hybrid models where the cloud was one part of a folio of storage options aimed at balancing risk, resilience, and cost optimization – deployments that combine a mix of on-premises and cloud in a single global namespace.
We Realized There Is Too Much Data to Backup
For years IT administrators have worked hard to back up an increasing tsunami of data, and with each passing year, that has become harder to manage. In some cases, backup has been abandoned altogether. That is a precarious place to be. Traditional backup, where systems are scanned to produce full and incremental copies to external storage, is no longer coping with the rate of growth ranging from petabytes to exabytes and beyond. Separate systems are no longer feasible – entities with large data holdings realized the current approaches need to change to one that is continuous and an integral part of the storage fabric.
Atempo (Ferhat Kaddour, VP sales and alliances)
With data security being a recurring concern for the past couple of years due to ransomware attacks, data immutability was a major expectation and 2022 was THE year of immutable backups.
Due to the continuous explosion of unstructured data volumes, keeping track of all data operations to ensure data compliance, security, confidentiality, and privacy is increasingly complex in open environments. This led IT teams to provide users with pre-configured self-service data movement portals.
We saw the early signs of converging structured and unstructured data in big data environments with a clear need for viewing and analyzing all data types from a global storage-agnostic data management platform.
Axcient (Rod Mathews, CEO)
Hybrid Cloud environments are now the norm, changing the strategy, economics and management of backup and disaster recovery
● The costs of the public cloud are rising, challenging traditional economic models, and prioritizing compute-side control of rogue workloads.
● Overlapping data protection solutions that cover use cases spanning both on-prem and cloud have created tech stack complexity, management inefficiency, and security risks.
Economic pressures on cost control grew but created opportunities for MSPs*
● In the channel, 93% of small and SMBs are concerned about the impact of the economic climate on their business, and over 40% of MSPs are concerned about rising costs.
● However, pressures on SMBs to lower fixed costs and reduce risk have created opportunities to outsource IT services to MSPs.
● The challenge for MSPs is to optimize their operations to scale efficiently and take on new opportunities profitably.
BCDR 2.0: Recovery is In, Backup is Out
● Backups consistently fail, and industry estimates are that up to 40% of data is unrecoverable. Businesses say this contributes to what they call a “protection gap” of up to 89%.
● Today, backup alone is insufficient. The conversation and priority have shifted to enabling business continuity and implementing a strategy that makes business recovery – not just data recovery – simple, rapid, and reliable.
* Source: Optimizing BCDR Can Help MSPs Fight Off Recession and Grow, Channelnomics and Axcient, October 2022
Burlywood (Tod Earhart, CTO and founder)
SSD Industry focus on Solutions to Latency Issues Intensifies
The demand for SSDs with improved latency and latency consistency continues to explode as application complexity and real-time demands intensify. Applications including virtual machines and containers, artificial intelligence and machine learning, low latency edge applications, big data, or fast data analytic workloads can no longer tolerate the inconsistent latency and large latency spikes exhibited by commodity data center SSDs. The industry continues to search for solutions. The problem is exemplified by the large-scale initiatives driven by hyperscalers requiring major architectural changes to SSDs and the storage software layers above them. These initiatives (such as Flexible Data Placement, Zone Namespace, and Software Enabled Flash) continue to gain momentum and are beginning to be adopted and refined by standards bodies. Unfortunately, these solutions are out of reach for most data center SSD users due to their significant impacts on the software stack.
Workload Aware SSDs™ Hit the Market
The prevalent use of legacy, HDD-based benchmarks as evaluation criteria and design targets for commodity data center SSDs is a key contributor to the serious latency issues the industry is dealing with. The standard benchmarking and qualification tools do not expose the performance variability seen under real production workloads. Workload aware SSDs are evaluated and optimized against real workloads and exhibit huge improvements in latency behavior, performance consistency, and endurance over life. In 2022, Swissbit began offering workload aware SSDs built with Burlywood’s patented FlashOS™ technology.
Multiple SSD Entrants into Data Center Market
Data Center demand for storage continues to grow in the face of an economic slowdown with an increasing need for data center SSDs. It is also becoming more apparent that the commodity SSD offerings from the major NAND manufacturers are not meeting all the requirements for SSDs under complex, demanding workloads. This has opened up opportunities for new Enterprise SSD manufacturers to enter the data center SSD market.
Catalogic Software (Ken Barth, CEO)
Supply chain disruptions from Covid-19 lockdowns continued to disrupt business, but the big news was the geo-political environment in Europe that has impacted the energy markets and encouraged bad actors to increase their cyberattacks, especially on vulnerable IoT and IT systems.
Cyber resilience became the most important vendor requirement for enterprises to combat cyber-attacks across the entire IT environment, both on-premises and in cloud applications. Every vendor is working to harden their applications and solutions and add more self-monitoring and detection.
Data protection (backup and recovery) solutions have become the last line of defense against data loss from cyber-attacks. Backups on secondary storage need to be protected from being discovered and compromised by ransomware, and IT and backup admins need to be able to verify that a backup can be used for recovery when needed.
Ctera Networks (Aron Brand, CTO)
Economic turbulence drove organizations to look to technology to automate, reduce infrastructure spend and maintain the status quo.
Enterprises are embracing the cloud to provide the agility and flexibility that they need to compete. Cloud transformation is essential for these organizations if they are to survive and thrive. They can scale up or down as required, without having to make significant capital investments in new hardware, particularly important given today’s uncertain economic conditions.
Increased adoption of hybrid cloud storage. Users understand that with hybrid cloud storage, files are continuously sent to the cloud resulting in a short recovery window, and virtually no data loss. Hybrid cloud storage also provides instant DR, as the data is stored in the cloud and can be accessed immediately in the event of a cyberattack.
Data Dynamics (Piyush Mehta, CEO)
Unstructured Data continues explosive growth and sprawl, continuing to eat away at budgets
The cost of storing and managing this unstructured data can be significant. IBM estimates the yearly cost of poor-quality data in the US is $3.1 trillion, and this figure is expected to rise in the coming years if enterprises do not keep a close eye on their unstructured data growth and ensure that they have the necessary technology and processes in place to manage this valuable asset effectively.
Cloud Storage alone is not cost-effective; you need to be smart about the cloud
Moving to the cloud is easier said than done. 80-90% of cloud adoptions fail due to budget overspending, risk, and the biggest of all – unstructured data. Dealing with unstructured data is like a black hole of unknown possibilities and risks. Enterprises are unaware of what’s in there and what they must prepare for. Furthermore, they are entirely oblivious to sensitive information in the sprawl and have no way of securing it while moving to the cloud. They resort to the traditional lift-and-shift approach without data analysis and are prone to incurring more cost, time, and risk. Enterprise data must be effectively analyzed, secured, ensured compliance, and then migrated – not the other way around. The key is to break free from traditional lift-and-shifts and adopt a data-driven approach.
Data swamps are the biggest obstruction to data analytics
A data swamp is a collection of unstructured data that is so large and complex that it is difficult to process and extract value from it. This trend is only going to continue as we generate more and more data. Gartner coined the term in 2017, and the problem has only worsened. The amount of digital information created daily is increasing exponentially, showing no signs of slowing down anytime soon. We are quickly approaching the point where traditional storage and processing methods cannot keep up.
DataCore (Abhijit Dey, CPO)
Covid cloud boomerang
The world is getting back on its feet post Covid, however, most businesses are still experiencing unstable hybrid workflows. There is more pressure on companies to have better control over remote work forces due to the unpredictable cost of the cloud.
Significant increase in spend on IT, mostly due to the pandemic, has slowed or stopped
Companies are giving more serious consideration on where and how budgets are spent. One area they still are concerned with is IT losing control of physical security due to increased amounts of individuals either working from home full time or adapting to hybrid scenarios. As a result, companies are looking at redirecting funds to this area as IT’s job has gotten much harder and the need to invest more funding for remote security has increased.
Nw breed of widely scalable cyberattacks were prevalent in 2022
Cyberattacks have become more commonplace. They have shifted from stories you read about in the news to attacks that hit closer to home – either through personal experience or through business.
Datadobi (Carl D’Halluin, CTO)
Continued unstructured data accumulation has forced many organizations to consider new approaches to managing the growth and lifecycle of data. Cost and risk considerations are two major contributors. Risk is a consideration because as data continues to age, people in the organization have less knowledge of what it contains and who owns it. There’s also financial risk, as they might be paying to store and protect irrelevant personal data.
Unstructured data storage on NAS systems is still prevalent since many applications have not been re-developed to leverage object storage. Many applications are built to read and write data over NAS protocols. Moving the application to the cloud and 9 StorageNewsletter Storage Industry Facts 2022 using object storage requires the application to be rewritten using object storage APIs, so there’s still a lot of data stored on these on-premises systems.
Major cloud service providers have increased the number of vendor-branded file services in order to provide organizations with a familiar platform for file storage – but with the benefit of using a managed service. The managed file services provide customers three main benefits: a simple “on-ramp” to migrate their legacy application to the cloud provider’s environment; investments made in training, documentation, processes, and administration; and, as a managed service, the cloud provider takes care of the underlying hardware and operating environment.
DDN (James Coomer, SVP of products)
The expansion of large AI systems, enterprise and national class AI supercomputers, further drove the need for parallel file systems across many markets for scalability and performance requirements. Manufacturing, life sciences, education and government verticals saw the increased provisioning and growth of parallel file systems in their environments.
IT departments are now demanding Opex for storage systems regardless of locality. Public cloud, hosted cloud or on premises systems are being sourced on a pay-per-use basis to avoid expensive upfront capital costs. Additionally, skills shortages have led to a sharp increase in demand for fully managed services on systems deployed in customer data centers. The storage market is maturing faster than other infrastructure areas and vendors must be able to provide flexible pricing, features and services to meet these requests.
Customers continue to demand flexibility in storage media options. The emergence of lower cost QLC storage systems has introduced an additional choice for customers looking to optimize performance and capacity, yet the continued capacity increases in disk drives keeps them as the primary choice for lower-cost cold storage. NVMe flash or storage class memory are now the de facto choices for primary workloads, but hybrid systems continue to see strong demand, especially in systems that use automation to remove the administrative overhead of managing data between tiers.
DH2i (Don Boxley, CEO and co-founder)
Back-to-office mandates – the struggle was real
For many CEOs, the question was to do, or not to do, or to do so in a hybrid fashion. Those that demanded their employees come back to the office, oftentimes experienced quite a bit of pushback. In some cases, it was even a deal-breaker – helping to contribute to the recent “Great Resignation” phenomenon. And, for some that were not able to quit, they stayed. But they weren’t happy about it; and no organization wants to employ a disgruntled workforce. So after many CEOs really put their foot down, many decided the smart thing to do was to gently walk that back a bit… The software defined perimeter (SDP) moved from marketing hype to real-world proven
Over the past year, we learned that traditional approaches to data security were not up to snuff for the way we work today. Take for instance virtual private networks (VPNs). Even today’s most up-to-date VPNs rely upon complex, expensive and less-than-secure network-to-network approaches that create too large of an attack surface. Because of this, the Software Defined Perimeter (SDP) moved from the innovator into the early adopter phase, as SDP offers a much faster, easier to manage and dramatically more secure method for connecting people, and people to places. In other words, SDP moved from marketing hype to people using it and seeing the value in it.
Kubernetes (K8s) crossed the chasm
K8s is becoming the standard for container orchestration. According to the Cloud Native Computing Foundation (CNCF), “As of Q1 2021, 57% of backend developers had used containers in the last 12 months, but only 31% of developers used Kubernetes to orchestrate these: 5.6M developers in total. Their overall usage of Kubernetes has increased by 4 percentage points in the last 12 months. Kubernetes thus seems to exhibit a distinctive positive trend within the cloud native space, and there is arguably still room to grow.” This is because developers have figured out that they can develop code and deploy it much faster if they containerize. Likewise, developers are finding that nothing else compares with the speed at which Kubernetes can automate the deployment and management of containerized apps.
Folio Photonics (Steven Santamaria, CEO)
Shaky Earnings Unveiled the Storage Buyer Oligopoly
The past year saw several storage vendors underperform their investor’s expectations. This was largely due to the oligopoly of a few large hyperscalers/cloud service providers in the storage media market. When there are a handful of customers that consume over half of any given product in an industry, it creates an extremely delicate ecosystem. Unfortunately, it was the storage media suppliers that were exploited by this system throughout the previous year. With hyperscalers overprocuring media in 2021, it led to halting orders and the media suppliers having to miss their revenue projections and lower future earnings expectations.
Hardware-As-A-Service Models Piqued Interest
The storage industry has recently seen the effects that traditional hardware sales models can have with supply chain issues playing a part in falling revenues and decreased unit shipments. It was cause enough for several vendors to dabble with changing the way storage hardware is bought and sold. Several vendors began to test hardware-as-a-service, pay-per-use models. This allows for lower upfront capital expenditures from end-users and more predictable revenues year over year in the long run. This model saw an uptick in adoption rates over the past year while slowly intriguing more and more users across every industry.
Next-Gen Technologies Gained Momentum
The exit of Intel’s Optane business highlighted just how difficult it can be to successfully create, manufacture, market, and create adoption of a new technology. With this being said, there is a greater need than ever to develop storage technologies with new roadmaps as existing technologies continue to delay new generations and underdeliver on their prospective trajectories. This was shown through the increased interest in next-generation storage technologies in 2022. Across several headlines and conferences, there was a continued presence from DNA, glass, and multilayer fluorescent storage technologies all with the aim to increase storage capacity while driving down cost.
Fujifilm Recording Media (Rich Gadomski, head of tape evangelism)
Sustainability Became Priority
Many conferences throughout 2022 focused on sustainability as a priority in IT operations. Data Center World, Flash Memory Summit, Open Compute Summit, and SC22, to name a few. The most notable event was the Open Compute Summit which announced Sustainability as its 5th tenet. Given the undeniable impacts of climate change and the energy crisis due to grid limitations and geo-political tensions, carbon reduction and energy conservation seriously influenced data storage strategies. Active archive solutions incorporating automated tape systems can help reduce energy consumption by 87% and CO² by 97% compared to HDD alternatives.
Cybersecurity Concerns Lingered
Ransomware-as-a-service multiplied and remained a top C-suite imperative in 2022. Getting cyber insurance as liability coverage in the event of a data breach or ransomware attack also became more popular but not easier. Underwriters require stringent cyber-security measures, including regular backups with one copy offsite and air-gapped. This requirement stoked renewed interest in modern tape as a fail-safe and an affordable air gap line of defense against bad actors.
Uncertain Economic Conditions Emerged
Declining shipments of flash and HDD indicated slack storage demand amidst uncertain economic conditions. Fortunately, tape deployments remained a long-term strategic solution to the data volume and growth challenges facing customers from hyperscale to HPC to traditional enterprises. With the rising value of data and a heightened awareness of data temperature, the TCO advantage of tape continued to be compelling.
Hammerspace (Molly Presley, SVP marketing)
IT supply chain challenges compelled new approaches to storage and data management
With supply chain disruptions caused by factors such as the Covid-19 pandemic, geopolitical events, transportation and labor disruptions, and more, organizations implemented new approaches to move their data and workloads to where the infrastructure is readily available. Organizations had to find new, innovative ways to orchestrate data to the compute resources or storage capacity available when procurements were slow or completely on hold.
Many organizations had to automate Burst to the Cloud to access sufficient compute resources for their large workloads
IT teams could no longer depend on Moore’s Law to accelerate compute performance at the same exponential pace enjoyed in the past. Data-driven innovators built workflow automation to burst to the appropriate cloud services and most cost-effective cloud regions to have the necessary compute power.
Software engineering talent needed to be accessed from anywhere in the world Organizations needed to leverage the talent that resided all across the world. Many IT teams made data a global resource by using tools and processes to automatically and securely provide remote data access and efficient collaboration to the remote workforce.
Hitachi Vantara (Radhika Krishnan, CPO)
IT leaders increasingly understood the need to utilize new data management tools to remain competitive
Organizations continued to collect massive amounts of data from across their systems but realized very little relative value from that effort. In fact, 97% of data in the enterprise still goes unused, and less than 1% of it is analyzed. When the data foundation is well-constructed, a company can enhance both its agility and its ability to monetize data and apply it to the pursuit of business objectives.
Leadership changes signaled a shift to advanced data culture
Increased focus on data analytics and its success in extracting value left companies reconsidering the makeup of their organizations and leadership teams in 2022. Chief data officers are now at the forefront of implementing not just effective data strategy but helping to define and deliver wider business success and value.
Data analytics needed to demonstrate CSR progress
The companies that will be the most successful in achieving their sustainability goals will be those who are fastest to embrace data as foundational to their operations. Knowing and understanding the data across the enterprise landscape can help companies reach sustainability goals and become carbon footprint neutral.
Infinidat (Eric Herzog, CMO)
Launch of InfiniBox SSA II
In April 2022, Infinidat unveiled the InfiniBox SSA II, the second gen solid state array in the company’s portfolio of storage and cyber resilient solutions. Delivering 35μs of latency, it is the industry’s fastest all-flash storage arrays with lower latency compared to comparable systems – according to several storage industry analysts. This enterprise storage solution is well-suited where the highest levels of performance, reliability and availability, extensive cyber storage resilience, and cost-effectiveness are required for the most demanding applications and workloads. It also features the combination of the cyber storage resilient capabilities of InfiniSafe, performance enhancements well beyond the first-gen InfiniBox SSA, and expanded InfiniOps integration capabilities for a storage system that stood out in 2022.
Launch of InfiniSafe
The firm caught the attention of the storage market in 2022 with the introduction of InfiniSafe, which provides comprehensive cyber storage resilient capabilities. InfiniSafe combines immutable snapshots of data, logical air gapping (local and/or remote), a fenced forensic environment, and virtually instantaneous data recovery for the entire InfiniBox, InfiniBox SSA II, and InfiniGuard platforms. Logical air gapping creates a gap between the source storage and the immutable snapshots, while remote air gapping sends data to a remote system. The fenced forensic environment provides a safe location to conduct forensic analysis of InfiniSafe snapshots to identify a copy of the data that is free from malware or ransomware and can, then, be safely restored. Once a dataset without malware or ransomware has been identified, the data can be recovered, regardless of the size of the dataset, in minutes and made fully available to the backup software for restoration.
The rise of guaranteed SLAs
Infinidat’s big push with guaranteed SLAs in 2022 was a disruptive factor because it caught bigger storage incumbents off-guard and set a new precedent for storage solution vendors to guarantee the service level agreements they have with customers. In mid-2022, Infinidat made headlines with the industry’s first cyber storage guarantee for recovery on primary storage – the InfiniSafe Cyber Storage guarantee for the InfiniBox and InfiniBox SSA II primary storage platforms. This ensures that enterprises and service providers recover and restore their data at near-instantaneous speed in the wake of a cyberattack by using a guaranteed immutable snapshot dataset with a guaranteed recovery time of one minute or less. In October of 2022, Ithe company extended its cyber storage guarantees to the InfiniGuard platform. The InfiniGuard cyber storage resilience guaranteed SLAs ensure that InfiniSafe snapshots are immutable and those InfiniSafe immutable snapshots are recovered in 20mn or less on the InfiniGuard platform.
Komprise (Krishna Subramanian, COO/president and co-founder)
Global energy crisis and ongoing supply chain disruptions will add new requirements for storage managers
Beyond green data center strategies and purchasing energy-efficient hardware, IT will need to adopt sustainable data management practices to reduce data footprints on resource-intensive storage. This will entail getting detailed metrics on data usage and costs to make decisions for cold data tiering to object storage in the cloud (which also eliminates the need for multiple backup copies) or to confine data for deletion. These tactics will also help IT managers cope with supply chain shortages and months-long delays for procuring new hardware.
IT infrastructure costs are under the microscope, calling for extra focus on cloud cost optimization
Anecdotally, our customers are telling us that they are taking a cautionary approach to moving workloads into the cloud as cloud economics are not always better than on premises. Global economic pressures will make the practice of FinOps more important than ever. IT and storage teams will need to work closely with their departments and use tools to understand data usage and need to make the best decisions as to which data sets should move to which storage class in the cloud. Continuous data lifecycle management in the cloud will be critical.
Storage architect/engineering role adapts to cloud and talent shortages
As the IT talent war wages on, we’re seeing a shift in storage IT roles: experienced storage pros are moving to lucrative cloud architect/engineering roles. Meanwhile IT generalists/junior cloud engineers are being hired to oversee storage among other areas. This is a conundrum, as IT organizations still require deep NAS expertise while growing the knowledge gap in cloud storage and hybrid cloud management. IT employees managing the storage function will need new skills beyond managing storage hardware to encompass data services – including facilitating secure access to data, adhering to governance requirements and making data searchable/available to business stakeholders for applications such as cloud-based machine learning and data lakes.
MinIO (AB Periasamy, founder and CEO)
Object storage has become the dominant storage type. While object storage was always dominant in the public cloud, the accumulated evidence in 2022 proved that it has become the dominant storage in the private cloud as well. Snowflake and SQLServer both made object storage their primary options with the support of external tables, joining countless others that had already made the switch.
Multicloud is the dominant architecture and the economy is only strengthening the case for it. Cost of capital concerns have changed the calculus on the “all in” on the public cloud orthodoxy. Now enterprises look at the characteristics of the workload and make that determination – and the economics for predictable workloads favor private clouds/colos and thus the repatriation of those workloads.
Model9 (Gil Peleg, founder and CEO)
In search of cost effective and innovative solutions, object storage is becoming an attractive target for mainframe secondary storage due to better TCO than proprietary on-premise solutions and better performance.
Due to the many cyber threats and as a result of the Russian invasion of Ukraine and the vulnerabilities discovered – 3rd data copies to protect from ransomware and other cyber events have become pervasive especially in the financial services industry with large corporations leveraging public cloud backups to perform standalone and/or clean room recovery.
N-able (Chris Groot, GM, cove data protection)
2022 was the year the world accepted the requirement to back up SaaS applications on a mainstream basis. Of course, these services were used prior to 2021, but the nuance was these were the early adopters. The observation in looking back on 2022 was that we saw a ‘crossing of the chasm.’ The majority of buyers now accept SaaS backup as a requirement and are looking for the best way to manage it. The conversation has shifted from “why would I do that?” to “I must do this.”
Nakivo (Sergei Serdyuk, VP product management)
Continued adoption of hybrid cloud and rise of data protection and data management solutions
The past year saw the continued adoption of hybrid cloud, harnessing not only greater flexibility through the combination of on-prem and cloud environments, but also incurring increased complexity, as organizations were then tasked with addressing a whole new level of data security and management. Cloud integration also clearly demanded more comprehensive security measures to improve data protection and avoid potential breaches. However, the tenacity to maintain business continuity saw solution capabilities address some of these unique challenges, helping organizations to more safely integrate and orchestrate hybrid cloud.
Prioritization of data protection
2022 showed no signs of cyberattacks abating, with data breaches impacting millions of people and with ransomware still considered to be one of the biggest threats to organizations of all sizes. The unprecedented spate of attacks and how world events have unfolded over the past few years have undoubtedly affected how organizations look at data protection and recovery. As the fear of uncertainty impacts all organizations, large and small, the aim will be to continue to prioritize fast recovery in case of a disaster.
More stringent data protection regulations
As political and geopolitical factors continued to impact the industry in 2022, ensuring compliance has remained high on the agenda. New data protection and data privacy regulations continued to be introduced globally, impacting how businesses that collect and process data operate. The year saw further updates to the EU General Data Protection Regulation (GDPR) guidelines, imposing more onerous personal data breach notification obligations on organizations that are not established in the EU but are still subject to the extra-territorial provisions of the EU GDPR.
Nasuni (Russ Kennedy, CPO)
33% of ransomware attacks were targeted at manufacturing (20.7%) or retail (12.6%) enterprises (Source: Abnormal Security report)
Organizations in these industries are highly vulnerable to ransomware attacks due the distributed nature of their operations and the need to share data across the organization. IT leaders in these industries must implement security policies and technologies to prevent these attacks from successfully penetrating their environment. These organizations should also look to deploy cloud-based solutions that will alert them to an attack in real time and streamline the process of recovering critical file data as quickly as possible to minimize downtime and return the organization to full productivity.
Growth of Microsoft Teams 2021 to 2022 145 million licenses to 270 million licenses (Source: Microsoft)
With the onset of the Pandemic and the growth of remote and hybrid workers, organizations are now required to not only protect sensitive corporate data inside their facilities but allow their knowledge workers to access that data seamlessly and securely when working from non-traditional office settings. IT leaders need to look for solutions that enable distributed collaboration among their workforce while at the same time, protecting and securing critical file data without significantly increasing costs.
79% of organizations have experienced a ransomware attack with 47% experiencing attacks on a monthly or more frequent basis (Source: ESG)
Certain industries and businesses are particularly vulnerable to ransomware attacks and if an organization decides to pay the ransom to get their data back, they are setting themselves up for repeat attacks and vulnerabilities. IT leaders need to look for solutions that minimize the impact of an attack on their operation and enable rapid recovery of critical impacted data in order to return to full productivity as quickly as possible. Unfortunately, traditional data protection techniques are too vulnerable to sophisticated attacks and require extensive time to recovery and return to full productivity.
Nyriad (Adam Roberts, field CTO)
It didn’t all come back, and it didn’t all go to the cloud
The theory that all data/workloads would move to the cloud and on-premises resources would be virtually eliminated was not the case in 2022. Instead it became well understood that CIOs needed a hybrid approach to ensure data is stored in the technical environment best equipped to meet business, IT and budgetary requirements. This also gave rise to hybrid storage-as-a-service (hybrid STaaS). And for end users, the lines between consuming storage locally or in the cloud became increasingly blurred.
Industry began to overcome storage bloat
In a sense, the NAS and SAN providers dominated the market to such an extent that some of their solutions became bloated with unnecessary features and proprietary storage stacks. Inevitably, there was market pushback. The hyperscalers were the most vocal in this transition. They openly stated they would not buy expensive shared storage with unneeded features from large OEMs. And then others followed suit. Businesses were happy to add the cost and complexity of DAS/SDS, because it was still more economical than buying overpriced SAN/NAS solutions.
Data needs for M&E businesses continued to evolve
Studios and post-production facilities needed to store more and more data quickly and reliably to create high-quality content. At the same time, they needed to be able to access that data easily and quickly for editing and playback. As a result, M&E increasingly turned to storage solutions that had the ability to simultaneously capture content and playback in real time without a decrease in performance. This concurrency reduced time and improved efficiency – increasing the businesses’ profitability.
Panasas (Jeff Whitaker, VP marketing and products)
HPC industry is doubling down on massively scalable storage solutions
Panasas rolled out two new platforms in 2022, high-capacity ActiveStor Ultra XL and all-NVMe ActiveStor Flash, and customers have responded by doubling down on purchases of the ActiveStor Ultra XL platform. In a world where everyone talks about expensive all-flash storage, this trend showcases the growing need for more cost-effective scalable storage solutions to support massive-scale data environments and large HPC workloads.
Consolidated storage is becoming the norm in HPC and AI environments
Just as we saw in the enterprise space, HPC and AI environments are pursuing storage consolidation efforts and building out infrastructures that can support multiple workloads with diverse data types from a single storage solution. Data-driven organizations are showing us that data silos are becoming a thing of the past.
Testing HPC in cloud continues, but is stopping short of production migrations
The cloud providers continue to release more performant HPC compute solutions, and we are seeing customers test small applications in the cloud. What we aren’t seeing, however, is the movement of large-scale HPC storage environments to the cloud. In a couple cases, sufficient performance can be obtained, but the steep cost to attain it outweighs the benefits. Large HPC workloads continue to be deployed in the customer data centers.
Panzura (Jill Stelfox, CEO)
Rise of ransomware
In 2022, the cost of recovering from a ransomware attack cost seven times more than the actual ransom demand. Prevention is better than cure, so they say, but that’s plain unrealistic in today’s threat environment. Ransomware is everywhere – you will get a ransom demand. Ideally, you would be able to mitigate the threat of ransomware with the ability to recover without losing data, and disarm the threat of ransomware with a focus on improved monitoring. If a business can spot potential ransomware and deploy its recovery strategy within seconds of the attack, this drastically limits the damage ransomware can wreak. If you do experience an attack, you need to be able to restore pristine data without losing data.
Data management is moving to the edge
We know that edge computing is beneficial, but we saw in 2022 how this distribution of files increased several risk factors and quality of service monitoring. The first question asked when something disappears is where was it last? That has become increasingly difficult when those files are encrypted and distributed. Monitoring for quality of service is quite hard when you can’t see what is in the file, but is necessary when stored remotely. All too often, this leads to data that is stored on the edge and isn’t protected as well.
Cloud spending in 2022 reached $490.3 billion, exceeding expectations. The majority of businesses now embrace a cloud-first approach to executing their digital transformation strategies. In fact, most network managers reported recently that their modernization to the cloud cost them more than they thought, but it was still worth it. However, speed, availability, and security continue to be ongoing challenges when migrating workloads. Even more so, the performance penalty you suffer when you have migrated to the cloud – meaning, files that used to open immediately can suffer a delay when opening – which is why it is more important than ever to have a provider that overcomes this. Organizations need a high-performance, highintelligence, cyberthreat-resilient data environment based on their specific, strategic business requirements. (Source: https://www.gartner.com/en/newsroom/press-releases/2022-10-31-gartner-forecastsworldwide-public-cloud-end-user-spending-to-reach-nearly-600-billion-in2023#:~:text=Worldwide%20end%2Duser%20spending%20on,18.8%25%20growth%20forecast%20fo r%202022. https://www.gartner.com/en/newsroom/press-releases/2021-11-10-gartner-says-cloud-will-be-thecenterpiece-of-new-digital-experiences
Improve your digital immune system [per Gartner’s top strategic trends]
The most boring data management advice you’ll get in 2023? Focus on security hygiene. Against the backdrop of an increasingly difficult economic climate, businesses are going to be required to do more with less. The most cost-effective move for ensuring data protection is to prioritize getting the basics right. Patch management, for example, is not rocket science, but it takes consistent time and effort to understand and mitigate vulnerabilities.
Due to the rapid improvement of ChatGPT and other AI bots, cybercriminals will have a new weapon as they craft phishing attacks. We have become accustomed to identifying bad email with bad grammar. That is in the past since cybercriminals can now easily craft a realistic and believable email conversation. A new breed of anti-spam and data protection will respond to the increase in employees who repeatedly fall victim to phishing attacks.
2023 will show an increase of Snowden-like events in the corporate world where a disgruntled employee infiltrates large amounts of data in an attempt to blackmail or expose a company to scrutiny. Edge computing, growing storage requirements, unsecured hybrid cloud volumes and the work from anywhere trend can make data more exposed and stretched IT teams vulnerable.
Point Software and Systems (Thomas Thalmann, CEO)
S3 to Tape for backup of object storage
More and more companies have realized that data availability is not the same thing as data protection. Therefore, in addition to high data availability, backup of an object storage system has become mandatory. The backup is often performed with a tape-based system, which could thus achieve both a media break and an “air gap“. As backup software a S3-to-tape approach is used, as this stores objects in native format and allows direct access to the backup data.
Unstructured Data Management to handle data growth and archiving requirements
The exponential data growth is the biggest challenge in IT. As most part is unstructured data which is getting inactive in a very short period, companies have implemented data management solutions to optimize the storage infrastructure and at the same time to fulfill archiving requirements. These software solutions integrate multiple storage tiers, platforms, and locations (on-prem and off-prem).
Protocol Labs (Stefaan Vervaet, head of network growth)
In 2022, more large technology companies joined the Web3 bandwagon, establishing Web3 divisions and launching services that leverage the power of blockchain and open services. This year Google set up a new blockchain unit, bidding for web3 relevance, launched a blockchain node engine for Web3 developers, and introduced cloud-based blockchain node service for Ethereum. The increasing interest in blockchain from web2 giants indicates a growing number of applications for the technology and lends credibility to the space.
In 2022, we also saw more countries adopt Bitcoin as an official currency
This is especially important in the decentralized storage industry, where cryptocurrency is frequently used as an incentive layer for storage providers. As cryptocurrency becomes more widely accepted as a means of financial exchange, we can expect to see an increase in the number of storage providers and customers willing to use decentralized storage services in an open marketplace.
This year, IDC published a report on decentralized storage, predicting that it will eventually replace many public cloud-based Storage Services, lending additional credibility to the space
Pure Storage (Ajay Singh, CPO)
Rise of Sustainability
Sustainability made rapid strides up the corporate agenda. The perfect storm of geo-political developments (war, conflict between major powers) and macro-economic trends (inflation and the rising price of energy) meant that sustainability was thrust into the spotlight like never before. More and more businesses were asking the question ‘How sustainable is our IT infrastructure?’ This trend was particularly noticeable in Europe where sustainability was at the heart of almost all requests for proposals (RFPs). As a result, vendors whose technology infrastructure could facilitate more efficient space, cooling and power utilization were much in demand.
Evolution of Containers and Kubernetes
Containers and Kubernetes have become a driving force behind how the industry is reinventing the way we build and run applications, fueling enterprise IT efficiency. A recent survey by Pure Storage on the state of Kubernetes adoption highlighted that companies are running an average of 45% of their databases and data services on Kubernetes. In 2021, the use of Kubernetes increased as companies were forced to roll out new apps and services to adapt to remote work, market disruption, and financial pressures. The pace of adoption has accelerated for 79% of companies surveyed; however, smaller companies are embracing Kubernetes at a faster rate.
One of the main reasons why containers, deployed and managed via an orchestration platform like Kubernetes, have become so popular is because they provided a mechanism and format to package application code – with its dependencies – in a way that made it easy to run an application in different environments.
Increasingly modern applications are being backed by a growing number of SQL, NoSQL, search, streaming, and analytics data services. Managing so many disparate data services is a major challenge for developers. The next step in the evolution of containers is the rise of the managed database-as-a-service experience which will enable developers to focus on using data services, not managing them.
CIOs became savvy at identifying a true as-a-service model
Flexible storage consumption models have been top of the agendas for most CIOs and IT leaders for a few years now. But 2022 was the year when the discerning CIO was quick to spot the difference between a product on subscription and a true service. Customers recognized and firmly rejected legacy financial leasing models dressed as STaaS)
A true STaaS offering is built around SLAs and service level obligations (SLOs) which are backed by data, observability, and telemetry of workloads to manage the tight timescales offered to customers. It offers a cloud experience; no business disruptions when migrating (or thereafter); easy installation; proactive management; flexible options for entry, expansion and exit; transparent pricing; flexibility of architecture, from cloud to on-prem to hybrid; automation capabilities, especially in upgrades and maintenance; and a seamless experience that does not vary whether the solution is delivered directly or through a partner. And none of these are part of the so-called as-a-service solutions that are currently flooding the market.
Retrospect (Brian Dunagan, VP engineering)
Ransomware aggressiveness and intelligence intensified
And, with the easy attainability of ransomware-as-a-Service (RaaS), attacks became even more frequent and random. All size organizations became a target – from the most seemingly secure government agencies to the largest and most successful global business organizations to small, independently owned stores.
Ransomware infiltrated organizations and attacked the backups first, before hitting the production data and demanding ransom
Organizations responded with multi-layered approaches to data protection and security. The first layer now includes data security software engineered to keep ransomware out. The next layer is now focused on quickly detecting when ransomware or other malware has successfully breached these initial defenses. IT professionals found that this was/is best accomplished with a solution that includes anomaly detection, with customizable filtering and thresholds, which detects successful ransomware attacks in real-time. The final step was deploying a backup solution – such as a 3-2-1 backup strategy, where you maintain at least three copies of your data, two local (on-site) but on different media, and at least one copy off-site. In this way, organizations were able to thwart the cybercriminals’ efforts, avoid paying ransom, and maintain uninterrupted operations.
Immutable backups became another integral step in the backup strategy
Onsite and/or in a public cloud, an immutable copy of data is one that cannot be deleted or changed in any way. Immutable backups ensure that there is an untouched – and untouchable – version of data that is always recoverable and safe from ransomware or any kind of disaster.
ScaleFlux (JB Baker, VP product management and marketing)
Market has appetite for better, faster, stronger SSDs
The mainstream adoption of computational storage has begun. Two years ago, we believed that the market wanted the choice for something better than what commodity NVMe SSDs could deliver, and in 2022 our assumptions were validated. The market’s quick update of an NVMe-compatible drive powered by Computational Storage technology tells us the SSD industry is ready for a shakeup.
Technology critical to unlocking the promise of edge computing has finally arrived
We have heard about edge computing for a long time, but enabling the data collection and processing necessary to unlock its promise has been limited by a gap in technology designed to operate in edge environments. In 2022, we saw that begin to change. Infrastructure optimized to operate with limited power, minimal maintenance, and denser performance have become more widely available. Our customers who deploy systems at the edge have long asked for high-endurance, high-capacity, high-performance SSDs that also offload compute resources and add no complexity, and in 2022 we have finally been able to deliver that technology.
Fact: Sustainability requirements are now a thing
In 2022, we began to have customers asking how we could help them meet sustainability targets in their data center for power, waste, carbon footprint, and physical footprint. We know this is only the tip of the iceberg, no pun intended, and the industry has a lot to figure out. Still, storage is one of the obvious domains to scrutinize as these requirements get implemented and are already becoming a new way to win business.
Scality (Paul Speciale, CMO)
Continued rise of ransomware
Ransomware and malware threats continued to escalate in frequency and sophistication throughout the past year. Attacks now occur every few minutes, with each incident costing organizations millions of dollars and consuming countless IT resources. The price tag of such incidents reached record highs in 2022 – with the global average cost of a data breach now hovering around $4.35 million. In the face of these threats, backups are now one of the most mission-critical use cases for IT teams, especially those in high-risk industries such as healthcare and finance.
Public cloud grew this last year and will continue to grow, but there is backlash
Over the long-term, storing petabytes of data all in the public cloud can be expensive and risky. Throughout the past year, we saw more companies moving data back from the cloud in order to limit costs and contain data in a more secure environment. With the rates of data growth the cost meter is constantly running with S3 buckets in the cloud. Consumption costs rise and monthly storage bills compound every year. Also, with data now being generated at the edge far from the public cloud, a central on-prem hub to retain this data is becoming more critical. 451 2022 research found that, over the past 12 months, 54% of survey respondents said their organizations had moved workloads or data away from the public cloud – stating information security, data locality, and sovereignty issues as the primary drivers. Interestingly, these organizations are moving significant portions of their workloads in the process. Of those organizations that had moved workloads away from the public cloud in the previous 12 months, 27% relocated 26%-50% of all their cloud-based deployments, and 26% had moved 11%-25% of all deployments. Another 25% moved 51%-75%, or more than half, of their deployments. Just 4% said they moved completely away from the public cloud.
Meta data growing at dramatic rates
Thanks to the expansive growth of unstructured data, meta data is exploding. The ratio of data to metadata used to be 1,000 to 1 and today it’s closer to 10:1. It has become increasingly clear over the past year that the ability to harness this metadata to give organizations an advantage has become critical. IT teams must look at the foundational part of the storage software stack to make sure it can index and sort data. This is the only way to meet the demands of modern applications.
SIOS Technology (Cassius Rhue, VP customer experience)
We saw cloud becoming the de facto platform for ERP applications
As the use of advanced clustering software and added features in cloud computing have evolved, the cloud is becoming the de facto platform for ERP applications. Cloud computing is now making ERP systems accessible and feasible for smaller organizations in a wider range of industries.
We saw companies expand high availability measures as cloud migration and repatriation accelerate
Last year in the US, more than 80% of companies moved some public cloud workloads, apps, and data to local infrastructure, including private or hybrid cloud, and sometimes a classic on-site data center. As companies move more of their stateful applications, such as SAP, HANA, SQL Server, and Oracle to the cloud, the need for cloud high availability has grown.
Increased demand for faster DR has become the foundation of enterprise data storage strategies
As the pandemic, climate change, and other massive threats to IT infrastructure continue to fill the headlines, DR has become one of the most important and highly demanded requirements of enterprises. This trend has made it crucial for C-suite and IT teams to partner with vendors that offer solutions that perfectly align with the organization’s company wide DR requirements.
Spectra Logic (David Feller, VP of product management and solutions engineer)
Ramping up of IT security defenses
With the continued onslaught of ransomware attacks last year, the focus on data protection and the security of IT infrastructures remained high on the agenda. As attack methods became more severe and sophisticated, new solutions and processes were put into place by organizations to safeguard data and ensure business continuity. On top of leveraging the use of physical and virtual air-gap, 2022 saw an uptick in defense capabilities, with the deployment of immutable storage, encryption, time-based snapshots, replication options, and multifactor authentication.
Discussion of on-prem vs. in the cloud workflows
2022 saw the increasing adoption of hybrid cloud, with organizations choosing to reap the rewards of flexibility and data accessibility that cloud offers, with the option to also keep data on premise (either for legal requirements or where the cost of data repatriation from the cloud might be prohibitive). The shift to this approach meant organizations sought to understand which workloads to keep on prem vs. in the cloud, and even how to manage multiple clouds, integrating that capability with on-prem storage for long-term protection and usage.
Explosion of “universal” data availability
The gradual adoption of more efficient hybrid and multi-cloud workflows in 2022 has provided organizations with unprecedented data access across multi-site and multi-cloud storage. With conversations focusing on how to integrate cloud, protect against cybercrime, and the long-term planning of data storage while optimizing budgets, 2022 saw a shift where data storage management solutions began to take center stage as a means of successfully orchestrating these disparate factors. Distributed multi-cloud data management tools have empowered organizations with the ability to have greater control of their digital assets, while saving costs and enhancing data usage and protection.
StorCentric (Surya Varanasi, CTO)
SMBs felt enterprise-class pain, but didn’t have enterprise-class budgets to combat it. While SMBs suffered from similar IT and business pain points as their enterprise counterparts, their budgets for acquiring innovative technology and the manpower available to manage it differed greatly. For example, data protection and security remained a tip-top priority across SMBs and enterprises alike, especially in the face of an ever-escalating ransomware threat. What we saw over the past year, was SMBs seeking solutions that offered enterprise capabilities, but could be delivered in smaller, more affordable and easier to manage form factors.
More-and-more vendors recognized one size does not fit all. As a result of the aforementioned trend, more-and-more vendors recognized and/or maintained the opinion that one size does not fit all; and instead offered enterprise-class features like in-line compression, data-at-rest encryption, intelligent data backup, cloud connectors and anomaly detection at size and price points ideal for SMBs (as well as, enterprise edge use cases, which embody many of the characteristics and resource constraints as SMBs).
Enterprise workloads requiring top performance increased; i.e., real-time applications that collect, analyze and act on streaming data as it happens. As a result, innovative all-flash NVMe platforms entered the market, engineered from the ground-up with the superior performance, capacity and flexibility necessary to address these apps’ speed requirements. In addition, with ransomware remaining top of mind, the prevailing NVMe platforms also offered the built-in features and functionality critical for ensuring data security, regulations compliance and fast recovery from ransomware attacks.
StorMagic (Bruce Kornfeld, CMO)
Broadcom intends to buy VMware
The news has created massive confusion in the market. There are customers scrambling to find alternatives to VMware because of the anticipated increase in prices, and small channel partners are nervous that they will be ignored by Broadcom.
Supply-chain issues continue
The impact of Covid-19 continued throughout 2022 and impacted end users’ ability to deploy new solutions to help them modernize and improve their operations.
“Work from anywhere” culture
The pandemic caused all organizations around the world to adopt new technologies to keep business running with empty offices–and this approach is (arguably) here to stay.
Verge.IO (Yan Ness, CEO)
Acquisitions cause angst- VMware, potentially Nutanix…others disrupt the industry. Customers remember what acquirers did to other acquired companies. This is a natural course of highly successful tech companies and has been happening forever, but when the acquisition is of core infrastructure it is much more disruptive to IT than an outlier utility.
Staffing gaps were exacerbated during Covid and there seems to be a new norm for the priority of work
● The continued demand and growth for all things IT and data is causing an increase in demand on staff. As IT infrastructure becomes more complex those demands are growing exponentially.
● We heard from many companies that they were understaffed and not staffed to plan through the entire year. This does seem to be changing as large tech is starting to downsize or freeze growth as they see storms on the horizon.
● There’s more distance between the IT professional and the infrastructure they use or manage than ever. Working from home and working from anywhere means remote management, administration and operations are a requirement. This remote IT worker will play out nicely as workloads continue to distribute (see predictions). It’s also helped a bit with the staffing challenges. You can now hire from anywhere – though so can your competitors.
The cloud is expensive at scale – Customers need to focus on unit economic costs. The cloud will become increasingly expensive as the era of cheap money came to an end in 2022
● Andreesen article, The Register.
● The cloud allows you to buy resources by the sip. You can rent infrastructure by the hour and scale it massively or use only a tiny bit. You can scale for a few hours a night (Netflix – a majority of their streaming happens at night in the region); or for a few weeks (HR Block needs lots of resources around tax time, none after). It’s friction-free adoption and scale. However, everything else is wildly complicated. Ever seen an AWS bill? Or ever seen how much an AWS expert (yes, it takes an expert) costs? It’s a lot. Ever discovered it’s friction-free but really easy to forget about? Many have.
● What’s odd to me is that the Cloud Craze has taken over sensibility. All that agility ends up being expensive.
● The analogy I use is housing
○ Hotel rooms are great for a night or weeks of nights. It’s extremely simple, agile (1 night or 2 weeks, anywhere), scalable (need 10 rooms or 1?), and outsources all services (cleaning, utilities etc.).
○ If you need something for a few months you’re better off leasing or renting a house or apartment. You pay the utilities and some maintenance but aren’t burdened with the Capex.
○ Staying some place for a year or so. Rent a house. You’ll assume even more of the operating but no capital expense. Your TCO per night will be dramatically lower than the above options.
○ How about 5+ years? you’d buy a house.
○ The hotel rooms are like the cloud to me. Agile, flexible, friction-free (no lease), use and go. ○ What you wouldn’t do is stay in a hotel for 4 years (unless you’re Howard Hughes).
○ So, why would you endure the high cost of cloud computing for more persistent workloads? I don’t think you should. I think CIOs, CFOs and even IT managers and directors are becoming aware of this issue and are starting to slow down their migration, look elsewhere, at least for new workloads and even repatriating some workloads.
● Accelerated distribution of productivity – work from home, from anywhere. The infrastructure barely passed the test. But will it in the future?
Zadara (Noam Shendar, VP of WW solution architecture)
An overall commoditization of storage at a rate that represents an acceleration compared to prior years, with storage-only niche players coming under increasing strain. The most successful players offer either a truly differentiated product or the “full stack” (compute and storage).
The hyperscalers have been relatively slow to deploy edge solutions in 2022. Start-ups have been able to gain momentum because of this.
MSPs received large injections of PE money, but the effect hasn’t yet been felt in the marketplace. Key question: what is the right software stack for enabling MSP growth?