What are you looking for ?
Advertise with us
RAIDON

Vendors Predictions for 2021

64 companies share view and opinion for this year.

Just a few days after the publication of a vendors compilation about the last storage decade, here is the opinion of 64 vendors for the coming year. It is particularly interesting as we continue to live in a stressful period under the Covid-19 threat.

Before you read in details the following players inputs, we try to summarize key patterns we detect among all these information:

  • global data management especially for unstructured data with tiering, DR and globally smart policies,
  • no surprise with the edge, hybrid and multi-cloud globally,
  • container-based storage with Kubernetes fueled by DevOps needs,
  • security aspect with ransomware protection, encryption, air gap and tape,
  • cloud storage and services then
  • object storage, S3 and storage APIs,
  • SaaS backup and archive, and
  • of course AI/ML, GPU, analytics

The 64 companies are by alphabetical order: Acronis, Arcserve, Atempo, Backblaze, Bamboo, Catalogic Software, Cloudian, Clumio, Cohesity, Ctera Networks, Data Dynamics, DataCore, Datadobi, Diamanti, ExaGrid, Excelero, FalconStor, Fujifilm, Fungible, Hammerspace, HYCU, IBM, Infinidat, Infrascale, Iron Mountain, iXsystems, Kaseya, Komprise, Lightbits Labs, Micron, MinIO, Model9, Nasuni, NetApp, Object Matrix, OwnBackup, Panasas, Pavilion Data Systems, Point Software and Systems, Pure Storage, QStar Technologies, Quantum, Qumulo, Robin.io, Rubrik, Scality, Seagate, SIOS Technology, SoftIron, Spectra Logic, StorageCraft, StorCentric, StorMagic, StorONE, StorPool, StrongBox, Toshiba Electronics Europe, Vast Data, Wasabi, Western Digital, WekaIO, XenData, Zadara and Zerto.

Acronis
Jan-Jaap Jager, SVP

Cyber Protected storage will get much more attention and become more important in 2021. With all the news around data storage hacks, ransomware attacks, immutable storage erased, etc., just storage solutions (software/hardware) are not enough anymore, especially for customers that are planning to move to new storage solutions (on premise or in the cloud). The protection against cyber threats integrated with their storage solution will be an important decision criterion moving forward. Data Processing Units (DPU) like BlueFiled-2 from Nvidia will be playing an increasingly important role in the cloud storage architectures and will bring significant performance improvements to the large scale storage deployments.

SMBs and enterprises will look for local cloud storage providers in 2021. The global players like AWS, Google, and Microsoft have greater, larger region coverage and will for sure grow massively on the general trend for companies moving to the cloud. However, we have seen a new trend coming up, that is partners would like to store more locally in the cloud. And as most global cloud players don’t have a data center in every country, there is a great opportunity for players that offer such service.

More MSPs are going to offer simpler and cheaper storage alternatives to compete more with the main cloud players (AWS, Google, Microsoft, etc.). Players like Backblaze and Wasabi that offer cheaper storage offerings into the market, but also Acronis Cyber Infrastructure storage as-a-service offerings, have gotten great traction already in 2020 and will continue to grab market share in this space. Acronis continues to be at the forefront of the storage technology, ensured by continuing investment in the original research and development for the cutting edge new hardware.

Arcserve
Sam Roguine, director solutions marketing and enablement

Cyber criminals targeting critical infrastructure will become a major threat to business continuity.
Attacks on critical infrastructure and industrial control systems will continue to increase and become one of the biggest threats to society in 2021 as criminals intensify their attacks and make them more damaging. Because they have the potential to disrupt critical operations like oil and gas, these attacks will be more devastating and dangerous than attacks that solely compromise company data. In 2021, these companies will have to expand their data protection and security protocols to account for changes to their risk landscape.

Container data will need better protection
With the ever-wider adoption of containers, the role of data protection companies will also evolve. They will have to show companies how to manage containerized data and how to understand where it lives. In 2021, we will see more partnerships between data protection companies and those using containers to guide implementation and help develop a solid data backup and protection plan for container data.

Rethink what data needs to be protected and how
In 2020, Covid-19 forced many organizations to take a hard look at their data protection protocols. This created a greater need to protect workspaces that may have previously been deemed “not worth it,” including remote employee devices. Remote work isn’t going away anytime soon. Organizations will need to re-tier their data, systems and applications in line with new vulnerabilities. In 2021, more data will be stored in the cloud and companies will need to develop robust plans that aggregate and protect newly distributed data.

At least one Covid-19 vaccine distribution effort will be disrupted by a cyberattack
Over the past year, healthcare providers and hospitals have been prime targets for rampant cyberattacks. With several Covid-19 vaccines currently going through development, approval and distribution, the healthcare industry will become even more of a high value target for ransomware. This is primarily due to the large sums that have been invested in developing and manufacturing these vaccines. The US federal government, which is just one of the parties involved in the vaccine development, has already invested more than $9 billion, while the European Commission has unlocked €2.7 billion to help finance the efforts of major European laboratories. These are attractive figures for cybercriminals, who can feel confident in asking for hefty ransoms, as governments and other parties will prefer to pay rather than risk delays or disruptions. Many organizations will also be involved in the distribution phase, meaning there’s greater security risks and increased likelihood of attempted cyberattacks. It is therefore very likely that we will see at least one Covid-19 vaccine distribution effort disrupted by a ransomware attack, so the businesses involved must be prepared to protect against these threats.

Atempo
Ferhat Kaddour, VP sales and alliances

Increased data management complexity and threats will call for best practices assessments
2021 will see accelerated data management trends including the increasing use of active archiving in hybrid scenarios, a mix of on-premise and cloud infrastructure accelerated by the impacts of the Covid crisis. Many organizations are struggling to cope with new usages, trying to leverage their current archiving systems to embrace new workflows such as remote collaboration or content distribution.

In 2021, not only will the file based unstructured data trend continue to fuel usages such as IoT, genomics, simulation and so on, it will become even more predominant. This will require active archiving and data management solutions to not only prove they can scale in volume but be flexible enough to embrace multiple complex scenarios in heterogeneous environments across multiple sites.

However, too many organizations when faced with rapid evolutions are using rigid solutions or making choices essentially based on cost. Because of cost-consciousness, security becomes a secondary matter.

The number of cyber-attacks targeting backup and archive data will only increase in 2021. Respecting best practices including multiple copies on different air gapped technologies and destinations, is a call to establish proper assessments of an organization’s vulnerabilities. Increasing cyber threats are imposing a critical impact to business continuity and long-term asset preservation and this impact extends across organization’s workflows from preparation and planning to the execution of archive protection tasks.

Industry experts are convinced this will be proven again in 2021.

Backblaze
Jeremy Milk, head of product marketing

Ransomware ready will become the new normal. IT leaders will universally advocate for 3-2-1 backup not as a “best practice” but as a basic requirement for operations. Object lock enabled immutability will be expected for every backup solution. Workplace ransomware drills – featuring data restore tests and disaster recovery time trials – will become as common as fire drills. And CEOs will pay closer attention as they see ransomware readiness as part of their overall business health, not just a matter of interest to their CIO and/or technical leaders.

The exodus from on-premises only storage will accelerate. After the pandemic, businesses will not race back to on-premises only infrastructure. The power of effective collaboration regardless of individuals’ locations, being able to budget for storage as a predictable operating expense instead of wasteful and cash-intensive capital expense, extending the useful life of on-premises storage by incorporating it with a hybrid-cloud approach – businesses have received a crash course in the benefits that cloud solutions have to offer, while also underlining why some of the classic arguments against cloud storage are far from accurate. We’re not saying tape is dead in 2021, but those who’ve relied on it for reasons other than strict use case necessities will have far less ground to stand on in the coming years.

Expect the rest of the workflow to go cloud, too. Beyond storage, software solutions that absolutely, positively had to sit in the office will move into the cloud or enterprising customers will take them there via VMs. Hardware like workstations will increasingly transition toward “panes of glass” with view on to the cloud as their connected apps live there instead of on local machines, further disrupting or enabling new flexibility in sectors like post-production that have historically been entirely centralized.

Developers will demand freedom of movement for data and services. Legacy cloud providers who punish developers for working with other cloud service businesses by charging punitive egress fees will see their market share in that segment decrease. Providers will be forced to abandon fees as developers demand the ability to work with best of breed services, rather than operating on monolithic platforms.

“Easy” will become table stakes for cloud storage offerings. As massive conglomerates that provide cloud services increasingly necessitate consulting agencies just to understand pricing, services, and tiers, the majority of IT decision makers increasingly want the opposite in cloud solutions: They seek something that’s easy to test, onboard, and administer, without the fear that a few months down the road they’ll be surprised by unanticipated complexity in functionality or huge surprises in their invoicing. The importance of data in everyday business operations is becoming broadly recognized from the Fortune 500 down to Main Street businesses. But the majority of potential future customers will need solutions they can onboard, use, and trust without the need for significant technical support.

Bamboo
Tony Craythorne, CEO

2020 was a year of forced change due to the global pandemic. As we begin 2021 with a new understanding of “anything is possible”, we see that the old ways are making way for the new at an accelerated pace. When we consider the infrastructure options of the data center, we see 2021 as the year that enterprises will truly begin to shift away from Intel and turn toward a new emphasis on the adoption of Arm technology.

There’s evidence from 2020 that this will be the case. Nvidia’s acquisition of Arm Holdings, and announcements from Apple, Microsoft, and AWS point to the momentum heading in the direction of Arm. Modern software design and data centers need high throughput, low power consumption, and high density computing platforms. Arm servers deliver this and are able to run on the same open source software that x86 servers utilize.

The time for Arm servers is here. And we predict they will gain in market share because of the obvious business benefits they offer compared to what dominates the market now. Arm’s power, sustainability and compute density at a low cost, will pave the way for a data center infrastructure revolution that begins in 2021.

Catalogic Software
Ken Barth, CEO

In 2020, Kubernetes emerged as the de facto standard for container orchestration and in 2021 we expect this trend to see tremendous growth and ubiquitous enterprise adoption.

Containers are a natural evolution from virtual machines to a more granular and portable application environment, designed to solve problems with rapidly developing and deploying cloud-native applications.

That’s just the first benefit as there are many advantages that container adoption can bring to a business. A Kubernetes-based container management platform provides a unified approach to achieve the benefits of development speed and deployment scale, while also addressing cloud portability and compliance.

In 2021, we predict that one of the big internal enterprise questions will be how to decide which vendor’s Kubernetes management platform is the go to. Business and IT leaders need to ensure in the world of the new serverless cloud applications being developed, that each organization chooses the right one to meet their corporate best practices for security, governance, monitoring, and business continuity. As with any new technology platform, IT leaders will need additional data management tools, such as a Kubernetes backup service, to provide data protection and disaster recovery for their serverless databases today. We predict the accelerated adoption of these data management tools in 2021, as well.

Cloudian
Jon Toor, CMO

Public cloud and on-prem storage will merge

  • All public cloud providers now offer on-prem solutions, which positions public cloud and on-prem as environments that should work in combination, rather than being viewed as an either/or decision.
  • In addition, enterprise storage providers have upped their cloud game, building new solutions that work with the public cloud rather than competing with it.
  • As both sides move towards the center, the inevitable result is that organizations will come to view public cloud and on-prem as two sides of the enterprise storage coin.

Self-managing storage in data centers will become mainstream.

  • Automation will expand as a critical component of storage systems to replicate data for disaster recovery, manage immutable copies of data, monitor hardware for potential failures and proactively initiate replacement tickets.
  • Enterprises will increasingly rely on automation to reduce outages and disruptions with predictive maintenance – ultimately saving costs, enhancing security and adapting to evolving workload needs.

Object storage shatters the myth that it’s only used for archive.

  • Although object storage is best known as a backup and archive storage solution, three trends will expand that perception in 2021.
  • First, flash-based object storage will gain favor in data analytics workloads that also have high capacity requirements.
  • Second, S3-compatible storage will simplify Kubernetes deployments, making it a logical choice for modern applications.
  • Third, cloud-native applications will be increasingly be deployed on prem, driving the need for on-prem S3-compatible storage to enhance application portability. As a result, more organizations will use object storage to support compute-heavy use cases, such as AI, ML and data analytics, shattering the “cheap and deep” myth once and for all.

Clumio
Poojan Kumar, CEO

AWS has been the big leader in the cloud realm – and 2021 will see Azure and GCP play “catch up” with additional partnerships. In fact, both will make partnering core to their competitive strategy. We saw Google announce a contract with Box last summer – and we’ll see more deals like this next year as partnerships become an integral part of their cloud innovation strategy. When Azure and GCP become more partner-friendly, this will force AWS to seek out more partnerships to remain competitive in 2021. As VCs put more of their focus on cloud-only companies, we’ll see all cloud vendors work to strike a balance between building and partnering – specifically when it comes to their API and platform strategy.

AWS has been the default standard when it comes to cloud infrastructure – and there are still a lot of companies built on AWS. Historically, companies have used AWS as a cloud starting point before expanding to cover other cloud environments. But now, with Azure and GCP focusing more on platform and API plays, we will see more startups launch from a non-AWS launching pad. The first batch of companies will launch from Azure – with GCP to follow. This first batch will experience growing pains, but we’ll see the process become easier for the second batch with consumable APIs; think of it like the “flywheel effect”.

Cohesity
Mohit Aron, CEO

Storage management is the past, the Future is Data Management
The focus in 2021 will no longer be about managing storage, it is going to be about managing data. In 2021, businesses will focus more on how to easily access data and generate value from it. This includes data that is stored in the cloud. While hybrid cloud models will continue to grow in popularity and adoption in 2021, we will see more and more organizations embracing cloud-based services that enable companies to derive much more value from data, whether that’s applying analytics and machine learning capabilities to the data, improving securing or elevating compliance.

Vineet Abraham, chief product and development officer

With attacks on the rise, increased security and rapid recovery will be requirements
While ransomware attacks became much more prevalent in 2020 as the Covid-19 pandemic gripped nations around the world, in 2021, we will continue to see attacks grow at unprecedented levels – not only in terms of volumes but in terms of sophistication. Verticals particularly at risk, will continue to be those with large personal data sets, such as healthcare, financial services, federal, and retail. State-sponsored attacks will continue to disrupt transportation and supply chains. Organizations that can recover in minutes instead of days will be much more immune from ransomware extortion and have a tremendous advantage in their respective sectors. As a result, expect data management vendors to continue to focus heavily on security offerings that help organizations improve their security postures, including threat response and threat identification.

CTERA Networks
Liran Eshel, CEO

The rise of the edge – IT tends to swing like a pendulum between centralized and distributed topologies.
As we emerge from a decade focused on mega-clouds and centralizations, 2021 will officially mark the rise of the edge and the distributed clouds. Recent releases of “tethered clouds” such as Amazon Outposts pave the way for cloud providers’ infiltration into core datacenters. With Gartner predicting a staggering 30x increase in edge workloads in the next five years, we should see strong demand for edge-to-cloud solutions and for hyperconverged file services with edge data processing.

Global File Systems – For years the concept of one file system that can reach all outskirts of the enterprise was considered an ideal IT vision but impractical to achieve. Storage architects defaulted to using a mix of systems, ranging from legacy NAS at the core, servers for branches, and cloud file sharing for the desktops. Following major technology advances such as Ctera’s 7.0 release in late 2020, global file systems are now evolving into full-scale enterprise solutions with distinct advantages over mainstream NAS suppliers, including fast and secure access to corporate data from any location over a single namespace with consistent access control.

Home and Office Blur Together – The 2020 pandemic blurred the lines between corporate IT and the home network. With Zoom becoming a household name and virtual conferences becoming the new norm, the separation between home and office environment became practically irrelevant. This imposed a whole set of new IT challenges for extending secure corporate data access without exposing the organization to data leakage and ransomware. In 2021 enterprises will shift weight from HQ campuses to smaller branches and home offices, and will seek to balance user demand for fast access to file assets without composing security and performance.

Data Dynamics
Piyush Mehta, CEO

Data creators will participate in the commercial value of their data as Individuals will be paid for access and use of their data by corporations. With data sovereignty laws garnering greater momentum globally, the “opt in” of sharing data will require corporations to share the commercial value gained by using an individual’s data in order to get users to agree to share their information. This also creates a democratization of data from simply being in the hands of enterprise organizations and leads to a more federated model.

Intelligence driven data mobility will be key to an effective core/edge strategy and vital to access data at the right time and place of choice. 5G is starting to propagate and will transform use and access to data as the IoT will create a network of interconnected devices. This will require data to be moving from edge to core to edge as required, transforming it along the way and ensuring it is done with data context in mind.

Public cloud management capabilities will be provide on-prem to legacy enterprise data centers and in turn create a single management layer across the hybrid cloud. The ease of use of public cloud interfaces to create and access infrastructure in a moment’ notice will be brought to legacy enterprise infrastructure management on premise. This will create a single and simple user experience to access infrastructure regardless of location and to do so in a hybrid utility model.

DataCore
Gerardo A. Dada, CMO

SDS becomes essential for IT infrastructure flexibility and cost efficiencies – critical factors in economic recovery

  • SDS will make storage smarter, more cost-effective, and easier to manage, while enabling IT to be future-ready. It has the power to manage, optimize and simplify primary, secondary and backup/archive storage – all under one unified platform that provides consistent services across different classes of storage. All storage technologies are supported consistently, regardless of vendor or architecture, and can be deployed on new or existing servers, turn-key appliances or in the cloud.
  • SDS has proven to be a powerful tool during the economic crisis by helping IT departments achieve cost savings and get maximum value from their existing investments. As IT spending begins to ramp back up in 2021, SDS will continue to function as a major cost saver for IT teams, helping them get more predictable performance as they scale. Furthermore, it can usually be acquired with term licenses, lowering the entry point for adoption and allowing IT to become adaptable to multiple situations.
  • As these benefits are increasingly realized, technologies that support flexibility and cost savings will continue to play a key role in the short and long term as the industry stabilizes.

The cloud is not a panacea; primary use cases for on-premises applications will exist for a long time

  • The cloud was originally touted as a less expensive storage option. However, that was quickly dispelled when IT started using more cloud resources and the associated costs showed the cloud was not always the most cost-effective infrastructure option.
  • Today, enterprises have moved a portion of infrastructure data back on-premises where they have more control and better economics. Yet, the cloud still offers formidable simplicity, agility, and yes, cost efficiency in many cases – one of which can be long-term secure data storage.
  • As the IT industry becomes increasingly smarter about what belongs in the cloud and what does not, modern data management tools and software-defined storage that spans multiple public clouds, private clouds, multi-cloud deployments, and on-premises infrastructure will help the industry reach a level of maturity. This will be made possible by smart software that understands the profile of data, has the ability to access data anywhere, and therefore can move it in an automated fashion based on business rules.

More mature data tiering is required for hybrid cloud

  • There are currently very few tools that make the movement to the cloud transparent and automatic, essentially creating an additional data silo in the cloud. An intelligent hybrid system should move data to or from the cloud, as needed, to optimize cost and performance based on business needs, automatically and effectively.
  • Ideally, data that has been moved to the cloud is always available, so it is not “shipped off” to a cloud but tiered intelligently while still being part of the same logical system. A software-defined system becomes the unifier across storage systems and the intelligent layer that controls data placement in an optimal way. Vendors who can span the public/private divide will have an advantage.

Datadobi
Carl D’Halluin, CTO

Cloud consumption is going to accelerate, creating a wider need for organizations to embrace hybrid storage setups. Organizations will need software to manage their unstructured data in this decentralized storage world. Businesses in every industry are wondering about the practicality of their on-premise storage gear now that their employees, customers, and other stakeholders are all working from home. As cloud file systems continue to mature, cloud vendors will start seeing higher adoption rates as they add features and prove stability. As a result, at some point in the next year, customers will wonder if they should spend money on a fully duplicated on-premise file server infrastructure, or if they should redirect that money to the cloud to enable global data availability, data protection, and data archival.

As organizations move away from a single-vendor on-premise storage infrastructure and look to a fully comprehensive storage plan that likely includes an increase in cloud, they will begin to buy from more than one vendor for a comprehensive storage solution. In this multi-vendor world, customers will need software to manage their unstructured data – no matter the placement. In the new hybrid setups most organizations will have in 2021 and beyond, it is important that data sets are moved, copied, or archived to the optimal locations. By utilizing software created for unstructured data management, IT administrators can see an overview of their data and as a result set up policies, satisfy compliance regulation, protect against threats, and optimize costs for their storage.

Steve Leeper, office of the CTO

In 2021 visibility and management of massive datasets will be the name of the game. Enterprise IT professionals will be seeking solutions that give them valuable insights into the datasets for which they are responsible. Beyond getting insights into the data they will need innovative technology that allows them to take action and manage the data in a reliable and automated fashion.

The story does not end there, however, as these same professionals will want world-class service and support to ensure the solution meets all the needs of their organization. They will be looking to their trusted channel advisors (VARs, SIs, and distributors) for a relationship that doesn’t end at the loading dock. IT professionals will be seeking a true partner that is able to help them to navigate the entire data movement process, in a highly predictable manner – from identifying the right data to move, to helping them to move it as quickly, accurately, securely, and cost effectively as possible – all while maintaining critical capabilities, such as chain of custody tracking and integrity reporting.

Michael Jack, VP of global sales

In 2021, organizations that understand unstructured data management are going to be the winners. On a global scale, unstructured data is growing at such a rapid pace that the sheer amount of it is out of control. This makes it difficult for organizations to make informed decisions when staring into such a muddled abyss. Having a strategy to manage unstructured data is now essential and will put space between the leaders and those forced to play catch up. But, devising a strategy for such a complex environment isn’t innate to most enterprise-scale organizations – they’ll need help.

Vendors that understand management of unstructured data will be the differentiating factor that most organizations are looking for. Experienced vendors can provide knowledgeable insights to optimize unstructured storage environments. More and more, important business decisions will revolve around migrating data into the cloud and then about how to protect it, move it around and archive it. In the year ahead, businesses that have ballooning quantities of unstructured data will need guidance on how to make use of it, what insights can be gleaned from it, and how to make informed decisions based on it.

Diamanti
Gopal Sharma, CTO

The advancements in storage and networking – from 10GbE to 100GbE in network speed and to NVMe in storage – will become standard. As a result, we will see the emergence of technology choices, such as the offloading of storage and networking functions and the emergence of data processing units. All of these factors will lead to accelerated acceptance of networking and storage standards from all parts of the ecosystem.

Brian Waldon, VP of product

In hybrid- and multi-cloud environments, global visibility and governance of storage will be a major focus moving forward. Organizations must determine how to effectively provision and administer storage capacity in different locations, whether the location is a data center or the cloud. With those capabilities in place, companies can make smarter decisions about where they’re running applications. Storage platforms spanning multiple locations require global fault tolerance and persistence, and vendors that can enable that vision will succeed.

Boris Kurktchiev, field CTO

Organizations will start to give data stewardship and data flow in hybrid-cloud environments more consideration. Right now, there isn’t a single technology that can give companies a block-for-block copy of data between on-premises servers and hosted cloud instances. Many teams are working on this problem by way of developing a federated Kubernetes approach. Eventually, we’ll see a data plane and data guarantee for moving data between different cloud instances.

ExaGrid
Bill Andrews, CEO

The move to public clouds such as AWS and Azure will not take a big piece of the Fortune 2000 accounts. Lager organizations are running the economics and realizing that the cost to go the public cloud is higher than running IT operations themselves. Therefore, the public cloud will be used to augment larger IT data centers for certain use cases such as archive data (data that has been unchanged for years), devops, and other such use cases. The core applications and storage will remain in the larger organizations data centers.

The public clouds will continue to grow due to SMB growth, IoT, augmenting larger data centers and other use cases. The growth of the public cloud is separate and distinct from the growth of larger organization’s data centers. The net is data is growing on all fronts and there is plenty to go around.

Backup and recovery becomes more important moving forward, not just for local file restores, longer term legal discovery, financial audits, and disaster recovery (if a natural or manmade disaster hits the primary data center) but also as a way to avoid paying the ransom for ransomware attacks. Organizations need a way to have a second copy that is protected for recovery in the event of a ransomware attack. Ransomware is only in its infancy and will become a bigger threat as time moves on.

Excelero
Yaniv Romem, CEO

Deep learning (DL) will continue to drive innovation. DL requires the processing of massive volumes of annotated data in order to reach a level of accuracy close to 100%. The inability to move this data from the storage system into the GPUs at the speed required by DL continues to pose a challenge. We can expect to see continued innovation in this space in 2021.

AI, HPC, data analytics and database workloads will move to the cloud at an accelerated pace. Present cloud technology cannot readily deliver the storage required by AI, ML and DL. Without support for both low latency and high IOPs, applications can perform less than acceptably, or costly GPUs are underutilized and thus infrastructure efficiency lags – just as is the case with on premise infrastructure.

Enterprises moving such workloads to the cloud will continue to drive new hardware and service offerings. In order to avoid vendor lock-in, many will build best of breed solutions that combine readily available services, such as cloud-orientated elastic storage solutions.

More importance will be placed on the AI, ML and DL storage price/performance. Achieving the level of IO/s required by AI, ML and DL workloads can escalate storage costs beyond the cost of the GPU servers. From the customer’s perspective, the majority of the available budget should be spent on the compute power, i.e. the GPU servers. Expect further emphasis and market choices for AI, ML and DL storage in 2021.

FalconStor
Todd Brooks, CEO

While this may not be at the top of the industry’s mind, but based on what we see in our customers’ environments in the work-from-home era, we believe that tape adoption will grow in the major public clouds while it will continue to taper in on-premises data centers as organizations export virtual tape to the public cloud as a means of optimizing their hybrid cloud approach to data backup and archival.

FujiFilm
Rich Gadomski, head of tape evangelism

The past decade saw the renaissance of data tape technology with dramatic improvements to capacity, reliability, performance, and TCO giving rise to new industry adoptions and functionality. This trend will only continue in 2021 as data storage and archival needs in the post-Covid digital economy demand exactly what tape has to offer. Below are 5 key contributions tape will make to the storage industry in 2021.

  • Containing the growing cost of storage
    One lingering effect of the pandemic will be the need for more cost containment in already budget-strapped IT operations. We are well into the “zettabyte age”, and storing more data with tighter budgets will be more important than ever. Businesses will need to take an intelligent and datacentric approach to storage to make sure the right data is in the right place at the right time. This will mean storage optimization and tiering where high capacity, low cost tape plays a critical role – especially in active archive environments.
  • Best practice in fighting ransomware
    One of many negative side effects of Covid has been the increasing activity of ransomware attacks, not only in the healthcare industry which is most vulnerable at this time, but across many industries, everywhere. Backup and DR vendors are no doubt adding sophisticated new anti-ransomware features to their software that can help mitigate the impact and expedite recovery. But as a last line of defense, removable tape media will increasingly provide air-gap protection in 2021, just in case the bad actors are one step ahead of the good guys.
  • Compatibility with object storage
    Object storage is rapidly growing thanks to its S3 compatibility, scalability, relatively low cost and ease of search and access. But even object storage content eventually goes cold, so why keep that content on more expensive, energy intensive HDD systems? This is where tape will play an increasing role in 2021, freeing up capacity on object storage systems by moving that content to a less expensive tape tier all while maintaining the native object format on tape.
  • Low-energy tool for countering climate change
    Prior to Covid-19, climate change was a big issue. But like many issues, it was placed on the back burner during the pandemic. In the wake of recent natural disasters including record storm activity and wildfires, look for climate change to regain focus in 2021. Enterprises worldwide have initiatives in place to address global warming and IT operations are not excluded. Data centers consume a significant amount of energy and have been noted to contribute as much CO2 as the airline industry. Storage is a major consumer of energy within data center operations, and this where tape systems have a big advantage over HDD. Data tapes don’t consume energy unless actively being used in a tape drive, unlike 24/7 spinning disks. According to a recent whitepaper from Brad Johns Consulting, tape consumes 87% less energy than disk and produces 87% less CO2 than the equivalent amount of disk storage. Look for tape to take on more of the load for infrequently accessed data with the benefit of cutting energy consumption and cost.
  • Affordable storage for video surveillance
    The video surveillance market is exploding. While the prices for cameras come down, and resolutions soar into 4k and 8k, the cost of content retention will be almost cost prohibitive without a new breakthrough solution. LTO can save video surveillance operators 50% on their cost of content retention compared to expensive disk-only systems. But while LTO has become a de facto standard in the M&E industry, its ease-of-use was a historical hurdle for surveillance operators. Management software companies like Cozaint have come to the rescue, seamlessly integrating a tape tier, behind disk, to allow the luxury of longer retention periods previously not feasible due to cost constraints. Look for LTO tape to play another starring video role (beyond Hollywood) in 2021!

Fungible
Jai Menon, chief scientist

Block storage of the future will evolve to become universal storage that is scale-out and multi-tenant. It is clear that all data centers are evolving to become scale out and cloud-like, whether on-prem or in the cloud. Similar to hyperscaler data centers, all data centers will evolve to using a single universal block storage solution for all their applications. If hyperscaler block storage, such as EBS (Elastic Block Storage) from AWS, can support the tremendous diversity of workloads that run on AWS, so too will next-generation universal block storage for on-prem data centers.

Such universal storage will be scale-out, will allow access from bare-metal, virtualized and containerized apps, and they will support different types of storage volumes with different performance, security, durability and cost attributes, all supported by the same underlying infrastructure. Customers will no longer need to buy vendor A’s storage for one class of applications and vendor B’s storage for a different class of applications.

These new universal block storage offerings

  • will allow different volumes belonging to different customers or workloads to get guaranteed levels of performance, even when multiple workloads are running at the same time
  • will ensure a given set of volumes can only be accessed by the user or workload that created them through use of per-volume or per-tenant encryption keys, and other means
  • will allow each volume to choose its own level of durability (RAID, EC, mirroring)
  • will allow different cost objectives to be specified on a per-volume basis (spinning disk, cheap Flash, expensive Flash, persistent memory)
  • will allow access from bare-metal, virtualized, containerized and serverless applications.

The next gen protocol for block storage will be NVMeOF, specifically NVMe over a transport on top of IP/Ethernet. While there will continue to be block storage products supporting older protocols such as FC or iSCSI, the future will increasingly be dominated by block storage that uses the NVMeOF protocol. More specifically, we believe the future belongs to NVMe over a transport protocol that sits on top of IP on top of standard Ethernet for reasons of ubiquity, low cost and industry momentum. For now, the dominant block storage protocol will be NVMe over TCP, but we expect other transport protocols (e.g. Truefabric) carrying NVMe to emerge that will ride on standard IP and standard Ethernet. Such new transports will allow storage disaggregation at data center scale, not just at the scale of one or a few racks.

The corollary is that protocols such as NVMe over RoCE (which requires specialized lossless Ethernet and cannot scale to large numbers of racks), or NVMe over FC (which is not Ethernet based) will be relegated to niche use cases. See [ NVMeoF ] for a performance study that shows how much faster NVMeoTCP is compared to iSCSI. It is so fast that it can make remote storage appear to be as fast as local storage. When this level of performance is achievable using the pervasive and inexpensive TCP transport, use of RoCE which requires less pervasive Ethernet technologies such as DCB (data center bridging) and PFC (priority flow control) will prove to be unnecessary.

The need for local storage on servers will decrease over time and data centers will become hyperdisaggregated. In a hyperdisaggregated data center, local resources such as storage and GPUs will be removed from servers and made available remotely on the data center network so they may be shared by many servers. A confluence of many trends will drive hyperdisaggregation in the data center.

These trends are

  • The high cost of expensive local resources like GPUs and SSDs will force data centers to disaggregate and pool these resources for more efficient resource utilization
  • The emergence of high-performance protocols such as TrueFabric and NVMe-oF that allow disaggregated remote resources to have the same performance as local resources, for many applications
  • The increasing use of DPUs (data-centric processing units) in data centers, in addition to CPUs and GPUs. DPUs run data-centric workloads at high speeds and will further enable expensive remote resources such as GPUs and SSDs to have the same performance as local resources

We predict the rise of hyperdisaggregated data centers using diskless and GPU-less servers. Hyperdisaggergation will enable a 2X-3X reduction in the TCO of future data centers.

Hammerspace
Douglas Fallstrom, SVP products

2021 will be the year that DevOps looks to storageless data to solve the gaps in data orchestration.
To truly become cloud-native, DevOps will need the ability to orchestrate compute directly with data, not storage. The limitations caused by data gravity are no longer tolerable but can only be overcome when we untether data from the underlying infrastructure. Much like serverless compute, storage management will be automated and defined by the ever-changing objectives of workloads, not the physical hardware.

There are 3 basic tenants any solution must meet to be “storageless”:

  • It must be a universal data supplier. This means it will be software-defined to run anywhere, serving data everywhere while eliminating the need to for specialized storage.
  • It must be a universal consumer of storage, using all available infrastructure to automate and optimize how data is managed. This spans protocols, clouds, geographies, and vendors.
  • It must use objective-based data orchestration to manage data. This is so data consumers do not need to understand the infrastructure, while IT keeps everything running smoothly. All of this must be fully automated and file-granular to be efficient and effective.

HYCU
Simon Taylor, CEO

The 2021 predictions:

  • Cloud adoption will continue to move forward quickly. It’s not just on the rise, but it’s across the board, for data, applications and workloads. In 2020, especially in light of what happened with the coronavirus pandemic and move to remote working and social distancing, companies spent more time leveraging the cloud than ever before.
  • Multi-cloud is now not just a concept but is firmly entrenched and moving forward into 2021 and beyond. Both customers and partners continue to look at which cloud is best for which workload or application. Some have decided to move workloads from public cloud to on-premises. Others are moving more workloads from on-premises to public clouds.
  • There will be continued interest and adoption with containers. Containerization is not in its infancy anymore and while it may not be as wide spread as some may say, it will be. HYCU is seeing it more with our Fortune 500 and Global 2,000 customers. It’s getting into much larger swaths of the data center and the data estate. More and larger applications will become more mainstream on containers into 2021 and beyond.

IBM
Shawn O. Brume, global hypergrowth storage OM

Active archive growth will be IT lever to invest in DevOps
2020 changed the focus of IT shops across the spectrum, but digital data never stopped growing. Only 33% of IT budgets shrunk as a result of the 2020 challenges, most budgets aligned spending to mobile access and security enhancements. While some infrastructure changes are on hold, storage requirements continue to grow. Active archives are the most efficient method of reducing cost of infrastructure without recognizing penalties. The growth created in cloud storage during the pandemic will lead to an increased spend on active archives by hyperscale and hyperscale-lite storage providers. At the same time, traditional data centers will continue to expand active archives as the only method to meet budgets while improving operational
efficiency on production systems.

Infinidat
Moshe Yanai, chief technical evangelist

A look into the 2021 crystal ball: the future has never been more uncertain
It’s the time of year, where we all look into our proverbial crystal balls and try to predict what the next year will bring for the storage industry. As 2020 has clearly had its unique challenges, we head into 2021 with lessons learned, new insights and a better understanding of the evolution of the storage market toward a post-pandemic future. The following are three market trends we anticipate to gain momentum in 2021.

AI and deep learning misconceptions will be demystified
The term AI has been overused, abused and misused over the last few years. While some companies tout their automation capabilities as AI, at the end of the day that’s all it is – automation. What differentiates automation and AI is the intelligence behind it. True machine learning is a more valuable subset of AI, and there is an even more intelligent and sophisticated layer known as deep learning. Deep learning allows solutions to dynamically react in real-time while making proactive suggestions and decisions based on multiple sources of information – offering up a number of different options for one request, and then using the outcomes to make more informed decisions for the next iteration.

With resources cut due to the pandemic, there is an increased emphasis on efficiency and cost effectiveness. Tools that leverage deep learning will have a huge impact on the storage industry in areas like performance, latency and security – especially for next generation applications and digital transformation projects. In 2021, organizations will need to see past the myths and misconceptions around AI and look for tools rooted in deep learning to be prepared for the newer, more effective use cases of the future.

Investments in IT projects and priorities will shift to enable Business
Continuity To prepare for unprecedented situations and business environments  in the future, organizations’ priorities are switching when it comes to making decisions about IT projects and technology investments. In 2021, there will be a focus on investing in the technologies that keep businesses running without human intervention, if something were to go wrong. Even IT projects that do not have a tangible, short-term impact on businesses will receive much more scrutiny compared to tactical, long-term investments. More strategic digital transformation projects will be delayed, especially ones that involve large scale data analytics, and there will be an uptick in the need for preventative and remediation type technologies, such as data storage backup, immutable snapshots, and/or active-active replication. Systems need to be protected from data unavailability as well as from increased cybersecurity risks. As a result, all IT infrastructure plans will need to be adjusted.

It will be more important than ever for organizations to shift to more cost-effective storage options with higher availability. Without knowing what the next disruption could be, companies will need a self-managed solution to meet growing capacity needs while budgets will continue being cut. There will also be much more scrutiny into purchasing decisions and higher pricing criteria for technology, since businesses have been economically challenged during the pandemic. Whereas vendors have historically focused on the total cost of ownership as opposed to the acquisition cost, priorities will switch moving forward, and the cost of acquisition will become a greater factor than the cost of ownership.

Need for data protection will Lead to more industry consolidation
Company data storage policies will continue to evolve in order to be successful as the data privacy landscape changes and becomes more complex. As new regulations are passed or updated, businesses without plans in place to assure customer data privacy will find themselves becoming obsolete, and will look to forward-thinking businesses that already have a good data privacy policy and the infrastructure to support it in place.

As this continues, data encryption will become a key pillar for data storage services, as well as a differentiator. One trend we will continue to see in 2021 is the consolidation or partnerships between software companies and data storage vendors, as these vendors aim to add services like encryption to make their products more compelling and competitive in an increasingly security-focused business environment. Rather than build it themselves or reinvent the wheel, they’ll look to established vendor services. Expect to see more acquisitions of data and storage companies in 2021 than we saw in 2020.

The year 2020 was not what any of us expected it to be. It was also a very important cautionary tale that digital transformation journeys and new tech investments need to be well underway by now. Never has there been such a pressing need to have secure data solutions and cost-effective, flexible storage options. This year taught us that the next disruption is always just around the corner. Those that prepare now, if they haven’t already, will be ready to tackle all of 2021’s troubles – whatever they may be.

Infrascale
Russell P Reeder, CEO

According to a new Infrascale survey of more than 1,200 business executives, 53% say general data protection is currently their most-needed service – and many are looking to MSPs to help fill the gaps using their cloud services.

It’s no secret that adoption of cloud services skyrocketed in 2020 – public cloud will have a compound annual growth rate of 18.3% between 2020 and 2027, reaching $88.7 billion by 2027, according to Global Industry Analysts.

In fact, cloud backup services are so important that cloud backup is the most-needed service for 2021, with 59% of business executives citing this as their top need for the new year, according to Infrascale’s survey.

The survey also emphasizes the importance of security technology: Antivirus/malware (50%) and network security (like firewalls and VPN) (50%) are second to cloud backup as the most-needed service for 2021.

Iron Mountain
Tara Holt, senior product marketing manager

Ransomware attacks will continue to rise increasing data security risks
According to the FBI, the severity of ransomware attacks increased 47% in 2020 and there has been a 100% spike in the number of attacks since 2019. With so many organizations adopting remote working as a response to the pandemic, cyber criminals are exploiting security vulnerabilities and this trend will likely continue throughout 2021. Organizations will benefit from taking another look at their data protection strategies to ensure all endpoint devices are secure, backed up and recoverable. Additionally, active and inactive data throughout an organization’s data ecosystem could be vulnerable to ransomware attacks, making it necessary to review how and where data is protected with an increased emphasis on storing an air-gapped, offline gold copy of data. Active archives will play an increasingly important role with the ability to easily export copies of data to a secure offsite location for safekeeping.

iXsystems
Morgan Littlewood. SVP product nanagement and business development

The future of application storage is software-defined, open source, and hybrid cloud friendly.

  • SDS will be the basis for nearly all new storage products.
    The lines between block, file, and object storage are increasingly blurred. Users are benefiting from the agility, expandability, and cost structures of unified SDS that supports bare-metal, laptops, VMs, and Kubernetes.
  • Open Source storage will continue to slash the long-term storage costs of many organizations.
    Data is still growing, but SSD and HDD storage costs are not decreasing as quickly. Users will increase their deployments of OpenZFS, HDFS, Gluster, MinIO, Ceph, etc. on industry standard hardware without the vendor lock-in and costs associated with proprietary stacks. Enterprise-grade support of open storage software is the key enabler of this transition.
  • Hybrid Clouds will be the storage reality of every large organization.
    The performance of local storage and the long term data retention of geo-distributed clouds are necessary partners. Cloud services will replace tape as the third copy of data. Data security, lower storage costs, and migration flexibility are critical.

Kaseya
Fred Voccola, CEO

MSP sales and marketing budgets will pick up. As a result of the pandemic, competition is at an all-time high for MSPs. Sales and marketing – or in other words, an MSP’s go-to-market strategy – is more crucial to their success than ever before. Many MSPs focus on having the best technology, but it’s rarely the best tech alone that wins new business. They have to sell themselves better and help their customers stay a step ahead of ever evolving IT needs in order to thrive in this new marketplace.

Automation will be king. MSPs must prioritize automation in their service delivery framework in 2021. This will allow them to more easily scale based on their customers’ needs as the economy changes.

SMBs invest more into technology and rely on MSPs to deliver their essential IT needs, meaning MSPs will flourish with strong 2021 growth.

Mike Puglia, chief strategy officer

One of the biggest trends that we’ll see in 2021 is the increase in remote management, which also aligns with a rise in automation. With a hybrid environment of employees working from anywhere, we are seeing a rapid acceleration across the board of remote management tools to support, secure, backup and monitor employees who are out of the office. Being intermittently connected to a VPN alone does not provide the accessibility IT needs to support this “new” workforce. Recent research has also shown that around 60% of IT leaders plan to invest in IT automation in 2021. Though automation is not a new trend, it along with remote management will take on new urgency as businesses continue to adapt to a heavily remote work world, and IT must continue to contend with the demands to do more with fewer resources.

Remote management and automation are not new to the enterprise. However, they’ve taken on heightened importance as distributed workforces continue to remain the new norm and organizations are pressed to prevent a growing number of cyberattacks. What was once easier to manage behind the corporate firewall through company issued devices is now like the lawless, wild west, with IT having to support personal device use while still maintaining security and compliance for the corporate network.

Most IT professionals saw their workloads significantly increase during the pandemic, facing new challenges posed by personal device use and enabling wide scale access to the corporate network. By leveraging remote management and automation solutions, IT admins can proactively manage multiple facets of the IT environment including endpoint and network monitoring, backup, compliance and security. Through automated workflows, it lessens the burden of manual oversight and empowers IT to provide better service delivery to employees, who can then perform their jobs effectively. It’s a win-win down the workforce chain.

IT leaders can make the most of remote management by investing in integrated remote management solutions that allow IT professionals to manage both endpoints and networks from one console. This allows technicians to reclaim the “space between”, or the valuable time IT professionals waste moving between disparate applications and processes, and allows for a more efficient allocation of internal IT team resources. Integrated remote management solutions also allow processes to be automated more easily, which saves time and makes it easier for businesses to stay compliant and secure.

Komprise
Krishna Subramanian, COO and president

Cloud storage costs begin to overtake compute costs
As enterprises continue to shift to a multi-cloud strategy, they are moving more traditional file-based workloads to the cloud. This is creating huge volumes of enterprise data in the cloud and will cause a shift in cloud costs. Today, the majority of cloud costs is in compute, but as more data builds up in the cloud, cloud storage costs will begin to rival compute costs in 2021 and in the longer run storage will overtake compute as the primary cloud cost.

Cloud replication replaces data center replication
In 2021, many companies will stop mirroring their data across datacenters and instead put a second copy of their data in the cloud. This cloud replication ensures that data is recoverable if a site goes down, or if a company gets hit with ransomware, or if users need to spin up some capacity in the cloud and want to access some of the data there. Cloud replication offers greater resiliency and less overhead than cross-site replication. it also creates an opportunity to diversify vendors as the cloud replication could be to a different file and object store.

Cloud storage management transitions to enterprise IT from DevOps
Many companies currently have two different IT teams – a core enterprise IT team that deals primarily with hybrid cloud data, and a DevOps or Agile Enterprise team that deals with cloud native data. With core IT workloads going to the cloud, enterprise IT teams are increasingly managing cloud storage – this trend will accelerate in 2021, spurring demand for data management that gives a single pane of glass to manage data across on-premises and cloud storage.

Lightbits Labs
Eran Kirzner, CEO

Storage will continue to be a major factor within the next decade as consumers rapidly consume data and companies clamor for new ways to store and make sense of it.

Some upcoming 2021 trends include:

  • Rising adoption of 5G and the accompanying explosion of IoT devices will push even more data toward the farthest reaches of the networks, increasing the amount of high-performance, low-latency storage in edge clouds.
  • Nearly all major flash array vendors will add NVMe/TCP support to their product offerings early in the coming decade, paving the way for widespread adoption of NVMe/TCP, and hastening the demise of FC.
  • Hybrid clouds will make even more economic sense, so for enterprises and SaaS providers the pendulum will swing back from pure public cloud to a mix of public cloud and private cloud/on-premises infrastructure. On-prem will be used to reduce costs and provide better performance and security, while cloud bursting will be used to expand workloads to a public cloud on demand during spikes.
    • The ability to move data between public/private, public/public and private/private clouds will become critical both for DR/BC and for burst needs.
  • QLC SSDs will pick up momentum in the market with an anticipated cost reduction curve. PLC (5 level cells) NAND introduction will push QLC prices much lower as it becomes the norm. And when the prices get low enough, QLC will even compete with HDD from a TCO perspective.
  • The demand for low latency storage will increase with expectation of 500 micro-second response times becoming the new normal, much as 1 millisecond because the AFA standard for SCSI and FC. Real-time processing such as fraud detection, network intrusion and AI inference will drive this need for low latency response times.

Micron
Raj Hazra, SVP of emerging products and corporate strategy

In 2021, the prevalence of remote work – even post-pandemic – will continue accelerating capabilities in the cloud. Companies will look to create preparedness for a new normal whether it be more IT solutions for a flexible workforce, larger data stores to fuel continued growth of online commerce, or resilient IT systems to address any future health care crises. This will drive unprecedented demand for agile IT infrastructure, multi-cloud solutions and pervasive connectivity to power edge-to-cloud use cases. While we see great opportunity for memory and storage to fuel increasingly data-centric cloud services, we will also see a rise in data center operators evaluating disaggregated, composable systems to better scale for coming enterprise demands and data growth.

Boundaries between memory and storage will blur: 2021 is going to see AI-as-a-service become mainstream, intelligence migrate to the edge, and 5G come to life. This is going to propel fundamental changes in the way server systems are architected. Memory will extend into multiple infrastructure pools – and will become a shared resource. And the lines between storage and memory will blur. You’ll no longer think “DRAM for memory and NAND for storage“. Instead, faster NAND will create the ability to use it as memory, and applications will grow in their sophistication to utilize resourcing in innovative ways. In 2021, we’ll also see enterprises seeking new kinds of solutions such as storage-class memory and memory virtualization to further unlock the value of AI and exploding volumes of data.

More pressure for an energy-efficient cloud: The move toward composable infrastructure will be critical in reducing over-provisioned resources, and thus, mitigating the rising environmental impact of IT. Information and communication technology is already predicted to use 20% of the world’s electricity by 2030. As companies look to incorporate sustainability into business strategy and reduce Opex for compute-intensive workloads such as AI and high-performance computing, we’ll see escalating demand for energy-efficient architectures, enabled by composable infrastructure.

AI will become more accurate and more ubiquitous, and, we’ll start to see it filling in more gaps for simple tasks where people would traditionally say “Why would I ever use an AI algorithm for that?” For example, grocery stores might tap AI-enabled cameras that periodically check to see if a shelf is empty and if so, alert a clerk to restock. In a post-Covid world, we’ll see more businesses adopting AI for use cases like these to create these contactless experiences. We’ll also see AI moving into infrastructure such as data centers and telecom base stations as neural network algorithms become more adept at workload and system error correction and recovery.

The rise of edge data centers: There are lots of startups that are focused on building edge data centers that look like transport containers that sit in metro areas to enable content – like your Hulu videos – to be closer to the consumption. We’ll see the adoption of these edge data centers in the next few years, as enterprises and consumers look to tap massive amounts of data for insight and faster services closer to the source.

High-bandwidth solutions for high-compute at the edge are becoming a requirement. With fully autonomous solutions, the amount of compute performance needed for cars is reaching data center levels; in ADAS and autonomous driving, cars need hundreds of tera operations per second. This is some of the highest levels of performance in the industry today, rivaling what you will find in data centers.

Given this, in 2021, we can expect to see embedded players increasingly turning to creative options for low-power, high-bandwidth memory and storage. For instance, requirements are exceeding capabilities of standard PC DRAM and low-power DRAM, and instead driving the need for capabilities of graphics memory like GDDR6 and or HBM. We’ll see these increasingly adopted in cars which need fast, high-performance memory.

In 2021, look for more usage of object stores, for storing structured and unstructured data, files, blocks, objects – all in one repository. AI’s large data sets need proximity to where processing is happening. So, rather than viewing it as a large cold store, object stores are going to be able to do AI-type workloads, which means large sets of data can be accessed with high bandwidth and low latency. As a result of the rise of data-intensive AI workloads, we’ll see the need for high-performance NVMe storage also increase, since this high-performing object store resides on flash-based storage, as opposed to the traditional, cold storage. This could lead to faster adoption of Gen4 NVMe SSDs to enable more powerful object store for AI.

MinIO
AB Periasamy, CEO

The race to win the hybrid cloud becomes the race to win the cloud. Full stop. Amazon’s acknowledgement that the private cloud will remain a vital part of an enterprise’s architecture strategy is a signal to the rest of the industry that if you are not hybrid, you are not viable. Over the next year, as the race intensifies, the true leaders in the hybrid cloud will begin to emerge. They will all be Kubernetes native.

Speaking of Kubernetes, over the next twelve months it will solidify itself as the standard for infrastructure as a service. This will require storage to become containerized and to be managed by Kubernetes. The companies that are native to the cloud will make this transition seamlessly. The appliance vendors will not. It is why they don’t like talking about Kubernetes. It is their kryptonite.

Over the next year, we will also see the lines between DevOps and IT begin to blur. The primary driver is the scale at which dataops needs to achieve. The scale will drive new requirements for automation and will force IT organizations to invest heavily in hiring software engineers. IT still needs to deliver against reliability, maintainability, serviceability, procurement, security, compliance – but automation becomes a key requirement to serve their internal and external stakeholders.

Model9
Gil Peleg, CEO

Some mainframe storage observations heading into 2021

Mainframe cloud data storage will take off. They said what? Yep, Gartner is predicting that around 1/3 of mainframe backup and archival storage will move to the cloud by 2025 according to a report by analyst Jeff Vogel. This is especially true as orgs seek better access to historical mainframe data to leverage against data management tools like Snowflake, Cloudera and Big Query for maximum insights.

There will be more innovation around mainframe from cloud storage providers. We saw some important mainframe modernization development in 2020, specifically, three big public cloud providers started mainframe modernization groups targeting mainframe clients while at the same time refining their on-premises offerings. Examples include Azure Stack and AWS Outposts. We should expect further developments along these lines in 2021.

Mainframe orgs will choose a hybrid storage model. Mainframe organizations have the kind of intense needs that a mainframe can best meet – but there are many functions, such as archival storage and analytics that can be better handled in the cloud.

Costs and strategic advantage (better analytics) support this view. For the foreseeable future, most organizations will need both.

IBM will stay the same. Nothing’s going to change with IBM in 2021. Their focus still seems to be on keeping mainframe apps on the mainframe. While acquiring RedHat can be seen as a positive for hybrid cloud, in fact, it does nothing to help mainframe functions take advantage of the cloud. In other words, IBM is firmly rooted in the on-prem world.

Nasuni
Andres Rodriguez, CTO

The IT leaders of the next decade will require a different set of skills.
In a world dominated by SaaS applications and cloud services, IT is going to change, and IT leaders are going to need different skills. IT used to be a kind of priesthood, composed of expert geeks who provided technology to their uninitiated colleagues. As an employee, you had to go through the priesthood for IT services. Now we’ve moved from highly trained people accessing rarefied technology, and everyone working through this priesthood, to a scenario in which everyone has access to tools and services – and IT has effectively moved up a level. The new IT experts are equal parts IT – and industry-specific knowledge of other technologies that are relevant in that sector. They are a broader, more curious bunch. We’re seeing a more diverse mix in IT because the job requirements are changing. It’s a much more strategic role, one that’s less about the technological plumbing than it is about how to create new services for the business.

IT is going to be more programmatic in nature.
IT leaders will be working with APIs and writing code to negotiate the gulf between what the ecosystem of cloud services delivers and what their specific enterprise actually needs. It’s going to be a far more creative role. IT needs to be able to set up systems and get them running and keep them running, with minimal need for human intervention. The alerts and fixes have to happen through automated processes that take seconds, because IT will already be on to planning the next integrative, forward-looking project, and won’t really be in the business of fixing leaky pipes.

The nature of IT disasters is changing permanently.
This year also marked the rise of distributed disasters – a problem I don’t see going away anytime soon. A sophisticated ransomware breach can impact all your offices almost simultaneously. Relying on a traditional backup solution is no longer a sound strategy. Even if they can recover a single site in a reasonable amount of time, many of them require sequential restores, so you’ll have to prioritize which sites to bring back up first. I met with one large multinational firm that had suffered from one of these new ransomware variants. The company was running best-in-class enterprise backup. They were following best practices. They’d educated their end-users, secured their firewalls and done everything by the book. None of this helped. When the company was hit, the malware immediately spread through its network, encrypting the file servers at hundreds of sites. Within two hours, IT had responded and shut everything down, but the damage was already done. We’re seeing these distributed disasters increase in frequency, and enterprises will need to find ways to prepare for them in the coming years, or incur devastating losses.

NetApp
Biren Fondekar, chief transformation officer

Four ways continued digital transformation will shape 2021

  • I think healthcare has endured the most rapid changes this year. The Covid shutdowns across the globe forced healthcare providers to immediately introduce telemedicine for most primary care scenarios. Even healthcare staff in roles such as in-patient pharmacists are now working remotely; with surgeons only setting foot in the hospital when it’s time to perform surgery. Anything that doesn’t require hands-on-care is being done remotely. These types of changes will continue, accelerating the digitization of healthcare.
  • Industries that offer services which do things for people that allow them to stay home, or distance themselves from other people. Formerly called “convenience services” like Grub-Hub/Uber Eats/DoorDash have paved the way with this type of service model. This model now been experienced, adopted and scaled by so many that we expect future business models will revolve around the concept or idea of “what else can person 1 do for person 2 as a service so that person 2 doesn’t need to leave the comfort of their home.
  • The acceleration of digital transformation. The realization that knowledge workers can be productive outside of a traditional office, big sales deals can be done in any industry remotely without fancy dinners and golf, and consumers can have nearly all needs met online. This greatly accelerates the need to digitize, and accelerates the need for managing more data.
  • Inter-personal and team-wide communications have improved greatly. In-person travel and meetings are not needed to market, sell, strategize, or execute successfully. This has allowed for greater diversity in our engagements and has decreased “localization bias”. It used to be that if you were local, hallway conversations were a great way to be informed, but left too many individuals out and decreased valuable, diverse input. Now a quick chat and a zoom room re-creates a hallway conversation atmosphere with more of the right people involved anytime.

Object Matrix
Laura Light, marketing manager

Active archive to become unified and data-centric
When it comes to an active archive, some people talk about on-prem, some people talk about cloud or even multi-cloud, and some people talk about hybrid, but what everyone needs is unified active archive platforms. The user doesn’t care where the data is so long as it is quickly accessible when they want it and where they want it. Unified deployment architectures are the antipathy of a piecemeal approach and will see far greater prominence once the pandemic “work from home” edict has died down and “work from anywhere (even from the office)” takes off.

Analytics are also likely to become increasingly important for managing active archives.
As it becomes increasingly difficult to manage disparate work forces and storage systems alike managers will want to take back control in order to make informed decisions. Analytics can pull together clear overviews. In the case of an active archive, what is being stored where and for how long? How long is data taking to transfer and how much is that holding up production? Is data being kept on the wrong storage tier?

Analytics have never been more important than during this time and will only become more of a focus during 2021.

OwnBackup
Adrian Kunzle, head of product and strategy

2021 will be the year software-as-a-service comes of age
Last year, companies learned the hard way that disruption can happen at any time and the time for transformation was yesterday. 2020 marked an inflection point from which there is no turning back. SaaS is no longer just a way of buying business systems, it is the only way of buying business systems. Every digitally driven business will run in the cloud on names like Salesforce, Microsoft, Workday, Slack, and Zoom to guarantee their systems are readily available when and where their teams need them. However, as teams move faster and faster in the cloud, many will see a corresponding increase in faulty code, data corruption, data leakage, and data loss, and will look to better storage, backup, and recovery solutions to address those challenges.

CIOs will embrace their role in data security and adopt guardrails to ensure oversight
As SaaS continues to pervade, concerns about the ability of cloud service providers to safeguard customer data have largely diminished, creating a false sense of security. Organizations fail to identify the greatest risk: vulnerabilities are inevitable whenever people, even those in their own trusted workforce, have access to data. For example, since most SaaS solutions are extensible through low-code, no-code, and pro-code tools, workflows and applications created by non-IT professionals greatly increase the chance of inadvertent data loss or corruption impacting mission-critical SaaS systems. To address this, CIOs and CISOs will recognize their own data governance responsibility to safeguard all company data stored in the cloud or on devices, and for SaaS account and access management.

Traditional business continuity practices will be reinvented
Most backup and recovery approaches are still stuck in the era of disaster recovery centered around infrastructure failures, making them ill-suited for the data management needs of an always on, digital world. In 2021, companies will start to shift their focus from the “backup” portion of business continuity planning to the much more difficult and important topic of “recovery”. More modern, forward-thinking approaches to recovering from data loss and corruption will involve advances in system uptime, data monitoring, rogue change isolation, and valid data identification. In addition, organizations will adopt more efficient strategies for data restoration, such as frequent incremental backups, as well as more configurable retention policies that automate compliance. Finally, next-generation data storage techniques will include sophisticated data replication and backup validation.

Panasas
Curtis Anderson, software architect

Enterprises will accelerate the move to HPC
Forward-looking enterprises are moving to adopt HPC compute and storage solutions because their legacy storage architectures cannot keep up with the extreme storage performance needed to manage the growing deluge of highly valuable data they’re accumulating, mostly driven by AI and ML applications.

New Enterprises requirements call for a new generation of HPC storage
As enterprises evaluate their options for dealing with the large data volumes driven by AI and ML, the historic stability and cost issues of legacy HPC storage are being exposed. Conventional wisdom says that to get high-performance storage without paying high prices you should use “tiering”, which is marrying a small high-performance storage subsystem with a more cost-effective layer of storage where all the “cold” data is stored. That’s where conventional wisdom is wrong. With AI and ML creating data that’s always “hot”, tiering becomes a performance-killing bottleneck. What’s needed is an infrastructure where all the storage hardware you buy contributes to the performance you need. A single-tier HPC-class storage solution using commodity platforms to manage data that is “all hot, all the time” meets that goal.

In a world driven by AI and ML all data should be mined for value
AI and ML are just the start of a trend where all the data an Enterprise has accumulated can be mined for value. That dispels the notion of “cold” vs. “hot” data and instead makes the case that all data is potentially valuable and therefore “hot”. A single-tier HPC-class storage solution using commodity platforms to manage data that is “all hot, all the time” creates a data storage environment where Enterprises have instant access to primary data and are able to explore the potentially high value of data that was previously labeled “cold” and relegated to slow and difficult to access archive storage tiers.

Pavilion Data Systems
Amy D. Love, CMO

Business insights, previously a combination of experience, select data, and educated-guesses, will shift to a data centric world as its source for clarity, where organizations will need to ingest and analyze huge volumes of data to derive actionable insights beyond what legacy systems can handle. Those that make this shift will become the leaders of tomorrow, while those that do not will be at a significant competitive disadvantage.

Organizations will rapidly adopt modern architectures, such as those that can accelerate GPU and DPU performance, and that can tier to multiple storage and media types. Organizations will choose choice and control, and demand that these modern architectures be capable of supporting block, file and/or object to provide ultimate flexibility in deployment options.

Demands on IT to support remote workers will remain. As IT leaders support growing volumes of data, along with an increasing distributed workforce, they will spend more time considering new technologies as a business imperative rather than proceeding with what has been.

Point Software and Systems
Thomas Thalmann, CEO

Data management and object storage software increasingly important for active archives

The continued growth of data volumes, especially of unstructured data, will remain the dominant challenge for data storage. Intelligent data and storage management software will meet this challenge by integrating different storage technologies (flash, disk, tape) and architectures (file systems, object storage) so that data is stored in the most appropriate storage class according to its use and purpose. As data becomes inactive at an ever-increasing rate, an economical, technology-independent

Active Archive Tier will be of central importance. In this context, software-based object storage, which supports HSM/ILM capabilities and mass storage technologies such as tape will become increasingly important.

Pure Storage
Shawn Rosemarin, VP of global systems engineering

Customer experience becomes the differentiator in storage as-a-service market
As-a-service consumption is now table stakes – and enterprise customers are demanding more. In 2021, it will become clear that as-a-service models have to re-justify their value every single day, because it is so easy to sign up for a service and then discontinue it in favor of a different one if it doesn’t meet your needs. This means that customer service needs to formally extend beyond the point of purchase and become a more holistic customer experience. In the hierarchy of awesome customer experience: it’s good to be responsive, it’s better to be proactive and let a customer know that there’s a problem, and it’s even better to let a customer know there was a problem and you have already fixed it. The next year will bring greater clarity around the differences between “products on subscription” (i.e. lease) offerings and true “as-a-Service” solutions, which are about buying an outcome (i.e. SLAs) and having a third-party deliver it. With as-a-service, you should be able to: start small, grow over time, and have complete transparency over pricing and related KPIs. The customer should never feel like they’ve bought something and now they are on their own – or locked into a service that offers little benefit compared to a traditional capital purchase.

Containers and Kubernetes – the one-two punch of enterprise efficiency – are so mainstream we’ll stop talking about them in 2021
Containers and Kubernetes are the one-two punch of enterprise efficiency – reinventing how we build and run applications. By 2025, Gartner projects that 85% of global businesses will be running containers in production, up from 35% in 2019. But for digital leaders in the enterprise, these essential building blocks of the microservices that enable organizations to be more agile while still building highly reliable applications, are already mainstream. Agility and resilience are key benefits of microservice architectures and digital native powerhouses like Netflix understood the competitive advantages of microservices early on. In 2021, look for containers and Kubernetes to remain central to enterprises launching and expanding their long-planned digital transformation projects. In fact, containers will be so mainstream that it will not be the technology that is interesting any more – but instead the new applications and digital touchpoints that CIOs will be talking about. They’ll understand that their teams have a toolkit of solutions that will allow them to do things at speed and velocity that they could have only dreamed of five or 10 years ago – like leveraging streaming data to deliver real-time personalization to 10 million customers worldwide.

Object gets smarter: Despite the object renaissance, it’s not a panacea without file
Twenty years ago, object was treated as a dumb but extraordinarily scalable repository for storing data while all the intelligence, the metadata, and the annotations were held separately in some kind of database. This structure worked well, up to a point, but with the exponential growth of data volumes – with billions of blobs just sitting in an object store – the framework is no longer viable. Increasingly, organizations want to interrogate and analyze their data without the headache of having to keep these two systems aligned. As a result, there is an increasing demand for embedding the crucial metadata into the data objects so that it’s not just about performance, but about being intelligent.

Object storage is well suited to the growth of cloud platforms and the big data environment of the modern world. Ultimately, customers want a scalable, agile system that can handle the challenges of modern, unstructured data. While object storage might be having a renaissance, it is not on its own a panacea. It might be highly scalable but it cannot mutate individual pieces, i.e open up an object and write a few bytes out of it. In order to be successful a full application workflow needs more than just an object store. While fast file storage is not a new concept, putting file and object together in the same platform is a creative way to avoid building two different silos and adding more complexity. Unified file and object is the future and 2021 will see this category go mainstream.

QStar Technologies
Dave Thomson, SVP of sales and marketing

Active archive requirements to scale from terabytes to petabytes

Enterprises continue to create data at increasingly rapid rates due to, in part, the use of higher resolution video and graphics, along with significant growth in applications using IoT and AI.

Much of this data is used extensively for relatively short periods of time. For example, for repeat runs as new methods become available necessitating data analysis to be updated. These sets of data can each be multiple petabytes of content. Scale-up, scale-out primary storage is now the norm for many, QStar sees the same requirements in active archive – where massive single name spaces are available through SMB, NFS and/or S3 interfaces – leveraging libraries containing tens or hundreds of tape or optical drives, with optional replication to on-prem or remote cloud infrastructures.

Node-based architecture will allow for performance and capacity to be scaled independently and, at the same time, increase disaster recovery options. QStar predicts a significant change in active archive requirements, surging from hundreds of terabytes on average just 5 years ago to tens or even hundreds of Petabytes in the very near future.

Quantum
Noemi Greyzdorf, director of product marketing

Understanding data and improving storage management
Organizations will seek to better understand their data by adopting data management. There will be a big push to make data insights actionable and to leverage these insights to improve storage management, whether on premises or in the cloud.

Bridging compute and storage for analytics
Enterprises will seek new ways to bridge compute and storage where analytics run over the network or at the edge. Data will need to become mobile without affecting timeliness.

Easy-to-use interfaces
Ease of use will continue to be a priority, with improved interfaces on devices.

Enterprises will need to be able to manage their systems remotely, without complex professional services engagements.

Qumulo
Bill Richter, president and CEO

It will be mission-critical to make data more accessible. In 2020 it became even more obvious that access to data is vital for teams working on data-heavy projects like film production and Covid-19 research, and modern enterprises can’t afford to slow down their operations. IT teams will be challenged in 2021 to keep up with important workflows by making data easier to access.

Radical simplicity will be the only option for unstructured data. In 2020, unstructured data was a pivotal concern for IT leadership. Heading into next year, forward-thinking IT leaders will be drastically simplifying the way they manage file data. File systems that have intuitive interfaces, built-in data analytics, and a single solution to address both on-prem and cloud requirements will become the gold standard for unstructured data.

Every company needs to be an app company – or they risk falling behind. In 2020, companies like Peloton and Panera leveraged their apps as they pivoted to pressing consumer needs in-home fitness and online ordering. In 2021, we’ll see more companies embracing app development to stay top-of-mind and relevant.

Robin.io
Narasimha Subban, senior director of software development

Kubernetes takes center stage in year ahead for storage
On the cusp of this new year, we’re reminded that forecasts can be folly. However, what we know about the maturity and adoption level of Kubernetes, hybrid cloud, and network transformation, we have a solid roadmap for what 2021 looks like for the storage space.

Kubernetes will drive the storage needs of emerging hybrid cloud strategies. Because Kubernetes is still the common platform across all the clouds, our prediction is that Kubernetes will become the centerpiece of this newfound hybrid cloud standard. Looking at recent trends, Amazon EKS has already come up with an on-prem Kubernetes distribution (EKS-D), and Google is pushing hard for Anthos and Anthos for bare metal. Seeing this, it is pretty clear to us that Kubernetes is going to be the starting point for hybrid cloud storage in 2021.

There will be a massive jump in stateful applications of Kubernetes. It’s reasonable to expect the number of organizations running stateful applications on Kubernetes to jump to more than 80 percent in 2021, simply due to the maturity of Kubernetes deployments. In its current stage, the capabilities of storage and data management solutions on Kubernetes make running stateful applications in production not only possible, but optimal.

Kubernetes will drive the network transformation for 5G. As 5G continues to strengthen and improve, the requirements for latency, uptime, and number of connected devices have similarly intensified. As a result, it is becoming very clear to network operators and telco service providers that they will have to make the shift from virtual machines to containers. When that happens, Kubernetes will become the standard for running network applications and supporting their storage requirements.

The industry – us included – is making headway on the important task of simplifying hybrid and multi-cloud portability for complex stateful applications on Kubernetes, and as mentioned in the first prediction, hybrid cloud is going to be the most popular choice. Consequently, there will be growth in cases where users must move stateful applications from one Kubernetes cluster to another. In 2020, that’s a cumbersome process. In 2021, it’s going to become simpler out of necessity. Automating most of the operations that a storage admin would do on Kubernetes will be critical. This will free developers and DevOps engineers to take care of their stateful applications independently of each other.

It’s clear to us that Kubernetes is more than ready to take on the storage demands of hybrid cloud, stateful applications, and 5G network transformation. We can’t wait to watch it all unfold.

Rubrik
Nicolas Groh, field CTO EMEA

Reorganization of the cloud
After having massively adopted the public cloud, at times in a haphazard manner, this year enterprises and architects will be tackling the reorganization of their cloud infrastructures. This will be imperative, both to guarantee greater data security and to maximize their investments in this area.

Increased automation
The increased automation of the cloud will enable companies to achieve a higher level of efficiency while avoiding human error. Cloud automation will help companies to remain profitable in the face of market uncertainty. We can anticipate that cloud automation will make daily work of even the least technical employees easier.

Ransomware: companies will no longer pay
Throughout the pandemic, there has been a resurgence of cyber attacks. This has affected not only businesses but also public services such as hospitals. This year will see more and more companies under siege who will simply refuse to pay. They will rely on more efficient tools to prevent and remedy these ransomwares. We shall see if governments will also fight back.

Scality
Paul Speciale, CPO

In 2021, three major shifts in IT will impact data management and storage solutions:

  • Enterprise IT apps and infrastructure moving to the new cloud-native model
  • Data being generated everywhere, globally
  • Increasingly rapid evolution in platforms, APIs, and cloud services

Data storage has morphed significantly in the past decade to address the well-known avalanche of data that enterprises and consumers create and consume. The scale of data storage growth is truly astonishing. In the 1990s, 1GB of data seemed massive and a terabyte of data was a distant promise on the horizon. Today, we see companies of all sizes managing petabytes of data and beyond, even in industries where digital transformation arrived late, such as healthcare.

This explosion of data pressed the data storage industry to rethink the scale of systems and enabled solutions that we refer to as “scale out”, meaning the ability to grow across multiple physical resources. New ways of protecting data have also developed because traditional mechanisms (such as RAID) were simply outgrown and would have resulted in significant data loss if depended upon for petabyte-scale data.

Twenty years ago, only the wisest prognosticators could have foreseen video streaming services to personal mobile devices or the presence of ubiquitous and low-cost, high-speed connectivity. Storage providers have been forced to think about global 24×7 access to storage for thousands, even millions, of simultaneous users.

As global access continues, we see these three IT trends having similar influences on data storage:

  • Cloud-native computing will shape data storage deployment and access
    Every 10 years or so, the world of IT faces a significant change in infrastructure that affects applications and the underlying resources they depend upon. Twenty years ago this change was driven by server virtualization, and just over 10 years ago we saw the emergence of public cloud services greatly impact how businesses leveraged IT. Both of these trends had an impact on data storage, and we see a similar trend emerging now as IT moves to a new model called cloud-native.
    Cloud-native will cause a shift in enterprise IT applications and infrastructure, which will also transform data storage. The changes we foresee are in how storage will be deployed as well as in how it will be accessed and consumed by applications. Deployment in cloud-native computing environments relies on building blocks of software services called containers, which are managed in container orchestration environments such as Kubernetes. According to IDC, by 2023 – just two years from now – over 500 million digital apps and services will be developed and deployed using cloud-native approaches. That’s the same number of apps developed in the last 40 years.
    These applications will naturally consume storage over APIs such as the popular AWS S3 API for object storage. Moreover, new storage automation methods for container environments are emerging, including the Container Storage Initiative (CSI), the COSI project from the Cloud Native Computing Foundation, and the SODA Foundation project within the Linux Foundation. Both deployment and access to storage will, therefore, be shaped significantly by cloud-native computing.
  • The “data is created everywhere” dynamic will necessitate global data management
    Another significant industry trend we see is the increasing decentralization of data creation. We have seen this before when we went through several evolutions of centralization (mainframe), local distribution (client/server), and globalization of companies (multiple data centers and remote office access).
    Now, with enterprises having global data centers, using public clouds and an emerging edge-tier, it is clear that “data is created everywhere” is a dynamic that we must confront from a data management standpoint. This will drive new data management solutions that can view data globally through unified namespaces and can perform intelligent searches on data no matter where it is located.
  • Acceleration in new technology adoption will bridge data to the future
    Finally, we see a macro trend that has been occurring for decades but is accelerating: an increasingly rapid evolution of the underlying technology that businesses depend on to store and manage their data.
    Consider that just 10 years ago most servers had 40 to 60GB  HDDs, making a server with 2TB of storage capacity “cutting edge”. Today we have 16TB HDDs creating over a petabyte of storage capacity in a single server. Now emerging are flash disks that can hold tens of terabytes of data, promising fast access at lower cost.
    The access protocols for storage have also rapidly changed from block to remote file system access (NFS and SMB) and now to object APIs (AWS S3). This rapid evolution will create a need for data storage and management solutions in software that can bridge across and transcend these technology changes, while preserving existing investments and, most important, preserving critical enterprise data well into the future.

Seagate
John Morris, CTO

Increasing significance of hierarchical security (for data at rest and in flight)
A continued growth in the trend of hyperscale software ecosystems is underway, allowing for applications to be developed and deployed on smaller atomic units; for business and locations that may not have the connectivity infrastructure required. More and more cloud native applications run in points of presence or colocation facilities around the world. With this asset partnership model becoming increasingly more common, it is necessary to protect data at each step of the process. In flight and at rest are critical spheres of protecting user data in a more distributed deployment model.

Data-at-rest encryption is becoming increasingly mandatory in many industries as a way to avoid both external and insider threats. Although it might not be mandatory in your particular industry today, it might well become so tomorrow. Therefore, Seagate recommends moving to using encrypted disks as soon as possible to ensure there are no disruptions in the future if the requirement sneaks up on you.

Broader adoption of object storage by enterprises
With the explosion of useful data, object store is becoming the standard for mass capacity and offers advantages over traditional file stores including prescriptive metadata, scalability, and no hierarchical data structure. Systems benefit from greater intelligence incorporated in data sets and object stores provide this intelligence. Storage types include block, file, and object. Block is critical for so many mission critical applications that are performance sensitive. File has serviced legacy applications and provided robust architecture for years. Object storage is focused on new application development in combination with block storage to provide scale and performance in a symbiotic fashion. Many legacy file applications are also migrating to object storage infrastructure to take advantage of the economies of scale that object storage enables.

Object storage is quickly becoming the de-facto standard for capacity storage quickly augmenting and displacing file storage due to improved economic efficiencies and scalability. Additionally, new programmers graduating today increasingly build workflows assuming object storage interfaces. Hire those people. If you haven’t yet added object storage to your data center, now is the time to do so.

Greater adoption of composability
While the idea to separate systems into independent units that can be combined with other independent units is not new, a broader, open source-reliant adoption of composability is underway. Kubernetes – the open-source system for automating deployment, scaling, and management of containerized application – is at the core of this trend. Open source is the future of application development because it enables a much larger community to work on the problems that challenge many industries and it also allows for domain specific solutions that are leveraged from the open architectures. Composing the HW to optimally meet the SW or business needs is a natural migration.

Today’s data centers are moving toward composability because it provides easier deployment and redeployment of resources without requiring a priori configurations and statically configured ratios between compute, memory, and storage. Containers and Kubernetes are the core mechanisms of composability. It behooves all data centers to start embracing these technologies if they haven’t already.

SIOS Technology
Cassius Rhue, VP customer experience

Business continuity and disaster recovery will drive adoption of hybrid cloud and multi cloud configurations
As cloud adoption takes center stage in IT infrastructure configurations, companies will begin using more hybrid and multi-cloud configurations to solve long-standing challenges to business continuity and disaster recovery. Companies will increasingly use the cloud to enable geographically separated offsite replication or failover for disaster protection. They will look to extend failover clustering not only across cloud availability zones but across different cloud vendors. The expansion of private cloud usage brought on by the growth of the increasing availability needs of the applications required for monitoring this new, broad class of IoT devices.

Companies will look to use backup and HA data for DevOps
Companies will look to get more value from replicated data “at rest” to use it for more than simply disaster recovery. Companies will tap stored data for a variety of testing, including dev ops testing, and availability and failover testing.

Storage agnostic HA/DR protection will be required
Companies are no longer tethered to their SAN or NFS storage. All manner of storage solutions are being used in the cloud and on premises. Protecting all of that data and enabling the flexibility to implement hybrid cloud environments will mean a greater need for high availability solutions that work equally well with all types of storage.

SoftIron
Craig Chadwell, VP product

Three trends I see continuing to impact the enterprise data center in 2021are the ongoing growth in edge infrastructure deployments, the rise and evolution of enterprise-class open source, and the increased scrutiny of, and subsequent impact on, IT supply chains.

  • The increasing focus on edge to core (and back again) data orchestration will be driven in part by the increased prevalence of data generating and data capitalizing devices outside traditional “company walls”. Subsequently, the drive to deliver richer user experiences and higher quality inference via these devices – coupled with the increase in bandwidth to the edge enabled by 5G infrastructure growth – will see the growth of more local/regional data points of presence. And of course, Covid has contributed to the acceleration of this trend by changing where data is created and used, further pushing it to the edge.
  • At SoftIron, we’re absolutely witnessing an increasing appetite for, and adoption of, open-source infrastructure standards. This can be attributed to the perceived security vulnerabilities in high tech infrastructure supply chains, especially as those complex infrastructure solutions become embedded into industries with high sensitivity to either personal privacy (e.g. healthcare) or to disruptions (utilities supply). We’ve seen it time and again and it’s no secret that these threats are made possible due to supply chain infiltration by bad actors. The result is an increased urgency in the security arms race between open-source operators and bad actors that want to exploit them, leading to maturing ecosystems of open-source security tools, best practices, and advisory boards.
  • Closely related to the previous point, I foresee national policy shifts toward supporting more independent IT supply chains, such as those which have already begun in Australia and India. With strong, accessible incentives in place to decentralize global research and manufacturing facilities, we’ll start to see an influx of nationally and regionally oriented technology businesses – supported by their governments – competing with multinational corporations.

Spectra Logic
David Feller, VP product management and solutions engineering

2021 will bring a shift towards data storage planning for the long-term
The pandemic is causing self-reflection in the data industry. IT professionals are asking themselves these questions: How well is my data protected? How easily can I monetize archived data? How effortlessly can my data be accessed? What is a good 20-year plan for my data retention systems? The latter of these questions continues to force organizations to take a long-term look at their architecture and at the systems that will grow, adapt and enable them to achieve goals in the future by investing in modern and flexible solutions today.

Greater realization of the long-term economic value of tape
We are seeing a trend with data centers where organizations are consolidating and deploying larger tape libraries in one or two locations. They then share that data (because of the improved bandwidth), with data management software overlaying it, making an on-premise set up simpler and easier. After the initial investment in the tape library, that cost dissolves and then it is just about the cost of physical tape media. Scalable, smaller tape libraries that enable users to increase capacity incrementally as their data repositories grow will continue to be deployed in midsize tape environments for economical long-term data retention as well as for air-gap protection against malware. In 2021 many organizations will realize this scenario proves to be more economical long-term than year-on-year costs to get data back with a pure cloud-only strategy. With the LTO roadmap extending out to LTO-12 and 144TB of capacity per tape, and new demonstrations by IBM and FujiFilm of future tape capacities achieving 580TB per-tape using barium ferrite technology, data-hungry organizations will continue to depend on tape for its superior reliability, affordability and scalability for years to come.

Increasing control of data and ability to move between clouds
Organizations will recognize the importance of being discerning when it comes to committing to one cloud option (where prices can be raised after time or data can only be retrieved at great cost) and shutting down their data centers completely. Instead, organizations will embrace greater independence and cost savings from cloud lock-in with a local copy kept on-premise and one in the cloud for disaster recovery or cloud-specific workflows and compute. In 2021 more organizations will recognize the importance of the freedom to choose where to put their data without compromise, the benefits of controlling their own data, and the ability to move between clouds. 2021 marks the start of commoditization of cloud storage and compute.

StorageCraft
Shridar Subramanian, CMO

Data storage will embrace zero trust
For years, the famous security maxim was “trust by verify”. But now organizations embrace a zero-trust approach to security. They entirely remove trust from the equation and assume that everything – including users, endpoints, networks, and resources – is untrusted and must be verified. A similar approach will soon be applied to data protection. A zero-trust approach to data backup and management will help answer these questions while further protecting enterprise data.

Zoom will have a big impact on storage capacity
With Zoom calls being recorded, shared, and stored, companies are generating more data than ever. Many organizations don’t know, however, that video storage costs can run into millions of dollars annually. They will soon outgrow their existing storage space and need far greater data-storage requirements. Cloud storage costs that start at a few hundred dollars a month could rise to a few hundred thousand dollars annually in few years.

Storage and backup will get smarter
Organizations are now collecting massive amounts of machine learning and IoT data. Most companies are thinking mainly about data analysis and much less about data backup or security. But as data increasingly moves from analysis to production environments, that’s when protection becomes critical. Cutting-edge storage tools increasingly rely on AI and machine learning to automate the data backup process. Given the exploding size of enterprise data, an efficient backup process will be crucial.

StorCentric
Surya Varanasi, CTO

Cybersecurity and backup: inseparable priorities in 2021
In 2020, increasingly aggressive and rampant ransomware, and other bad actors, continued to attack not only onsite production data but every possible copy, wherever data lived. In 2021, this will not only remain true but increase in severity. Therefore, it will become critical for organizations to step-up their cybersecurity game with data security, protection, and unbreakable and impenetrable backup solutions.

Mobility, flexibility and agility
2020 saw IT beginning to overcome the traditional obstacles associated with data migration, data replication and data synchronization with new and innovative solutions entering the market. In 2021, IT and business leaders will further prioritize strategies and solutions that enable fast, flexible, safe and seamless movement of data across heterogeneous on-premise, remote and cloud environments in order to reap the strategic business value, IT benefits and budgetary advantages for which these operations are intended.

Migration of workloads from VMs to Kubernetes will accelerate
Covid-19 forced organizations to adapt to a distributed workforce and consequently forced a spotlight on strained digital infrastructures. The scalability and rapid uptime of Kubernetes deployments helped organizations bridge the gap between cloud and on-premise capabilities. Post Covid, the ability of Kubernetes to manage distributed large-scale workloads seamlessly will only accelerate the trend away from VMs.

StorMagic
Bruce Kornfeld, chief marketing and product officer

Two node HCI deployments become the standard – HCI has traditionally required 3 servers per site, but with edge computing on the rise there is massive demand to reduce the IT footprint at these smaller locations.

Simplified edge computing management – With hundreds and thousands of locations, edge computing can be a nightmare to manage. Software providers will improve the tools available to manage all of these locations from one pane of glass.

IoT and cloud migration further edge adoption – As various IoT implementations truly begin to take off in 2021, so will the need for edge computing to be able to process the sheer amount of data that the devices produce. Additionally, cloud providers will begin realizing the performance benefits of moving their data processing closer to the source, thus increasing edge adoption.

2020 Facts
Covid – The big deal here was that it forced organizations for all sizes around the world to quickly accelerate their digital transformation plans. Everyone had to find new ways to use technology to stay in business and make it through the pandemic.
The retail market gets turned on its head – This was the story of the haves and have nots. Luxury retailers and high-end boutiques struggled to stay in business (many have not) but those closer to providing essential goods actually thrived. (example: Neiman Marcus vs. Target).

Security – 2020 saw more threats, more “bad actors”, more ransomware and more data breaches. The security industry is on fire right now with technology providers innovating to continue to find ways to help organizations defend against all these threats.

Edge – Customers found new ways to deploy solutions to meet their customer needs and improve efficiencies with IoT. Cloud providers are all scrambling to take part in this massive growth trend.

Work from home impact – Almost every professional, regardless of industry has had to move to a work from home environment due to COVID. This may or may not be permanent. Technologies that increased demand during this transition have been all security related products, video conferencing as well as VDI (virtual desktop infrastructure).

The commercial real estate market suffers – Many organizations are announcing 100 percent remote working strategies and shutting down offices in response. In many cases, this will be a permanent change or minimally will change the demand for office space globally. This will have a long-term impact on pricing of existing office space and developers’ plans to build more.

StorONE
Gal Naor, CEO

Storage hardware will continue to make significant strides, providing both dense performance and dense capacity, and only solution companies that can utilize 90% of drive potential will survive.

Only companies that deliver a multi-product, multi-use case storage solution rather than a single product will thrive.

Customers will demand a storage solution that works on-premises and in the cloud, with the capability to seamless transition between the platforms.

StorPool
Boyan Ivanov, CEO

Everything NVMe: NVMe SSDs are continuing a slow overtake of the Flash media market. More businesses will select a capacity flash solution over HDD for their workloads. Businesses using legacy Fibre Channel technology will continue migrating their infrastructure to NVMeOF, though it remains to be seen what the final technology of choice will be – NVMe/FC, NVMe/RDMA, or NVMe/TCP.

Fully managed, proactive services will become the standard: as the hyperscale cloud services providers continue to hire scores of infrastructure experts in more and more new geographies worldwide, hiring the needed technical personnel is becoming an even bigger bottleneck to running a smaller public or private cloud environment. Making storage easier to use through UX/UI enhancements is no longer sufficient. Vendors need to step up their game and leverage their expertise to provide higher service levels, thereby lightening the human resource requirements for businesses and optimizing their overall cost structure.

High-density SSDs will become the norm. Storage software will need to draw out their full performance potential: Using larger drives has advantages (less rack space, less power consumption) and disadvantages (impact of drive/node failure on performance, rebuild times). The overall trend to work in a sustainable way will push businesses to adopt higher-capacity (8TB+) SSDs. The main roadblock to this is the performance per drive. Storage software which needs fewer NVMe SSDs to achieve the overall IO/s latency levels required by customers, would be an attractive option.

Updates using continuous integration / continuous deployment practices: As IT infrastructure is an integral part of the operation of modern businesses, storage systems that cannot keep up with shifting customer demands can become a blocker for business model innovation. Updating storage software once a year or less frequently will not be acceptable for businesses looking to adapt to Covid-19 aftershocks. Multiple updates per year, or even per quarter, will become the norm.

StrongBox Data Solutions
Floyd Christofferson, CEO

Data-centric file management will help contain costs in 2021
IT organizations are increasingly moving to a data-centric model to manage files across multi-vendor on-premises and cloud storage platforms, a trend that will grow significantly in 2021. The datacenter budget tightening in 2020 in part caused by the impact of Covid-19 has increased the need for IT organizations to leverage intelligent data management technologies to automate multi-tier storage architectures to defer expensive storage upgrades, and get more life out of existing primary storage. This trend is driven by the need to reduce the load on expensive primary storage types, but also is possible now due to the emergence of technologies that enable automation based upon data intelligence and business-processes, which can seamlessly manage data across any storage type.

Data-centric approaches leverage metadata from files and other workflow driven triggers to automate data movement, migration, data protection, active archiving and other use cases across any on-prem or cloud storage type. As opposed to traditional storage-centric approaches, where data can often get stranded in a single vendor silo, data-centric automation for policy-based data placement enables IT planners more flexibility to contain costs by offloading primary storage without interrupting user access, or adding complexity. This trend also enables IT organizations greater flexibility in deferring primary storage upgrades or replacement, by seamlessly shifting data to lower cost on-premises and cloud storage choices transparently.

Toshiba Electronics Europe GmbH
Rainer W. Kaese, senior manager for HDD business development

The company has a longstanding reputation as a leader in data storage technology, and its team of experienced professionals possess a deep understanding of the fundamental dynamics that define the market. In the following text, of the company’s storage products division – gives his insight into what the year ahead holds.

Through a broad cross section of use cases, each with their own particular nuances, the expectations currently being placed on data storage technology are proving to be greater than ever. Our society’s data consumption is already way beyond what could have even been imagined in the past. Projections from IDC suggest that our annual data generation levels will have exceeded 175ZB by 2025. Furthermore, the expansive array of new applications that are now starting to emerge mean that the exponential growth rate we are already experiencing is only going to continue.

As we go into 2021, dramatically heightened data access activity is going to start being witnessed at the edge, as well as at the core. Thanks to the huge production volumes supported, coupled with characteristically attractive price points plus ongoing innovations, HDDs are certain to continue to have an important role to play.

Although SSDs seem to get the vast majority of media attention, the value of HDDs should never be underestimated – especially as data storage demands are getting more and more intense. It must be acknowledged that even the most ambitious estimates about future SSD production output would still only allow this storage medium to constitute a mere fraction of the total capacity that will be needed.

Market developments driving demand changes to working culture over the last 9 months, with a much greater percentage of the population now working from home with all-digital connections, has accelerated the migration to cloud based services. This is putting more strain on existing data center infrastructure. At the same time, the landscape supporting all this activity is changing too. Cloud-based IT, often located in co-location (colo) sites, is set to become increasingly commonplace, enabling the requirements of numerous customers to be attended to using shared resources. This sets new challenges when it comes to storage technology that forms the foundation of data center operations – requiring optimized solutions that match the access pattern as well as performance and reliability requirements.

Alongside what is occurring in the data center sector, the roll-out of IoT is now starting to scale up considerably. Estimates on the number of connected nodes being put into operation over the course of the next few years vary, with Juniper Research even predicting that this figure could actually pass 83 billion by the middle of the decade. What is definitely assured is that, if IoT is to be truly prevalent, the costs involved need to be as low as possible – especially from a data storage perspective.

Closely interrelated to IoT roll-out, increased interest in Industry 4.0 will be an impetus for deployment of greater storage capacity in relation to the manufacturing arena. IoT will also be leveraged by utilities and municipal administrations to enable various smart city functions to be benefited from (thereby combatting congestion, air pollution, etc.). As with Industry 4.0, this will result in huge quantities of data being generated by sensors. With only limited on-site storage reserves and processing capabilities available, this data will generally be sent back to cloud servers for subsequent analysis – where cost – effective data storage resources will once again be required. More widespread use of surveillance systems is also destined to have a major impact on data capacity requirements, as will the move towards greater vehicle autonomy in the years to come.

Cost Considerations
For all the use cases just discussed a substantial ramping up of data capacity will be mandated, while still keeping the financial investment involved to a minimum. Admittedly, a single SSD may be able to outperform a single HDD. However, the applications we are talking about here don’t deal in single discrete units – they need large scale solutions. For such implementations, multiple configured HDDs are able to achieve very high IO/s figures, while still being extremely economically viable too.

When looking at what is the most suitable storage medium to utilize, the price/gigabyte is usually the primary concern. Though the costs associated with SSDs have fallen, they remain close to an order of magnitude higher than their HDD equivalents. Moreover, advances in HDD design are translating into further cost savings. It should be noted that tape will have to play a role as well, as it’s definitely the cheapest way to store data when it comes to cost per capacity, but tape is not directly competing with HDD and flash, as all data storage mentioned so far is on-line, while tape is not an on-line media.

From an engineering standpoint, continued progression is being made with regard to helium-filled drives. Next generation technologies like HAMR and MAMR)are also in the pipeline. Through these there is the prospect of storage capacities being boosted without calling for any cost premium. The gap between HDD and SDD implementation outlay will therefore remain sizable for a very long time yet, as will HDD’s overall market dominance in terms of deployed online storage capacity.

Vast Data
Renen Hallak, CEO

The healthcare industry walks away from the HDD
Healthcare is the first industry to go all-in on flash. They learned a hard lesson this year as they raced for the researching, testing, manufacturing, vaccinating, deployment, and calculating answers that the world needed. The problems with tiered storage show up most prominently at scale with analytics. These critical data sets now have a value that is proportionate to their size, which throws the value of storage tiering out the window. 2020 proved that our front-line systems could not deal with the latency of mechanical media, and low-cost flash price points are now compelling enough that organizations no longer need to choose between performance and budget.

The cloud tax triggers the cloud tea party
Companies that began their cloud journeys 4-5 years ago are now refactoring their strategies. The multi-year agreements customers made years ago have now started to mature, and as organizations weigh their alternatives, 2021 will be a reconciliation against the tax that cloud vendors level against customers vs. being able to build on premises. Many cloud vendors have held the line on pricing during periods of extreme HW cost reductions (example: Amazon S3) such that the initial economic calculus no longer works. In 2021 customers who operate at scale will make bold moves back on-prem as they have now realized that shifting costs between Opex and Capex results in the same total spend. And, while time is money and cloud provides agility, money is also money. When the comparable costs are 15x-30x more than what customers pay on-prem, the economics can’t be ignored.

Legacy NAS is dead for AI
With the introduction of PCIe Gen4, I/O rates have now completely broken away from CPU core evolutions. Legacy NFS providers are stuck with single-stream TCP that is rate-limited by the capability of a single CPU core on the application server. PCIe Gen4 will double the peak I/O performance of applications in 2021, while a CPU core will no longer be able to equally double single-core I/O performance. There is no greater concentration of single-host IO than in the AI market – for applications such as machine learning and deep learning. To resolve this, customers will seek solutions that support multi-threading, RDMA, and the ability to bypass CPUs altogether – as is the case with Nvidia’s GPUDirect Storage. The demands to keep GPUs and AI Processors fed and efficient will dramatically outstrip the I/O capabilities of legacy TCP-based NAS, leading customers to walk away from legacy NAS altogether in 2021.

Standards become even more important in 2021 for ML/AI
AI compute vendors will further push to homogenize standards across the market to make way for framework compatibility. In their quest to open up the TAM for accelerated computing, the need for standard programming environments and I/O stacks will only increase. The explosion of ML/AI-tuned silicon will force the industry to adopt a standardized storage presentation to a common hardware environment. Storage-wise, while NAS has an appeal as NFS is also a standard, standard NFS optimizations in the kernel such as RDMA and multi-path will be table stakes to marry the performance needs with the push to standardize.

Wasabi Technologies
David Friend, CEO

Increased diversification in cloud technology market
Moving into 2021, we will see increased diversification in the cloud technology market. Organisations will continue to be turned off from high costs, including hidden egress and ingress fees, rigid capacity and vendor lock-in from the mainstream public cloud storage providers like Amazon Web Services, Google Cloud Platform and Microsoft Azure. And a lack of innovation and increasing prices from these hyperscalers will continue to drive organisations away. As a result, the cloud industry will continue to expand with new storage providers offering increasingly diverse IT service packages that are more flexible, cost-effective, and personalized. This will enable a greater shift to hybrid or multi-cloud storage options throughout the enterprise.

Developers become increasingly cloud-savvy
In 2021, we will see the role of the developer evolve as tech stacks shift towards a more cloud native approach. Historically, with traditional on-premise solutions, developers’ main concerns centered around selecting the right hardware for their applications. While this will still be the case for the foreseeable future, when it comes to the cloud, developers will also need to be experts on cloud providers, what they have to offer, and which are the best fit for their organization’s needs. They will also need to be sufficient in configuring hybrid solutions where different functions are performed in different clouds, if they aren’t already. This will ultimately expand developer capabilities and help make their jobs easier.

Surveillance will move to cloud
I see the surveillance industry being the next major migrator to the cloud. Surveillance is one of the world’s biggest consumers of storage, but traditional local, on-premise storage has limitations around capacity and security. What the industry needs is a sustainable, flexible, and low-cost cloud solution to streamline data storage where it can keep its content safe from being destroyed or modified. With the cloud, surveillance organisations get increased safety and security, are able to save files for longer periods of time, and remain more compliant.

WekaIO
Liran Zvibel, CEO

Looking back while moving forward
One of the most interesting things about looking ahead is the fact that to know where you are going, you must know where you have been. If we are going to have flying cars in the future, you must have a good understanding of both automobiles and aeronautics. If we are going to establish a base on Mars, you have to leverage the knowledge you’ve gained in getting to the Moon.

The best guesses for the future lie in understanding predictive analytics. And analytics rely on copious volumes of information to attempt to guess at what is coming next. For the upcoming year – and the soon afterwards – data will remain king. Data-hungry applications will continue to proliferate. And data management strategies across information’s entire lifecycle will be highly sought-after solutions.

For the year 2021, organizations will seek to adopt a modern storage architecture that solves today’s biggest data-intensive problems. They will look to high-performance computing solutions to satisfy their AI/ML workloads. They will seek to migrate data to the most-efficient platform. And they will move to a hybrid-cloud model because it has become a must-have vs. a nice-to-have option.

  • Migration away from NFS for HPC workloads
    Speeds and feeds have dominated the compute and storage landscape the past few years. It has become a virtual arms race to provide the most IOPS, the most horsepower, the fastest processing.
    With AI/ML, HPDA, and analytics workloads now dominating the enterprise, companies have come to realize that NFS cannot support these data-hungry applications.
    Companies looking for new advanced technologies to gain a competitive advantage and improve their business outcomes will continue to move to a modern storage architecture that can handle the most demanding I/O intensive workloads and latency-sensitive applications at petascale. They need solutions that allow them to maximize the full value of their high-powered IT investment – compute, networking, and storage. This is especially true of organizations in the fields of manufacturing, life sciences, energy exploration and extraction, financial services, government & defense, and scientific research – all of which have found that NFS is no longer cutting it.
  • Workload portability will be a requirement
    While speed remains an important metric in HPC environments, more end-user environments will want to run their workloads in the most cost-efficient platform without compromising performance. Data continues to be a strategic asset for business and lifecycle management has increasingly become key.
    Users will want to move their workloads easily and seamlessly with no user downtime to achieve the same application performance regardless of where the data resides.
    As data volume grows, so too does the need for a unified storage solution that can manage data in a single, unified namespace wherever in the pipeline that data is stored. Enterprises will continue to explore solutions that provide them with the utmost operational efficiency in managing, scaling and sharing data sets, with absolute operational agility by eliminating storage silos across edge, core and cloud. A system that does this in conjunction with stateless containers best allows for data to be implemented wherever it resides for ultimate portability rather than requiring massive amounts of data to be ingested and acted upon.
  • Hybrid cloud is no longer optional, it is mandatory
    The cloud has been one of the past years’ predictions that now serve as today’s reality. While organizations understood the importance of implementing a cloud-based strategy into their architectural decisions, limitations or concerns left it as an optional upgrade to traditional storage networks. This is a nice-to-have option no longer.
    Cloud computing was already seeing massive acceptance, but COVID accelerated companies cloud strategies. Hybrid cloud capability is no longer an option, it is mandatory. Customer must be able to move their workloads to the cloud for capacity planning and to satisfy bursty workloads.
    By leveraging a cloud-native storage solution in conjunction with a fast file system, organizations can deploy end-to-end stack solutions that seamlessly run on-premises, in the cloud and burst between platforms. While clouds, like AWS, have become an ideal platform to provide and agile compute environment for workloads requiring HPC to accelerate discovery, there will be an increase in the need for fast, scalable file systems to ensure applications never have to wait for data.

If 2020 has proven anything, it is that nothing is predictable. But we have the ability to make informed decisions about the future. With a team of industry experts with hundreds of years of experience, our team at WekaIO has walked the path that has helped us become a leader in the storage industry today.

We recognize that an innovative approach to storage management that moves organizations away from NFS to the cloud with portability of workloads from the core to the edge is where the future of storage management is and we’re here to help ready our customers for liftoff.

Western Digital
Scott Hamilton, senior director, data center platforms

New storage technologies enhancements will emerge to support cost-effective active archives
In 2021, we will see greater adoption of next-generation disk technologies and platforms that enable both better TCO and accessibility for active archive solutions. We estimate that data is growing at ~30% CAGR, expected to reach 150ZB stored by 2025. This insatiable growth, driven by humans and machines, is creating an explosion in long-term data retention and archive challenges like never before. Where do we put all of this data? How do we cost-effectively store it and maintain long-term access with the lowest TCO? In an age where capturing, storing and extracting value from this influx of data is critical to success, new solutions that support active archive systems must emerge. With advancements in HDD technology, including new data placement technologies, higher areal densities, mechanical innovations, intelligent data storage, and new materials innovations, HDD-based solutions will emerge enabling new capacity points, and unprecedented economics and TCO at scale for active archive tiers. In 2021, we will see a new generation of storage device based on host-managed SMR HDD technology emerge, giving way to platforms specifically designed for colder storage tiers making long-term data storage more economical and accessible for data at scale either on premises or in the cloud for decades to come.

XenData
Philip Storey, CEO

Increasing adoption of active archives based on tape libraries
It is hard to beat the cost per terabyte of data tape libraries for organizations that have large active archives, whether those organizations are public cloud providers or users with large volumes of data. This is due to the low cost per terabyte of the cartridges themselves and low system power requirements. It means that tape storage is often an important element, alongside disk and management software, in high-capacity active archive systems.

Many public cloud providers have introduced object storage with very low-cost archive tiers that take perhaps an hour or more to restore a file. Often these providers use data tape as part of the storage mix. We anticipate rapid growth in the use of low-cost archive tier cloud storage. However, many potential users of cloud archive tiers are put off by the very high cost of egress fees, not just for routine restores, but the potentially massive cost if they ever want to move their content to another provider.

On-premises tape-based active archives are typically not subject to egress fees and we will continue to see growing demand for this class of storage. On-premises archive solutions that offer an S3 interface, allowing the archive to be securely shared by remote users and other facilities, will be especially attractive.

Zadara
Nelson Nahum, CEO

Edge cloud will become a reality
Factors such as 5G adoption and the pandemic will cause many businesses that have workloads on premise to move to the edge cloud. This will provide the best of both worlds: a cloud model that removes the owning and managing of data centers and CapEx – while at the same time moving to a close location for latency, bandwidth and compliance. Many workloads currently in traditional cloud environments – including those that suffer from negative user experiences due to latency issues – will move to the edge cloud as well. Businesses will have the ability to run their infrastructure in a common approach – no need to design for different architectures in different locations. This portability will drive new initiatives and the adoption of new
technologies.

More sophisticated, multi-tiered cloud storage solutions will emerge
Storage tiers will always exist to accommodate different price/performance capabilities. Today, different storage systems exist in each one of the tiers, and the IT professional is responsible for placing and moving data from tier to tier. Once data is moved from traditional storage systems to a cloud model, expect to see more sophisticated cloud storage solutions emerge. From SMR drives to Optane NVMe, IT professionals will be able to intelligently move data to the right tier according to policies set by the user. Additionally, companies will benefit from applications and architectures that are “cloud agnostic” which will remove the lock in and allow them to freely move applications to the best price/performance clouds.

Containerization will accelerate
From containerization in development environments to full-scale production orchestration managed by Kubernetes, the use of persistent production workloads and running business applications on containers will accelerate. Many companies will look to run their entire business in containerization, driven by the emergence of persistent storage for containers. A key factor driving this is data mobility. The need to move data from on-premises to cloud or between clouds is a challenge that can be overcome with containers – and having a common platform will enable containerization to go to new levels in 2021.

Zerto
Andy Fernandez, product marketing manager

Now that we’ve passed the hype, the cloud has become a catalyst of digital transformation. Covid-19 stress tested our infrastructure globally, and it showed that the cloud can actually scale and support the surge in provisioning which confirmed it as a reliable source of infrastructure. This was made possible not only because of how readily available the cloud is but also how it allows people to scale quickly, spin up new resources, and accelerate application development.

Now organizations are realizing they can move their data protection and disaster recovery services to the cloud, resulting in an increase in speed, agility, and efficiency.

In 2021, modern organizations will move even more workloads to the cloud and continue to adopt cloud-native services, specifically containers and applications for DevOps. By the end of the decade, enterprises will run most of their production environments in the cloud. Companies will move away from building new sites or buying more hardware in favor of pursuing an operational model with the cloud. In order to achieve this, organizations will need data management, protection and mobility solutions that facilitate this move, not act as an impediment.

Read also :
Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E