Click to enlarge
QF2 delivers performance and capacity for file-based data on the cloud and on premises, while providing real-time visibility and control of the data footprint. The company offers a free tier for using QF2 on AWS up to 5TB, giving businesses the freedom to store, manage and share file-based data across on-premises data centers and the cloud.
Data-intensive industries need to take advantage of the elastic compute resources, operational agility and advanced services that the public cloud offers. In a recent The Taneja Group study, businesses reported that file-based data is the leading data access method for the applications that they plan to move to the public cloud. However, existing options for working with file-based workloads in the public cloud are limited by scale, performance and visibility into the data footprint. As businesses become more global and data sets approach billions of files and petabyte scale, they require a solution that supports file-based data mobility across on-premises and public cloud infrastructures.
“Businesses are looking for solutions that can help them move and share file-based workloads between the data center and the cloud,” said Peter Godman, founder and CTO, Qumulo. “We recognized an opportunity to fill this gap in the market with a more intelligent file storage system designed for the demands of the modern enterprise by scaling both performance and capacity on the cloud, with no hard limits. The only unified fabric to span on-premises and public cloud storage for customers at petabyte scale, QF2 is in an entirely new class of enterprise storage.“
The company’s answers the need of data-intensive industries, such as life sciences and media and entertainment, to work with globally distributed data sets across time zones and locations, and up to billion-file scale. With QF2, these modern enterprises can now leverage the cloud for new economies of scale and access to such technologies as GPU arrays, machine learning, micro-services and serverless computing.
“Our researchers in epidemiology who study how diseases spread have very large, file-based data sets that are best analyzed with GPU-based, massive parallel computing that runs on the cloud,” said Tyrone Grandison, CIO, Institute for Health Metrics and Evaluation, University of Washington. “Ramping up GPU infrastructure on premises in our data center is far too expensive and complex. QF2 allows us to move file-based data sets to a QF2 cluster on AWS, complete our analysis, and move the artifact back to our on-premises QF2 storage cluster, saving us time and money. The flexibility for us to move our file-based data where we need it to be is something that nobody else in the market can provide at scale.“
“We are building a fully orchestrated visual effects rendering solution that spans our on-premises data center and AWS,” said Jason Fotter, co-founder and CTO, FuseFX. “We now have a QF2 cluster on AWS and in our data center creating a unified fabric that enables us to share file data between these two operating environments, maintain workflow consistency, and meet the high performance requirements for heavy compute workloads on the cloud.“