Kognitio Announces Memory-Based Pricing for Analytical Platform
On amount of memory employed by servers hosting the platform
This is a Press Release edited by StorageNewsletter.com on March 1, 2012 at 2:57 pmKognitio is shifting the pricing of its Analytical Platform to be based solely on the amount of memory employed by the servers hosting the platform.
Data Warehouse vendors have historically charged users for the amount of data held on disk. While this enables easy price-per-terabyte comparisons between software vendors, it doesn’t help organizations who seek insight from ever-increasing volumes of data in seconds. Kognitio, which does all of its analytical processing in-memory, has changed this paradigm-enabling the storage of petabytes of data at a fraction of the cost of other leading vendors.
The Analytical Platform satisfies queries at fast speeds because it leverages the performance of data held in memory. While Kognitio also holds data on mechanical disk for persistence and failure recovery, it does not suffer the bottlenecks of disk-based systems which fail to supply data quickly enough to the CPU core for processing. Data held in memory and true massive parallelism, ensures that every CPU core, on every CPU, on every server is utilized for every query.
Since most systems store much more data than they access regularly, with Kognitio, hot data – that which is needed by systems immediately – can stay in-memory, while the warm data can stay on standard mechanical hard disks. Even with a data set the size of the US Library of Congress, around 200TB, only 10% of that data is regularly needed and therefore would reside in memory. With the new pricing, the Library would pay only for the 20TBs of data they use regularly, saving 86% over leading vendors.
A commensurate Oracle Exadata solution is 700% more expensive. While prices are not published for direct comparisons with Oracle’s Exalytics, or SAP HANA, indications are that those solutions are even more expensive. Internal testing shows that Kognitio would be many times faster and more scalable than those systems.
"Data is growing exponentially every day. With traditional pricing models, users are locked in to pay ever-increasing license costs on data that they need to have available, but don’t need instantly," said Kognitio Chief Technology Officer, Roger Gaskell. "Now Kognitio clients can just pay for the value they get from high-speed in-memory analytics on their active data, not a ‘tax’ for data on disk."
Customers can now put as much data as they like on a Kognitio Analytical Platform and be charged for what they actually put into memory. The Massively Parallel Processing (MPP) scale-out architecture allows platforms to be built with memory sizes ranging from half a terabyte to hundreds of terabytes. Customers are free to attach as much disk as they like to these systems – insulating them against increasing software license costs from the growth in data volumes they already expect.
"Paying for every drop of data you store just isn’t sustainable," said Steve Millard, Kognitio Chief Operating Officer. "It’s time we shred discussions about cost-per-TB of disk-based data. In-memory pricing aligns cost with value to our clients-no one else can deliver this kind of price/performance."