What are you looking for ?
Infinidat
Articles_top

Teradata Bring Out Extreme Data Platform 1700

SQL engine for big data at $2,000/TB compressed

Teradata US delivered its Extreme Data Platform 1700 with enhanced processing speed and storage capacity.

Teradata

This platform also delivers a SQL engine on top of big data at a price from $2,000/TB of compressed data.

The amount of data created and stored in every industry has exploded from 100’s of terabytes to 100’s of petabytes. The large volumes of relational data are offloaded or sometimes even discarded. To drive analytics, organizations require fast access to all the detailed, historical data, not a sample. To respond to this need, there is an emerging set of immature SQL engines, however, they lack performance, security, and feature functionality. In comparison, this allows customers to take advantage of all data by exploiting low cost storage, without having to compromise the robustness of their analytics.

Before the Teradata Extreme Data Platform 1700, those customers searching for a solution under $5,000 per terabyte were forced to look at a file system or solutions with a reduced set of features,” said Scott Gnau, president, Teradata Labs. “Our new Platform removes the economic barriers, while delivering our best-in-class SQL engine for analysis of large volumes of relational data.

A Teradata customer ran the ‘Terasort’ test, a benchmark, on both the Extreme Data Platform and Hadoop cluster. The benchmark’s goal is to sort one terabyte of data, as fast as possible. The Extreme Data Platform completed this benchmark in less than 20s verses 60s for the Hadoop cluster. It performed three times faster even though the Hadoop cluster had eight times more servers. This benchmark proves the performance advantage of the platform for data sorting.

In addition to the performance, customers will benefit from the database’s features that are required for analysis of all data, and they include analytics, security, optimization, indexing, and workload management. When assessing the viability of solutions, this example reinforces the importance of considering the TCO to deliver a result, and not the cost of cheap commodity servers.

In addition to performance for simple and complex tasks, it enables organizations to store more data in less space. The platform is 15 times more data-space efficient than a commodity system, based on a customer’s deployment. It scales beyond 500PB of compressed data.

It is compatible with used analytics and business intelligence applications and designed for performance. It handles big data analytic queries from business users. It’s enterprise-ready with the Teradata Database. It offers Intelligent Memory, integrated workload management, security, ACID Compliant (Atomicity, Consistency, Isolation, Durability), and provides standard ANSI SQL compliance (American National Standards Institute).

What’s Improved?
It offers three terabyte drives, a more cost-effective storage option. System availability has been enhanced with additional optional hot standby nodes and hot spare drives and data is better protected with improved storage architecture. The Platform leverages the dual Xeon Sandy Bridge eight-core processors at 2.6GHz. Like all platforms, it runs Novell SUSE Linux Enterprise Server 11.

Deep Dive Analytics
As information grows, the ability to perform deep-dive analytics is growing. This supports the airline industry, which requires the ability to store and analyze this machine-generated data to better understand maintenance and repair requirements. For example,

  • A jet engine can generate 10 terabytes of information per engine every 30 minutes of flight.
  • A twin-engine plane traveling from New York to Los Angeles can generate up to 240TB of data. When that base rate is calculated by nearly 30,000 flights in the sky at any one time, the amount of data to be stored and analyzed amasses quickly.

Other examples of industries that require the storage and analytic capability include:

  • Web Analytics – Analysis of integrated click-stream data to drive more product sales
  • Regulatory and Compliance – Multi-year historical data to comply with regulations, such as legal retention other governmental requirements.
  • Sensor Data – RFID product movement and monitoring
  • Communications Network – Data analysis will enable customers to perform an assessment of consumer behavior, network, cell tower, and handset performance; fulfill legal compliance for retention of several years of history.
  • Actuarial Analysis – Insurance carriers can determine risk profiles of customers with analysis of a variety of different customer characteristics.

It is being shipped. The ability to ingest and analyze JSON data will be available in early 2014.

Unified Data Architecture (UDA)
Data stored is kept in the same format and schema, which makes it transferable to and from any Teradata integrated data warehouse. These characteristics allow it to fit within the UDA, which is a framework for organizations to analyze all types of data on multiple Teradata systems. It leverages the complementary value of – Teradata Database, Teradata Aster Database, and Apache Hadoop.

Articles_bottom
AIC
ATTO
OPEN-E