What are you looking for ?
Infinidat
Articles_top

Interview of George Teixeira, CEO, DataCore

"Parallel processing software will be a game changer in 2017."

datacore texeiraGeorge Teixeira co-founded the company and has served as CEO and president of DataCore Software Corp. since 1998. Prior to that, he served in a number of executive management positions including WW VP of marketing and GM of the product business group, Encore Computer Corporation, where he also played a major role as the team leader of OEM marketing and sales to Amdahl, IBM, and DEC. His work culminated in the $185 million sale of Encore’s storage control business to Sun Microsystems in 1997. He also held a number of senior management positions at the computer systems division of Gould Electronics. He is a native of Madeira, a Portuguese island off the coast of Africa. He earned an MBA at Florida Atlantic University, after joint B.S. degrees in Chemistry and Biology.

Question: What technology do you think will be a game changer in 2017?
George Teixeira: I believe that parallel processing software will be a game changer in 2017, largely due its disruptive capabilities in terms of dramatically increasing productivity. There is still so much computing power sitting idle – despite all of the incredible technology advancements that have occurred. However, 2017 will be the year that parallel processing software goes mainstream to unleash the incredible processing power of today’s multi-core systems, positively disrupting the economic and productivity impact of what computing can do and where it can be applied.

This will happen as parallel processing software reaches beyond the realm of specialized uses such as HPC and genomics (that have focused primarily on computation), and impacts the broader world of applications that require real-time responses and interactions. These will be mainstream applications and storage that drive business transactions, cloud computing, databases, data analytics, as well as the interactive worlds of machine learning and the Internet IoT.

The software has to become simple to use and non-disruptive to applications to allow it to move from these more specialized use cases to general application usage. When it does – the impact will be massive because application performance, enterprise workloads and greater consolidation densities on virtual platforms and in cloud computing that have been stifled by the growing gap between compute and I/O will no longer be held back.

The real driver of change is the economic and productivity disruption. Today, many new applications such as analytics are not practical because they require hundreds if not thousands of servers to get the job done; yet each server is becoming capable of supporting hundreds of multi-threading computing cores, so all of these engines are available to drive workloads that until now have sat there idle, waiting for work to do. 2017 will usher in an era where one server will do the work of 10 – or 100 servers – of the past. This will be realized with new parallel I/O software technologies that are easy to use, require no changes to the applications and are capable of fully leveraging the power of multi-cores to dramatically increase productivity and overcome the I/O bottleneck. This will lead to a revolution in productivity and allow a new world of applications within the reach of mainstream IT in 2017.

Can you elaborate on how this will specifically impact the problems that continue to impede the performance (and promise) of applications such as real-time analytics and big data?
In a world that requires the amount of interactions and transactions to happen at a far faster pace with much faster response times, the ability to do more work by doing it in parallel – and to react quickly – is the key. The combination of faster response times and the multiplying impact on productivity through parallelization will fuel ‘real-time’ analytics, big data and database performance.

The ability to leverage all of the computing power of multi-cores will propel real-time analytics and big data performance into the forefront by making it practical and affordable. The implications on productivity and business decision making based on insights from data in areas such as financial, banking, retail, fraud detection, healthcare, and genomics, as well as machine learning and IoTs type applications, will be profound (read this to learn more).

What do you think the impact of Microsoft technologies such as Azure Stack, Hybrid Cloud, Windows and SQL Server 2016 will be in 2017?
Microsoft was one of the first to recognize that the landscape will continue to be a mix of on-premise and cloud. The success of Microsoft’s Azure Cloud is already evident; however, the real impact will come from the larger strategy of how Microsoft has worked to reconcile the world of on-premise and cloud computing. For example, Microsoft’s Azure Stack now makes it seamless to get the benefits of cloud-like computing whether in the public cloud or within a private cloud, and it has become the model for hybrid cloud computing. Likewise, Microsoft continues to further integrate its Windows and server solutions to work more seamlessly with cloud capabilities.

One of the most dramatic changes at Microsoft has been how it has reinvented and transformed its database offerings into a true big data and analytics platform for the future. SQL Server 2016 has become far more powerful and capable, and now deals with all types of data. As a platform, it is primed to work with Microsoft’s large ecosystem of marketplace partners, including DataCore with its parallel processing innovations, to redefine what is possible in the enterprise, the cloud, and with big data performance and real-time analytics use cases for traditional business applications, as well as new developing use cases in machine learning, cognitive computing and the Internet of Things.

In your opinion, how has storage transformed in recent years?
The industry is in the midst of an inevitable and growing trend in which servers are defining what storage is – essentially storage has transformed into servers plus software-defined infrastructure. This is because traditional storage systems can no longer keep up; as a result, they are increasingly being replaced by commodity servers and software-defined infrastructure solutions that can leverage their power to solve the growing storage problem. The storage function and associated data services are being driven by software and becoming another ‘application workload’ running on these cost-efficient server platforms, and this wave of flexible server-based storage systems are having a disruptive industry impact.

Marketed as server-SANs, virtual SANs, web-scale, scale-out and hyper-converged systems, they are a collection of standard off-the-shelf servers, flash cards and disk drives – but it is the software that truly defines their value differentiation. As a result, parallel processing software and the ability to leverage multi-core server technology will be the major game-changer. In combination with software-defined infrastructure, it will lead to a productivity revolution and further solidify ‘servers as the new storage.’
 
For example, parallel I/O software technologies have been used to power off-the-shelf multicore servers to drive the world’s fastest storage systems in terms of performance, lowest latencies and best price-performance. For additional information, see the following report: Server SAN Readies for Enterprise and Cloud Domination.

What’s Beyond Flash?
Flash used to be the next big thing, but now it’s here. The market now wants to know how we go even faster, and do more with less. The answer seems obvious; if flash is here, yet performance and productivity are still an issue for many enterprise applications, especially databases, then we need to parallelize the I/O processing. The reason for this is that it multiplies what can be done as a result of many compute engines working in parallel to process and remove bottlenecks and queuing delays higher up in the stack (near the application), so we avoid as much device level I/O as possible and drive performance and response times far beyond any single device level optimization that flash/SSD alone can deliver. Combining flash and parallel I/O enables users to drive more applications faster, do more work and open up applications and use cases that have been previously impossible to do.

Hyper-convergence has been another key growth area in the market. How will we go beyond that?
Hyper-converged systems, which today are in essence a server plus a software-defined infrastructure, are frequently restricted in terms of performance and use cases, and too often lack needed flexibility and a path for integration within the larger IT environment (for instance not supporting FC, which often is key to enterprise and database connectivity). Hyper-converged software will continue to grow in popularity, but to cement its success users will need to be able take full advantage of the ultimate productivity promise of hyper-converged. The real objective is to achieve the most productivity at the lowest cost; hyper-productivity.

Better utilization of one’s storage and servers to drive applications is critical. Parallel processing software will enable users to take advantage of what hardware and software can do (see this video from ESG as an example). For instance, powerful software-defined storage technologies that can do parallel I/O  provide a higher level of flexibility and leverage the power of multi-core servers so fewer nodes are needed, making them more cost-effective. Likewise, the software can incorporate existing flash and disk storage without creating additional silos; migrate and manage data across the entire storage infrastructure; and effectively utilize data stored in the cloud.

Data infrastructures including hyper-converged systems can benefit from the advances of parallel I/O software technologies that can dramatically increase their productivity by untapping the power that lies within standard multi-core servers.

In closing, what do you see as the next giant leap forward for 2017?
The next giant leap forward will be leveraging the multiplier impact of parallel processing on productivity. The world continues to go ‘software-defined’ in order to cost-effectively utilize off-the-shelf computing. The combination of powerful software and servers discussed earlier will drive greater functionality, more automation, and comprehensive services to productively manage and store data across the entire data infrastructure. Parallel processing software will unleash the power of unused computing cores that are all around us, and lead to a new era where its benefits can be applied universally.

These advances (which are already before us) are critical to solving the problems caused by slow I/O and inadequate response times that have been responsible for holding back application workload performance and cost savings from consolidation. Therefore, the advances in multi-core processing, parallel processing software and software-defined infrastructure, collectively, will redefine what is possible while opening up many new applications and use cases by enabling faster performance, real-time responsiveness, massive consolidation, and smaller footprints.

Articles_bottom
AIC
ATTO
OPEN-E