2014 IT Predictions From Virtual Instruments
Surrounding data center, storage, virtualization and IT industry
This is a Press Release edited by StorageNewsletter.com on December 16, 2013 at 3:00 pmHere are 2014 IT predictions piece from John Gentry, VP marketing, Virtual Instruments, Inc.
His predictions include the company’s take on emerging trends surrounding the data center, storage, virtualization and the overall state of the IT industry.
Leaving the Public Cloud So Soon?
Public cloud providers have been able to deliver cost effective, easy access to the cloud for businesses ranging from start-ups to global corporations. Although the public cloud has done well for many companies to date, the primary use case has been for non-business critical functions, and in some cases rogue business units and unauthorized use. As companies start to consider moving more business-critical workloads to the cloud, 2014 will be the year that many organizations adopt the private cloud to ensure that they are in control and operating at peak performance. Although the public cloud offers many cost and scalability benefits, providers have not yet put a solution in place to give organizations in-depth visibility into the performance of the cloud infrastructure. Most public cloud providers focus SLAs around availability, or uptime. Increasingly, businesses want SLAs tied to specific application and business requirements because they want assurances around application response times. As a result, the private cloud will be embraced as the best way to ensure performance in the New Year.
IT’s New Look
IT infrastructures are more complex and interdependent than ever before. Hiccups in the infrastructure put business operations at risk. When issues occur, IT needs to be able to identify and remediate problems quickly. Recent surveys have found that many organizations are breaking down historical silos to create new cross-domain teams that bring together subject matter experts from across IT and the business, including application ‘owners.’ As the IT department converges and becomes a more streamlined and efficient unit, so too are the teams charged with managing its processes. Organizations with these teams in place are better able to identify infrastructure problems, remediate them and proactively mitigate risks. 2014 will see more organizations aligning around these teams to drive improved infrastructure performance and thus business agility.
Outages: Time to Stop Playing the Blame Game
When outages occur – like the recent LivingSocial ‘scheduled maintenance’ – companies fly into a panic, tracking down their vendors in order to figure out what went wrong. Device-specific tools collect, measure, and view data from only one device dependent point of view. They don’t tell the whole story and customers are left with a room full of suits pointing fingers at each other. In 2014, I see companies doing something about this to ensure that they have an unbiased view across the end-to-end infrastructure. So, what is the solution? Independent, vendor agnostic tools that aggregate and correlate information to provide an accurate view of the real-time performance of systems across the hypervisor, server, network and storage layers. In the event of an outage, these tools can help diagnose the problem and get on with fixing the issue, no finger pointing needed.
The False-Start of Software-Defined Everything
The dynamic, flexible solution presented by ‘software-defined everything’ is in theory, great for the business overall. Yet, constant changes in the network, storage infrastructures, and data centers as a result of shifting application and user demands are adding complexity thereby dramatically increasing an organizations risk profile. So many moving parts increase the chances that a transient problem – or a ‘ghost in the machine’ – will be harder to find and have a significant impact on business. As it stands, the shift toward software-defined solutions represents an evolution, not a revolution. I don’t think that 2014 will be remembered as the year that Software-Defined-Everything became ubiquitous in the enterprise.
Data Center Consolidation
It is a key strategy as IT works to reduce cost and deliver real value to their organizations in both the public and private sectors. Data center consolidation is about more than just shutting down or closing centers. It’s about driving standards and modernizing the infrastructure, improving asset utilization and streamlining business activities. I believe that throughout 2014, businesses will begin to take a more thorough look at data center activity in order to optimize business operations.