January 05, 2016
The Adoption of Software Defined Storage and Hyperconverged Infrastructure in 2016
This post is part of a series based on the forthcoming book, Building a Modern Data Center, written by Scott D. Lowe, David M. Davis and James Green of ActualTech Media in partnership with Atlantis Computing.
IT budgets are shrinking. Demands on IT are increasing. Data center technology has become a quagmire of complexity. Traditional storage has struggled to keep pace with workload demands. With these challenges, CIOs, technical decision makers, and IT staff members are looking for ways to continue meeting critical business needs with solutions that stabilize data center costs while also being simpler to manage. Perhaps the biggest challenges facing the data center today revolve around storage. It’s expensive. It’s complex. And, until flash became more common, it suffered a great deal with regard to performance.
Both software defined storage and hyperconverged infrastructure have emerged as solutions intended to solve the storage problem. They have entered the market mainstream as forceful options for consideration. Both bring heretofore unheard of levels of simplicity while also helping to turn the data center economic picture on its head. Rather than buying three to five years’ worth of storage, data center administrators can take more of a “just in time” approach to storage, thanks to the easy scalability opportunities that present themselves with these architectural options.
With great interest in these technologies, we (ActualTech Media) sought to understand what businesses think of each. To that end, we surveyed more than 1,200 IT professionals and business decision makers to get their thoughts around these technologies and how adopters are using them. What you will find below is a summary of some of the most interesting highlights from the survey.
HCIS and SDS are Coming Quickly
Deployment of software defined storage and hyperconverged integrated systems (HCIS) is happening in waves and is more than likely taking place based on existing hardware replacement cycles. Figure 1 shows that over the next year or so, 17% of respondents say that they will undertake deployments. Over the next two years, that number jumps to a total of 62%. 27% of respondents say that they are uncertain as to their deployment plans, which could mean that they are still not sure whether they will definitely deploy or they truly don’t know when they might plan a deployment.
The Emergence of SDDC
As the data center continues to evolve, there's an emerging need for flexibility, agility, and control. With "web scale" comes challenges that aren't found in legacy or smaller infrastructures, and require new ways of approaching the data center. The current approach to address these issues is the "software-defined" approach which refers to the idea of abstracting a physical data center resource from the underlying hardware and managing it with software. An example most IT professionals would be familiar with is the virtualization of compute resources. No longer allowing physical servers to be the container for data center systems, while providing and manipulating their resources with software, is the new normal. The ability to create a new "server" with a few clicks or migrate a running workload between physical servers is the essence of the software-defined approach.
The software defined approach took hold with compute, but is now starting to encompass all areas of the data center, which has led to the term software-defined data center (SDDC). The SDDC isn't any one thing specifically, but rather a way of describing a data center where as many pieces as possible are abstracted into software. The SDDC is characterized by automation, orchestration, and abstraction of resources into software and code. By nature, code is more reliable than humans, which means that compared to a legacy data center, the SDDC is more secure, more agile, and moves more rapidly. The fallout of abstracting physical resources across the data center is that all of a sudden, the hardware is substantially less important to the big picture.
Figure 1 - HCIS and SDS Deployment Timeframe
Say Goodbye to Disk
When we asked respondents to tell us what their future (future = 2-to-3 years out) plans are regarding storage, the responses paint a bleak future for disk-based storage. A full 19% of respondents – almost 1 in 5 – say that they will fully decommission their disk-based storage systems over the next two to three years. Figure 2 shows that the primary gainers in the same timeframe will be all flash arrays and hybrid storage arrays, but 35% also say that they will expand their use of software defined storage and hyperconverged infrastructure.
None of this comes as a major surprise. Flash storage has been plummeting in price for quite some time and is expected to hit price parity with disk within the next few years. Once raw price parity is achieved, expect to see spinning disk quickly fall off in terms of usage. Flash simply carries with it far too much performance potential when compared to disk.
Figure 2 - The Future of Storage in the Data Center
Sign up for your free eBook today: Building the Modern Data Center: Principals & Strategies of Design
Who Cares About Form Factor?
Hyperconverged infrastructure and software defined storage solutions are available as either just software deployments or as hardware appliances that include the software. There are different solutions available depending on customer needs. Software-only solutions provide more hardware flexibility since the customer can specifically size the individual nodes. Preconfigured hardware appliances offer a bit less individual resource flexibility, but do offer a simplified deployment experience. As you can see in Figure 3 below, for those that have an opinion, most prefer appliance-based solutions, but not by a wide margin. 57% of respondents are keeping their options open and considering both kinds of solutions.
Figure 3 - Respondent Thoughts on Form Factor
Data – Lots and Lots of Data
Perhaps one of the most serious technology challenges facing organizations is keeping up with the sheer growth of data. Figure 4 shows you that most organizations are seeing a 10% to 30% annual data growth rate. However, a number of companies see much higher rates, even 50% or 100%. For these respondents, finding a storage solution that can scale easily and inexpensively is absolutely critical to maintaining reasonable level of expense and application availability.
Figure 4 - Data Growth Rate
In the four charts below (Figure 5), we can get a look at the data growth patterns for the four primary verticals under scrutiny for our survey. As you can tell, in general, the data growth patterns are all pretty similar; most organizations, regardless of vertical, fall primary into the 10% to 30% data growth range and have some kind of peak around the 50% data growth rate. Here, though, finance is something of an outlier, with its “secondary peak” coming in at around 45% with a smaller third peak coming at 65%.
Figure 5 - Growth Rate by Vertical
We have drawn a few conclusions during our review of the survey results. The first is that, as you’ve seen, widespread deployment of hyperconverged infrastructure and software defined storage is coming very soon. In the next two years, 62% of our survey respondents plan to undertake deployments of one or both technologies. The second conclusion is that disk is dying. Almost 1 in 5 survey respondents plan to fully decommission their disk-based storage systems in the next two to three years. We concluded that a majority of buyers are open to either software-only or appliance based hyperconvergence; they will choose based on which solution best fits their needs, regardless of the form factor. And finally, we concluded that the ability to scale quickly and affordably with regard to data storage is going to be critical. On average, organizations are expecting a 10% to 30% increase in storage needs annually. Some organizations see much higher than that. It is imperative that storage in those data centers scales well to accommodate this rapid growth.
What you’ve seen here is only a subset of the full report. To read the report in its entirety, http://www2.atlantiscomputing.com/Survey2016.html
SCOTT D. LOWE– Contributor
Scott is Co-Founder of ActualTech Media and serves as Senior Content Editor and Strategist. Scott is an enterprise IT veteran with close to twenty years experience in senior and CIO roles across multiple large organizations. A 2015 VMware vExpert Award winner, Scott is also a micro-analyst for Wikibon and an InformationWeek Analytics contributor.
Current rating: 4 (10 ratings)