Hyperconverged Infrastructure

The State of HCI Today

Hyperconverged Infrastructure (HCI) has emerged as a rapidly growing, hotly contested portion of the IT infrastructure market. Although some would say that it has become a mature market, the intense focus on innovation says that the market is likely to continue to change at a rapid pace. A large number of players, including long-established suppliers and a host of newcomers, are pushing forward the technology and finding many areas in which to innovate.

Although there are many areas of focus for enhancement, the primary areas are improved use of functional intelligence, better integration of HCI in established computing environments, improved use of software-defined x, and, of course the continued improvement of processing, memory, storage and networking performance.

Early in their history, HCI solutions forced the central processing function be heavily involved in every support function, including managing storage and networking. Intelligent networking and storage functions are increasingly available in HCI environments. This means that the central processing function only has to orchestrate the use of these other functions, rather than having to be involved in every aspect of those functions. This increasing reliance on functional intelligence means that HCI solutions are offering improved performance and better scaling.

Better Integration

Integration of HCI solutions has been improving across a number of dimensions, including bringing them into broad enterprise management frameworks, vendor-specific virtual computing environments, and even DevOps development environments. Each step in this integration process means that HCI can easily fit into established enterprise computing environments. Not only does this trend indicate that HCI can be used to support new workloads, it also means that organizations can more easily migrate established workloads onto these platforms when it’s time to refresh the underlying hardware.

Improved use of Software-Defined X

Many vendors have been marketing their virtual environments under the banner “software defined.” At first, these vendors would claim that they were supporting a software-defined function after simply placing the function into an artificial/virtual environment. Since many of the higher level functions of a software-defined environment had not yet been developed, these claims tended to dilute the concept in the minds of decision makers.

The next step was to make it possible for those artificial/virtual computing environments to be provisioned and administered programmatically. Once the suppliers allowed their technology to be monitored and controlled using an API, it became possible for them to  operate within guidelines, policies and company-defined constraints.

HCI solutions increasingly rely upon the use of software-defined storage and networking functions. They also increasingly have been made to fit within the broader software-defined data center tools offered by many suppliers.

Improved Performance

As the industry as a whole gets faster, HCI follows along. Companies are adopting faster processors, larger and faster internal memory systems, faster internal networking structures and faster and more intelligent storage systems.

Each new product announcement brings along with it improved performance, improvements in scalability and better cost of ownership numbers.

So, if we sort through all of the claims made by HCI suppliers, simplification and cost savings appear to be at the top of their lists. This usually means that if an enterprise chooses to host their workloads on the vendor’s products, that the overall cost of hardware, development and administration will be lower.

Increased scalability is a claim presented by many of these suppliers as well, although what configurations are being compared isn’t always immediately clear.

Improved performance is another claim made by these suppliers. Usually, their claims are supported through the vendor’s use of better internal networks, faster internal memory and flash-based storage.

Ease of management is proudly proclaimed by nearly all suppliers. It isn’t clear, in some cases, whether this claim is based on an environment that only includes the vendor’s systems, or if a broader and more complex environment is also included.

In spite of these gains, problems remain, though. Most suppliers don’t discuss potential drawbacks such as interoperability with established environments, or the fact that they may be a startup without established purchasing and support agreements. They may also gloss over the fact that their solutions require their own proprietary systems software, processor cards, memory, storage and networking adapters.

These potential drawbacks have been diminishing rapidly, however.

HCI Is Now Mainstream

Analyst firms continue to note the strength of HCI offerings in the market. IDC’s Q1 2018 report on the market indicated revenue growth for HCI vendors of 78.3%. Gartner’s report, “Data Centers: Global Industry Outlook & Forecast 2018-2023 – Adoption of Hyperconverged Infrastructure to Have a High Impact on Growth” said that “The adoption of hyperconverged infrastructure will have a higher impact on the growth of global data center market because it enables operating software defined data center (SDDC) environments.”

These and other reports from research firms support the positive view of HCI and its potential.

All this is proof that HCI has become a mainstream part of IT infrastructure. If vendors continue to deliver on these benefits, continued increases in adoption are expected. It’s clear that enterprises, however, are going to demand that hardware and software lock-ins be addressed if this vision of the future is to actually take place.

As for the near-term, we can expect to see better integration with enterprise computing environments, development tools and applications that are better able to utilize these solutions and lower cost structures through the use of management tools that incorporate machine learning and predictive analysis.