Back in February 2010, then-US government CIO Vivek Kundra proposed to close hundreds of federal data centers, giving an early boost to a developing trend in the IT industry. Data center consolidation in the private sector had already started by then, driven primarily by the advent of server virtualization.
Since then, many companies have followed suit, looking to save energy and real estate costs by shrinking their data center footprint and moving workloads to the cloud. Today, some experts say there are potentially millions of data centers still to be closed, while others think the trend has peaked.
Data center consolidation “still has a long way to go,” said Mike Leone, senior analyst at Enterprise Strategy Group.
IDC saw the number of data centers worldwide continuing to climb until 2015, when the number reached 8.55 million. The worldwide number would decline to 8.4 million in 2017, IDC predicted last year, and again to 7.2 million by 2021.
In the US government the work is not yet complete. Eight years after Kundra launched the effort, many agencies still have not achieved the closure goals set by former President Barack Obama’s Office of Management and Budget.
In August 2016, OMB issued a new policy, shifting its focus to data center optimization in addition to consolidation. Still, agencies have fallen short of closure goals, shuttering 4,388 data centers out of a targeted 5,597, according to a US Government Accountability Office report from last May.
The Second Wave
The initial wave of consolidation virtualized existing infrastructure, which helped to reduce organizations’ costs, Leone said. But a second wave of consolidation is now focusing on “not only reducing the hardware footprint, but accelerating virtualization adoption to deliver a cloud-like experience on-premises,” he added. This second wave is creating converged and hyper-converged IT infrastructures, making “it easier to virtualize, easier to manage, easier to scale, and do it all cost-effectively.”
In the past, many organizations consolidated data centers they acquired through mergers or geographic expansion, added Richard Villars, VP for data centers and the cloud at IDC. “They’d tend to close smaller facilities and build a larger, more centralized one,” he said.
But now, with companies embracing virtualization, solid state storage, and cloud technologies, a new cycle of data center consolidation is happening, Villars said. “The technologies, along with the move to SaaS, are highly deflationary when it comes to need for IT hardware — and therefore space — in existing data centers,” he said.
Companies embracing the cloud are eliminating the pressure from new workloads to build new data centers, he added.
“Rather than consolidating lots of smaller data centers into bigger ones, now companies can consolidate small and large data centers into smaller ones, or, as many are doing, opt out of owning data center space altogether … and just rent space in a colocation facility that has newer” infrastructure, Villars said.
Enterprise companies have a lot of room to move their computing workloads from on-premises facilities, added Matthew Brisse, research VP at Gartner.
This year, more than 60 percent of enterprise workloads are still running on-premises, with that number projected to fall to about one-third by 2021, he said. Another third will be running in the cloud, with the remainder running off-premises using colocation data centers and managed hosting services, Gartner predicted. On-premises enterprise workloads should fall below 20 percent by in 2024, the market research firm says.
The cost of upgrading aging data centers will also drive businesses to consider other options, said Chris Bihary, CEO and co-founder of Garland Technology, a network access hardware provider. Many of the data center upgrade costs “will simply be too much for many small to mid-size enterprises,” he said.
Many businesses will consider hosted or cloud services as their data centers become too expensive to upgrade, with those costs “perhaps forcing them to turn to larger service providers or mega data centers that can handle multiple petabytes of data storage.”
Density Takes Center Stage
While data center closures will continue, many organizations beyond the federal government are focused more on optimization, said Gartner’s Brisse. At most enterprises, a “quarter to a half” of the server racks in their data centers are full or close to full, he said.
“Your efficiency that way is degraded,” Brisse said. “There’s a new metric in town – it’s density.”
Some data center experts believe that change in focus will slow down consolidation. With “the name of the game” now optimization or density, data center consolidation may have already peaked, said Mark Gaydos, chief marketing officer for Nlyte Software, a data center infrastructure management vendor.
“Most organizations have gotten all they can out of consolidation and are trying to maximize the use and minimize the cost of their remaining resources,” he said. “This is all happening while the demand for compute capacity continues to grow.”
For some organizations, focusing on optimization makes the most sense, while others would benefit from cloud computing. One size does not fit all.
In some cases, maintaining existing data centers is the best option, said IDC’s Villars.
“When an enterprise decides to keep, update, or add a data center in a specific location, it’s likely to have a very good reason,” he said. “Decisions by enterprises to maintain their own data centers come down to the critical issue of location … tied to concerns about latency, availability, or specific regulatory requirements.”
Many in-house data centers will run “next-generation workloads” related to the Internet of Things, robotics, virtual reality, or machine learning, he added. These specialized facilities are often denser and managed using artificial intelligence.
Some industries like retail and transportation are starting to embrace these on-premises smart data rooms, but the trend is still “lost in the noise” of the larger data center market, Villars added.