particles - where science matters
  • Why we need a sustainable solution to cool data centers

    By Kelly Hall, 3M Storyteller

    Inside a data center

    • Starting the moment we wake up, many of us feel the need to plug in.

      In the U.S., we spend more than five hours a day on our smartphones – and 80 percent of smartphone users check their phones within 15 minutes of waking up. In Poland, almost three-quarters of people put their phones next to them when they go to sleep.

      And that’s just our phones. Our increasing use of data from all our internet-connected devices is a global trend.

      With this constant use of data, it’s no surprise that data centers are working hard to keep up. “This phenomenon of more and more data comes with both great opportunity, but also challenges,” says Laura Nereng, Electronics and Energy Business Group sustainability leader at 3M.

    • Pie chart illustrating the breakdown of energy demand in data centers
      Source: https://www.clarke-energy.com/natural-gas/data-centre-chp-trigeneration/

      One of these challenges? The massive amount of energy that data centers consume in order to operate. In fact, the biggest environmental impact of data centers is their energy use. The Department of Energy’s Lawrence Berkeley National Laboratory estimates that data centers account for about 2 percent of U.S. electricity use, which is up from 0.8 percent in 2000. “If you think about it, 2 percent sounds like a small number, but that’s more energy than most states use,” says Dale Sartor, scientist at the Lawrence Berkeley National Laboratory.

      Here’s why.

      “Data centers are always on. They are always consuming energy,” says Lucas Beran, senior research analyst with IHS Markit.

      Studies predict that by 2020, data centers are estimated to consume about 73 billion kilowatt hours a year – the equivalent to the amount of energy used by 7 million households.

      This is a worldwide concern. “The energy consumption of data centers in Europe was about the equivalent of the entire country of Portugal,” says Zahl Limbuwala, founder of Romonet, a company that provides predictive analytics software services to data centers.

      This energy consumption is only getting larger as the data center industry sees a significant increase in power density, or watts per rack of servers. “That power density is heat,” explains Bruce Taylor, executive vice president of Data Center Dynamics, North America. “If you go back 10 years, the average enterprise data center ran well under five kilowatts per rack of server computers,” he says. “Today’s racks are frequently 15 kilowatts per rack and higher. There is a desire to go even higher – to get to 100 kilowatts per rack.”

      That density equates to more computing power in a smaller footprint.

    • Cooling data centers

      When it comes to data center operations, the cooling of the electronics is the largest part of data center energy costs. “Cooling is the biggest chunk of the electricity needed in the operation,” explains Laura. “About 38 percent of the electricity needed to run the operation is just to cool the electronics,” she says. “That’s the thing we want to address. That’s the system inefficiency in this case.”

      So, how do you solve for this when cooling is necessary for data centers to function?

      “The chips within supercomputers generate a tremendous amount of heat, and without efficient cooling, the temperatures within the devices rise and they become less efficient,” says Phil Tuma, advanced application development specialist who works with heat transfer fluids for data centers at 3M. “At some point, they simply won’t operate without cooling.”

    • “We’re coming to a crossroads where we’re going to need something new and different to help maintain and reduce that energy footprint.” - Lucas Beran, senior research analyst, IHS Markit

      Cooling innovations

      That’s why companies across the globe are looking for innovative ways to cool data centers. “Owners and operators of data centers are becoming more concerned with how much the energy costs and what can be done to save this energy,” says Berkeley Lab’s Dale.

      “Everything we’re doing to minimize the energy footprint of data centers is great, and we’re continuing to make strides – but we’re coming to a crossroads where we’re going to need something new and different to help maintain and reduce that energy footprint,” adds Lucas, of IHS Markit.

      From putting data centers underground or near the edge of the Arctic Circle, innovators are getting creative to find solutions to cool data centers.

      “It’s interesting to watch the lengths at which we’re going to try to cool data centers efficiently,” says 3M’s Laura. “These ideas are symptoms of the fact that a cooling technology is needed.”

      While some of these solutions are unique, Laura says they won’t fully address the needs in the future. “Knowing where the data will need to be located to serve real-time, low-latency applications, data centers can’t be underground somewhere in the middle of nowhere,” she says. “As the applications develop more and more, there will be a need for data centers to be in the cities.” When commuting using an autonomous vehicle, for instance, you won’t want the vehicle to be controlled by a data center that’s far away. The information that allows the vehicle to function will need to be received rapidly from a nearby source.

    • An IT specialist reaches into the floor of a data center, where fans are stored

      Air cooling

      Most industry leaders are utilizing air cooling to cool data centers. “Air cooling is the predominant method of cooling in data centers,” says Phil. “Air flows through the servers. It heats up and is typically recirculated and cooled by refrigeration.”

      “The majority of data centers use what we call ‘CRAC units’ – computer room air conditioners. These are direct expansion, refrigerant- and compressor-based air conditioners that extract the heat out of a data center and pump cold air into the underfloor. Then, that cold air is distributed to the IT equipment,” adds Dale, of Berkeley Lab.

      Air cooling may benefit smaller data centers and supercomputers, but when it comes to a million square foot facility and larger, air cooling may not be the most efficient solution.

      “There is no question that there is a limit to what you can do with air cooling,” says Bruce. “While there is a lot of trust built up in the engineering around air cooling, it is still by nature inefficient.”

    • Inside a data center

      Water cooling

      Data center industry experts and scientists are working together to advance other solutions. Some are utilizing liquid cooling in their designs – which includes a few options, like water cooling. “Water cooling can still be highly effective. It’s how we cooled mainframe computers in 1970,” says Bruce.

      Water is pumped onto the servers and through plumbing networks to the heat-generating devices in data centers. “The water heats up and is pumped outside the server, cooled off and recirculated,” explains Phil, of 3M.

      Just like air cooling, water cooling has its draw backs. “There are limitations to the power density that can be achieved with water cooling,” says Phil. “There can be a risk of leakage and water spilling out into the equipment, perhaps causing short circuits.”

      Leakage isn’t the only issue. Water is often evaporated to provide cooling. “Water is very efficient with cooling data centers, but there is concern with the amount of water that data centers are using,” says Lucas.

      Studies show that U.S. data centers use an estimated 165 billion gallons of water per year. That’s enough water to fill up eight million average-sized swimming pools.

    • Immersion cooling

      Immersion cooling

      This is why the data center industry is exploring other cooling solutions – like liquid immersion cooling with fluids other than water.

      “When it’s perfected, fluid in theory should be far more resilient than air,” says Bruce, of Data Center Dynamics. “It has fewer mechanical parts and fewer electronics. Everything about it is at a different scale.”

      We know what you’re thinking – fluids and servers shouldn’t mix together – but 3M scientists utilize immersion cooling with non-conductive fluids. These fluids can touch the IT equipment and cool it directly. “You can’t get any closer in terms of touch than an immersion cooling system,” explains Dale.

      The liquid is flowed through the server not by a fan, but by a pump. As it warms up, it exits the server, and is recirculated and cooled. Then, it flows back to the server. “This is what we would call ‘single-phase immersion,’” says Phil. “The liquid does not actually change state. It remains a liquid through the entire process.

      The benefits? “It’s really ideal technology for today’s power densities,” says Phil. “If you’re going to be adapting hardware that was designed to be cooled by air, it typically takes a lot of liquid to submerge it. Single-phase immersion is very efficient for cooling that type of hardware.”

      Phil says that as the data center industry looks to much higher power densities, two-phase immersion cooling becomes attractive. “There is a density at which single-phase immersion is no longer practical, because the engineering and pump power required to ensure that the liquid goes where it needs to go becomes too great,” say Phil. “So, passive immersion by two-phase methods becomes much more attractive at that density.”

      With two-phase immersion, a phase change occurs. The electronics are submerged into a bath of liquid. The liquid boils off of the heat-generating devices. “The liquid is changing state from liquid to vapor, and that process typically takes place passively. The vapor that’s generated is condensed by a heat exchanger and naturally falls back into the bath,” explains Phil.

      This phase change is more energy efficient than a single-phase process.

      “An immersion cooled system using 3M fluids is designed to take 100 percent of the space and put it in 10 percent of the space. Instead of spreading out the electronics so you can air cool them, you pack the electronics together, because you can cool them with this 3M fluid. It can reduce the electricity usage for cooling up to 95 percent,” explains Laura. “It’s a practical solution that can be used in any geography.”

    • The future of liquid immersion cooling

      Phil believes that liquid immersion cooling is catching on, because it provides a lot of benefits that other technologies cannot provide.

      “You have best in class power density coupled with best in class energy efficiency and dramatically simplified design,” says Phil. “Those things should bode well for the technology in the future.”

      Where will this lead us? You may see immersion cooling making its mark in high-performance computing.

      “We’re already working with partners who have the capability to build very high-density graphics processing unit (GPU) clusters,” says Phil. Because GPUs can be densely packaged today, these clusters offer a glimpse of future deployments in machine learning, artificial intelligence, blockchain and autonomous vehicles. “Immersion cooling will enable those types of technologies to deploy in high-density environments where energy and real estate costs are high,” says Phil. “Those are the applications where we believe immersion cooling is going to debut next.”

      Note: The views and opinions expressed not necessarily state or reflect those of the United States Government, the Department of Energy, The Regents of the University of California or the Lawrence Berkeley National Laboratory, and shall not be used for advertising or product endorsement purposes.