Making data centres more energy efficient could potentially influence global electricity consumption, thereby contributing markedly to sustainable development. The work is complex and a slew of factors must be taken into account, but there is much to gain from allowing artificial intelligence to control the operation of data centres. At the ICE data centre in Luleå, research is being conducted by RISE aimed at identifying new, energy-efficient ways of running data centres.
With an electricity consumption of 2000 TWh globally – approximately 10 per cent of the world’s electricity production – the IT sector is one of the largest electricity consumers on the planet*. The trend is accelerating: consumption is estimated to have increased to just over 20 per cent by 2030. One of the main reasons is that electricity consumption in data centres is expected to quadruple during that time.
“There’s a lot of talk about the impact of aviation on the climate and its associated carbon emissions,” says Jonas Gustafsson, researcher at RISE. “But if we look at the IT sector and its environmental impact, we see that it’s on the same level.”
In other words, there is major potential for substantial gains by making data centres more energy efficient, but the process is complex and comprises an array of diverse parameters and operators which must be taken into account. Machine learning and artificial intelligence have presented opportunities in this and have made it possible to not only control the data centre itself but also the computing processes requiring power and server loading.
“We’ve looked at the fans controlling building cooling and the fans inside individual servers in order to determine how we can get them to interoperate,” says Gustafsson. “We believe that data centre owners need to take a combined view of IT and building cooling.”
“In the next stage, we want to incorporate weather forecasts, electricity prices, and expected loads,” adds fellow RISE researcher Rickard Brännvall.
AI costs energy
Unofficial world record
Research into energy efficiency at ICE has been successful, to say the least. In October 2019, an unofficial world record in power usage effectiveness (PUE) was broken when the team achieved a value of 1,007 – just 0.7 per cent from an ideal result.
“PUE describes the amount of electricity purchased that is used to run servers and not used for cooling or other things that require electricity,” explains Gustafsson. “We were assisted by Fraunhofer in artificially loading the servers, but the results are impressive nonetheless. By comparison, Facebook’s data centre in Luleå has a PUE of about 1.1, and a study from Supermicro published in December 2018 showed an average of 1.89 among respondents.”
Data centres closer to users
Demand for computational power will increase as data-driven services become more ubiquitous. Making this possible necessitates new types of data centres that are closer to users in, for example, big cities. In this, the control of energy consumption and computational resources will be even more important.
“We have developed what we refer to as an ‘edge model’; a smaller data centre with access to both solar cells and batteries,” says Brännvall. “For these data centres to be used in cities with major diurnal variations in data traffic, computational needs and electricity consumption, we are developing models which enable the data-driven control of cooling, for instance.”
Artificial intelligence can thus be a powerful tool for energy saving in future data centres. However, the technology will also require an abundance of computational power, which in turn requires energy and additionally expanded data centres. According to Gustafsson, the development is making energy efficiency increasingly important:
“AI consumes energy, so we really need to understand how to build and operate future data centres.”