The Challenges Generative AI Poses for Data Centres

Artificial Intelligence (AI) has been described as a transformative technology with the potential to optimise operations, boost productivity, support medical advancements, and address some of the major challenges faced by Australian industries today, including energy efficiency and sustainability.

Many businesses in Australia are already investing in AI models and chatbots to automate customer service, analyse data, and handle repetitive tasks. However, implementing AI is not without its challenges. Some large-scale AI projects have been deemed costly failures, with results falling short of expectations. This highlights the need for businesses to set clear goals and fully understand the problems AI is capable of solving before committing to extensive development.

Although AI can perform various tasks, and large language models (LLMs) can produce impressive content, the technology has limitations. AI models are trained using data and algorithms to make predictions based on patterns, but errors such as incorrect assumptions or incomplete data can lead to false outputs, often referred to as hallucinations. As a result, human oversight remains critical, especially when automating vital business functions.

Successful AI deployment depends heavily on the capacity of data centres to support the training of these models, which is driving advancements in data centre infrastructure across Australia.

Managing AI Training Demands in Australian Data Centres

Training AI models requires vast amounts of data and significant processing power. Generative AI models, for instance, use neural networks with billions of parameters and rely heavily on Graphics Processing Units (GPUs) for training. GPUs are notorious for high energy consumption, leading to projections of increased energy demands in data centres, driven largely by the expansion of AI technologies.

Data centres must also contend with the rising thermal loads generated by the intensive processing involved in AI training. As electrical components shrink and rack densities increase, more heat is trapped inside. Managing this heat is essential to avoid damaging sensitive electronic components.

For high-density data centres, where rack loads exceed 40-50kW, liquid cooling systems are increasingly being deployed to deliver targeted cooling. These systems are energy efficient and help reduce a data centre’s carbon footprint, making them a valuable solution for sustainability-focused facilities. Additionally, they offer scalability to accommodate future expansion.

However, for extreme heat loads exceeding 200kW, air cooling becomes insufficient. Two main liquid cooling methods are gaining traction in advanced data centres:

  1. Immersion Cooling: Servers are submerged in dielectric fluid tanks, providing efficient cooling. However, it raises concerns regarding warranty validity and the need for robust structural flooring to support the heavy tanks.
  2. On-Chip Cooling: Liquid coolant is pumped directly to the heatsinks on chips, absorbing heat where it’s generated. This method is more space-efficient than immersion cooling and offers significant benefits, including lower energy consumption, increased processing capacity, and improved system uptime.

For data centres with lower power requirements, air cooling may still be a viable option, but liquid cooling offers the greatest benefits in terms of efficiency and scalability.

Edge Data Centres: A New Demand

Once AI models are trained and integrated into business operations, real-time data processing becomes essential. Low-latency data processing, often required in settings such as factory floors, necessitates placing IT equipment close to data sources. This can introduce challenges, as delicate electronic components may need to operate in less-than-ideal environments, such as hot, dusty, or humid industrial settings.

To address these issues, there has been a rise in the deployment of edge data centres within Australian companies. These centres use pre-configured modules with built-in cooling, power supply, and IT racks, making them easy to install and providing businesses with scalability options.

AI holds enormous potential to unlock innovation and efficiency across Australian industries. However, the successful deployment of AI models relies on a balanced approach that acknowledges both the limitations of AI technology and the critical role of human oversight. Furthermore, businesses must not overlook the growing demands that AI training places on data centres, particularly in terms of energy consumption and cooling.

Explore Rittal Micro Data Centres

Looking for a solution to manage AI demands in your data centre? Rittal’s Micro Data Centres offer efficient, scalable, and secure solutions tailored for the challenges of AI and data-heavy applications. Discover how Rittal’s Micro Data Centres can help your business meet its AI infrastructure needs today. 

Information from Rittal UK

Download Now