A recent survey conducted by Equinix indicates that the demands of AI technology are surpassing the capabilities of current IT infrastructure, as reported by 42% of respondents. As the adoption of artificial intelligence (AI) accelerates, IT organizations express concerns that their existing infrastructure lacks the necessary power to keep up with the growing demands.
Equinix’s 2023 Global Tech Trends Survey reveals that AI hardware, especially training hardware, is becoming increasingly power-hungry. While traditional data center racks typically draw between 5 kW and 10 kW per rack, newer GPU-based racks are pushing power consumption to over 30 kW per rack, with some reaching as high as 72 kW per rack. Senior technologist at Equinix, Kaladhar Voruganti, emphasizes that hosting such infrastructure in private data centers is exceedingly challenging.
According to Equinix’s research, 85% of the 2,900 polled IT decision-makers are already utilizing AI or planning to implement it across various key functions. The most common areas for AI implementation or future plans include IT operations (85%), cybersecurity (81%), and customer experience (79%).
Despite the widespread adoption, 42% of IT leaders believe that their current IT infrastructure is not adequately equipped to meet the demands of AI technology.
Insufficient power capacity is not the only concern; cooling systems are also affected by the escalating power requirements of AI. Traditional air cooling methods, involving heatsinks and fans, are only effective until approximately 30 kW per rack. Beyond this point, fans are unable to cope, necessitating the adoption of liquid cooling. However, most data centers are not designed for liquid cooling and would require costly retrofitting.
In addition to hardware limitations, many companies are grappling with a shortage of skilled IT professionals. The shortage of individuals with analytics skills, for instance, has been exacerbated by the surge in AI’s popularity and usage.
AI as a Service Organizations lacking the necessary hardware for AI training face two options: investing heavily in hardware or turning to cloud service providers that offer AI-as-a-service, an option increasingly provided by major cloud service providers. Instead of making significant hardware investments, enterprises can upload their data for processing by the cloud service provider, who handles the heavy lifting. Once the processing is complete, the enterprise can retrieve the trained data models.
Customers often opt for end-to-end solutions from AI vendors in the cloud, particularly in the initial stages, as they simplify the process with a single button, as highlighted by Voruganti. However, variable cloud costs, such as those incurred with each read or write to cloud-based data or with data extraction, may prompt IT teams to reconsider this approach.
Voruganti notes that companies are increasingly selecting different cloud service providers to host foundation models based on their respective areas of expertise. For example, one provider may excel in vision-based models, while another may specialize in general or large language models.
“There is an increasing desire for people to leverage these models from different clouds,” Voruganti explains. “And if they can maintain their own control over their data, like a neutral location, then they can bring the model to where the data is and customize it accordingly.”