Hyperscale datacenter capacity set to triple because of AI demand

Trending 1 month ago

Total capacity of hyperscale datacenters is group to turn almost threefold complete nan adjacent six years connected nan backmost of AI demand, substantially expanding nan magnitude of powerfulness required by those facilities.

With nan generative AI hype rhythm successful perpetual motion, datacenter operators are readying up to expect nan request for higher density, higher capacity infrastructure to meet nan processing requirements.

A new report from expert IDC, for example, forecasts that enterprises worldwide are group to rustle astir $16 cardinal connected generative AI successful 2023. This spending, which includes package arsenic good arsenic related infrastructure hardware and IT/business services, is estimated to scope $143 cardinal successful 2027.

The upshot of this, according to Synergy Research Group, is that nan mean capacity of immoderate hyperscale datacenter opening complete nan adjacent respective years will beryllium much than double that of existent facilities.

There will besides beryllium immoderate retrofitting of existing datacenters to boost their capacity, and nan mean IT load of individual spot barns continues to grow, pinch nan consequence that Synergy predicts nan full capacity of each hyperscale datacenters will almost triple successful nan adjacent six years.

Synergy based this study connected nan operations of 19 of nan world's biggest unreality and net work firms. This includes nan providers of SaaS, IaaS, PaaS, search, societal networking, e-commerce and gaming.

As of 2023, those hyperscalers had a full of 926 monolithic spot barns successful cognition astir nan world, and Synergy said it already knows of a further 427 accommodation that are successful nan pipeline.

Synergy says nan worldwide full number of datacenters has already doubled complete nan past 5 years. It predicts these will proceed to turn by good complete a 100 per year.

However, nan caller advances successful generative AI will not needfully velocity up nan building of information dormitories, but will alternatively "substantially increase" nan magnitude of powerfulness required to run those facilities, acknowledgment to nan burgeoning number of high-wattage GPU accelerators being crammed into server nodes.

This was noted by different investigation outfit, Omdia, which recovered that request for servers fitted pinch 8 GPUs for AI processing activity has besides had nan effect of pushing up mean prices for datacenter systems.

Synergy is coy astir really overmuch it reckons nan magnitude of powerfulness required will "substantially increase".

However, a recent investigation paper calculated that integrating generative AI into each Google hunt could perchance devour nan aforesaid magnitude of powerfulness arsenic a state nan size of Ireland.

IDC elder investigation head for Europe Andrew Buss agreed that AI is driving request for higher capacity datacenter infrastructure.

"We do spot a immense magnitude of accelerated compute capacity being installed," he told us. "We spot hyperscalers buying a important magnitude of nan wide AI accelerators that are coming onto nan marketplace to support nan ample generative and transformer models crossed B2C and B2B customers, arsenic good arsenic galore organizations trying to get immoderate proviso arsenic well."

This is expanding nan powerfulness density of nan servers and creating a batch of powerfulness proviso and cooling issues, Buss said. "Many datacenters are built pinch a powerfulness fund of betwixt 7.5 and 15kW per rack, but now a azygous Nvidia DGX tin usage up 10kW, meaning nan full powerfulness fund is utilized by a azygous 10U box," he explained.

Synergy main expert John Dinsdale told america that powerfulness concerns are causing hyperscale operators to rethink immoderate of their datacenter architecture and deployment plans to amend nan layout and alteration overmuch higher powerfulness density per rack, and perchance moreover reappraisal nan location of their information dormitories.

"It's not conscionable astir powerfulness readiness and cost," Dinsdale said. "Many AI workloads are not arsenic latency delicate arsenic different workloads, truthful tin let nan usability to spot datacenters successful much distant and little costly locations. For example, we were already seeing hyperscale datacenter maturation successful nan US Midwest outpacing maturation successful different regions specified arsenic Northern Virginia and Silicon Valley. We afloat expect that inclination to continue," he added.

  • If you're brave capable to move afloat datacenter racks, here's nan robot for you
  • Generative AI slashes unreality migration hassles, says McKinsey partner
  • Google says that YouTube vid tin hold if it saves connected energy
  • Facing a 30% value emergence to parkland servers successful a colo? Blame AI

Just this week, Nvidia and Taiwanese electronics shaper Foxconn announced plans to squad up and build what they telephone "AI factories", meaning datacenters dedicated to AI processing.

"A caller type of manufacturing has emerged - nan accumulation of intelligence. And nan datacenters that nutrient them are AI factories," Nvidia CEO Jensen Huang said successful a statement, adding that Foxconn has nan expertise and standard to build those AI factories globally.

Foxconn will usage Nvidia's tech to create caller datacenters for generative AI services that screen a scope of applications, including business robots and self-driving cars. Foxconn is expected to build a ample number of systems based connected Nvidia's CPUs, GPUs and networking for its world customer base, galore of which are seeking to create and run their ain AI factories, Nvidia claimed. ®