Officials from both Google parent Alphabet and Microsoft said they expect to increase spending on cloud computing data centers in order to support new artificial intelligence (AI) technologies like ChatGPT.
That's a noteworthy development for the world's telecom network operators considering they're in the midst of investing in their own AI-based services. Meanwhile, they're also shifting more of their own IT operations and network functions into cloud data centers.
At the same time, both hyperscale cloud companies and telecom network operators are shedding thousands of employees in order to cut costs and buoy profits.
"Our recent checks indicate hyperscale [data center] leasing and forward demand pipelines remain robust," wrote the financial analysts at TD Cowen in a recent note to investors. That demand in data center computing capacity is likely welcome news to data center operators like Equinix and DigitalBridge, as well as networking component suppliers including Nokia, Juniper Networks, Coherent, Ciena and others.
However, the TD Cowen analysts speculated that growing demand for generative, large language model (LLM) technologies like ChatGPT could increase hyperscalers' data center investments.
"Unlike traditional data center deployments, AI workloads run at much higher power densities, with operators highlighting deals within their pipelines which run at 30-50kW per cabinet, and in some cases running as high as +100kW per cabinet," the analysts wrote.
"To support these deployments, liquid cooling to the cabinet or the chip is required. However, given data centers have historically leveraged air cooling, we would expect AI deployments to live in new data centers with liquid cooling capabilities. While it is possible to retrofit older data centers to support higher power densities and liquid cooling, it is a sub-optimal solution in our view given the requisite capex needed to upgrade these data centers. Interestingly, operators we spoke with highlighted that demand is coming from a combination of the existing hyperscalers as well as new upstart companies. Given AI deals are just starting to hit demand pipelines, we expect AI leasing to begin to pick up in earnest in 2H23 and accelerate into 2024," they added.
Officials from Alphabet made clear their intention to make whatever investments are necessary to support AI computing.
Officials from both Google parent Alphabet and Microsoft said they expect to increase spending on cloud computing data centers in order to support new artificial intelligence (AI) technologies like ChatGPT. That's a noteworthy development for the world's telecom network operators considering they're in the midst of investing in their own AI-based services. Meanwhile, they're also shifting more of their own IT operations and network functions into cloud data centers. At the same time, both hyperscale cloud companies and telecom network operators are shedding thousands of employees in order to cut costs and buoy profits. AI's computing demands "Our recent checks indicate hyperscale [data center] leasing and forward demand pipelines remain robust," wrote the financial analysts at TD Cowen in a recent note to investors. That demand in data center computing capacity is likely welcome news to data center operators like Equinix and DigitalBridge, as well as networking component suppliers including Nokia, Juniper Networks, Coherent, Ciena and others. However, the TD Cowen analysts speculated that growing demand for generative, large language model (LLM) technologies like ChatGPT could increase hyperscalers' data center investments. (Source: Pitinan Piyavatin/Alamy Stock Photo) (Source: Pitinan Piyavatin/Alamy Stock Photo) "Unlike traditional data center deployments, AI workloads run at much higher power densities, with operators highlighting deals within their pipelines which run at 30-50kW per cabinet, and in some cases running as high as +100kW per cabinet," the analysts wrote. "To support these deployments, liquid cooling to the cabinet or the chip is required. However, given data centers have historically leveraged air cooling, we would expect AI deployments to live in new data centers with liquid cooling capabilities. While it is possible to retrofit older data centers to support higher power densities and liquid cooling, it is a sub-optimal solution in our view given the requisite capex needed to upgrade these data centers. Interestingly, operators we spoke with highlighted that demand is coming from a combination of the existing hyperscalers as well as new upstart companies. Given AI deals are just starting to hit demand pipelines, we expect AI leasing to begin to pick up in earnest in 2H23 and accelerate into 2024," they added. Officials from Alphabet made clear their intention to make whatever investments are necessary to support AI computing. "We do now expect that total capex [capital expenses] for the year for 2023 will be modestly higher than in 2022," Alphabet CFO Ruth Porat said on the company's earnings call this week, according to a Seeking Alpha transcript. "And as we discussed last quarter, AI is a key component. It underlies everything that we do, and we're continuing to invest in support of AI."
Microsoft officials made similar comments.
"We expect capital expenditures to have a material sequential increase on a dollar basis, driven by investments in Azure AI infrastructure," Microsoft CFO Amy Hood said on the company's earnings call this week, according to a Seeking Alpha transcript.
Such comments are important to telecom network operators, and other companies in the telecom industry, for several reasons.
First, data centers are increasingly housing critical networking components. For example, AT&T and Dish Network are placing some of their core networking software in the cloud – thus potentially driving up demand for the data centers where that cloud runs. And a wide range of other operators are expected to make similar moves into the cloud in order to cut costs and remain flexible.
At the same time, telecom operators are increasingly eyeing AI technologies like ChatGPT in order to assess how they might incorporate such offerings into their businesses.
"Telco execs need to try GenAI [generative artificial intelligence] out for themselves," wrote public cloud evangelist Danielle Royston on her TelcoDR site.
Indeed, consulting firm PwC US this week announced plans to invest $1 billion over the next three years to expand its AI capabilities for clients.
Already analyst Doug Dawson with CCG Consulting sees a future where big telecom operators make extensive use of those AI technologies. "It's not hard to envision some of the giant ISPs fully automating the backoffice function to eliminate many customer service, accounting, and other office workers," he wrote on his blog. "If this new software meets only a fraction of the early claimed benefits, we're going to see huge changes across the economy. Whatever is coming is going to be massively disruptive, and working in telecom or any other industry will never be the same."
Ultimately, though, it would seem a good portion of whatever AI investments made by telecom players and others will go to the hyperscalers offering those technologies. "The excitement around AI is creating new opportunities," agreed Microsoft CEO Satya Nadella during his company's earnings call.
Source: https://www.lightreading.com/aiautomation/ai-to-drive-data-center-investments/d/d-id/784564?