Artwork

EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

AI in 2025 – Infrastructure, investment & bottlenecks with Dylan Patel

51:13
 
공유
 

Manage episode 457046591 series 2615510
EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Dylan Patel, founder of SemiAnalysis and one of my go-to experts on semiconductors and data center infrastructure joins me to discuss AI in 2025. Several key themes emerged about where AI might be headed in 2025:

1/ Big Tech’s accelerating CapEx and market adjustments
The hyperscalers are racing ahead in capital expenditure, with Microsoft’s annual outlay likely to surpass $80 billion (up from around $15 billion just five years ago). By mid-decade, total annual investments in AI-driven data centers could climb from around $150–200 billion today to $400–500 billion. While these expansions power more advanced models and services, such rapid spending raises questions for investors. Are shareholders ready for ongoing, multi-fold increases in data center build-outs?

2/ The competitive landscape and new infrastructure players
The expected explosion in AI workloads is drawing in a wave of new specialized GPU cloud providers—names like CoreWeave, Niveus, Crusoe—each gunning to become the next vital utility layer of AI compute. Unlike the hyperscalers, these players tap different pools of capital, including real-estate-like finance and private credit, enabling them to ramp up aggressively. This dynamic threatens the established order and could squeeze margins as competition heats up. The market is starting to understand that.

3/ The semiconductor supply chain isn’t the only bottleneck
We often talk about GPU shortages, but the real sticking point is broader infrastructural complexity. Yes, Nvidia and TSMC can ramp up chip supply. But even if you have enough high-end silicon, you still need power infrastructure and grid connectivity. Building multi-gigawatt data centers in the US—each the size of a utility-scale power plant—is now firmly on the agenda. In some states, data centers already consume 30% of the grid’s electricity. By 2027, AI data centers alone could account for 10% or more of total US electricity consumption, straining America’s aging infrastructure.

4/ Commoditization of models and margin pressure
A year ago, advanced language models were scarce and expensive. Today, open-source variants like Llama 3.1 are driving commoditization at speed, slicing away the profit margins of plain-vanilla model-serving. If your model doesn’t outperform the best open source, you’re forced to compete on price—and that’s a race to the bottom. Currently, only a handful of players (OpenAI and Anthropic among them) enjoy meaningful margins. As models proliferate, value will increasingly flow to those offering distinctive tools, integrating closely into enterprise workflows and locking in switching costs.

5/ Into 2025: exponential curves and new market norms
Despite these challenges—soaring costs, stalled infrastructure build-outs, margin erosion—Dylan is confident that exponential scaling will continue. The sector’s appetite for GPUs, specialized chips and next-gen data centers appears insatiable. We could easily see record-breaking fundraising rounds north of $10 billion for private AI ventures—funded by sovereign wealth funds and other capital pools that have barely scratched the surface of their capacity to invest in AI infrastructure. There’s also a very tangible productivity angle. AI coding assistants continue to reduce the cost of software development. Some software companies could be looking at 20–30% staff reductions in these technical teams as high-level coding becomes automated. This shift, still in its early days, will have profound downstream effects on the entire software ecosystem.

Find us:


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

  continue reading

207 에피소드

Artwork
icon공유
 
Manage episode 457046591 series 2615510
EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Dylan Patel, founder of SemiAnalysis and one of my go-to experts on semiconductors and data center infrastructure joins me to discuss AI in 2025. Several key themes emerged about where AI might be headed in 2025:

1/ Big Tech’s accelerating CapEx and market adjustments
The hyperscalers are racing ahead in capital expenditure, with Microsoft’s annual outlay likely to surpass $80 billion (up from around $15 billion just five years ago). By mid-decade, total annual investments in AI-driven data centers could climb from around $150–200 billion today to $400–500 billion. While these expansions power more advanced models and services, such rapid spending raises questions for investors. Are shareholders ready for ongoing, multi-fold increases in data center build-outs?

2/ The competitive landscape and new infrastructure players
The expected explosion in AI workloads is drawing in a wave of new specialized GPU cloud providers—names like CoreWeave, Niveus, Crusoe—each gunning to become the next vital utility layer of AI compute. Unlike the hyperscalers, these players tap different pools of capital, including real-estate-like finance and private credit, enabling them to ramp up aggressively. This dynamic threatens the established order and could squeeze margins as competition heats up. The market is starting to understand that.

3/ The semiconductor supply chain isn’t the only bottleneck
We often talk about GPU shortages, but the real sticking point is broader infrastructural complexity. Yes, Nvidia and TSMC can ramp up chip supply. But even if you have enough high-end silicon, you still need power infrastructure and grid connectivity. Building multi-gigawatt data centers in the US—each the size of a utility-scale power plant—is now firmly on the agenda. In some states, data centers already consume 30% of the grid’s electricity. By 2027, AI data centers alone could account for 10% or more of total US electricity consumption, straining America’s aging infrastructure.

4/ Commoditization of models and margin pressure
A year ago, advanced language models were scarce and expensive. Today, open-source variants like Llama 3.1 are driving commoditization at speed, slicing away the profit margins of plain-vanilla model-serving. If your model doesn’t outperform the best open source, you’re forced to compete on price—and that’s a race to the bottom. Currently, only a handful of players (OpenAI and Anthropic among them) enjoy meaningful margins. As models proliferate, value will increasingly flow to those offering distinctive tools, integrating closely into enterprise workflows and locking in switching costs.

5/ Into 2025: exponential curves and new market norms
Despite these challenges—soaring costs, stalled infrastructure build-outs, margin erosion—Dylan is confident that exponential scaling will continue. The sector’s appetite for GPUs, specialized chips and next-gen data centers appears insatiable. We could easily see record-breaking fundraising rounds north of $10 billion for private AI ventures—funded by sovereign wealth funds and other capital pools that have barely scratched the surface of their capacity to invest in AI infrastructure. There’s also a very tangible productivity angle. AI coding assistants continue to reduce the cost of software development. Some software companies could be looking at 20–30% staff reductions in these technical teams as high-level coding becomes automated. This shift, still in its early days, will have profound downstream effects on the entire software ecosystem.

Find us:


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

  continue reading

207 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드

탐색하는 동안 이 프로그램을 들어보세요.
재생