
For over a decade, data centres have been the backbone of the digital economy. Now, one AI founder is questioning whether that entire model could be headed for disruption. Aravind Srinivas, founder and CEO of Perplexity AI, has issued a contrarian warning: if artificial intelligence can run directly on personal devices, the world’s sprawling, multi-trillion-dollar data centre buildout may no longer make economic sense.
What exactly is Aravind Srinivas warning about?
Srinivas argues that centralized data centres could become redundant if AI workloads shift from cloud-based server farms to chips embedded in personal devices such as laptops, tablets, and smartphones. If AI can process, learn, and act locally, the dependence on massive remote infrastructure diminishes dramatically.
Why does this challenge today’s AI investment boom?
Tech giants including Microsoft, Google, Amazon, and Meta are collectively spending hundreds of billions of dollars annually on AI infrastructure. Srinivas questions the logic of committing between $500 billion and $5 trillion globally if consumer devices can eventually handle most AI tasks themselves. In that scenario, centralized capacity risks being underutilized.
What is the core threat to the data centre business model?
The threat lies in where computation happens. Srinivas believes the biggest disruption comes when AI runs directly on device chips, removing the need to constantly send data to distant servers. This would upend the assumption that large-scale inference must always occur in centralized facilities.
Essays by Shashi Tharoor, Sumana Roy, Ram Madhav, Swapan Dasgupta, Carlo Pizzati, Manjari Chaturvedi, TCA Raghavan, Vinita Dawra Nangia, Rami Niranjan Desai, Shylashri Shankar, Roderick Matthews, Suvir Saran
How does on-device AI differ from cloud-based AI?
On-device AI keeps data entirely local, offering strong privacy benefits and eliminating the security risks tied to constant data transmission and authentication. It also reduces latency, enabling faster, more responsive experiences while allowing AI systems to learn individual user behaviour through test-time training.
What kinds of tasks would benefit most from local AI processing?
Everyday, repeatable tasks—email handling, web research, app navigation, scheduling, and workflow automation—are best suited for device-based AI. These use cases prioritise speed, customization, and privacy over raw computational scale.
Which companies are best positioned for this shift?
Apple stands out due to its energy-efficient silicon, such as the M-series chips, and tight integration between hardware and software. Chipmakers like Qualcomm, along with PC and device manufacturers including Samsung, Lenovo, and HP, could also benefit by shipping AI-ready devices at scale.
What are the biggest technical obstacles today?
Current mobile and laptop processors still struggle to run advanced AI models without draining batteries or generating excessive heat. Memory bandwidth and thermal constraints remain major bottlenecks. Srinivas acknowledges that no existing model can yet handle complex tasks reliably and efficiently on-device.
How big is the infrastructure at risk?
The global data centre industry is expanding toward nearly 100 gigawatts of AI capacity. A single one-gigawatt facility can cost around $80 billion to build and equip. Countries like Saudi Arabia have announced $100 billion AI programs built around centralized infrastructure, bets that assume cloud dominance will persist.
What does this mean for India’s technology ecosystem?
India’s massive mobile user base produces enormous data volumes. Srinivas argues that India must build indigenous AI capabilities to retain data sovereignty. However, widespread on-device AI could reduce the need for domestic data centres by limiting how much data ever leaves user devices.
Should companies abandon data centres altogether?
Not yet. Training large models and running highly compute-intensive workloads will still require centralized facilities. Experts suggest a hybrid future, where cloud and edge computing coexist, with companies carefully deciding which workloads truly need centralization.
Will data centres disappear or evolve?
Srinivas believes data centres won’t vanish overnight but could lose dominance. The real shift will be toward local, private, and instant AI experiences. If that transition accelerates, companies that overbuilt centralized infrastructure may face long-term underutilization and weaker returns.
(yMedia is the content partner for this story)