TelcoNews India - Telecommunications news for ICT decision-makers
Realistic modern data centers rows computer servers cooling cloud computing

Is OpenAI planning an AI cloud to rival AWS, Azure & Google?

Mon, 10th Nov 2025

Sam Altman, OpenAI's chief executive, has hinted that the company plans to offer its own cloud computing service for artificial intelligence. In a recent post on X (formerly Twitter), Altman wrote that OpenAI is exploring ways to "more directly sell compute capacity to other companies (and people)." He predicted that "the world is going to need a lot of 'AI cloud'" and said OpenAI is excited to provide it. This was the first public indication that OpenAI could become a cloud provider in its own right, moving beyond simply consuming computing power to selling it. If realized, the plan would put OpenAI in direct competition with established cloud giants like Amazon Web Services, Microsoft Azure and Google Cloud.

Compute costs

Behind OpenAI's interest in an "AI cloud" offering is the immense cost of running advanced AI models. The company's popular services – from the ChatGPT chatbot to large-scale AI systems for enterprises – require staggering computational resources. OpenAI has committed to spend over USD $1 trillion in the coming years on data centres and hardware to support its ambitions. That includes deals securing hundreds of thousands of cutting-edge chips. Such eye-watering infrastructure investments far outstrip OpenAI's current revenues, which are expected to reach roughly USD $20 billion on an annualized basis by the end of 2025. The company remains unprofitable as it pours money into growth. This raises a pressing question: how will OpenAI pay for all that compute? One answer may be to start renting out its computing capacity. Cloud services, once scaled up, can generate steady cash flow by selling excess capacity to a broad range of customers. Industry leaders like Amazon, Microsoft and Google have turned massive capital outlays on servers and data centres into highly profitable cloud businesses. By following a similar path, OpenAI could recoup its huge investments more quickly. It would not have to rely solely on AI software products to bring in revenue.

Cloud deals

OpenAI's push toward independence in computing has accelerated in recent months. In October, the firm underwent a major restructuring that loosened its ties with key partner Microsoft. Previously, Microsoft – which has invested USD $13 billion in OpenAI – held a contractual first right to supply OpenAI's cloud needs. That exclusivity has now been lifted, freeing OpenAI to tap other providers. Since then, OpenAI has moved swiftly to secure capacity wherever it can. Earlier this month it signed a seven-year deal worth USD $38 billion to use Amazon Web Services, gaining access to vast clusters of Nvidia processors hosted by Amazon. That move followed a reported agreement with Oracle to buy around USD $300 billion worth of cloud infrastructure over five years. OpenAI also agreed to spend a further USD $250 billion on Microsoft's Azure as part of the new arrangement. The startup has even explored Google's cloud offerings and partnered with smaller specialist providers like CoreWeave. In effect, OpenAI has become one of the world's most sought-after cloud customers – and now it appears eager to become a cloud vendor itself. Selling its own cloud service would mark a striking shift in strategy. OpenAI would go from relying on big tech platforms for compute to directly competing with those same partners for enterprise clients who need AI processing power.

Data centres

The idea of OpenAI running its own data centres has been brewing internally for some time. OpenAI's Chief Financial Officer, Sarah Friar, warned in September that the big cloud companies have been "learning on our dime" – benefiting from OpenAI's heavy usage of their infrastructure. Speaking at an investor conference, Friar outlined how OpenAI aims to do more of its computing in-house. She described a progression from buying capacity off the shelf, to co-designing systems with partners, and eventually to doing "first-party" builds of AI data centers. OpenAI has already embarked on a massive infrastructure project dubbed "Stargate" to construct its own AI supercomputing facilities. The company has begun building new server farms in partnership with investors like SoftBank and tech partners such as Oracle and NVIDIA. It plans to spend hundreds of billions of dollars on these facilities, including a flagship campus in Texas. Over time, this could give OpenAI direct control over a significant share of the hardware it needs. Friar acknowledged that constructing and operating data centres at this scale will take years – constrained by the availability of chips, energy and engineering talent – but the direction is clear. Once that infrastructure is in place, OpenAI can evolve from being just a client of other cloud platforms into a full-fledged provider of AI computing.

Investor view

The prospect of OpenAI launching a cloud service has major implications for the tech industry and for its investors. On one hand, it addresses growing concerns about how AI startups will justify their astronomical valuations. OpenAI, valued around USD $500 billion after its latest funding moves, has drawn comparisons to a potential tech bubble because of its massive spending and yet-unproven business model. Traditional cloud providers soothe investor fears by generating profit from their infrastructure. By contrast, AI-focused firms like OpenAI or Meta have been spending relentlessly on AI with far less obvious short-term returns. Indeed, Meta's heavy investment in AI hardware without a cloud business to monetise it has contributed to recent market scepticism. OpenAI's answer is to build a revenue stream from its infrastructure. If successful, an "AI cloud" offering could not only help finance OpenAI's own growth but also position the company as a competitor in one of tech's most lucrative markets. There are already signs of booming demand for AI-tailored cloud services. For instance, GPU cloud startup CoreWeave went public in 2025 and saw its valuation soar, reflecting the appetite for specialised compute. By leveraging its expertise and the cutting-edge systems it is deploying, OpenAI could attract businesses that need AI computing power but lack the resources to develop it themselves. Still, entering the cloud market pits OpenAI against formidable rivals with deep experience and entrenched client relationships. Even with its strong brand and technology, OpenAI will face a steep learning curve to become an enterprise cloud provider. Yet the company's leadership appears willing to take on that challenge as the next step in its evolution.

OpenAI's foray into the cloud arena represents a strategic turning point for the company. What began as a research lab pushing the boundaries of artificial intelligence is rapidly morphing into a broad-based tech enterprise. By creating its own AI-centric cloud platform, OpenAI would gain a dual identity – both as a pioneer of AI applications and as a backbone provider powering other organisations' AI initiatives.

Such a move could redefine competitive lines in the industry: allies like Microsoft might become direct competitors, and the balance of power in AI computing could shift. For now, Altman's bold hint of an OpenAI cloud service signals that the company sees opportunity in becoming more self-sufficient. If OpenAI can execute on this vision, it won't just be training AI models – it will be building the digital infrastructure on which the next generation of AI runs.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X