Meantime, OpenAI itself has invested in its fair share of startups, including in humanoid robot companies FigureAI and 1X Technologies. OpenAI CEO Sam Altman has also put money into several AI health startups, like Thrive AI Health.
And OpenAI has a notable partnership with Apple to bring ChatGPT to some Apple products, though no financial terms have been disclosed about the deal.
It's no secret that companies are spending on AI — in fact, investments are estimated to hit $1 trillion in the coming years. But while Big Tech has publicized their investments into other companies, they've remained opaque on what they're spending on scaling their own AI offerings, from buying chips to hiring experts to building data centers.
Here's a list of the Big Tech companies and what we know about their AI investments.
But executives remained fairly cryptic about its AI spending in the company's second-quarter earnings call last week.
Executives haven't provided any hard numbers on the company's AI investments but continued to emphasize that its AI ambitions will produce long-term returns. For now, what we do know is Google spent $3 billion to build and expand its data centers, and $60 million to train AI on Reddit posts.
OpenAI
OpenAI's operating costs may be as high as $8.5 billion this year, according to The Information. That might be more than the startup can afford, raising concerns about the company's profitability — especially as its competitors offer free versions of their chatbots.
The costs include about $4 billion to rent server capacity from Microsoft, $3 billion to train the AI models with new data, and $1.5 million on labor costs, the report said.
Meta
Meta's CEO, Mark Zuckerberg, said he plans to purchase 350,000 Nvidia GPUs by the end of 2024, bringing Meta's GPU collection to roughly 600,000. Analysts have estimated the giant could have spent about $18 billion by the end of 2024.
A JPMorgan analyst said the company's costs, fueled by AI spending, could hit $50 billion by 2025, according to a report from Quartz.
The tech giant is expected to share more about its AI expenditures during its quarterly earnings call on Wednesday.
Apple
In a first-quarter earnings call, Apple's CFO Luca Maestri said the company spent about $100 billion over the last five years on research and development, according to a report from MarketWatch, but didn't split out how much of that was on AI. Maestri said the company would continue to invest in some areas and work with third parties in others.
"We obviously are pushing very hard on innovation on every front, and we've been doing that for many, many years," the CFO said, according to the report.
Microsoft also said it would continue to invest in AI and cloud services in its third-quarter earnings call in April. The giant's CFO, Amy Hood, said on the call that capital expenditures would increase "materially."
The company is expected to give more details about its AI spending in its fourth-quarter earnings call later today.
Amazon
Amazon is working on an AI chatbot to compete directly with OpenAI's ChatGPT, called Metis, and has been trying to make its own AI chips to compete with Nvidia, albeit with limited success.
Meantime, it plans to spend almost $150 billion in the coming 15 years on data centers, Bloomberg reported.
Amazon is also planning to invest up to $230 million into startups building generative AI-powered applications.
About $80 million of the investment will fund Amazon's second AWS Generative AI Accelerator program. The program will position Amazon as a go-to cloud service for startups building generative AI products.
A lot of the new investment is in the form of compute credits for AWS infrastructure, which means it isn't able to be transferred to other cloud service providers like Google and Microsoft.
xAI
Elon Musk said earlier this month that the latest version of xAI's chatbot Grok would train on 100,000 H100s. Nvidia's H100 GPUs help handle data processing for large language models, and the chips are a key component in scaling AI.
The GPUs are estimated to cost between $30,000 and $40,000 each, meaning xAI is spending between $3 and $4 billion on AI chips.
It's worth noting that it's not clear if Musk's company purchased those chips outright. It's possible to rent GPU compute from cloud service providers, and The Information reported in May that xAI was in discussions with Oracle about a $10 billion investment to rent cloud servers.