When I think ‘Nvidia,’ I think ‘games,’ but that’s far from the full picture. If we’re talking just cold hard cash, then AI is becoming an increasingly massive source of revenue for the company, bringing in $30 billion during Q2 and far over-shadowing the share of revenue brought in by games, totalling $2.88 billion during the same period. Yeah, okay, neither of those figures is small potatoes; Nvidia is a heavy-hitter in this arena for sureābut there’s about to be a rumble in the AI jungle.
Amazon has not been shy about its commitment to AI (and we’re not just talking about their deal-spitting robot Rufus). Besides investing in a nuclear future to power their electric dreams, the company is also eyeing up the throne when it comes to AI chips (via Financial Times).
In a move geared towards reducing their reliance on Nvidia, Amazon is looking to make good on the millions they’ve already spent in semiconductor investments and pool all of their chips towards, well, making their own. The hope is that Amazon-made chips will boost the efficiency of Amazon-owned data centres, thereby bringing down the cost of running them. That’s good news for the company’s own pockets, but customers of Amazon Web services may reap the benefits too.
So, where is Amazon flinging money specifically? In two words: Annapurna Labs, a start-up Amazon already scooped way back in 2015 for $350 million. Together they’ve been noodling away on the ‘Trainium 2’ AI chip for at least the last year, and it looks like the wait to get my grubby little, deeply AI-sceptical mitts on them is not long now.
The abysmal ‘Trainium’ name is about the fact these chips are designed to train the latest AI models, and as such are already being put through their paces by Anthropic. A competitor to OpenAI, Anthropic is another startup that’s enjoyed generous financial backing from Amazon (to the tune of $4 billion) among other investors.
Amazon is also cooking up another line of AI chips named ‘Inferentia,’ which the company claims is already proving to be 40% more cost-effective when it comes to generating AI responses. Meaning ‘inference’ in Latin, it’s definitely a better name than ‘Trainium’, but I do hope they don’t follow it up with chip lines called Suspiria, Tenebrae, or Lacrimosa (though an AI-themed return from horror director Dario Argento could be interesting).
Giallo daydreams aside, Amazon’s latest AI bid is not unique. Both Microsoft and Meta also want to dethrone Nvidia by making their own chips to better meet the demands of AI in their respective businesses. That’s already making for big motions in tech’s ocean, but other major players are likely to follow with a big splash.
As I’ve already said, AI has been a huge money-maker for Nvidia so it’s no surprise lots of other companies want to muscle in on that market. These companies are likely hoping for continued growth in the sector, but perhaps that’s blue-sky thinking. OpenAI’s co-founder reckons large language model learning is approaching a plateau, so perhaps that means this bubble will pop sooner than we think.