From PC World: You can’t find a new Nvidia graphics card for love nor money. Between pent-up demand from PC gamers and Nvidia selling every GPU it can to the bubbling AI industry, new models are going out of stock in a matter of minutes — and it looks like the situation isn’t going to improve any time soon, as the biggest AI company around wants even more hardware.
OpenAI CEO Sam Altman took to the social network formerly known as Twitter (spotted by Tom’s Hardware) to say that OpenAI’s ChatGPT version 4.5 is ready to go… but desperately in need of even more hardware. The “giant, expensive model” requires even more data center capacity than older versions, and to launch with enough access for paid users, the company is gobbling up GPUs at an even faster rate.
The CEO claims that OpenAI is adding “tens of thousands of GPUs next week” for the planned rollout, with hundreds of thousands following soon after. He expects the system to still be taxed to maximum capacity.
Now, it’s not as if OpenAI, Microsoft, Meta, et al. are shopping at Best Buy, yanking retail graphics cards out of the hands of crying PC gamers. (No, it’s scalpers doing that.) It’s certainly possible to build AI data center hardware out of consumer-grade electronics, but these companies are generally placing industrial orders directly with Nvidia, AMD, and others. In other words, this isn’t an exact one-to-one comparison with the GPU shortage surrounding the cryptocurrency boom a few years ago.
View: Full Article