Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3
From TechCrunch: Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future.
Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the...