Meta has published an update on how its Llama large language models are performing, and they’re apparently doing so well that they’re now “approaching 350 million downloads to date.” That’s 10 times more than the downloads it accumulated compared to the same time last year. Approximately 20 million of those downloads took place in the last month alone, after the company released Llama 3.1, its latest LLM that it says can now rival OpenAI’s and Anthropic’s.
The monthly usage of Llama grew ten times from January to July this year for some of Meta’s largest cloud service providers, the company said. From May to
→ Continue reading at Engadget