“We’re in an LLM bubble,” Hugging Face CEO says—but not an AI one

https://arstechnica.com/ai/2025/11/were-in-an-llm-bubble-hugging-face-ceo-says-but-not-an-ai-one/

Samuel Axon Nov 19, 2025 · 2 mins read
“We’re in an LLM bubble,” Hugging Face CEO says—but not an AI one
Share this

There’s been a lot of talk of an AI bubble lately, especially with regards to circular funding involving companies like OpenAI and Anthropic—but Clem Delangue, CEO of machine learning resources hub Hugging Face, has made the case that the bubble is specific to large language models, which is just one application of AI.

“I think we’re in an LLM bubble, and I think the LLM bubble might be bursting next year,” he said at an Axios event this week, as quoted in a TechCrunch article. “But ‘LLM’ is just a subset of AI when it comes to applying AI to biology, chemistry, image, audio, [and] video. I think we’re at the beginning of it, and we’ll see much more in the next few years.”

At Ars, we’ve written at length in recent days about the fears around AI investment. But to Delangue’s point, almost all of those discussions are about companies whose chief product is large language models, or the data centers meant to drive those—specifically, those focused on general-purpose chatbots that are meant to be everything for everybody.

That’s exactly the sort of application Delangue is bearish on. “I think all the attention, all the focus, all the money, is concentrated into this idea that you can build one model through a bunch of compute and that is going to solve all problems for all companies and all people,” he said.

Instead, he imagines the eventual outcome to be “a multiplicity of models that are more customized, specialized, and that are going to solve different problems.”

It’s of course important to note that his company is focused on being a GitHub-like repo for exactly those sorts of specialized models, including both big models put out there by companies like OpenAI and Meta (gpt-oss and Llama 3.2, for example) and fine-tuned variants that developers have adapted to specific needs or smaller models developed by researchers. That’s essentially what Hugging Face is about.

So yes, it’s natural that Delangue would say that. However, he’s not alone. In one example, research firm Gartner predicted in April that “the variety of tasks in business workflows and the need for greater accuracy are driving the shift towards specialized models fine-tuned on specific functions or domain data.”

Regardless of which way LLM-based applications go, investment in other applications of AI-by-the-current-definition is only just getting started. Earlier this week, it was revealed that former Amazon CEO Jeff Bezos will be co-CEO of a new AI startup focused on applications of machine learning in engineering and manufacturing—and that startup has launched with over $6 billion in funding.

That, too, could be a bubble. But despite that some of Delangue’s statements on the AI bubble discourse are clearly meant to prop up Hugging Face, there’s a helpful reminder in there: The overbroad term “AI” is a lot bigger than just large language models, and we’re still in the early days of seeing where these methodologies will lead us.