1Wk·

Is the AI bubble really bursting...?

I've just read an article in Die Welt about $005930 I'll just take note of it for now, but will gladly leave it to the community for discussion.


"Then the bubble bursts" - the delicate consequences of the Samsung discovery


A new AI model from Samsung requires 10,000 times less power than current competitor series such as GPT or Gemini. This discovery, which has now been published, jeopardizes the hundreds of billions invested by Nvidia, Microsoft Google & Co and could prove to be a gigantic AI trap.

Wild deals in the AI sector have dominated the stock market news in recent weeks: Nvidia sold supercomputers worth 100 billion dollars to ChatGPT startup OpenAI in early September, gets shares in OpenAI in return. AMD concluded a similar shares-for-chips deal at the beginning of October, and its share price promptly shot up too. Whoever announces the largest data center supercomputer deal wins.

The stock market rewards AI companies for investing hundreds of billions of dollars in capital goods with a finite expiry date. But what if the investments in supercomputers were not necessary at all - if the same artificial intelligence could be developed with much less computing power?

This is the question posed by Alexia Jolicoeur-Martineau, Senior Researcher for Artificial Intelligence at Samsung's Advanced Institute of Technology in Montreal, Canada. This week, under the heading "Less is more", she published a self-developed artificial intelligence that uses just seven million parameters and is therefore ten thousand times smaller than the competing models from OpenAI, Google and other competitors. Nevertheless, according to its inventors, this mini AI outperforms state-of-the-art language models, including Google's current Gemini 2.5 or OpenAI's o3-mini, in some of the most difficult reasoning benchmarks in AI research.

The Samsung model breaks new ground - it first develops an approximate solution for a logic task, then works in repetitive loops to optimize the solution - and requires significantly less computing power than the competition. Samsung has published the program as open source software, and AI researchers from the competition can see for themselves that it works.

"The idea that you have to rely on massive base models trained by large companies for millions of dollars to solve difficult tasks is a trap," wrote Jolicoeur-Martineau on the social network X. "Currently, there is too much focus on using large language models instead of developing and expanding new directions

Whether the Samsung approach actually represents a revolution in the research of current thought models and whether it will stand up outside a relatively narrow circle of benchmark tests has not yet been tested by independent experts.

However, this does not change the fact that researcher Jolicoeur-Martineau is challenging the current AI boom with a fundamental question: What if a technological breakthrough in AI software were to suddenly devalue the billions invested in hardware?

What if the accumulation of more and more power-hungry AI chips is actually a mistake, if it is not the large LLM models but combinations of much smaller, specialized models that win the race for general artificial intelligence? Then the billion-dollar data centers full of Nvidia chips would be worth significantly less in one fell swoop - and the most important companies in the AI boom might have to be revalued.

The performance of the OpenAI model GPT 5 published a few weeks ago already shows that the marginal benefit of the training computing time used is currently decreasing significantly: Although the large language models are getting better and better, the time of big leaps in performance by bluntly using more and more computing power is over for the time being.

The next big advance is missing. If this really is based on the motto "less is more", then the billions planned to be invested in the coming years would be called into question and the data centers that have been built so far could be written off much faster than their builders had planned and hoped.

Unlike previous capital-intensive technical revolutions, unlike the fiber optic boom of the early 2000s or railroad construction from 1840 onwards, the investments made are only of a fleeting nature. Fiber optics in the ground remain valuable. Data centers without supercomputers, however, are little more than well-cooled industrial halls with very large power connections.

However, the supercomputers themselves, which account for around 60 percent of construction costs, have a half-life of just three years and are currently replaced after five to six years, as their high power consumption makes continued operation unprofitable. They have to recoup their construction costs within a very short time before they become worthless. The "Summit" supercomputer used by the US Department of Energy, for example, was the fastest computer in the world when it was commissioned in June 2018 thanks to its Nvidia chips. It was only operated until November 2024, after which it became obsolete

With their development, the Samsung researchers are not only questioning the current large models of the competition. They are indirectly questioning the entire current AI boom on Wall Street, which is primarily based on chip deals and the bartering of supercomputers for computing time. If the technology continues to advance as fast as it has so far, many investors, infrastructure companies and data center operators could be left with billions of dollars in investment ruins within a short space of time.

"In order to refinance the announced investments, ten new AI companies of the size and with the turnover of Google would have to grow up within a short period of time. I think that's unrealistic," says Damian Borth, AI researcher at the University of St. Gallen. Part of the basic work on small models, which the Samsung researchers are now building on, was carried out at his institute.

Borth has been observing for several months that "the neural scaling laws are becoming saturated" - in other words, that the rule that more and more computing power always results in more intelligent models is no longer valid.

"Light weight models, i.e. AI models that require significantly fewer parameters, are currently the most important trend in the industry," explains Borth. The trend was partly born out of technical requirements - models should run on smartphones alone - and partly out of necessity: Only relatively few researchers have access to enough computing time on supercomputers to train large models. This current research direction is now determining progress towards general artificial intelligence, the holy grail of AI research.

Borth therefore sees a discrepancy between market expectations of ever-increasing expenditure on AI infrastructure and current research: "The question is: do we even need such large models, so many parameters, to solve the majority of AI tasks? If not, then we don't need so many supercomputers."

However, Borth believes that as long as the big AI companies continue to uphold the vision of general artificial intelligence from supercomputers, which is set to arrive in the near future, the boom in data centers will continue to be financed. "This is a matter of national security alone - the US cannot afford to lose this race." But the moment faith in the supercomputer AGI wavers is also the moment when AI stocks on the stock market have to be harshly revalued. "Then the bubble bursts," says Borth. The AI model from Montreal could be the first pinprick in the bubble, with more likely to follow.

7
4 Comments

profile image
Almost burst today. Overeat once. And then almost burst my AI bladder because I almost didn't make it to the loo.

=> Bursting averted for the time being
8
profile image
@DonkeyInvestor almost had that too, makes it stronger 💪
1
profile image
10000x smaller than conventional models! Wow! If that's true, it would be another breakthrough.

But why should it be the end of the AI super-computing hells? Then the Samsung model will simply be applied to the AI chips and then they will be 10000x more effective.

What's the problem? Please help me out here! 🤔
2
I am not an AI expert or technician, but a banker in the late fall of his professional career and have therefore already experienced quite a bit. Whether there is a bubble, I don't know. Like so many new disruptive factors, we don't see the end of AI yet. AI is in the process of forming and finding itself. There will be results, which, as always, will produce winners and losers. We should not underestimate national security efforts and the market power of individual companies, which will influence developments in your direction. Therefore, I think you should observe the market and invest selectively and perhaps hedge downwards.
Join the conversation