💢💢💢

🚀OpenAI CEO Sam Altman has dropped a key signal: Transformer may just be a transitional phase.

When many people discuss #AI nowadays, there's a default assumption:
The Transformer architecture is almost the endgame of computational paradigms.

After all, from GPT to nearly all mainstream large models, their cores are built upon Transformer.
This has gradually fostered a consensus within the entire industry—computing scale + Transformer is the sole path for AI's future.

But a recent remark by Sam Altman during a student exchange clearly challenges this consensus.

He believes Transformer is not the endpoint.

"I bet there will be a new architecture, and it will be better."

The significance of this statement far exceeds a mere technical prediction.

Because today, the investment logic of the entire AI industry chain largely revolves around Transformer:
Larger GPU clusters, longer context, larger model parameters, higher training compute.

From #NVIDIA GPUs, to cloud computing, to model training infrastructure, almost all resources are accelerating along the same path.

But if a new computational architecture emerges in the future, capable of significantly surpassing Transformer in efficiency, cost, or capability, then the entire AI competitive landscape could undergo a change akin to a "paradigm shift."

This has actually happened many times in history.

Neural networks were once deemed impractical, until deep learning emerged.
RNNs were once the mainstream sequence model, later replaced by Transformer.
Each architectural change redefines the winners.

Sam Altman's words are more like a reminder of one thing:
The hottest technology today is often just a bridge to the next stage.

Transformer is extremely successful, but success itself does not mean it is the final form.

Truly great opportunities often appear when "consensus is most stable."

When the entire market assumes a certain technology is the endgame, the innovation that truly changes the game is more likely to emerge on a different path.

If a new architecture emerges in the future that is more efficient, lower-cost, or even closer to human cognitive structure than Transformer, then the competitive advantages built around compute and model scale today could all be reshuffled.

Many students asked the question at the time:
"Are there still new opportunities in the AI era?"

Altman's answer was actually quite direct—

If everyone believes the rules of the game are set, that usually means the rules are not truly set yet.

Do you lean more towards believing that Transformer will dominate AI models long-term, or that it will be replaced by a new architecture?

The copyright of this article belongs to the original author/organization.

The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.