Market Cap: $2.6844T 0.700%
Volume(24h): $104.1076B 9.910%
  • Market Cap: $2.6844T 0.700%
  • Volume(24h): $104.1076B 9.910%
  • Fear & Greed Index:
  • Market Cap: $2.6844T 0.700%
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
Top News
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
bitcoin
bitcoin

$82951.790245 USD

-0.70%

ethereum
ethereum

$1791.465527 USD

-1.83%

tether
tether

$0.999717 USD

-0.01%

xrp
xrp

$2.055970 USD

0.14%

bnb
bnb

$593.238692 USD

-1.32%

usd-coin
usd-coin

$1.000032 USD

0.02%

solana
solana

$115.381354 USD

-4.13%

dogecoin
dogecoin

$0.161732 USD

-2.67%

cardano
cardano

$0.649656 USD

-0.44%

tron
tron

$0.239261 USD

1.04%

unus-sed-leo
unus-sed-leo

$9.561241 USD

1.74%

toncoin
toncoin

$3.530703 USD

-6.73%

chainlink
chainlink

$12.739766 USD

-3.87%

stellar
stellar

$0.259841 USD

-2.48%

avalanche
avalanche

$18.093210 USD

-3.52%

Cryptocurrency News Articles

Meta AI Chief Yann LeCun Has Lost Interest in LLMs, Looking to Next-Gen Architectures

Mar 20, 2025 at 12:06 am

Meta AI Chief Yann LeCun has said that he's no longer interested in LLMs, and is looking to next-generation AI architectures that'll be able to better model the real world.

Meta AI Chief Yann LeCun Has Lost Interest in LLMs, Looking to Next-Gen Architectures

Meta AI Chief Yann LeCun has said that he’s no longer interested in Large Language Models or LLMs, and is looking to next-generation AI architectures that’ll be able to better model the real world.

Speaking at the NVIDIA GTC 2025 event, LeCun said that these new AI architectures should enable AI to think more like humans, with persistent memory and the ability to think through complex problems.

“I am not interested anymore in LLMs,” LeCun said. “They are just token generators and those are limited because tokens are in discrete space. I am more interested in next-gen model architectures, that should be able to do 4 things: understand physical world, have persistent memory and ultimately be more capable to plan and reason.”

This isn’t the first time that LeCun has spoken about the limitations of LLMs. The models largely learn through text data, and he’s previously said that this is insufficient for achieving human-level AI.

“A typical large language model is trained with something on the order of 20 trillion tokens or words,” LeCun had said. “That’s about 10 to the power 14 bytes; one with 14 zeros behind it. It’s an enormous amount of information.”

“But then you compare this with the amount of information that gets to our brains through the visual system in the first four years of life, and it’s about the same amount. In four years, a young child has been awake a total of about 16,000 hours. The amount of information getting to the brain through the optic nerve is about 2 megabytes per second. Do the calculation and that’s about 10 to the power 14 bytes. It’s about the same. In four years a young child has seen as much information or data as the biggest LLMs.”

This, he adds, means that we’re never going to get to human-level AI by just training on text. We’re going to have to get systems to understand the real world.

LeCun has now said that he’s not interested in LLMs any more. Indeed, Meta itself hasn’t released a new version of Llama in a while, and DeepSeek and other companies have taken over the mantle of releasing the top open-source models.

LeCun’s approach seems to be in contrast to OpenAI’s, which has consistently maintained there’s no wall in scaling LLM capabilities, and has instead been coming up with add-ons to LLM training, like test-time compute, to increase their capabilities. It remains to be seen which approach works out, but it appears that Meta is bearish on LLMs, and seems to believe that it’ll take a new technological breakthrough to dramatically improve the performance of current AI systems.

Disclaimer:info@kdj.com

The information provided is not trading advice. kdj.com does not assume any responsibility for any investments made based on the information provided in this article. Cryptocurrencies are highly volatile and it is highly recommended that you invest with caution after thorough research!

If you believe that the content used on this website infringes your copyright, please contact us immediately (info@kdj.com) and we will delete it promptly.

Other articles published on Apr 05, 2025