|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Cryptocurrency News Articles
Large Concept Models (LCMs) Offer Some Exciting Prospects
Jan 07, 2025 at 11:13 am
In today's column, I explore an intriguing new advancement for generative AI and large language models (LLMs) consisting of moving beyond contemporary words
Large concept models (LCMs) offer some exciting prospects. In today’s column, I explore an intriguing new advancement for generative AI and large language models (LLMs) consisting of moving beyond contemporary words-based approaches to sentence-oriented approaches.
The extraordinary deal is this. You might be vaguely aware that most LLMs currently focus on words and accordingly generate responses on a word-at-a-time basis. Suppose that instead of looking at the world via individual words, we could use sentences as a core element. Whole sentences come into AI, and complete sentences are generated out of AI.
To do this, the twist is that sentences are reducible to underlying concepts, and those computationally ferreted-out concepts become the esteemed coinage of the realm for this groundbreaking architectural upheaval of conventional generative AI and LLMs. The new angle radically becomes that we then design, build, and field so-called large concept models (LCMs) in lieu of old-fashioned large language models.
Let’s talk about it.
This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI including identifying and explaining various impactful AI complexities (see the link here). For my coverage of the top-of-the-line OpenAI ChatGPT o1 and o3 models and their advanced reasoning functionality, see the link here and the link here.
There is an ongoing concern in the AI community that perhaps AI researchers and AI developers are treading too much of the same ground right now. We seem to have landed on an impressive architecture contrivance for how to shape generative AI and LLMs and few want to depart from the success so far attained.
If it isn’t broken, don’t fix it.
The problem is that not everyone concurs that the prevailing architecture isn’t actually broken. By broken — and to quickly clarify, the issue is more of limitations and constraints than it is one of something inherently being wrong. A strong and vocal viewpoint is that we are hitting the topmost thresholds of what contemporary LLMs can accomplish. There isn’t much left in the gas tank, and we are soon to hit a veritable wall.
As such, there are brave souls who are seeking alternative architectural avenues. Exciting but a gamble at the same time. They might hit the jackpot and discover the next level of AI. Fame and fortune await. On the other hand, they might waste time on a complete dead-end. Smarmy cynics will call them foolish for their foolhardy ambitions. It could harm your AI career and knock you out of getting that sweet AI high-tech freewheeling job you’ve been eyeing for the longest time.
I continue to give airtime to those who are heads-down seriously aiming to upset the apple cart. For example, my analysis of the clever chain-of-continuous thought approach for LLMs merits dutiful consideration, see the link here. Another exciting possibility is the neuro-symbolic or hybrid AI approach that marries artificial neural networks (ANNs) with rules-based reasoning, see my discussion at the link here.
There is no doubt in my mind that a better mousetrap is still to be found, and all legitimate new-world explorers should keep sailing the winds of change. May your voyage be fruitful.
The approach I’ll be identifying this time around has to do with the existing preoccupation with words.
Actually, it might be more appropriate to say a preoccupation with tokens. When you enter words into a prompt, those words are converted into numeric values referred to as tokens. The rest of the AI processing computationally crunches on those numeric values or tokens, see my detailed description of how this works at the link here. Ultimately, the AI-generated response is in token format and must be converted back into text so that you get a readable answer.
In a sense, you give words to AI, and the AI gives you words in return (albeit via the means of tokenization).
Do we have to do things that way?
No, there doesn’t seem to be a fundamental irrefutable law of nature that says we must confine ourselves to a word-at-a-time focus. Feel free to consider alternatives. Let your wild thoughts flow.
Here is an idea. Imagine that whole sentences were the unit of interest. Rather than parsing and aiming at single words, we conceive of a sentence as our primary unit of measure. A sentence is admittedly a collection of words. No disagreement there. The gist is that the sentence is seen as a sentence. Right now, a sentence happens to be treated as a string of words.
Give the AI a sentence, and you get back a generated sentence in return.
Boom, drop the mic.
Making sense of sentences is a bit of a head-scratcher. How do you look at an entire sentence and identify what the meaning or significance of the sentence is?
Aha, let’s assume that sentences are representative of concepts. Each sentence will
Disclaimer:info@kdj.com
The information provided is not trading advice. kdj.com does not assume any responsibility for any investments made based on the information provided in this article. Cryptocurrencies are highly volatile and it is highly recommended that you invest with caution after thorough research!
If you believe that the content used on this website infringes your copyright, please contact us immediately (info@kdj.com) and we will delete it promptly.
-
- The U.S. Securities and Exchange Commission (SEC) has taken legal action against Nova Labs, Inc., accusing the tech company of making misleading statements regarding its partnerships with prominent corporations, just days before SEC Chair Gary Gensler’s d
- Jan 21, 2025 at 02:45 am
- The U.S. Securities and Exchange Commission (SEC) has taken legal action against Nova Labs, Inc., accusing the tech company of making misleading statements
-
- Mukesh Ambani-led Reliance Jio Makes Surprise Entry Into the Web3 and Blockchain Universe by Launching JioCoin, a Digital Token That Will Be Issued on the Polygon Labs Network
- Jan 21, 2025 at 02:45 am
- The development comes following Reliance Industries subsidiary Jio Platforms' (JPL) strategic partnership with Polygon Labs, a homegrown developer arm of Polygon Protocols for its Web3 and blockchain debut.
-
- TRUMP and MELANIA Tokens Launched, Targeting the Cryptocurrency Market
- Jan 21, 2025 at 02:45 am
- Launched on January 17 token TRUMP became the 18th largest cryptocurrency by market capitalization. The new US President Donald Trump positions the coin as part of his initiative “America — first”, in which 80% of the tokens belong to his companies CIC Digital LLC and Fight Fight Fight LLC.
-
- 5thScape (5SCAPE): The Altcoin That Could Compete with DOGE and Lead the Market
- Jan 21, 2025 at 02:45 am
- As we enter 2025, savvy holders are eyeing affordable cryptocurrencies with the potential to dominate the market. One standout is 5thScape (5SCAPE), a cutting-edge virtual reality (VR) gaming and entertainment ecosystem poised to redefine the gaming industry.
-
- Solana (SOL) Blockchain Faced Network Congestion as TRUMP, MELANIA Meme Tokens Surged, Solaxy's Layer-2 Tech Could Prevent Future Disruptions
- Jan 21, 2025 at 02:45 am
- The Solana (SOL) blockchain faced significant network congestion on Monday as the rising popularity of Donald Trump-themed meme tokens, TRUMP and MELANIA, created an unprecedented surge in transaction volumes.
-
- Baby's unexplained death leaves parents and coroner baffled after 10p coin is found in his oesophagus
- Jan 21, 2025 at 02:45 am
- Hayden David Lewis Matthews tragically passed away at Royal Cornwall Hospital in 2023. An inquest held at Cornwall Coroner's Court today (Monday, January 20) sought to establish the circumstances around his unexplained death.