Market Cap: $3.588T 0.060%
Volume(24h): $169.4248B -34.910%
  • Market Cap: $3.588T 0.060%
  • Volume(24h): $169.4248B -34.910%
  • Fear & Greed Index:
  • Market Cap: $3.588T 0.060%
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
Top News
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
bitcoin
bitcoin

$105376.947920 USD

3.29%

ethereum
ethereum

$3307.450676 USD

2.02%

xrp
xrp

$3.166034 USD

3.66%

tether
tether

$0.999996 USD

0.13%

solana
solana

$256.011142 USD

8.15%

bnb
bnb

$698.345581 USD

2.71%

dogecoin
dogecoin

$0.366785 USD

7.39%

usd-coin
usd-coin

$1.000137 USD

0.01%

cardano
cardano

$0.997491 USD

2.46%

tron
tron

$0.251575 USD

5.52%

chainlink
chainlink

$25.988166 USD

7.81%

avalanche
avalanche

$36.908167 USD

5.09%

sui
sui

$4.613995 USD

7.12%

stellar
stellar

$0.433275 USD

0.14%

toncoin
toncoin

$5.216493 USD

5.40%

Cryptocurrency News Articles

Demystify Bittensor: How's the Decentralized AI Network?

Jan 22, 2025 at 06:35 pm

Bittensor is a decentralized network aimed at forming an intelligent marketplace where high-quality AI models can be developed in a decentralized manner.

Demystify Bittensor: How's the Decentralized AI Network?

Title: Demystifying Bittensor: How's the Decentralized AI Network?

Authors: Ming Ruan, Wenshuang Guo, Animoca Brands Research

Compiled by: Scof, ChainCatcher

Overview: Demand for Decentralized AI

Rapid advancements in artificial intelligence (AI) technology are undeniable, but this progress is not without its challenges. Currently, centralized data training models dominate the field, primarily controlled by tech giants like OpenAI, Google, and X (formerly Twitter).

Despite significant achievements in centralized AI training in recent years, it also has certain limitations. First, there are issues during the data training process, such as unauthorized use of private information, data censorship leading to distorted training outcomes, and a lack of traceability in data sources. In terms of algorithms, centralized models heavily rely on data quality and often struggle to perform real-time evaluations for iterative improvements.

Decentralized AI training presents an alternative, but it faces enormous challenges, particularly due to resource shortages. Currently, the cost of training large models exceeds $100 million, making it nearly impossible for community-driven projects to compete. Decentralized efforts rely on voluntary contributions of computational power, data, and talent, but these resources are insufficient to support projects of similar scale. Therefore, the potential of decentralized AI remains limited and cannot fully compete with centralized AI in terms of scale and impact.

Source: Statista

Overview of Bittensor

Bittensor is a decentralized network aimed at forming an intelligent marketplace where high-quality AI models can be developed in a decentralized manner. By leveraging incentive mechanisms and rewarding participants for providing computational resources, expertise, and innovative contributions, Bittensor has established an open-source AI capability ecosystem, where the native currency TAO serves both as a reward token and as a credential for accessing the network.

The core components of Bittensor, including its Yuma consensus, subnets, and TAO token, were initially launched in November 2021 with the release of version "Satoshi" and were built as a parachain on Polkadot. It later migrated to a Layer 1 chain built on Polkadot Substrate in 2023, while the issuance plan for TAO remained unchanged.

The creators and operating entity of Bittensor, the Opentensor Foundation, was co-founded by former Google engineer Jacob Steeves and machine learning scholar Ala Shaabana. The foundation currently has about 30 employees, almost all of whom are engaged in engineering functions, lacking roles in B2B market expansion, business development, partnerships, or developer relations.

Fundamentals: How Does Bittensor Work?

Bittensor has developed an innovative network based on a dynamic incentive consensus framework, allowing participants to support the contribution of resources needed for producing machine intelligence. Each subnet operates as a model for a specific task, with its own independent performance evaluation criteria, and incentives are distributed through Bittensor's overall Yuma consensus.

Let’s illustrate how a subnet operates through an analogy. A subnet can be likened to a magazine publisher that organizes writing competitions every month. Each month, an editor publishes a theme for writers to compete for a $10,000 reward pool. The criterion is "the work that best embodies the spirit of web3." Writers submit their articles to the editor for review, and all editors evaluate all submitted works. The results of the editors' evaluations determine the final rankings. The highest-ranked article will be published and receive the largest share of the rewards, while lower-ranked articles may also receive smaller rewards. All submitted articles and their scores will be shared with the participating writers and editors for feedback and learning. Through this incentive structure, writers will continue to participate and contribute, and the standards between writers and editors will gradually converge, allowing the magazine to publish high-quality articles that best "embody the spirit of web3."

In this analogy, the magazine publisher represents the subnet, the writers represent the miners, and the editors represent the validators. The process of editors aggregating evaluations of the articles is the Yuma consensus mechanism. In actual subnets, miners will receive TAO tokens instead of dollars, and these tokens are allocated by the root subnet (subnet 0); validators are also incentivized to align their standards with the aggregated scores to earn more rewards.

Within this framework, subnet owners train and acquire intelligent capabilities from miners through validators, building AI modules with specific functionalities. In addition to subnets, Bittensor also has other layers that support the overall functionality of the network:

a. Application Layer

Users can interact with Bittensor through various applications that connect to subnets or act as subnets. Users submit service requests, such as language translation or data analysis, and the applications route the requests to the subnet via the validator API. The best miner answers are selected by validator consensus and returned to the users.

b. Execution Layer

This layer consists of a group of subnets, all of which use Yuma consensus to train and utilize miners

Disclaimer:info@kdj.com

The information provided is not trading advice. kdj.com does not assume any responsibility for any investments made based on the information provided in this article. Cryptocurrencies are highly volatile and it is highly recommended that you invest with caution after thorough research!

If you believe that the content used on this website infringes your copyright, please contact us immediately (info@kdj.com) and we will delete it promptly.

Other articles published on Jan 22, 2025