Market Cap: $3.3286T 0.180%
Volume(24h): $65.8056B -33.100%
  • Market Cap: $3.3286T 0.180%
  • Volume(24h): $65.8056B -33.100%
  • Fear & Greed Index:
  • Market Cap: $3.3286T 0.180%
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
Top News
Cryptos
Topics
Cryptospedia
News
CryptosTopics
Videos
bitcoin
bitcoin

$108166.236572 USD

0.05%

ethereum
ethereum

$2515.590590 USD

-0.11%

tether
tether

$1.000285 USD

-0.01%

xrp
xrp

$2.216184 USD

-0.32%

bnb
bnb

$655.165841 USD

0.05%

solana
solana

$147.119253 USD

-0.66%

usd-coin
usd-coin

$1.000016 USD

0.00%

tron
tron

$0.283596 USD

0.48%

dogecoin
dogecoin

$0.163766 USD

0.36%

cardano
cardano

$0.572467 USD

-0.10%

hyperliquid
hyperliquid

$39.371467 USD

0.63%

sui
sui

$2.897088 USD

0.28%

bitcoin-cash
bitcoin-cash

$487.010658 USD

0.33%

chainlink
chainlink

$13.191270 USD

-0.04%

unus-sed-leo
unus-sed-leo

$9.039695 USD

-0.03%

Cryptocurrency News Articles

Ethereum Co-Founder Vitalik Buterin Weighs in on AI Regulation, Proposes to Halt Globally Accessible Computational Resources

Jan 06, 2025 at 07:17 pm

Citing the dangerous form of AI superintelligence that's taking over, Ethereum co-founder Vitalik Buterin proposed to temporarily halt globally accessible computational resources just to “buy more time for humanity”.

Ethereum Co-Founder Vitalik Buterin Weighs in on AI Regulation, Proposes to Halt Globally Accessible Computational Resources

Highlighting the pressing danger posed by superintelligent AI, Ethereum co-founder Vitalik Buterin has proposed drastic measures to curb its development. In a recent blog post, Buterin argues that we have only five years before the arrival of super-intelligent AI, and there's no assurance that its impact will be beneficial.

To address this critical juncture, Buterin proposes a "soft pause" on industrial-scale computer hardware, presenting it as an alternative to halting AI development entirely. He suggests reducing global computing power by 99% for one or two years to "buy more time for humanity to prepare."

In his view, his initial post introducing "defensive accelerationism" (d/acc) included only "vague appeals to avoid building risky forms of superintelligence." Now, he aims to present more concrete thoughts on tackling scenarios "where AI risk is high."

Buterin also weighs in on regulating the AI industry, proposing a hardware "soft pause" only if convinced that liability rules alone are insufficient. Under such rules, individuals or organizations using, deploying, or developing AI could be held legally accountable for damages caused by their models.

He highlights proposals for monitoring AI development, such as identifying the physical locations of AI chips and mandating their registration. To control industrial-scale AI hardware, Buterin suggests integrating chips that require weekly authorization from three major international bodies.

"The signatures would be device-independent (if desired, we could even require a zero-knowledge proof that they were published on a blockchain), so it would be all-or-nothing. There would be no practical way to authorize one device to keep running without authorizing all other devices," Buterin explains.

The d/acc concept, endorsed by Buterin, promotes a cautious and measured approach to technological development, contrasting with effective accelerationism (e/acc), which advocates for rapid and unrestricted technological advancements.

Disclaimer:info@kdj.com

The information provided is not trading advice. kdj.com does not assume any responsibility for any investments made based on the information provided in this article. Cryptocurrencies are highly volatile and it is highly recommended that you invest with caution after thorough research!

If you believe that the content used on this website infringes your copyright, please contact us immediately (info@kdj.com) and we will delete it promptly.

Other articles published on Jul 07, 2025