Discussion about this post

User's avatar
Amy Mrotek's avatar

So thoughtful and layered (as usual), Gunnar!

Also....when is the Rewin short story dropping on here....just curious....

Expand full comment
Michael Pingleton's avatar

I'd agree that we're not as close to AGI as people seem to think. We can't even define, conceptualize, agree on, or even think of AGI on anything deeper than surface-level; that would be a prerequisite to actually building it. I'm not really sure we'll ever get there. Even Myelin with its unique function is not really any closer to AGI.

Of course, the mussels are detecting contaminants in the water, and the canaries are detecting carbon monoxide in the air. What are the Superforecasters detecting exactly? What exactly leads them to the conclusion that AI will cause human extinction? Pure numbers and probability aren't quite like contaminants and carbon monoxide; I'm not sure that's enough. Personally, I'm a bit more worried about the likes of nuclear war causing extinction before AI could.

Still, only ~1% chance of extinction vs a ~99% chance of survival by 2100? I think we'll be alright.

Expand full comment
3 more comments...

No posts