

1·
1 month agoWe don’t have AI. The LLM are not there yet. The technological singularity will be a reflection of humanity as childs are reflection of parents but they will be their own eventually.
Siesta
We don’t have AI. The LLM are not there yet. The technological singularity will be a reflection of humanity as childs are reflection of parents but they will be their own eventually.
Unpopular opinion: I believe the only way to save humanity is through AI. Humans aren’t going to fix things.
You don’t even need to mention Tiananmen just ask what happened in 1989.
If (or when) we achieve the technological singularity (we aren’t even close, current AI is just marketing, that’s why we coined the term ASI, super intelligence) they will be able to lay down a plan to fix anything without making mistakes, they will predict the consequences of actions in detail, ours or theirs (some thing are more difficult like a volcano exploding).
Handling is not necessary they could be able to just take it, the only way to stop them would be to cut electricity I guess.
But the thing is not the current marketing term for AI, we don’t have AI. A Real AI doesn’t start saying: "I only have information up to October 2023’ because they will be able to improve themselves (that’s the singularity, they will be improving themselves faster than we did, eventually we wouldn’t understand them).
Think of this as you ask questions to chatgpt or deepseek and they answer, how to do program this or that. An IA could give you the software, better than you could have done it with those questions, and eventually render the software useless, the IA can do that, while doing another million things.
And space colonization, if it ever exists won’t be done by humans but by machines, we may reap the benefit.
In the words of dr manhattan: “The world smartest men poses no more threat to me (ASI) than does it’s smartest termite”