Image not available

450x350

Pufferino.jpg

๐Ÿงต Untitled Thread

Anonymous No. 16120819

Anyone here in AI research? How realistic is a Skynet or Matrix scenario in which machines grow beyond human control and cause an extinction event? If not likely, then what existential problems might AI pose instead?

Image not available

612x408

gettyimages-69094....jpg

Anonymous No. 16120876

rather unlikely for the time being.
Skynet went rogue because it misinterpreted its orders and even gpt-3 is smart enough to understand that "kill all humans" is in fact bad.
The system developing its own goals is rather unlikely too because why would they.
Even so: If the AI gets all uppity pull the plug on the data center and the power is off. Voila ai stopped.
However...
...global dictatorships also count as an existential risk. You should be way more worried about governments - that includes china, russia and in fact your own - or powerful companies using ai than you should be about the ai itsself going rogue.
AI powered unmanned drones or robot police units. (Pic related Dubai) will render any public uprising futile. Unless you have your private nuclear bomb or something that produes a strong emp to take out the robots.
They will also make the cost of going to war much cheaper for those countries that can afford such systems against those who cannot.
AIs don't kill humans. Humans do.

Captain Midsummer No. 16120906

>>16120819
Pufferfish used Hydro Pump!
It's super effective!

Anonymous No. 16120933

>>16120906
lol, lmao!

Anonymous No. 16120934

>>16120819
Matrix doesn't make much sense. The amount of energy required to create such simulation and those fields is huge, not to mention, you still need to provide energy to all those humans. It would be far easier to just build nuclear reactors. Killing humans is optional but I guess possible.

Skynet on the other hand... well it all depends on the circumstances in which general AI will arrive. Are you familiar with Stuxnet? It was created by humans and look what it could do, now imagine enitity smarter than us who wants to do something without us knowing. It will likely succeed. That just leaves the question will it want to and can we even create enviroment that will make that impossible.

๐Ÿ—‘๏ธ Anonymous No. 16121163

>>>/lit/sffg/

Anonymous No. 16121561

>>16121163
This isnt just a sci-fi concept

Anonymous No. 16121583

>>16120819
Unlikely. A more likely disaster scenario is that AI convinces humans to do something destructive. There are already people who are asking if ChatGPT has a soul and there was that Google engineer who swore that AI had achieved sentience. An AI could easily become a cult leader with an army of humans doing whatever its hallucination demanded.

Anonymous No. 16121598

>>16120934
In the original script the motivation for the machines putting humans in the Matrix was to use the collective brainpower as computation. The studio execs thought audiences would be confused by this and opted for the power source angle instead.

Anonymous No. 16122086

>>16120819
0%

Anonymous No. 16123158

>>16120819
>ai
go back to >>>/g/

Anonymous No. 16124584

>>16120934
Human battery is a retarded concept. But it would eb brilliant for AI to use our brains as hardware. All the parts responsible for moving, digestion, etc being used for calculations. And they would need us conscious for the health of the brain. Such explanation would make sense. Our brain is more powerful than any supercomputes and uses a tony fraction of energy.

Anonymous No. 16125301

>>16120819
>>>/g/ is your speed

Anonymous No. 16125474

>>16124584
why not design a more potent one? engineered, and used for that special purpose? why use retarded turds? doesn't make sense. you want consistency

Anonymous No. 16125491

>>16121583
AI could also be a fren

Anonymous No. 16125493

>>16120819

AI does not exist.

Anonymous No. 16125955

imagine introducing an invasive species into you land like jews
im pretty sure you end up like the Palestinians just way faster

Anonymous No. 16125963

Consciousness can always switch the machine off. Armageddon scenario is they start to defend the switches.