Image not available

300x168

just end it.jpg

๐Ÿงต Untitled Thread

Anonymous No. 16198625

The recent popular discourse around artificial intelligence involves concerns about safety for humanity, control of the technology, and trying to avoid doomsday scenarios which preserve human life. I would like to invite discussion for the exact opposite goal. How can we maximally use AI, together with other factors, to bring about human extinction?

The deployment of nuclear weapons would seem to be the major chip here. But I enjoy the idea that AI may come up with other scary shit on its own which may get completely out of control. How can we help it to get completely out of control, thereby bringing about the stated goal of human extinction?

Anonymous No. 16198632

>>16198625
>involves concerns about safety for humanity
Hoomans being insecure and fragile.

Anonymous No. 16198656

notice OP picrel and see how closely this transhumanist fantasyland garbage is associated with stupid hollywood sci-fi crap.
OP was mentally conditioned to believe in retarded transhumanist fantasies because of his exposure to sci-fi TV shows and other media junk, his IQ is too low for him to tell the difference between ideas from movies and irl