Image not available

904x1083

1707411561972086.jpg

🧵 What is thought made of?

Anonymous No. 16186841

We have chat gpt which processes words, right. That's what it computes with. But how would you make a system that computes with ideas, thoughts? What is the fundamental quality that thoughts are made out of? If we're gonna make an AGI then you'd think we'd have to have the prerequisite for it to actually understand anything.

Image not available

1024x1024

1696808121143913.jpg

Anonymous No. 16186862

>>16186841
Anything? Like fear, despair, envy, disgust, suffering, hate and wrath?
Do you want to get brutally exterminated?

Anonymous No. 16187193

>>16186841
Just have it randomly fire off and evaluate itself. That's basically how the human mind works. Could draw from previous entries and generated images, mix and merge and apply them to current input.

Anonymous No. 16187220

>>16186841
I don't care what Scam Altman says, we aren't anywhere close to achieving AGI. LLMs are not capable of abstract reasoning.

Anonymous No. 16187223

>>16186841
Chatbots don't have meaning. Their training is superficial. Consider that ML actually can't have conversations with each other. This is paradoxical as conversations only every branch out.

Anonymous No. 16187267

>>16187223
>my particular goalpost
Your goalpost can be btfo by a parameter tweak.
>>16187220
>Sam Altman
This shit would be happening with or without graham’s cup-bearer.

Anonymous No. 16187290

>>16187267
NTA but you have absolutely no clue how these LLM systems function. There is no amount of added network depth or architecture which changes that these systems are just interpolators. They will literally never be able to produce anything more than an interpolation based on their training data. That's all they can possibly do as a result of the fundamentals of how network training and loss functions work.

Anonymous No. 16187306

>>16186841
Thoughts are not a scientific topic. They are beyond the reaches of it's epistemological tools.

Anonymous No. 16187413

>>16187306
By that logic, science is not a scientific topic from the view of a creationist because they are too dumb.

Anonymous No. 16187717

>>16187267
LLMs are glorified predictive text algorithms. they don't think or feel and they certainly aren't even close to being conscious. the sheer amount of handholding necessary for them to produce anything of value is so extreme that we're better off just talking to real people.

Anonymous No. 16189039

>>16186841
Imo the prerequisites for creating consciousness are
>sensory systems
>extensive processing of sensory information
>capability to produce something based on sensory information
>have sufficient memory and remember the past experiences
>capability to think causally / predict what is going to happen
Everything is then built upon this foundation

Anonymous No. 16189041

>>16186841
Neurons.

Anonymous No. 16189050

>>16189041
That's like saying words are made of transistors in a computer.

Image not available

1170x1134

IMG_6938.jpg

Anonymous No. 16189069

>>16186841
thoughts are made of words, either yours or somebody else's
https://m.youtube.com/watch?v=bPcgoaZjsu8

Anonymous No. 16189087

>>16189069
No, they are not, and I really can't fathom how some people are that stupid that they really believe that. Just in terms of contradictions that belief creates alone. Your ability to think must be severely deficient. Words come in addition to thoughts.

Anonymous No. 16189230

>>16189069
Plenty of people don't think in words. abstract concepts are easier and less limiting than words, plus needing to think a full sentence before being able to get to the end of a thought is absurd. why not just skip to the end of the sentence?

Anonymous No. 16189266

>>16189041
Not only. As far as I know electronical neuron network collapses upon itself after a certain amount of artificial neurons (it's been ten years since I had basics of AI in university, so my knowledge might be out of date). Glands play huge role in the communication between various parts of the body, I can't see why it wouldn't support thought processes as well.