🧵 What is thought made of?
Anonymous at Tue, 21 May 2024 21:55:51 UTC No. 16186841
We have chat gpt which processes words, right. That's what it computes with. But how would you make a system that computes with ideas, thoughts? What is the fundamental quality that thoughts are made out of? If we're gonna make an AGI then you'd think we'd have to have the prerequisite for it to actually understand anything.
Anonymous at Tue, 21 May 2024 22:08:49 UTC No. 16186862
>>16186841
Anything? Like fear, despair, envy, disgust, suffering, hate and wrath?
Do you want to get brutally exterminated?
Anonymous at Wed, 22 May 2024 02:09:56 UTC No. 16187193
>>16186841
Just have it randomly fire off and evaluate itself. That's basically how the human mind works. Could draw from previous entries and generated images, mix and merge and apply them to current input.
Anonymous at Wed, 22 May 2024 02:27:23 UTC No. 16187220
>>16186841
I don't care what Scam Altman says, we aren't anywhere close to achieving AGI. LLMs are not capable of abstract reasoning.
Anonymous at Wed, 22 May 2024 02:29:07 UTC No. 16187223
>>16186841
Chatbots don't have meaning. Their training is superficial. Consider that ML actually can't have conversations with each other. This is paradoxical as conversations only every branch out.
Anonymous at Wed, 22 May 2024 03:17:06 UTC No. 16187267
>>16187223
>my particular goalpost
Your goalpost can be btfo by a parameter tweak.
>>16187220
>Sam Altman
This shit would be happening with or without graham’s cup-bearer.
Anonymous at Wed, 22 May 2024 03:33:30 UTC No. 16187290
>>16187267
NTA but you have absolutely no clue how these LLM systems function. There is no amount of added network depth or architecture which changes that these systems are just interpolators. They will literally never be able to produce anything more than an interpolation based on their training data. That's all they can possibly do as a result of the fundamentals of how network training and loss functions work.
Anonymous at Wed, 22 May 2024 03:40:43 UTC No. 16187306
>>16186841
Thoughts are not a scientific topic. They are beyond the reaches of it's epistemological tools.
Anonymous at Wed, 22 May 2024 06:23:33 UTC No. 16187413
>>16187306
By that logic, science is not a scientific topic from the view of a creationist because they are too dumb.
Anonymous at Wed, 22 May 2024 11:47:12 UTC No. 16187717
>>16187267
LLMs are glorified predictive text algorithms. they don't think or feel and they certainly aren't even close to being conscious. the sheer amount of handholding necessary for them to produce anything of value is so extreme that we're better off just talking to real people.
Anonymous at Thu, 23 May 2024 06:17:11 UTC No. 16189039
>>16186841
Imo the prerequisites for creating consciousness are
>sensory systems
>extensive processing of sensory information
>capability to produce something based on sensory information
>have sufficient memory and remember the past experiences
>capability to think causally / predict what is going to happen
Everything is then built upon this foundation
Anonymous at Thu, 23 May 2024 06:18:46 UTC No. 16189041
>>16186841
Neurons.
Anonymous at Thu, 23 May 2024 06:26:06 UTC No. 16189050
>>16189041
That's like saying words are made of transistors in a computer.
Anonymous at Thu, 23 May 2024 06:48:58 UTC No. 16189069
>>16186841
thoughts are made of words, either yours or somebody else's
https://m.youtube.com/watch?v=bPcgo
Anonymous at Thu, 23 May 2024 07:01:09 UTC No. 16189087
>>16189069
No, they are not, and I really can't fathom how some people are that stupid that they really believe that. Just in terms of contradictions that belief creates alone. Your ability to think must be severely deficient. Words come in addition to thoughts.
Anonymous at Thu, 23 May 2024 09:39:21 UTC No. 16189230
>>16189069
Plenty of people don't think in words. abstract concepts are easier and less limiting than words, plus needing to think a full sentence before being able to get to the end of a thought is absurd. why not just skip to the end of the sentence?
Anonymous at Thu, 23 May 2024 10:19:29 UTC No. 16189266
>>16189041
Not only. As far as I know electronical neuron network collapses upon itself after a certain amount of artificial neurons (it's been ten years since I had basics of AI in university, so my knowledge might be out of date). Glands play huge role in the communication between various parts of the body, I can't see why it wouldn't support thought processes as well.