๐งต Untitled Thread
Anonymous at Tue, 25 Jun 2024 20:32:18 UTC No. 16253425
Ok, so, there are 86 billion neurons in the human brain. To fully simulate a single neuron at the biophysical model level is high balled to 10GB.
Okay, how optimized is the human brain, REALLY? Because with numbers like that, you could conservatively build a stacked LLM supercluster able to fully simulate a functional human brain itself, for less than a billion dollars.
Microsoft and OpenAI are building a $100 Billion dollar AI data center.
Doing any type of rough math on the hardware under that budget height, they will have the capacity to simulate neural activity of at least a whole human brain, no diff.
Dollars for dollars, silicon to neuron.
There are enough transistors in the theoretical data center to simulate vastly more interconnections, than the number of known electrochemical to synapse/neuronal inputs and outputs, in the typical human brain.
On top of all that, this year we have just started using the most powerful MRI machine ever, (It costed $70 million to build one.) it can take the most detailed scans of the human brain that we've ever been able to take. (How hard will it be for GPT6 to construct a human brain for simulation, if it gets trained with a few petabytes of scans from that fancy MRI machine? Better question, is GPT4 able to generate biologically sound replica brain scans, because you see, in real life we have GPT4, and that MRI machine.
Just a few thousand volunteers, and we can optimize for details like intelligence and pacifistic tendencies.) cont-
Anonymous at Tue, 25 Jun 2024 20:33:08 UTC No. 16253427
>>16253425
>Cont-
Now that could seem pie in the sky, but consider where we are at machine brain interfacing. Neuralink works, we have hard evidence of that.
The lines between brain and binary are going to cross soon, in a way they never have before.
Why should I not be concerned here?
Someone is going to start simulating things that are so much, drastically so much, more intelligent than a human, that we should probably be afraid of being marginalized, right?
I don't know what to do except switch into a human hand tradecraft like welding, because the bots to do it reliably will take a little bit longer to release than the lowest hanging fruit?
Wow, future seems really strange at the moment.
Anonymous at Tue, 25 Jun 2024 20:48:22 UTC No. 16253482
>>16253425
>To fully simulate a single neuron at the biophysical model level is high balled to 10GB.
whoever wrote that was a retard
Anonymous at Tue, 25 Jun 2024 21:10:36 UTC No. 16253555
>10gb
Lmao, imagine modelling all the 86 billion neuron with 10gb.
With all the computational power in the world, we cannot even simulate ~1000 neurons of C. elegans. See openworm project
Anonymous at Tue, 25 Jun 2024 21:20:39 UTC No. 16253584
>>16253555
Nvm I'm a retard. Disregard this
Anonymous at Tue, 25 Jun 2024 21:46:07 UTC No. 16253670
>>16253425
>To fully simulate a single neuron at the biophysical model level is high balled to 10GB.
Lmao it goes far deeper than simulating a neuron. It is the cohesion of all the various types of neurons in the brain that are thought to generate consciousness. Trying to get 2 or 3 of these simulated neurons to perform in a holistic and cohesive way sounds difficult. To get 100 million working in cooperation sounds nightmarish and impossible, and even if it was done, I seriously doubt we would be able to communicate with it in a meaningful way without destroying the system.
Anonymous at Tue, 25 Jun 2024 23:41:22 UTC No. 16253888
>>16253555
>simulate a single neuron
>10GB
that's what was typed
Anonymous at Wed, 26 Jun 2024 00:42:11 UTC No. 16254009
>>16253425
>Equating current computational frameworks with biological models
>With wrong numbers to boot
midwit moment
๐๏ธ Anonymous at Wed, 26 Jun 2024 03:42:56 UTC No. 16254184
>>16253425
>To fully simulate a single neuron at the biophysical model level is high balled to 10GB.
Not at all, it is determined to be worth of about 6 layers with 250 neurons eatch, or some such amount. That is much less than 10G
Anonymous at Wed, 26 Jun 2024 19:13:34 UTC No. 16255174
>>16253425
>>16253427
https://youtu.be/Pelrr__9qx8
Anonymous at Wed, 26 Jun 2024 22:06:36 UTC No. 16255511
>LLM fanboy midwits once again assume a 1:1 relationship between simulations and FNNs that doesn't exist and assume the existence of a complete model of neuron function that doesn't actually exist yet
modern FNNs trained on actual simulation data are interpolating inputs in ways constrained by the training results; they can point you to possible results/configurations that you may have missed, and do so very quickly compared to comprehensive simulation of the input domain, but nothing guarantees their search is comprehensive and you still have to do actual simulations to confirm it. perceptrons are not emulating the problem solving process, they are matching the distributions of the input-output relations they were trained on - this severely limits their ability to produce accurate "results" outside the scope of training inputs (i.e. to discover something new). you must still do the granular simulations or experiments unless all of this is true:
a) you understand the system well enough to design and simulate an approximated model of it (assuming it CAN be approximated)
b) the approximating simulation can itself be approximated by output matching from its past results (you still have to run the simulations first)
c) FNN output matching is more efficient than simply simulating the model again
popsci midwits seem to have this idea that if you fed a bunch of Navier-Stokes or three(or more)-body problem solutions into a FNN, you'd have an efficient solver for the problems - what you actually have is a probabilistic solution regurgitator. useless? no. mathematically very different from an actual solver? yes.
simulation of a system you don't understand well enough to approximate requires actual solvers. we don't understand neurons well enough to approximate them (or know how well they can be approximated), period. in "computing power" terms, until we have that understanding, any figures for "neuron simulation complexity" are guesswork at best.
Anonymous at Wed, 26 Jun 2024 23:15:10 UTC No. 16255613
>>16253425
Consciousness motherfucker, you can "simulate" a physical brain all you want but it won't work without an operator.
Anonymous at Thu, 27 Jun 2024 18:38:18 UTC No. 16256998
>>16253425
10gb of what?
Anonymous at Thu, 27 Jun 2024 18:45:52 UTC No. 16257023
The problem is analog to digital conversion. The brain is analog, it does not work in absolutes. You're trying to simulate a highly advanced analog computer in a digital one. And it's the same the other way around. A human brain can't do mathematical operations like a computer can.
Anonymous at Thu, 27 Jun 2024 18:46:29 UTC No. 16257026
>>16255613
>thing the brain produces within itself
>somehow necessary to create outside of the brain
??
Anonymous at Fri, 28 Jun 2024 11:47:54 UTC No. 16257998
>>16253425
>you could conservatively build a stacked LLM supercluster able to fully simulate a functional human brain itsel
No, you couldn't. You have no idea what you're even talking about.
Anonymous at Sat, 29 Jun 2024 07:36:49 UTC No. 16259442
This thread is very confusing
Anonymous at Sat, 29 Jun 2024 09:16:28 UTC No. 16259505
>>16256998
dedotated wam