๐งต Untitled Thread
Anonymous at Sun, 30 Jun 2024 22:18:23 UTC No. 16262037
> be me
> looking into AGI startup
> ask the developer if their AGI is real or just a fancy wrapper for a chat model
> they don't understand
> pull out illustrated diagram explaining what is AGI and what is wrapper
> they laugh and say "it's a real AGI, sir"
> run evals
> it's just a fancy wrapper for a chat model
Anonymous at Sun, 30 Jun 2024 22:36:19 UTC No. 16262058
>>16262037
Real AI is centuries away
Anonymous at Sun, 30 Jun 2024 23:37:30 UTC No. 16262121
Frog thread bumped
Anonymous at Mon, 1 Jul 2024 00:08:58 UTC No. 16262146
I wish people would actually research pre chat model rlhf base models, they are so much better. Unfortunately using them requires talent so the skill floor is higher.
Anonymous at Mon, 1 Jul 2024 00:14:09 UTC No. 16262152
>>16262037
how about you go back to the 'ddit
Anonymous at Mon, 1 Jul 2024 00:36:23 UTC No. 16262170
Man what a shitty boring stale meme. Did you steal this from twitter?
Anonymous at Mon, 1 Jul 2024 03:36:57 UTC No. 16262379
>>16262146
got any good links or resources? i crave AI research that isn't just "perceptron but large"
Anonymous at Mon, 1 Jul 2024 03:54:22 UTC No. 16262403
>>16262379
no, anon, that's exactly what that still is. it's just modifying the training procedure. while more of a customization of the resulting probability distributions in the output than just feeding data in, it's still just regurgitating a probability function, just now with "reinforcement learning/human feedback" adjustments to it. perceptrons are basically the only architecture we can run or train efficiently, and have been for over half a century.
we actually know biological brains don't work like that, assuming neuron/synapse architecture is the "source" of intelligence. real brains have feedback paths and cyclical structures that perceptrons lack (between each perceptron layer is a complete bipartite digraph that only moves in one direction; there's no way for information to move anywhere but to the outputs - no, backpropagation isn't the same thing, that's a process outside the perceptron to adjust the weights during training), and we legitimately have no idea how to train artificial neural nets with cyclic subgraphs efficiently like we do with perceptrons. we also have no idea how to train even perceptrons in real time. and we've been stuck there for half a century.