2
0

AGI Thread


               
2025 Apr 22, 1:05pm   76 views  0 comments

by MolotovCocktail   follow (4)  

Few specialists understand that expecting an LLM to achieve superintelligence is no different from expecting an ant or a cockroach to achieve superintelligence. You can make a bigger cockroach, but complex, abstract intelligence requires more sauce than merely additional computational power. LLMs do not understand what words represent in the abstract, they merely predict what the next word in a sentence is likely to be. This is why LLMs have so much trouble adding numbers (as described in the video). LLMs do not understand numbers as concepts; they understand numbers solely as sentence fragments and attempt to guess what the next number is the same way they guess what the next word is.

https://open.substack.com/pub/alwaysthehorizon/p/how-to-build-an-agi-and-why-an-llm


no comments found

Please register to comment:

api   best comments   contact   latest images   memes   one year ago   users   suggestions   gaiste