From reading all the comments from the community, it’s amazing (yet not surprising) that all these managers have fallen for the marketing of all these LLMs. These LLMs have gotten people from all levels of society to just accept the marketing without ever considering the actual results for their use cases. It’s almost like the sycophant nature of all LLMs has completely blinded people from being rational just because it is shiny and it spoke to them in a way no one has in years.
On the surface level, LLMs are cool no doubt, they do have some uses. But past that everyone needs to accept their limitations. LLMs by nature can not operate the same as a human brain. AGI is such a long shot because of this and it’s a scam that LLMs are being marketed as AGI. How can we attempt to recreate the human brain into AGI when we are not close to mapping out how our brains work in a way to translate that into code, let alone other more simple brains in the animal kingdom.
One of the best written comments I’ve seen about this. LLMs are cool for what they can do, but anyone comparing them to AGI is just shilling and trying to make a fortune off of selling pickaxes in a gold rush with the only gold being fools gold.
From reading all the comments from the community, it’s amazing (yet not surprising) that all these managers have fallen for the marketing of all these LLMs
I don’t think LLMs will become AGI, but… planes don’t fly by flapping their wings. We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains. It’s quite possible if/when AGI is achieved, it will be completely alien.
Aircraft wings operate on pretty much the same principle as bird wings do. We just used a technology we had already developed (fans, essentially) to create the forward movement necessary to create the airflow over the wings for lift. We know how to do it the bird way too, but restrictions in material science at scale make the fan method far easier and less error prone.
From reading all the comments from the community, it’s amazing (yet not surprising) that all these managers have fallen for the marketing of all these LLMs. These LLMs have gotten people from all levels of society to just accept the marketing without ever considering the actual results for their use cases. It’s almost like the sycophant nature of all LLMs has completely blinded people from being rational just because it is shiny and it spoke to them in a way no one has in years.
On the surface level, LLMs are cool no doubt, they do have some uses. But past that everyone needs to accept their limitations. LLMs by nature can not operate the same as a human brain. AGI is such a long shot because of this and it’s a scam that LLMs are being marketed as AGI. How can we attempt to recreate the human brain into AGI when we are not close to mapping out how our brains work in a way to translate that into code, let alone other more simple brains in the animal kingdom.
One of the best written comments I’ve seen about this. LLMs are cool for what they can do, but anyone comparing them to AGI is just shilling and trying to make a fortune off of selling pickaxes in a gold rush with the only gold being fools gold.
This is probably related to automation bias and wishful thinking
I don’t think LLMs will become AGI, but… planes don’t fly by flapping their wings. We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains. It’s quite possible if/when AGI is achieved, it will be completely alien.
Aircraft wings operate on pretty much the same principle as bird wings do. We just used a technology we had already developed (fans, essentially) to create the forward movement necessary to create the airflow over the wings for lift. We know how to do it the bird way too, but restrictions in material science at scale make the fan method far easier and less error prone.