Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>If you have a single system that can solve any problem any human can, I'd call that ASI

I don't think that's the popular definition.

AGI = solve any problem any human can. In this case, we've not reached AGI since it can't solve most FrontierMath problems.

ASI = intelligence far surpasses even the smartest humans.

If the definition of AGI has is that it's more intelligent than the average human, you can argue that we already have AGI today. But no one thinks we have AGI today. Therefore, AGI is not Claude 3.5.

Hence, I think the most acceptable definition for AGI is that it can solve any problem any human can.



>I don't think that's the popular definition.

People have all sorts of definitions for AGI. Some are more popular than others but at this point, there is no one true definition. Even Open AI's definition is different from what you have just said. They define it as "highly autonomous systems that outperform humans in most economically valuable tasks"

>AGI = solve any problem any human can.

That's a definition some people use yes but a machine that can solve any problem any human can is by definition super-intelligent and super-capable because there exists no human that can solve any problem any human can.

>If the definition of AGI has is that it's more intelligent than the average human, you can argue that we already have AGI today. But no one thinks we have AGI today.

There are certainly people who do, some of which are pretty well respected in the community, like Norvig.

https://www.noemamag.com/artificial-general-intelligence-is-...


>That's a definition some people use yes but a machine that can solve any problem any human can is by definition super-intelligent and super-capable because there exists no human that can solve any problem any human can.

We don't need every human in the world to learn complex topology math like Terence Tao. Some need to be farmers. Some need to be engineers. Some need to be kindergarten teachers. When we need someone to solve those problems, we can call Terence Tao.

When AI needs to solve those problems, it can't do it without humans in 2024. Period.

That's the whole point of this discussion.

The definition of ASI historically is that it's an intelligence that far surpasses humans - not at the level of the best humans.


>We don't need every human in the world to learn complex topology math like Terence Tao. Some need to be farmers. Some need to be engineers. Some need to be kindergarten teachers.

It doesn't have much to do with need. Not every human can be as capable regardless of how much need or time you allocate for them to do so. Then some humans are shoulders above peers in one field but come a bit short in another closely related one they've sunk a lot of time into.

Like i said, arguing about a one true definition is pointless. It doesn't exist.

>The definition of ASI historically is that it's an intelligence that far surpasses humans - not at the level of the best humans.

A Machine that is expert level in every single field would likely far surpass the output of any human very quickly. Yes, there might exist intelligences that are significantly more 'super' but that is irrelevant. Competence, like generality is a spectrum. You can have two super-human intelligences with a competence gap.


There is no one true definition but your definition is definitely the less popular one.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: