Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doing any job for more than an hour without completely forgetting it's goals and tasks


How long do you expect LLMs/agents to be unable to do this?


Good question, I'm working on exactly this, I suppose you could call it the replacement of RAG.

It's actually not very easy to achieve this. I could give a very long winded answer (don't tempt me) but suffice to say it's a resolution problem.

All AI have a fixed resolution on creation. Long running tasks focus on a very particular narrowing space per step, the resolution required for an infinite task is infinite resolution.

No 9s of error will ever fix this.

Funny enough, small animals do this with ease so I strongly disagree the idea that our AI outcompete even small mammals in every way.


Personally, I think that phenomenon (along with "hallucinations") is fundamentally baked into LLMs writ large.

I think LLMs are a dead end on the path to AGI.


I think hallucinations are actually the sign that LLMs are far closer to a real brain than we realize.

I think hallucinations are a major unsearched gateway to AGI.


I agree. Whenever people complain about LLM hallucinations they behave like they never seen one in humans.

Not only humans hallucinate all the time, humans also have persistent hallucinations as evident from the presence of opposing beliefs in various slices of society.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: