Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs mimic intelligence, but they aren’t intelligent.

They aren’t just intelligence mimics, they are people mimics, and they’re getting better at it with every generation.

Whether they are intelligent or not, whether they are people or not, it ultimately does not matter when it comes to what they can actually do, what they can actually automate. If they mimic a particular scenario or human task well enough that the job gets done, they can replace intelligence even if they are “not intelligent”.

If by now someone still isn’t convinced that LLMs can indeed automate some of those intelligence tasks, then I would argue they are not open to being convinced.



They can mimic well documented behavior. Applying an LLM to a novel task is where the model breaks down. This obviously has huge implications for automation. For example, most business do not have unique ways of handling accounting transactions, yet each company has a litany of AR and AP specialists who create semmingly unique SOPs. LLMs can easily automate those workers since they are simply doing a slight variation at best of a very well documented system.

Asking an LLM to take all this knowledge and apply it to a new domain? That will take a whole new paradigm.


Absolutely agreed, but I suspect that a whole lot of what humans do every day can be reduced to pattern-following.

If/when LLMs or other AIs can create novel work / discover new knowledge, they will be "genius" in the literal sense of the word.

More genius would be great! (probably) . But genius is not required for the vast majority of tasks.


> Applying an LLM to a novel task is where the model breaks down

I mean, don't most people break down in this case too? I think this needs to be more precise. What is the specific task that you think can reliably distinguish between an LLM's capability in this sense vs. what a human can typically manage?

That is, in the sense of [1], what is the result that we're looking to use to differentiate.

[1] https://news.ycombinator.com/item?id=44913498




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: