Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

hallucinations decrease with scale and reasoning, the model just gets better and stops making stuff up.


o1 still hallucinates badly though.


False, facts only need to be seen once, and one mis-step in reasoning and your CoT is derailed.


> one mis-step in reasoning and your CoT is derailed.

tell me you've never seen reasoning traces without telling me




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: