Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Human beings regularly hallucinate details that aren’t real when asked to provide their memories of an event, and often don’t realize they’re doing it at all. So whole AI definitely is lacking in the “can assess fact versus fiction” department, that’s an overlapping problem with “invents things that aren’t actually real”. It can, today, hallucinate accurate and inaccurate information, but it can’t determine validity at all, so it’s sometimes wrong even when not hallucinating.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: