Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think ChatGPT was doing that too, at least to some extent, even a couple of years ago.

Around the same time as my successful "people sleeping in puddles of ketchup" prompt, I tried similar tricks with uh.... other substances, suggestive of various sexual bodily fluids. Milk, for instance. It was actually really resistant to that. Usually.

I haven't tried it in a few versions. Honestly, I use it pretty heavily as a coding assistant, and I'm (maybe pointlessly) worried I'll get my account flagged or banned something.

But imagine how this plays out. What if I honestly, literally, want pictures involving pools of ketchup? Or splattered milk? I dunno. This is a game we've seen a million times in history. We screw up legit use cases by overcorrecting.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: