Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From what I read (on /r/gamedesign, /r/proceduralgeneration and Gamasutra) it's about control.

The AAA studies leave nothing to chance when it is about the game play. Once "AI" speech comes to a level where the NPC will only answer with pre-sanctioned content they might jump onto the band wagon ... but seeing that it is very much possible to generate porn with "censored" stable diffusion ... we won't see "free" AIs in games any time soon.



I've worked in game development (AAA and large indie games) for almost eight years and I would generally take the narratives on game subreddits with a healthy helping of salt. They're largely speculative and uninformed. Gamasutra is better, but often game journalism is full of simplifications and rough analogies.

The main problem with this type of content is that it needs to be solid enough that it adds to the experience. If you're allowed to have a conversation with any NPC you need to keep track of what they know, their responses need to be consistent with the world, and the time to generate the response needs to be reasonable.

Old RPGs such as The Elder Scrolls: Daggerfall used to allow you to go up to any of thousands of NPCs and ask them about anything! And practically all the results were uninspiring cookie-cutter replies either saying they don't know or relaying some vague common knowledge on the subject. For me it lost its novelty very quickly. In Dwarf Fortress you can play Adventure Mode and go to any randomly generated NPC to talk about anything! Those NPCs are highly complex with intricate personalities, relationships, and emotional states. Their responses are... mainly the same type of response you'd get from Daggerfall. With a good helping of completely inconsistent replies. I once asked someone about another NPC, they said it was their husband. So I asked where that NPC was and they replied "I do not know who that is". Trying to talk to characters in AI Dungeon is similarly frustrating, with the added hilarity of the AI sometimes getting mixed up about who is who between you and them.

There have been a lot of experiments with more ambitious use of AI in games. The most well-known publicly disclosed failure being Oblivion's Radiant AI which was meant to be a lot more impressive than it turned out, because they had to seriously scale it down due to random irrational behaviour.


Do you have any sources on the development process of the Radiant AI that I could read up on?

I’ve only heard the marketing speak (“amazing!!!!”) followed by the actual product (“get X for Y at Z, 5 times”). I’ve never seen anything that explained the disconnect, aside from Todd just being a sheister.


I remember some anecdotes including how they couldn't stop NPCs from randomly going from town to town and steal or buy everything in every shop so there was never anything left for the player. I believe this was mentioned in the making of documentary which came with the collector's edition of Oblivion.


"AI" with pre-sanctioned content just sounds like Eliza-like chatbots [1] (a couple thousand pairs of regex-match to fitting answer). Those actually work great to create the impression of a conversation.

I think those even were used in a couple games ages ago. But they are not very good at steering the player and keeping him on track, which is how dialog systems are usually used in game design. They might be useful in a truly open world where the player discovers the story themselves (maybe like Fallout 1/2), but the AAA gaming industry has largely abandoned that idea in favor of "linear scripted stories in an open world".

[1] https://en.wikipedia.org/wiki/ELIZA


and imagine if plot leaks happen through an NPC... would ruin the game!

ultimately we care more about our character than the NPC being "realistic".


This reminds me of the captors in Iron Man exposing plot spoilers, but only if you happen to understand Urdu.


Won't you need to train it to tell what you want before it tells it?


If you slap an M rating on it why would that matter?


Because an NPC will inevitably be made to say something racist which will be screenshot and posted on Twitter.


I believe no company in the world would want to put out a game where an NPC could be coaxed into presenting a sexy story featuring an infant Adolph Hitler and Jesus Christ.

It's just a headache that I wouldn't want, I guess.


Or maybe we just need to evolve and realize that people are going to use tools for what they want them to be used for, not what we want them to be used for.

People have apparently already found workarounds for Midjourney's nudity filter.

StabilityAI was smart enough to realize people will get around that, and allowed the objectionable content filter to be removed via commenting out a few lines of code.

It's our sensibilities that need to change, not our tools.


There are filters on the most common LLMs so you could obviously tune outputs to be within some range of possible outcomes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: