“This cart is now a coffin”, posted UK games writer Richard Stanton, adding “RIP Hamlet the best Labrador in the world :’(“. I can’t help but wonder what’s happened to Hamlet, trapped inside that little plastic bit.
In the case of a Nintendog cartridge the end is probably swift and painless, and certainly nothing like the weird images I have of a floating disembodied puppy consciousness drifting through a void.
But that’s not to say more intelligent computer systems can’t have a death experience.
Below, for example, is a ‘death dance’ choreographed by a creativity machine.
A creativity machine (or a variation called an imagination engine) is a trained neural network that’s goaded into creating things beyond its training thanks to a range of disturbances that include “dying.” In this case the dance was spontaneously improvised by a computer experiencing “simulated cell death” after only seeing 12 different poses.
One of the most well known names in the field, S. L. Thaler, says these “trained artificial neural networks spontaneously ‘dream’ potentially useful information that transcends what they already ‘know,’ once they are properly stimulated by random disturbances.” The article linked in the previous paragraph mentions a far more chilling way of putting it: “As the machine approached death, it began to output not gibberish but information it had previously learned—its silicon life flashed before its eyes, so to speak.”
So is there a future where gaming AI could have something even approaching feelings about you shooting it? A famous example of lower level AI apparently not wanting to die involved the CG characters created for the original Lord Of The Rings movies. Rather than hand-animate battles featuring thousands of participants, the orcs and humans were fitted out with AI provided by Massive Software and left to fight it out procedurally.
Until the elephants arrived.
When that happened the human forces turned and ran. Initially it was reported that they fled in fear although it was later explained as a bug. However, the perception of it being fear was very real, and as AI gets more complex, who’s to say we won’t see NPC’s behaving in a way we perceive as real even if it’s not?
And, if the code driving it all is complex enough, could there be creation machine-like death experiences as the average grunt expires in COD 47, way, way off in the future?
I remember being disturbed in COD 3 (of all things) by a melee mini-game where you wrestle with a German, Private Ryan-style, for your life. Hammer the buttons enough and you win and ‘kill’ him, in the process of which the soldier’s face switches from anger to fear. The first time it happened I actually had a moment of uncertainty—the human brain is programmed to do a lot of things like recognise faces or emotions and even though the simulated Nazi was clearly not real, and the system at play incredibly simply (mash button to make pretend thing pretend dead) a part of my brain still responded.
Obviously games with AI complex enough to actually have an opinion on their own mortality are so far off as to be science fiction. Experts currently peg anything artificial being even slightly close to having any ethical implications as about 50-odd years off. And that’s for complex robots and real research, so video games are unlikely to have any troubling moral implications just yet.
The important thing here really is that Hamlet didn’t suffer.