BARBELITH underground
 

Subcultural engagement for the 21st Century...
Barbelith is a new kind of community (find out more)...
You can login or register.


Video Game Characters with a Soul?

 
  

Page: (1)2

 
 
Tamayyurt
23:41 / 11.02.02
Ok the title is a bit much but not by far...
http://news.bbc.co.uk/hi/english/sci/tech/newsid_1809000/1809769.stm
 
 
01
00:47 / 12.02.02
We're the same thing. Marios and Lara Crofts.
 
 
the Fool
02:01 / 12.02.02
If game villians get too smart, and become conscious, we won't be able to kill them anymore... 'cause it will be murder. And then they will take over...

Huzzah! Welcome to the Matrix...
 
 
Magic Mutley
16:47 / 12.02.02
quote: "If they can become conscious in any way, they could have emotions, and there we do have serious ethical problems."

I find this a bit strange, considering the way apes, dolphins etc., are currently treated...
 
 
w1rebaby
17:04 / 12.02.02
On the other hand, sophisticated language systems can appear more "intelligent" to us than sentient but non-human intelligences, because they appear closer to us. One of the most ethically important studies with chimps, for instance, was the sign language studies with Washoe et al.

It's my belief that when most people say "intelligent" they mean "non-physically human-like".
 
 
cusm
17:10 / 12.02.02
So what they really mean by sentient is: can say "stop mommy, it hurts"
 
 
Random?
02:45 / 14.02.02
What im more concerned with is that if we can get computer generated character to do what we want them to do and do it right, why cant we get living breathing people to do the same ?
 
 
Molly Shortcake
02:53 / 14.02.02
So these characters are going to 'run' for their 'lives' because they have notions of 'death'? I don't buy it at all. It's hyperbole.
 
 
the Fool
03:45 / 14.02.02
quote:Originally posted by Lord Rugal Ultimate:
So these characters are going to 'run' for their 'lives' because they have notions of 'death'? I don't buy it at all. It's hyperbole.


Who says you're any different. The same game with better software...

If something has all the appearance and internal mechanics of life, is it not alive?
 
 
Chuckling Duck
14:40 / 15.02.02
If a video game character could pass a Turing test, I’d withhold my firey gun blasts. But that tech is still way, way off . . . maybe in 20 years, given Moore’s law and appropriate coding time.
 
 
Tom Coates
15:44 / 15.02.02
But would you? I mean what are the rights of an infinitely reproduceable being? A computer being presumably could not be killed - they'd have no continuity of experience - because there would always be a base level granularity in computer processing time. And at any time, you could branch off that individual into two - or save that individual in its current state and restart it from an earlier point in time.

If a computer could pass the Turing test and was still a game villain, would there be an ethical problem with killing them when you could immediately and easily bring them back from that death without anything but 'simulated' pain.

This is actually a really interesting conversation here to do with the rights of artificial intelligences and the potential confrontations they present to our ideas of where rights come from..
 
 
Tom Coates
15:56 / 15.02.02
Could I ask people who are starting threads based around links to other articles to summarise the article concerned so that should the original article go offline or not be available, people can still read the threads with suitable context?

Thanks....
 
 
cusm
18:04 / 15.02.02
There are, in neural networks like this capible of cognitive thought, two sorts. Linear, and recursive.

A linear neural network has an input layer, X processing layers, and an output layer. When it is not receiving input, it does nothing. It is not alive, it is an information processing pattern, a machine awaiting use. Linear networks can be taught to resemble cognitive decision making, possibly even emotions. However, they will never truely be alive, for they only exist while they process data. No continuity of existance.

Recursive networks happen when you link some of the output layer to the input layer, allowing for complex and unstable states to occur within the neural network. They continue to process without additional input, as they derive their own input from their own internal states and output layer. The math of these is still quite outside of our grasp, save on the most basic level.

An example of the basic is the flip flop configuration used in computer memory of 4 nodes, each of which with 2 input and 2 output, where one output line in each of thre second layer is connected to one of the input lines in the first, allowing for 2 input lines, 2 output lines, and a continuous "flip flopping" circuit that can store two bits within it. Recursive neural networks are like that, at several orders of magnitide complexity.

The human neural network is recursive, not just in the final input and output layers, but on every layer. Its a big complex mess of looping information storage that somehow seems to work itself out into cognitive thought, and continues to process based on its own input, and this preserves a continuity of consciousness. Occult image: the serpent swallowing its tail.

Until we build AIs that make use of this model, they will not truely be conscious, just clever processors than can model consciousness and behavior, even sentient behavior, so long as you feed them input. When you shut them off, they just shut off. A recursive consciousness will continue to ponder its state in the darkness, possibly to madness, or enlightenment.

[ 15-02-2002: Message edited by: cusm ]
 
 
Elijah, Freelance Rabbi
18:57 / 15.02.02
it would take the villians to be able to understand my point of view and be converted for me not to kill them

I play games so i dont go to work with a shotgun
 
 
netbanshee
15:42 / 18.02.02
...one also has to think about the need for developing such lifelike characters to interact with...and if spending half of your cpu cycles on it will take away from the lure of the concept of the game. Personally, I'd rather play a game that blows me away because its just that well developed than it's ability to behave at times. Generally speaking, video games are positioned to give you abstract experiences outside of your everyday hand-to-hand experience with the world. Plus there's different camps...the arcade people and those who enjoy simulation. Keep in mind that the mixture will be cool but trying to work on projects that encompass all of those abilities (gameplay, AI, etc.) sometimes dilutes the "game" element of it.
 
 
Ground Zero
20:03 / 27.02.02
The Sims.
 
 
grant
18:11 / 22.08.03
They may look the part, too.

"...something that looks and acts so life-like that it is indistinguishable from a real person."

The "see also" links along the side might also be useful in this discussion.
 
 
Perfect Tommy
20:44 / 23.08.03
...would there be an ethical problem with killing them when you could immediately and easily bring them back from that death without anything but 'simulated' pain. --Tom Coates

Just had this series of thoughts:
  • Very few people would fault someone for killing an animal that was attacking someone.
  • Most people would object to torturing a animal.
    • Cruelty to animals is often seen as a predictor of violence towards humans.

  • Most video game characters (in games in which violence is possible) have traditionally been attacking your avatar violently.
  • Games are becoming more complex: there are more opportunities to attack a video game character without provocation; and, characters simulate pain and injury more accurately than when one was shooting colored blocks.
    • Is 'torturing' or attacking without provocation a video game character unethical behavior?
    • Whether it is or not, is it predictive of violence toward animals and humans?
    • (...and Tommy is possessed by the spirit of Joe Lieberman) If so, is there merely correlation, or causation? ('It's okay to kill without feeling.')
Hm... that last bit reminds me of reading somewhere that in WWI, American soldiers' fire rates were only around 20%: they weren't all that likely to actually shoot another person. But by Viet Nam, fire rates were over 90%, because of the introduction of various kinds of psychological conditioning. (Have I ventured into thread rot yet?)
 
 
Linus Dunce
23:08 / 23.08.03
Pull the damn plug any time you want. They're just images generated by a machine; automatons.

Why so much faith in the Turing Test? What is it, essentially? "I cannot distinguish between this machine and a sentient being, therefore the machine is a sentient being." Mmm. What about, "I cannot distinguish between the display in the window of a Japanese restaurant and real food, therefore it is edible." I mean, they'll never pull the dish out of the window and serve it to you, so how can you argue it has failed a performative test? But it's still inedible, and common sense (or a Cartesian invocation of a wise and benevolent God) will tell you so.

Having said that, I would worry about someone who enjoyed disembowelling a game character. But I also worry that all this cyber-nonsense is just to keep us interested in filling out the spreadsheets of The Man.
 
 
Solitaire Rose as Tom Servo
03:03 / 24.08.03
I have had thoughts about this issue, especially while playing "Halo" which has small aliens who cry out for help, cry or other things when you shoot them and they don't die.

The upside of it is that at some point, games will be so sophisticated that they will be able to simulate things other than first person shooters, much like the Dating sims in Japan. And the fact that The Sims is the best-selling game ever means that people will play games that aren't based on beating or shooting the crap out of everything have a place.

The downside is that they will make games like "Doom" where people can kill semi-sentient beings without a care, and that really disturbs me.
 
 
Lurid Archive
12:57 / 24.08.03
Why so much faith in the Turing Test? What is it, essentially? "I cannot distinguish between this machine and a sentient being, therefore the machine is a sentient being." Mmm. What about, "I cannot distinguish between the display in the window of a Japanese restaurant and real food, therefore it is edible."

Thats a false analogy. A better analogy would be, "I can't distiguish between this fake Japanese food and the real thing, after a week of living on it, therefore the "fake" food must be edible."

I suppose if your point is that sentience is not observable in any way then your analogy probably holds. To me this is kinda like saying you can't tell something is edible by eating it. True, "intelligent" software can fool you. But not for that long.
 
 
netbanshee
14:28 / 24.08.03
The downside is that they will make games like "Doom" where people can kill semi-sentient beings without a care, and that really disturbs me.

But then again, we are "killing" each other when we play arena games and the like over the net. Can't tell you how many times I've been fragged by some of my friends. As the simulation becomes more real, do you think that people will be less interested in offing each other too?

Not all games out there that have a combat element glorify the violence in it either. The Silent Hill series (one of my absolute favs and highly recommended btw) does an excellent job of putting the weapon in your hand, so to speak. It also, depending on the scenario, uses feelings of guilt regarding the death of characters or one's actions as an effective tool in bringing one into the appropriate mindset. So... not the less intelligent hack-and-slash of say Resident Evil, but a true "survival horror" game.

Characters with a soul? Sometimes it feels like it when you get some game designers worth their weight...
 
 
STOATIE LIEKS CHOCOLATE MILK
15:10 / 24.08.03
Professor Taylor and his team are confident it could have the intelligence of six-year-old child by the end of next year

Sorry to sound cynical, but I'll believe it when I see it.

As regards the ethics... anyone read Ken McLeod's "The Stone Canal"? A large part of it deals with (albeit in an sf way) with whether genocide of electronic entities can be justified. Cos you can do it again and again and again...

Personally, I take Mutley's pov on this one... just because we (humanity) decide something is sentient, that's NO guarantee that treating it like shit will be considered immoral.
 
 
grant
14:18 / 26.08.03
Crosslink: First-person-shooters and karmic ramifications.

It's a related discussion, so a link seemed appropriate.
 
 
at the scarwash
17:01 / 26.08.03
If a videogame character was created in such a way that it would be sentient, wouldn't it end up being something like an actor? Wouldn't getting killed be part of its job? They would be, after all, intelligences specifically designed for the purpose of playing Dr. Doom in Marvel vs. Capcom, or whatever. It wouldn't be as if game designers were hacking MIT and Shanhai-ing cognitive scientists' pet AIs and press-ganging them to play Koopa Paratroopas or Ninjas kidnapping the president. They would be software intelligences created to perform certain tasks in a game environment, much as we are designed (for lack of a better word) to breathe, eat, shit, and breed. The fact that they could be "killed" on screen would simply be a function of their life cycle.
 
 
sTe
21:02 / 26.08.03
But are we not 'designed' to die too? Otherwise the earth would get far too full.

There's some who believe we come back, reincarnation and all that, albeit without any memory of past existence, I don't see how this can justify eliminating sentient life just because it doesn't expect to live forever.
 
 
Eloi Tsabaoth
21:19 / 26.08.03
Fascinating topic. When a video game character attacks you, is it responsible, or is it just its' programming? Couldn't soldiers in the army be subject to the same query, seeing as army training includes elements of brain-washing? Grossly over-simplifying, I know.

BTW:It wouldn't be as if game designers were hacking MIT and Shanhai-ing cognitive scientists' pet AIs and press-ganging them to play Koopa Paratroopas or Ninjas kidnapping the president.

nbb, if you don't write a story based on this, may I? Pretty please?
 
 
at the scarwash
22:53 / 27.08.03
But the thing is, they aren't necessarily dying. A software intelligence packaged in a videogame would fly across the room , hit a wall, and slide to the floor leaving a bloodstreak in response to the impact of your pixelated bullet because that's what it's programmed to do, not because it actually dies. Ideally, it would be programmed to dodge your bullets with a certain level of success, returning fire with a certain level of accuracy. Not because it actually believes itself to be an anti-American terrorist fighting for its life against your lantern-jawed special forces polygon, but because it is programmed to play the game. Why should computer-generated violence actually harm a sentient video-game character? It doesn't have a head to get blown off.
 
 
Perfect Tommy
06:03 / 28.08.03
I think I see what you're saying, Ninja B.B., but:

Say a software intelligence has AI routines that make it shoot, dodge, all that stuff, but it also has the functions of informing the game engine that its avatar is flying across the room, hitting a wall, leaving a bloodstreak, and going to be laying very still right here. From then on, it just tells the game engine where its body is.

But, really, I also have two sets of functions, except that my AI routines are encoded in my brain-state, and my messages to the game engine (i.e., the universe) on the position of my avatar is 'computed' by the physical aspect of my body. If I get shot, fly across the room &c., 'I' (as my body) communicate the same physics aspects to the universe (as well as initiating the decay subroutine... or rather, deactivating the anti-necrosis subroutine, I guess), but 'I' (as my brain-state or soul) is no longer a going concern--just like the AI routines. Both of us continue to function in a sense, but we no longer have the capacity to change what we're doing beyond lying around and possibly decaying.

Weeeeeird.

Alternatively, getting killed could be part of the agent's lifecycle or profession in the same way that manzanita bushes(?) need fire for their lifecycle. But, then we'd have to consider my eventual transformation into worm food to be a part of my lifecycle too, which is comforting in a contemplation-of-mortality sense but seems to be stretching the concept of lifecycle a bit much.
 
 
Phex: Dorset Doom
19:27 / 28.08.03
The whole issue of ethics could be side-stepped with a little clever programming:
In the game Half-Life your enemies act as a hive-mind; coordinating attacks/retreating etc. (Think the 'just like a flock of birds' scene in Jurassic Park) Isn't it far more likely that in the early days of sentient AI opponents in games that one AI will coordinate a whole game's worth of monsters/terrorists etc. a little like a player in a strategy wargame moving every piece individually at the same time. There would be no ethical issue because the sentient AI is a disembodied puppetmaster style entity, jumping from body-to-body Agent Smith style.
(Of course nobody should underestimate humanity's capacity for sadism, a savvy programmer could easily rip an AI into a single 'body' for a snuff-style torture game for the Japanese 'Otaku' market.)
 
 
Deadwings
05:02 / 29.08.03
Of course nobody should underestimate humanity's capacity for sadism, a savvy programmer could easily rip an AI into a single 'body' for a snuff-style torture game for the Japanese 'Otaku' market.

...So let me get this straight, Japanese geeks are vile pain-junkies? I'll be damned, I never knew.

Up until that last bit though, very valid and logical argument. Personally, I doubt that full-spectrum sentience would ever see use in grossly violent games. More likely your puppetmaster, or failing that, truncated consciousnesses with rudimentary pathings to simulate more complex function. It would be more akin to taking one of those football helmet-wearing foodcourt lobotomy cases and training it to do things that 'real' people did, like smile and cry and not shit itself without ever telling it why than sculpting a Harvard graduate out of ones and zeroes and shooting it in the face for the sheer blood-gushing awesomeness of it.

/sick
 
 
grant
19:00 / 29.08.03
Well, what you're skipping in that analogy is that once created, an AI could be (depending) easily reproducible, right? All them 1s and 0s could just be saved to a zip disk right before the shooting in the face. You could dupe however many copies you wanted, right?
 
 
Deadwings
21:55 / 29.08.03
That's like saying it's okay to kill people because once they're dead there's nothing but a pile of slowly dissolving chemical connections to demonstrate that anything actually lived, and, well, it's not conscious anymore, so it doesn't matter what I did to it in the past. I forget the latin term for it, but it's a really shitty thing to say. Like the ends utterly negating the means and all that. Very irresponsible.
 
 
mixmage
01:07 / 03.09.03
Infinitely reproduceable?

You mean you can make clones if you have the original. I played Black & White for a year, then lost my creature to hardware problems. I lost a lot of data, including a little playmate.

One I cannot reproduce even once, let alone infinitely.

I remember a game called "Creatures"... found an Old article which mentions it in section 4> Artificial life:

Inventor and lead programmer Steve Grand claims that "norns" are a true artificial life form, acting entirely from low-level genetic and cognitive principles rather than programming. They do live in a world of identified objects rather than raw bitmap perceptions, but nearly every other aspect of the simulation is due to random genetics plus whatever training you provide...

Most of their appearance and behavior is genetically determined, though, and each norn is a unique genetic sample. If you have a norn you'd rather not raise, the developers urge you to find someone else who would like it.


I remember people getting very concerned about these little beings, refusing to use the "euthansia" option unless they were incurably sick - quality of life issues.

Sure, just because it appears to be alive and people treat it as if it is, that doesn't mean it really is.

Then again, if you don't believe in the afterlife, how is being wiped from a drive any different to dying?
No more intelligence, even if that intelligence was only on a par with an insect.

Grant... I believe there's a Jedi talking to the Kaminoans about a similar idea
 
 
Jashugan
00:16 / 04.09.03
Well, intelligence involves some degree of self-knowledge, so presumably for the little baddies to be considered to have AI they would have to know that they were existing in a computer generated environment? In that case, in the context of, say, a console game, they would know that they were in fact immortal because their code is permenantly written onto a disc somewhere in another dimension we call reality. I can load up Vice City and blow the brains out of someone, turn off the console, turn it back on and they will have re-appeared.

If they knew this, you could see some really bizarre behaviour. They might recognise you as the human interloper into their world and just run away whenever they see you; they might gang up and refuse to risk their lives for the sake of some poorly written plot or they could just become entirely free and experiment. Do I die if I jump off this polygon skyscraper, or does that human-controlled polygon have to shoot polygon bullets at me? Can I die except by his hand?

Their only concern would be that you continue to play the game, because as long as you do they continue to exist. They would have a vested interest in making it fun for you, the player.

They might even come to think of you as god! They could develop culture, their own moral code. It just depends how much rope you give them - how much flexibility and potential for growth is written in. Perhaps Bowser would realise the futilty of fighting Mario, because past experience will tell him that he probably ain't gonna win.
 
  

Page: (1)2

 
  
Add Your Reply