Apologies for the long-winded title; it’s actually quite hard to find a subject that gets right to the point. This isn’t about triggering a particular emotion in gamers — not directly, at least. It’s also not about how ‘emotional’ gaming can be — we already know that playing games can be an intense experience that can warrant a massive gamut of emotions.
This entry’s about your avatar — your character, the model that represents you — and the emotions that it can, or as the case may be, cannot display.
Emotions have long played a vital role in communication and human interaction. We smile and raise our shoulders a little when we’re happy; we frown and slump when we’re sad — these emotional keys are a form of communication in their own right: body language!
Beyond subtle muscle shifts we also have emotive reactions that we’re less aware of: we blush when we’re embarrassed or caught lying; we raise our voice in anger or petulance. Most importantly though are the muscles groups on our face: the flaring or contraction of our lips and eyes, the furrowing or raising of the brow — each of these actions, or reactions, are ‘programmed in’ genetically and almost impossible to alter. It’s these same minute movements that we’re (often unconsciously) reading in the face of whoever we’re talking to. It’s these tiny twitches in someone else’s face or body language that can trigger our own involuntary responses: that momentary curl of the lip might be all the indication you need to run away quickly.
This ‘hunt for emotion’ as we communicate with other people is so ingrained that online communication has always felt a little… distant. Internet veterans are cautious, aware that without body language their words can easily be misconstrued. Newbies often blunder, forgetting that no one can see the ironic smile on their face. There’s a reason emoticons :-), *asterisks*, CAPSLOCK and _underscores_ exist: to convey emotion! It’s clunky and slow compared to body language or facial expressions but it’s the best that we have.
Why, twenty years after the first text-based world, are we still communicating with such basic tools? Some early games like LegendMUD had ways to inflect mood into your conversation through expansion of the verb sets (‘say alts’) but since then… nothing. In graphical virtual worlds a couple of games have tried to incorporate moods (notably Star Wars: Galaxies and EverQuest2) but still they were still primarily low-tech text-only executions, toggles: /angry, /sad, /afraid, or parsing exclamations and queries.
Why are we still running around in virtual worlds with emotionless, gormless avatars? In single-player games it’s almost the state of the art, the bleeding edge! ‘More realistic than ever before!’ the developers cry. What makes the games more realistic? Interaction with the game world: physics and realistic NPCs, or in the case of virtual worlds, other player avatars. You only need to look at the success of LittleBigPlanet — a very simple platformer with oodles of delicious detail and bucket loads of charm and a very diverse emotion system.
For a market segment that generates almost all of its appeal (and revenue) from the immersive quality of virtual worlds it’s amazing that there isn’t yet a virtual world that has the power to model emotions through various facial expressions and body poses. You could even go one step further from the toggle system and parse complex emotions like sadness, apprehension and lust out of chat. Then there’s the character state itself: in battle your avatar would grimace upon being hit; a healer would smile upon saving a party member.
Are we simply being held back by World of Warcraft‘s ancient graphics engine? Surely it’s time for realistic, immersive emotions in virtual worlds.
Further Reading
- Some work into avatar puppeteering in Second Life which, unlike WoW, actually has some built-in facial expressions
- Bob Moore’s presentation from the Austin GDC 2008 has a more-scientific, sociological look at it under the header ‘Challenges in Avatar-Mediated Interaction’
Brian 'Psychochild' Green
Jun 8, 2009
The reason seems obvious because you say it yourself: our emotional reactions are programmed in and we often don’t even think about them. With an avatar, we do have to think about them, and that’s not something we’re used to. Some of us (like me) have gotten used to doing things like adding smilies to indicate our mood.
Some games let you set your character’s mood, but that isn’t always accurate. If someone is having a good day and sets the character’s mood as happy, the first on their mind when someone calls their mother a whore isn’t necessarily to change their mood first, even thought their mood will probably have changed.
The alternative is to have the computer in charge of figuring out moods and emotions and handle them automatically. I don’t think that’s going to any more accurate, though, in most cases.
You also have the issue that subtle expressions do get lost in a game world. If I have my camera zoomed out to get a better tactical view of the battlefield, then I might not see facial expressions too well. Even if I have an amazingly detailed graphics engine, if a face is smaller than the size of a keyboard key on my screen, detail is going to be lost. This is definitely an area where the old text games did better than the new graphical games do.
I’m not sure what the solution is here, though.
sebastian
Jun 8, 2009
That’s the same conclusion I came to, after thinking on it a little more.
But I will say that when I’m having a ‘good chat’ with someone, in an RPG (like WoW, or even in a single-player game) I will often zoom in until I’m almost first-person, or at least so that I can see the other avatar clearly. I don’t know if I’m alone in that, though!
It’s definitely hard work having two places to look at: the chat and the game world. Text worlds were much better in that sense. Thing is, chat bubbles have been in for a while, but still emotes remain kind of… tacked on.
I don’t think it would be so hard to ‘figure out’ emotions accurately, given the huge body of data available for testing and parsing. Natural language processing is a very big field with a lot of research going into it.
But first, we have to get people zoomed in to see the facial expressions, that’s the important thing
Raph’s Website » Avatar body language
Jun 8, 2009
[...] blog reader mrseb has a blog post up on emotional avatars in virtual worlds inspired by this NYTimes.com article (it’s behind a reg [...]
Emily Jane
Apr 20, 2010
I can’t wait for WoW to get proper facial expressions, and body language too.