We’re seeing some real nice developments that are important for the use of virtual worlds as a place of collaboration and any other social function: non-verbal communication like facial expressions (a smile, surprise, scorn) or other body language like a nod of your head has been missing. Now sl.vr-wear.com offers a beta viewer for Second Life that uses a camera to track your head and expressions and acts them out with your avatar.
Up to date people adopted various new ways of social behaviour in immersive 3D worlds like Second Life, but the point is that body language and sudden emotions on your face are unconscious behaviour and while typing „lol“ is second nature to most of use by now, it’s still different if you’re suddenly appaled or delighted.
sl.vr-wear.com supposedly shows immediately – and therefore genuinely – these kinds of emotions on your avatar’s face as well. All you need is a webcam and their special SL viewer, available for Windows and MacOS.
UPDATE: I couldn’t get it to work on my MacBook Pro and I would like to know if anybody else has had more luck on a Mac. But I found this Seesmic video showing how simple it’ll work (once it works):