- Circle of safety
April 27, 2013
Apps that send alerts to selected contacts during an emergency gains popularity after Delhi rape.
- The future is now
April 13, 2013
Stuff that seems futuristic, but already exists.
- The easter egg hunt
March 30, 2013
For kids, Easter is defined by the egg hunt, where decorated artificial eggs of various sizes - are hidden both indoors and outdoors.
- In This Section
- Entire Website
From the Times Of India
- MOST POPULAR
Computers get emotions
A decade ago, when automaker BMW recalled the sophisticated navigation system in its 5 Series cars, it wasn't because of a flaw in the technology. It was because the system had a female voice. "The service desk had received calls from agitated men all over Germany who had the same basic complaint: They couldn't trust a woman to give them directions, " writes Clifford Nass, a professor at Stanford University and the author of The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships.
Now while this may sound like an example of gender stereotyping, there is something much deeper at play here, the professor says.
"The BMW episode is a classic instance of how people interact with devices, " he claims. "People have begun to attribute human personality traits to smart gadgets. And increasingly these products are becoming intelligent enough to respond accordingly. "
Indeed, companies around the world are working towards creating electronics that interact with people in a human-like way. And already, scientists have added intelligence to machines so they can listen, speak, see, reason and even learn, just like the way people do.
Ivon Arroyo and Beverly Woolf, both research scientists at the University of Massachusetts, have forged ahead in this field by lending emotional intelligence to artificial teaching software.
"These days, auto-tutors have become fairly common, " says Arroyo. "There are programs that teach everything from math to corporate HR policies. "
And studies have shown that children who use an auto-tutor progress much faster than those who learn in regular classrooms. But the biggest problem with these auto-tutors is that they're unable to understand and respond to students' emotions, Arroyo says. She and Woolf are therefore trying to change this with their Intelligent Tutors project.
"Our first step when creating intelligent tutors was to give the software a human-like face. It was then programmed to spot frustration or fidgeting and other such behaviour so that it can intervene immediately, " Woolf discloses.
To create an emotionally intelligent response, the computer first needs to understand and label these emotions, says Arroyo. "It might be easy to feed statistical data and make computers play chess or answer Jeopardy! questions, but emotions are a whole new area. "
The project currently employs myriad physical and algorithmic emotion-sensing technology: A web cam is used to detect expressions and follow a student's gaze to see if he/she is attentive. For the rest, the duo gathers emotional feedback from physical sensors installed around the child.
"We have tried to keep the physical emotion sensors as invisible as possible, " says Arroyo.
The first, of course, is the camera, the second layer is composed of sensors in the chairs to detect posture and weight shifts, while the third kind of sensors that measure galvanic skin response - for levels of psychological or physiological arousal - are meant to be worn like wristbands.
"The software gets constant feedback from the sensors. This data is then run through algorithms that tell the digital avatar to show a certain kind of emotion in response, " Arroyo says.
She talks about an example of a math tutor that sits at the bottom of the screen and taps into the student's emotional response. So, for instance, if the child has been shifting his weight or rushing through the lessons faster than he should, the animated tutor will ask if the student is bored. "And just like a human teacher, the intelligent tutor too will try to resolve the issue by giving an easier problem to get the student's attention back on track. "
Projects are currently under way in various schools around the US and all over the world through their web site (cadmium. cs. umass. edu/wayang2/flash) which is free for anyone to try. And the software has been more than 80 per cent accurate in sensing the students' moods, Arroyo claims.
"These systems will be great for people living in remote parts of the world who might not have access to good schools and colleges, " she says. "Rural areas, especially in countries like India, can greatly benefit from our system. " Arroyo is currently in the process of getting funding to extend their studies in such areas.
Eric Horvitz, a scientist with Microsoft Research, USA, has added emotional intelligence to his 3D virtual secretary. His assistant is a female avatar packed with emotion-sensing technology, including speech understanding, machine learning and image recognition.
"The mindlessness of gadgets follows us everywhere, " Horvitz says. "At work when you have just zeroed into a problem and ding goes the email distracting you. Or when you are driving and your phone rings with a telemarketing call. Wouldn't it be nice if computers had the same emotional common sense as human beings?" The researcher, therefore, designed a digital assistant, which is plugged into all kinds of databases, so that she can make complex (almosthuman ) decisions for him.
She taps into Horvitz's calendars of meetings and appointments that go back years, she also analyses his work patterns and his interactions with different people to compute mathematical results that predict future interactions and the time he will spend on each one of them. "She gathers data to make decisions based on my phone calls by length;people I talk to, and time of day, and she uses this to make future appointments, " he says.
So when someone approaches his assistant to set up a meeting, she quickly reviews that data looking for patterns - for example, how long have calls to this person lasted in the past? Is Horvitz multitasking while talking on the phone? And within minutes she processes different scenarios to give an accurate estimate of when her boss should be free.
Horvitz, of course, has sets of rules that he continuously feeds into her system. For instance, he has rules on the kinds of meetings she can interrupt. So if some colleague drops by for a quick word with him and the assistant finds that he is currently talking to his research assistant, she will quickly compute and say: "It's okay to interrupt him right now, just go in. "
To hone the conversations skills of his virtual assistant, Horvitz has added routines for general work-related chat. So she entertains waiting colleagues, just like a human secretary would, by talking about Horvitz's day saying things like: "Eric's been busy in meetings all afternoon. But he's looking forward to seeing you. "
Horvitz boasts she even detects emotions by recognising specific words or phrases, or by detecting other attributes in conversations. This is mainly by using a dozen indicators, including breathing, conversation pace and tone to decide the type of mood a visitor might be in.
RAGE AGAINST THE MACHINE
Professor Nass has similar projects where emotion sensing software is being used in automobiles to analyse driving patterns to avoid accidents. He cites an example of the 2002 study where a Japanese car company developed a system which analysed drivers' performance and warns when improvements were needed. "The cars would caution a driver when he exceeded the speed limit by chiding him or trying to cheer him up in an enthusiastic voice. "
This however - at times - had a reverse effect on the driver who became somewhat annoyed and sometimes even started driving more recklessly. "The biggest mistake in the system, " Nass claims, "was the lack of emotion sensing. " He has therefore tried to change this by creating a vehicle that analyses driving patterns to recognise when the driver is giving into road rage. If the driver is angry, the computer stands to reason that he will make more mistakes and therefore it tries to compensate by drawing extra attention to potential collision risks, Nass says.
Granted, most of these projects are still in the testing phase, but experts say it might not be long before we see them enter the commercial market. A start-up called Affectiva from the Massachusetts Institute of Technology has already unveiled a prototype product that will enable computer systems on the internet to detect the mood of surfers. The technology utilises the user's web camera to scan for facial expressions, eye movements and gestures that could provide clues to emotional reactions to anything from news content to videos and even advertisements. Content that is being 'pushed' to these users can then be altered accordingly.
Similarly, professor Peter Robinson from Cambridge University has been working on a prototype of an 'emotionally aware computer' that uses a web cam to capture images of the user's face and infers the user's mood. He believes it will revolutionise online marketing. "Computers will process our image and respond with adverts that connect to how we're feeling, " he says.
Register for Full Access to the Crest Edition
Don't have a Facebook Account? Sign up for Times Crest here.