Sunday 13 July 2014

Eye guided wheelchairs forward looking technology; Pepper the ‘moody’ robot

Eye guided wheelchairs forward looking technology; Pepper the ‘moody’ robot
Scientists in London have developed a system that enables wheelchair users to move around by simply looking in the direction they want to go.
Researchers at Imperial College, London say the technology is simple and cheap and could transform the lives of people unable to use their limbs.
Two cameras focused on the eyes observe movement and the information is passed to a laptop which calculates the direction a person is looking and the distance. It’s all done in milliseconds.
Project leader Dr Aldo Faisal explained: 'Our eyes are not only a window into our soul, they’re also a window to our intentions, so if you want to go somewhere… I will look there and I will look there in a specific manner, and we can build a computer system that can decode our eye movements, and so we observe eye movements with an eye tracker and we then try to make sense of them. The computer then interprets these commands and drives the wheelchair accordingly.'
The team found current technology using brainwaves to control wheelchairs was problematic, because of the time it takes to train someone to use it. A user has to concentrate very hard to get the chair to move.
Kirubin Pillay, a research student at London’s Imperial College, said the eye tracker system is much simpler: 'Current tracking software often uses a screen-based system where you have a screen open and you look at locations on the screen. The problem with that is that it’s very simplistic and also diverts the users’ attention from the outside world and therefore there’s more risk of not noticing obstacles or other things in the way.'
Tests by people without physical difficulties found they could steer a chair through a crowded building faster and with fewer mistakes than with other eye tracking movement technology.
‘Pepper’ the robot to suit your mood
Manufactured by French company Aldebaran, ‘Pepper’ is a humanoid robot designed to live with human beings.
‘Pepper’ can translate what mood a person is in by a knowledge of universal emotions (joy, surprise, anger, doubt and sadness.) It can analyse facial expression, body language and words. ‘Pepper’ will then adapt to your needs.
Bruno Maisonnier is from the developer Aldebaran: 'The robots are able to understand if there is a positive or a negative emotion in front of them, we already have this basic function. They know if the person in front of them is happy or unhappy. But they are not able to discern if the person is depressed or nervous, they just know if the person has a positive or negative feeling, which is already not bad, and it opens up a lot of possibilities. Little by little over time, on our own or with the scientific community, we can broaden that out to a whole range of emotions.'
Maisonnier sees a future where millions of robots can become humans’ best friends. He explained that the robots are made to be friendly and cute in order to ease humans into an emotional relationship with their new companions.
He said it poses all kinds of questions about emotional attachments: 'A real connection is made but, but with objects. Can we have feelings for objects, can we love objects? That’s the question behind this, a robot is an object, a specific sort of object, it is an artificial creature, but it is still an object. So can we love objects? We just have to take a look at children with their cuddly toy, their teddy bear, of course it is possible to love an object and have emotional feelings about them, but only if there is an emotional history, a common emotional background.'
Aldebaran has already opened an app store where programmers can develop new possibilities for ‘Pepper.’
BY:MSN

No comments:

Post a Comment