A raised eyebrow, quizzical look or a nod of the head are just a few of the facial expressions computers could soon be using to read people's minds.
An "emotionally aware" computer being developed by British and American scientists will be able to read an individual's thoughts by analysing a combination of facial movements that represent underlying feelings.
Peter Robinson of Cambridge University in England said "the system we have developed allows a wide range of mental states to be identified just by pointing a video camera at someone".
He and his collaborators believe the mind-reading computer's applications could range from improving people's driving skills to helping companies tailor advertising to people's moods.
"Imagine a computer that could pick the right emotional moment to try to sell you something, a future where mobile phones, cars and websites could read our mind and react to our moods," he added.
The technology is already programmed to recognise 24 facial expressions generated by actors.
Robinson hopes to get more data from the public to determine whether someone is bored, interested, confused, or agrees or disagrees when it is unveiled at a science exhibition in London on Monday, July 3.
People visiting the four-day exhibition organised by the Royal Society, Britain's academy of leading scientists, will be invited to take part in a study to sharpen the programme's abilities.
The scientists, who are developing the technology in collaboration with researchers at the Massachusetts Institute of Technology (MIT) in the US, also hope to get it to accept other inputs such as posture and gesture.
"Our research could enable websites to tailor advertising or products to your mood"
"Our research could enable websites to tailor advertising or products to your mood," Robinson said.
"For example, a webcam linked with our software could process your image, encode the correct emotional state and transmit information to a website."
It could also be useful in online teaching to show whether someone understands what is being explained and in improving road safety by determining whether a driver is confused, bored or tired.
"We are working with a big car company and they envision this being employed in cars within five years," Robinson said, adding that a camera could be built into the dashboard.
Anyone who does not want to give away too much information about what they are feeling, he said, can just cover up the camera.