DIGTAL SENSES

时间:2022-02-21 05:05:53

Recently, IBM released a list of innovations that have the potential to change the way people work, live and interact during the next five years. The list, based on market and societal trends as well as emerging technologies from IBM’s R&D labs, says touch, sight, hearing, taste and smell will be the next big things in computing.

“We have already witnessed the benefits of cognitive systems for advancing numerous aspects of the human experience - from agriculture and healthcare to utility management and weather forecasting. We envision a day when computers make sense of the world around them just like human brain relies on interacting with the world using multiple senses,” said Ramesh Gopinath, Director - India Research Lab and Chief Technology Officer, IBM India/South Asia.

We take a look at what IBM and others are doing to bring the senses to your computer or phone. Interestingly, almost all of these technologies are at advanced stages of testing.

So while IBM says they will be available for consumers by 2017, don’t be surprised if some of these technologies takes a shortcut to come to a device that will be up for sale in a couple of years.

Here is how the senses are finding their way into our digital lives:

SIGHT

By the end of this decade computers will not only be able to look at and recognise the contents of images and visual data, they will also start making sense of the pixels like a human views and interprets a photograph. So future computers will know that a red light means stop, and will be able to interpret signage on a road. A precursor to this can be seen in the form of the Google Goggles app that recognises products from photographs and gives you info on the same. But IBM says that in five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information. For instance computers will be able to differentiate healthy from diseased tissue. Another, use could be to use cameras as body scanners to tell which outfit will be a perfect fit for a person. The apparel industry is already experimenting with this technology with an eye on how this could bring in more online buyers.

TOUCH

Scientists have for decades been trying to bring touch and feel to mechanical devices. Now, this dream is becoming more of a reality. So we could have mobile devices that will allow you to touch and feel products thus redefining retail business across the world. IBM says its scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric – imagine a shopper touching the screen to feel the texture of a fabric that she wants to buy. This technology will use the vibration capabilities of the phone, assigning a unique set of vibration patterns that recreate their texture to each article.

SMELL

In the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analysing odours, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odours are normal and which are not. IBM scientists are already sensing environmental conditions and gases to preserve works of art. Companies like DigiScents and TriSenx are developing devices that will be able to recreate smells when connected to a computer. Imagine yourself holding your nose as you watch a video of a fish market. For instance, DigiScents has indexed thousands of smells based on their chemical structure and their place on the scent spectrum, before coding and digitising them into a small file that can be embedded in web content. Meanwhile, IBM technology will “smell” surfaces for disinfectants to determine whether rooms have been sanitised. Using novel wireless “mesh” networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.

TASTE

IBM researchers are developing a computing system that experiences flavour. It works by breaking down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavours and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavour combinations that pair. The computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. At the University of Tsukuba in Japan, researchers are working on a food simulator that that can mimic the taste and “mouthfeel” of food. Most of these computers use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavour compounds and their bonding structure, and use that information, together with models of perception to predict taste appeal.

SOUND

When computers start hearing, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies and interpret to predict when trees will fall in a forest or when a landslide is imminent. It can also gauge the mood of a speaker, or analyse whether he is lying. The systems will pinpoint aspects of a conversation and analyse pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call centre interactions, or allow us to seamlessly interact with different cultures. Scientists are now studying underwater noise levels to understand impact of wave energy conversion machines on sea life.

上一篇:LED BRILLIANCE 下一篇:HOW THE iPHONE 4 STAYS RELEVANT

文档上传者
热门推荐 更多>