As the year nears its close, IBM, as it has every year since 2006, has pulled out the crystal ball and given us its predictions of five innovations that it believes will impact our lives in the next five years. For this year’s “5-in-5” list, IBM has taken a slightly different approach, with each entry on the list relating to our senses. The company believes cognitive computing whereby computers learn rather than passively relying on programming will be at the core of these innovations, enabling systems that will enhance and augment each of our five senses.
NEW ATLAS NEEDS YOUR SUPPORT
Upgrade to a Plus subscription today, and read the site without ads.
It's just US$19 a year.UPGRADE NOW
In the past five years touch screens have become a part of everyday life for most of us, but IBM believes the technology will develop to include haptic feedback so that we will also be able to feel the texture of objects being displayed. This will be enabled by technology such as the REVEL system developed at Disney research that uses reverse electrovibration.
While the basic technology is already there, IBM says the development of a “Product Information Management” (PIM) database system that acts as a dictionary to match the vibration patterns to relevant physical objects will allow texture information to be easily matched with specific items. This will benefit not only online clothing retailers, which are expected to be the first major users of the technology, but also farmers – who will be able to determine the health of their crops by comparing it to the texture of a healthy plant – and doctors – who can literally get a feel for an injury to help with a diagnosis.
According to pingdom, in 2011 on average there were 4.5 million photos uploaded to Flickr everyday contributing to some 6 billion photos hosted on the site, there were an estimated 100 billion photos on Facebook and 60 photos uploaded every second to Instagram. While it is digital cameras that are responsible for this explosion in online photographic content, digital technology is still pretty “dumb” when it comes to analyzing images. This means sorting through them generally relies on user-defined tags and text descriptions, which are time consuming to set up and not always accurate.
IBM says that in the next five years, cognitive computing technology will allow computers to examine thousands of images and recognize patterns and distinct features to determine their content. For example, in beach scenes the computer might recognize certain color distributions that are common to such images, while for a downtown cityscape it might learn that certain distributions of edges are what sets them apart. Then once it has the general location down, it could be taught about the activities that are likely to be carried out there.
While such technology would make image searches on the Web easier, IBM says cognitive visual computing could be used to recognize tumors, blood clots and other problems at their early stages – something that is already happening for the early detection of potentially deadly melanoma. The technology could also help in the real time monitoring of disaster areas through analyzing images uploaded to social networking sites or keeping an eye out for potential security issues by monitoring security camera images.
The baby translator created by Homer’s brother, Herb, in The Simpsons will be a reality within five years, according to IBM. Cognitive systems with the ability to not only hear, but also understand sounds, will be able to translate a baby’s cries into spoken phrases in real time. The technology could also be used to monitor trees for sounds of stress that would indicate a potential collapse, or for more accurate noise cancellation technology in mobile phones.
IBM research is also aiming to give us superhuman hearing by translating ultrasonic frequencies into audio that we can hear. This could potentially give humans the ability to "talk" to the animals, such as dolphins and dogs.
Such a device could also work in reverse to convert audio that we can hear into ultrasonic frequencies that could be transmitted to a targeted recipient where the signal is re-translated back into normal audio. This could be used to privately converse with someone across a crowded room, to call children in from playing in the neighborhood for dinner, or by police to warn pedestrians of danger over traffic noise.
They say there’s no accounting for taste, but IBM says that within five years computers will be doing just that by adding another dimension to cognitive computing – creativity. A team at IBM Research is working on a system that analyzes food down to its atomic structure and combines this information with psychophysical data and models on which chemicals produce “perceptions of pleasantness, familiarity and enjoyment.”
IBM says such technology won’t just create meals that tickle our taste buds, but also ones that are healthy and meet nutritional requirements. Such a system could create nutritional school cafeteria lunches that students actually want to eat or allow those with limited ingredients, such as those in the developing world, to create meals that optimize flavor.
We already have electronic devices that can “smell” – the most obvious example is the breathalyzer that detects alcohol from a breath sample. But IBM says electronic noses are set to become much more widespread and will provide a valuable tool for doctors. By examining the molecular biomarkers present in our breath, tiny sensors that are small enough to be integrated into mobile phones or other mobile devices will be able to provide valuable diagnostic information about our physical health.
Similar technology already exists, such as an "artificial nose" that can sniff out bacterial infections, and another that can detect narcotics and explosives. IBM says it has already demonstrated the ability to measure biomarkers down to a single molecule using relatively simple sensing systems and believes it won’t be long before the technology is sniffing out various ailments, such as liver and kidney disorders, diabetes and tuberculosis, amongst others.
So that’s the list of five things that IBM thinks will change our world in the next five years by leveraging the power of cognitive computing to mimic our five senses. The company is also asking readers to vote for which of the five senses they think cognitive systems will first be able to understand and learn the way we do. You can cast your vote by hitting the source link below and let us know your thoughts on IBM's predictions in the comments.
Brief videos of each of the five sense-related technologies can be viewed below.