Computers

Interactive intent modeling gives SciNet the edge over other search engines

SciNet combines a conventional search interface with a radar-like interactive cloud of related keywords that can be combined or moved nearer or farther from the center to improve result relevance
SciNet combines a conventional search interface with a radar-like interactive cloud of related keywords that can be combined or moved nearer or farther from the center to improve result relevance

Google may be dominant in the battle of the search engines, but its ever-evolving page rank algorithm and straightforward list of results don't always get you the information you want – especially when you're not sure precisely what keywords to use. Now researchers at the Helsinki Institute for Information Technology (HIIT) have developed a new alternative called SciNet that uses information visualization to help you dig through related terms in narrowing down a search. Its creators claim that it outperforms conventional search user interfaces in finding information in an academic database.

SciNet's big selling point is a user interface called IntentRadar, which is meant to make search a more exploratory experience. Results appear with a radar-like cloud of keywords and terms on the left and the traditional ranked list on the right. Keywords nearer the center of the radar are deemed more closely related to the search topic than those farther away. Related topics are also shown. You can move keywords in the radar around to show what information is most useful.

The idea is to interactively model your intent so that your query evolves naturally through feedback loops as you dig deeper. The researchers give the example of searching for "3D gestures." The intent model suggests gesture recognition as a highly-relevant, potentially-interesting intent, with other intents including video games, interaction, virtual reality, and hidden Markov models (a special variety of statistical modeling). If you drag "gesture recognition" close to the center, the radar reconfigures itself and visualizes new estimates of intent. Conversely, if you drag "video games" further away, keywords related to that will be downplayed in the results.

Relevant keywords can be combined at a click, too. Selecting "gesture recognition" and then "hidden Markov models" will result in the system suggesting results that apply hidden Markov models in gesture recognition. And keywords at the periphery can be enlarged by a fisheye lens effect that follows the mouse cursor around the radar. You can see a video demonstration of the system below.

The basic theory behind all of this is that people often aren't sure precisely what they're searching for – what is known as the vocabulary mismatch problem. It's not a great issue in cases where your query is simple ("Thai restaurants near me" or "best websites about emerging technology"), but it becomes problematic when your search is for more complicated information and you don't know the appropriate jargon. By exploring relevant keywords, you can quickly adjust your search and focus in on the abstract target in your head.

This sort of exploratory search is possible with the likes of Google and Bing, but it requires visiting pages in the results and skimming through them to find more appropriate terms, a process that can be lengthy and laborious if your search is particularly complex.

The researchers claim that their system offers dramatic real-world improvements in complex information retrieval. They tested an earlier version on 20 people in a 2013 study in which participants were given 30 minutes to solve research tasks using information retrieval systems on a database of more than 50 million scholarly articles. Interactive search intent modeling significantly improved performance over a conventional list-and-typed-queries system. Users of the new system also iterated on their query nearly twice as many times as those of the conventional method, suggesting that it's easier to direct a search with an interactive IntentRadar.

Intent modeled search may soon extend into wearables and augmented reality, too, the researchers suggest. Head-mounted displays like Google Glass could suggest information based on a poster at a conference, for instance, and they've even had some success mining information relevance directly from the human mind via electroencephalography (EEG).

A company called Etsimo Ltd has been set up to further develop and commercialize the SciNet search engine, which is described in an article published in the journal Communications of the ACM.

Sources: Helsinki Institute for Information Technology

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
2 comments
Nairda
The radar interface in the demo looks really natural. Hope we see this concept in more search engines and interfaces
MurdoMcSponge
Tried to login to SciNet through FireFox and Explorer and neither could find the site.