Mood Map isn't the first project to draw on Twitter data, and it acts in a somewhat similar (though far simpler) fashion to the Hedonometer we reported on previuosly. The project hooks into Twitter's API to search for, and analyze, specific strings of Korean characters that are reckoned by E/B Office to depict moods or feelings such as pride, joy, anger, and pity.
Once the relevant Twitter data is collated, custom software created under the Arduino Processing programming language is used to process it through one of three fiber optic light sequences.
The first sequence visualizes tweets in real time, while the second interprets two moods which have been analyzed in the previous hour. Finally, the third displays the collective data of one mood taken in an entire day – all of which sounds more than a little vague, but it produces a stunning visual effect, judging by the photos.
According to E/B Office, the installation is intended to "express a flux of mood, feeling, intensity and time transmitted to a spatial 3D body." Whatever, let's just be grateful that somebody's finally found a good use for all those LOL's and retweeted photos of cats.