When it comes to foreign language films and TV programs, purists usually argue that subtitles preserve the vocal performance of the original actors. But I have to admit to a general preference for dubbing, mainly because I don’t like taking my eyes off the actors for extended periods (but maybe that’s just because I’m a slow reader). Researchers at BBC Research & Development could sway me to the other camp with a new system that frees subtitles from the shackles that have traditionally kept them at the bottom of the screen.
The prototype system from BBC R&D involves monitoring a group of people as they watch a program and using a Tobii eye tracker to record which area of the screen they are looking at and when. Using this information, the researchers then position the subtitles near to where most subjects' gaze fell.
Though the placement was done manually, the researchers are looking at ways to automate some of the process, through the use of facial tracking and identifying low contrast or out of focus areas. A human being could then come in to do a final pass to check the positioning. The team is also looking at ways to convey tone in the text, through the use of larger letters and the timing and position of the subtitles.
Senior R&D Engineer Matthew Brooks told us that tests to date have provided mixed results, with test subjects overall taking slightly longer to find the subtitles when they don’t appear as expected at the bottom of the screen, but taking less time to read them when they do find them. He points out that the quality of subtitle positioning ranged from good to terrible, and that he hopes to ascertain where best to position subtitles in various instances from further analysis of the test results.
Brooks says feedback has also been mixed, with those who commonly use subtitles preferring the traditional placement, while those with little experience with subtitles tending to prefer the gaze-positioned subtitles, indicating they may take some getting used to.
Since such a system won’t be for everyone, Brooks is hoping that a new subtitling standard could be developed that would transmit position metadata along with the text and allow users to switch between the traditional bottom of the screen placement and the new gaze-tracking placement.
I did spend a few minutes watching a demonstration of the prototype system and have to admit preferring it to the traditional subtitle placement. Although I don’t generally watch a lot of subtitled programs, I did feel like my eyes weren’t drawn away from the onscreen action as much and that it was easier to keep one eye on the actors while still taking in the subtitles.
Source: BBC R&D