A see-through screen, digital 3D objects manipulated by hand, perspective adjustments according to the user's viewing angle - these are the core features of a prototype computer desktop user interface created by Microsoft's Applied Sciences Group. The prototype uses a "unique" Samsung transparent OLED display through which the user can see their own hands to manipulate 3D objects which appear to be behind the screen.
A demo video appears to show a working prototype of a computer markedly different from those we use today. Yes it includes a familiar keyboard and trackpad - but these are placed behind the OLED display. The user simply lifts their hands from these input devices to manipulate on-screen (or more accurately behind-screen) objects, such as selecting a file or window.
The video shows the interface in action with a series of program windows stacked behind one another, with the user selecting the desired program by hand, using the depth of the workspace.
Similar actions are shown to manipulate 3D objects - an exciting prospect not only for gamers, but perhaps also for architects, inventors and engineers working on 3D models. The cherry on the muffin in this respect is the inclusion of head-tracking technology - step to the side to shift your angle of view and your view of the 3D objects on screen will be altered accordingly.
The video certainly poses questions as to the future of human-computer interaction - not necessarily all intentional. Non-touch typists may balk at the idea of a keyboard positioned behind a busy screen (the display may be transparent - that's not to say the information it displays is). Similarly, objects arranged behind one another will necessarily impede view.
But to quibble at the details is rather to miss the point. "This project advances research in current display technologies hoping to provide a more interaction with everyday desktop computing of the future," said Cati Boulanger, a researcher at the Applied Sciences Group. Even with a working prototype, this is a technological what if, not a thou shalt.
See below for a demo video of the prototype in use.
Via Wired UK
I love debbie downers like you. No foresight and only the desire to put down innovation.
Do you think, maybe, that this would not be used by people typing all day and would, instead, be deployed by graphic designers, 3D design engineers, physicists and chemists exploring chemical structures, and a host of other people looking for visualization aids?
Seriously, after seeing this the first and most important thing that came to your mind was, gee, this sucks because IT people who type all day wont use it because of ergonomic restraint?
So sad.
Yes, "graphic designers, 3D design engineers, physicists and chemists exploring chemical structures, and a host of other people looking for visualization aids" would use this system. However even they would be susceptible to the effects of holding their holding out their arms all day. As a short term use, it would probably be of some benefit.
But, as Slowburn commented. it would probably be way more efficient, and less tiring, if the screen were a visor display instead.
I just look forward to the tablet version of these screens - you could use it as a HUD it seems:) Now add the sensors so you can wave a hand to get more info etc and it is a winner:)
I also want to point out that research and design is all about creating something (prototype) and testing it to identify problems like the ones being pointed out here. maybe we shouldn't develop any new technology unless it is absolutely perfect from the very beginning?