Those old enough to remember the command line interfaces of yesteryear are only too aware of what a godsend the Graphical User Interfaces (GUI) of today are. However, the human computer interface (HCI) developed in the 1970s at Xerox PARC, combining a desktop metaphor GUI and mouse controller, has remained largely unchanged ever since. Now R. Clayton Miller proposes the next step in the evolution of HCI's with his 10/GUI concept that harnesses the power of multi-touch by removing the touch surface from the screen.
Current GUI's rely on the mouse, which confines a user's hand to a single pair of coordinates, even though the human hand is capable of multiple intricate manipulations. Recent multi-touch interfaces popularized on mobile gadgets, such as Apple's iPhone, have recognized this and proved their worth on smaller handheld devices - so much so that computer makers are now extending multi-touch capabilities to desktop computers, but without the same level of success.
Repositioning the touch surface
Using a traditionally-placed desktop screen as a touch interface, even for short periods, places too much strain on a user's arm. Touchscreens used in a drafting table configuration lessen the strain on the arms, but increase the chances of neck strain, as users are forced to look down at the display. And in either setup there is also the problem of the user's hands obstructing the display. Miller's 10/GUI overcomes these problems by splitting the touch surface from the screen, and using a touch surface similar to a drawing tablet that is large enough to accommodate all ten fingers. A hyper resistive capacitive array, with the capability to sense the position of each finger and to detect individual finger presses, would allow ten circle cross-hairs to be onscreen at once, instead of a single mouse pointer. Since this surface would be placed on the desk, users could combine the ergonomic advantages of using a mouse with the benefits of multi-touch.
CON10UUM
And the 10/GUI wouldn't just offer multi-touch benefits, such as simpler zooming or rotating of images. Instead, Miller proposes a new way to deal with the problem of multiple windows cluttering up a desktop as well. To overcome the problem of multiple, arbitrarily placed windows that can be difficult to sort through, Miller has come up with a system he calls, CON10UUM, which organizes windows linearly. Newly-opened windows would appear on the right side of the screen and take up the entire height of the display. Each successive window would slide in from the right, pushing the existing open windows to the left. When managing windows using multi-touch, the greater the number of fingers used, the higher the level where they have effect. For example, using one finger manipulates objects inside applications, while two fingers can be used to scroll or pinch-zoom inside applications.
Now, this is where the 10/GUI interface starts to show the advantages of using more than two fingers. Using three fingers allows the user to move applications around the desktop and pinching will resize the application. Four fingers are used to scroll left or right through the open applications and pinching will zoom the open applications to make it easy to find the desired application.
Two hands can even be used at once to zoom out with one hand and move applications around with the other. But even the CON10UUM system can become a chore to scroll through with enough open windows, so continuing to zoom out will provide an annotated thumbnail view of the open windows separated by application.
CON10UUM would also see the left and right edges of the touch surface acting as specialized areas for the 10/GUI interface. A subtle ridge would delineate the strips and allow them to be located by the sense of touch. Touching the right edge area would activate global menu options, such as opening applications and shutting the computer down, while touching the left edge area would trigger local menus, such as the current application menu.
In the video Miller has created detailing his concept, which can be seen below, no mention is made of text entry using the system. But at the very end of the video is a rendering of a keyboard with an integrated touch surface along the bottom, similar to a laptop keyboard/touchpad setup. Millar has obviously spent some time devising his 10/GUI concept and, given the increasing popularity of touch-based interfaces, it looks like a viable direction for HCI's to head in.
Miller admits, "relentless prototyping, user testing, and iteration, combined with exacting control over the software and hardware in concert, would be key to transforming these principles into something usable, versatile, and marketable." And while that's definitely true, Miller has already succeeded in his aim to, "inform, inspire, and start discussions."
Watch the video and let us know what you think of Miller's 10/GUI concept. Can it work? Do you have some ideas to make it better? We'd love to hear your thoughts.