Multi-touch hardware and software company Ideum is exploring a potential future for the workplace in which traditional desks give way to projected capacitive touch (PCT) tables that you use with both hand gestures and tangible objects. The project is called the Dynamic Desktop, and it's an idea that creative director and CEO Jim Spadaccini believes will work on any PCT screen.
"Tangible interfaces are compelling, since we all interact with objects of all kinds every day," Spadaccini tells Gizmag. "We like our physical keyboards and notepads, and the sensation of moving and interacting with real objects, for instance, but we're increasingly mired in a flat world of touchscreen convenience wherein the virtual beats the physical because it syncs up to the cloud and is more easily shared around the globe."
Ideum's Dynamic Desktop is embracing that convenience while also attempting to preserve and extend the tangibility of the physical object. You can put your phone down on the screen, for example, and drag out a few photos to share on social networks or some documents you want to work with on your desktop. Or you can plop a keyboard down and get right to work on a new document while a dozen other interfaces dot the edges of your desk.
You can get a greater sense of how it works in the video below.
The finer details are still being considered, but the idea is to make the desk itself interactive without losing the advantages of familiarity – the way we come to associate objects with particular tasks and workflows, as a kind of shorthand that reduces cognitive load. (I have a camera within arm's reach of my desk, for instance, and it would be much simpler for me to put the camera on the desk to bring up an interface with which I can access my photos than to think about what app to open and whether or not I need to plug the camera in to my computer to transfer any new photos across.)
"Theoretically, the system should be able to identify hundreds if not thousands of objects," Spadaccini says. But for now his team at Ideum and GestureWorks is hard-coding them in. Not just any old object will work, unfortunately. The system requires objects with capacitive qualities (i.e. conductive materials like copper or special plastics), which the team has been building with 3D printers. And objects must have unique conductive patterns so that the GestureWorks software can differentiate between them.
This is one disadvantage the system has when compared to the older vision-based touch technology that Microsoft used in PixelSense (formerly known as Surface, before the other Microsoft Surface came along). The use of cameras and optics in PixelSense made it possible to recognize any type of object without need for custom conductive patterns, or any capacitive properties whatsoever.
But vision and camera-based touchscreen systems are big; they tend to be several inches or more thick, and they can succumb to light interference or calibration issues. PCT screens, by comparison, are impervious to light interference and are already down to thicknesses of just an inch or two at these large sizes. Ideum's own Platform 55 multitouch table is now down to 0.75-in (19-mm) thick at its edges, with a 2.5-in (63.5-mm) wide pedestal that houses an integrated Intel Core i7 computer system.
Ideum started out making camera-based multi-touch tables that had no trouble handling fiducial (point of reference) markers like the ones they're prototyping now, so Spadaccini admits to being pleased that he can revisit the concept of tangible interfaces. "We think there is a real future for this technology," he says.
The Dynamic Desktop is for now just a proof of concept, but Spadaccini says it will eventually be tied into the GestureWorks SDK. Ideum hopes to have it ready next year, but won't commit to a firm release date.