Archive for November, 2011

Use your hands

Posted in Design, Multimodal, Planet GNOME, Planet Igalia on November 9th, 2011 by femorandeira – 2 Comments

I have just read Bret Victor’s excellent post on the future of interaction. He accurately describes the current mainstream interaction paradigms (and even many futuristic visions!) as pictures under glass and claims that they don’t really take advantage of most of our natural capabilities, chief among them the huge sensitivity and precision of human hands. He mentions some lines of research that might help pave the way for future interactive technologies that are more aligned with our natural capabilities. I am just going to add pointers to a few of them and a couple of comments.

Haptic technology focuses on providing tactile feedback. A trivial example is vibration in cell phones, but far cooler things are being researched, such as physical buttons that change dynamically, textile-based haptic displays that can be wrapped around surfaces, phones that move their centre of gravity and even touchable holograms.

bubblewrap

Touchable haptic display.

Another aspect is proprioception, the sense that informs us of the positions of our limbs and body; for instance, our use of a computer mouse depends on out ability to know the position of our hand and arm in space. Of great interest is the concept of extended physiological proprioception (original paper from the 70’s): the information obtained via a tool (e.g. the point of a pencil, prosthetic limbs in the original research, etc.) is actually perceived in a very similar way as if it were part of the body. The current thinking is that this capability might have evolved as our species began to use tools, many thousands of years ago.

For instance, this means that we can grab a pen, and use it to touch and successfully identify the characteristics of physical objects. You can even try it now. Haptic pens, such as those created by SensAble,  exploit this idea by providing a device that can be moved tridimensionally, with little engines that simulate the feedback one would get if he was using the pen to touch and interact with a physical object. They have been a very expensive technology for some time, reserved for fields such as advanced engineering and medicine; as the price of this technology goes down over time, we might begin to see more and more applications.

SensAble Haptic Device in use (source).

SensAble haptic device in use.

Tangible interfaces allow the user to interact with digital information by manipulating his physical environment. There are many different directions that are being explored at the moment; one of the favourites for the nerd in me is this implementation of a D&D role-playing game using Microsoft Surface.

Dungeons&Dragons on a Microsoft Surface.

Manipulative and haptic interfaces seem a good fit for dealing with computerised representations of physical objects. However, much of the work that we do on computers has to do with manipulating abstract information and there might not be an immediate way to translate and represent it in a physical interface. Different metaphors will need to be developed and evaluated.

This is linked to multimodal interaction, a branch of HCI research that tries to provide information to different human senses and gather inputs coming from varied sources. The expansion of mobile devices requires us to provide forms of interaction that are more flexible for when keyboard and screen are not readily available. Generally speaking, this is also a more human approach, in that it has the potential to make a better use of our innate abilities, and also more humane, as not all of us have the same senses available or with the same precision.

(Sources for the images: one, two, three)

http://www.sensable.com/products-haptic-devices.htm