alexandtheweb / blog

On-Body Interaction: at one with the interface

In preparation for a class presentation, my study group at Sussex immersed itself for a few short weeks in the rather vast discipline of Ubiquitous Computing. When we came up for air, we were left with the feeling that we’d only just scuffed the surface. I personally felt we didn’t spend enough time focusing our attentions on some of the new interaction paradigms which will make UbiComp viable and, well, ubiquitous. The work of Chris Harrison at Carnegie Mellon’s HCII is arguably at the forefront of these new and exciting interactions and boggles the mind in its ambition and scope.

Armura - on-body projection

Chris’ interests are in novel interaction models for small devices – see, for example, his PocketTouch project which allows access to basic functionality of mobile devices without the need to retrieve them from one’s bag or pocket. But he’s more recently been in the news with Armura, a new system which combines gestural inputs with on-body projection. I loved the project’s flurry of practical use cases, from summoning directions while lost in a museum to virtual tattoos to innovative on-body menu interactions. Chris admits there are challenges to this, from dexterity and fatigue. As he bluntly puts it in his overview of the Armura project: “our bodies have no API”.