ABSTRACT: FRAMING EMBODIMENT IN GENERAL-PURPOSE COMPUTING
The last thirty years have presented us with technology that has had a profound impact on how we produce, socialize with others, and consume culture. Today most of these actions are linked to a computational setup which involves a screen representing our options in two dimensions and a hand-operated controller for manipulating the screen environment, a hardware setup that has not changed considerably the last 50 years. The dominant interface for personal computers—the graphical user interface—is highly ocularcentric, where only parts of the body apparatus (eyes and hands) are addressed in the interface directly. As an increasing amount of information, life experience and human contact is channeled through it, the desktop computer system, becomes increasingly inadequate to fully represent these actions. Any prosthesis added to or used in conjunction with the body and any part of the sensory apparatus neglected will define our interaction with information. Information gathered by the haptic senses—the touch and proprioception—constitute a significant component in the way we form hypotheses about what an object is, and how it can be manipulated. By addressing the haptic senses in computer interfaces, we can achieve richer and more intuitive interactive experiences.
This thesis aims to identify the key components of a general purpose computational environment that foreground multimodal interaction by:
1) investigating the significant qualities of the haptic senses from a phenomenological and neurophysiological point of view,
2) pointing to successful principles of human computer interaction (coupling), and tools for designing embodied interactions (physical metaphors, interface agents, affordances, and visual and haptic feedback), and
3) evaluating the components of current mobile phone technology, surface computing, responsive environments, and wearable computing