audio-proprioceptive user interfaces (in progress)

audio-proprioceptive UIs

Gesture-based user interfaces that exploit proprioception require feedback to be delivered via other sensory channels. Sound is an always-available, fast-acting, and versatile medium for this feedback that does not require explicit attention from users, in addition to being computationally and materially cheap to implement.

Artifacts manifest solely through the combination of hearing and proprioception have only become possible in recent years, with the advent of commodity markerless motion sensing devices (e.g. Kinect, Leap Motion). As such, compared to other interaction modalities, the psychology of interacting with such artifacts is not well-studied.

We are conducting experiments to understand the experience of interacting with audio-proprioceptive artifacts, and to understand design considerations for user interfaces that exploit this modality.

image by mtytel / CC BY 3.0