Design / Spatial Interface Design Touch to see

Use the hands to interact directly

Replace remote gestures with direct manipulation, leveraging the neuroscience of interaction—namely, the brain’s naturally heightened understanding of objects near the hands.

DoUse direct hand manipulation to use tools and interact with content.                        
UseUse remote gestures and abstract pointers.                              


Touch to see

UI design suggestions

1. Touch content to act on it
Tools, actions and the content on which they operate should never be physically or spatially separated. Encourage this with volumetric tools that bear affordances (physical features that suggest their use and purpose). Do not encourage the user to manipulate content from a distance or with abstract middlemen like remote gestures, buttons and menus.

2. When grabbing, pushing, or expanding
The user’s physical movements should reflect their task, such as grabbing, pushing, or expanding. For example, to expand a 3D model and reveal its interior, the user should grab both sides and pull in opposing directions, or depending on its material, reach in and push out from the inside. This harnesses the user’s experiences in the real world and reduces the learning curve by rewarding their assumptions. Do not associate such actions with arbitrary gestures the user must first learn and memorize (such as expanding the model by opening their fist or waving their hand).

3. Avoid gestures that require specialized knowledge or memorization
While “power-user” shortcuts are enticing ways to speed up interaction with an interface, they leave behind most of humanity (the non-power users). Ultimately, our goal is to establish an inclusive dictionary of universally intuitive interactions that can apply to all humans equally, irrespective of their cultural, symbolic or linguistic backgrounds. Instead of discrete gestures, use affordances to encourage the user to interact with objects in more natural ways.

4. Persistence and the usage of tools
Tools intended for persistent use (such as a paint brush, a conductor’s wand, or a flashlight) should respond to the user’s manipulation realistically, instead of requiring a discreet hand gesture to activate or deactivate. For instance, the brush should draw when pressed against a surface, rather than in response to a separate trigger like a gesture or button press. Detached triggers are difficult to remember and don’t match the user’s mental model of work in the physical world. Furthermore, avoid abstract controllers such as cursors, as they put space between the user and their task, rather than connecting them.

5. Proximity feedback is your friend
Provide proximity feedback when the user has interacted with an object or content. Glows and subtle audio cues are a good way to accent a collision or movement and compensate for the lack of haptics. For instance, Meta uses a subtle sound effect when tools are removed from the toolbelt.

Technical consideration:
adapting movements to hand tracking

The limitations of hand tracking technology might mean the ideal movement for a given action is too subtle or complex to implement. In such cases, Meta advises using the closest supported gesture in its place. Later in 2017, Meta will support hand pose estimation, which is a requirement for the full range of natural movements. As an intermediate measure until then, we support grabbing, (which user studies have shown to be the most versatile single gesture) as well as physics and collisions between hands and objects. These two features can be combined as a stop-gap measure until greater articulation is available.

...Before you object! Ergonomic
fatigue and extending reach with tools

AR adaptations of real-world tools such as a grabber or lasso tool are a good way to mitigate the exhaustion a user may incur while extending their arms in constant physical interaction3. We recommend such ergonomic aids be designed along the lines of their real-world counterparts, in that they directly extend from the hand like a physical tool and obey similar physics (although some may exaggerate physical laws for enhanced functionality). Note that such tools are not abstractions, however, and preserve the user’s intuitive sense of reality within the interface, just as they would in the physical world.

Touch to see

The neuroscience behind it

What do babies do when they see something that interests them? They reach for it. Tools and content alike should be placed within the user's reach and accessed by touching them directly.

Meta advises interfaces that favor directly manipulating content with the hands instead of performing gestures from a distance, leading to a deeper understanding of the content than the remote hand gestures can provide.

A significant portion of the brain is dedicated to tracking peripersonal space, or the placement of the hands and the space around them. Harnessing this natural instinct reduces learning curves and increases the user's connection to the task at hand. In fact, studies have shown that even our visual acuity of a task being performed near the hands is measurably boosted (see Further Study, below). This advantage is lost in interfaces that physically separate the user from the target of their interaction, such as those that rely on peripherals.

Meta advises interfaces that favor directly manipulating content with the hands… 

The boost of peripersonal awareness can also be lost in AR interfaces that reduce input to abstract gestures, as these separate the hands from content in a physical sense while also adding the overhead of abstraction in the gestures themselves, which require training to memorize and additional cognitive effort to execute.

Further Study

Bonifazi S., Farnè A., Rinaldesi L., Làdavas E. (2007) Dynamic size-change of peri-hand space through tool-use: spatial extension or shift of the multi-sensory area. Journal of Neuropsychology. 1(1):101-14.

Papers: Fogassi L., Gallese V., Fadiga L., Luppino G., Matelli M., Rizzolatti G. (1996) Coding of peripersonal space in inferior premotor cortex (area F4). Journal of Neurophysiology. 76(1):141-57

Makin T.R., Holmes N.P., Zohary E. (2007) Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. Journal of Neuroscience. 27(4):731-40

3 Though “Cardiovascular User Interfaces” that require the user to get up off their chair and directly interact with the UI are preferable, as they require less learning curve, and avoid WALL-E future!


Previous Next