Oculus / Rift Home

WHy

Lead design for the new Oculus Home 2.0 on Rift that takes complete advantage of our 6DoF touch controllers. Together with engineering, shaped product conversation for the initial beta launch. Shipped multiplayer, 3D user generated imports, platform integration and more.

how

Product direction
3D and 2D design
6DoF design patterns
Art direction

when

Out in the wild!


Beta Launch 

Home 2.0 came to life with a more “game/immersive art” centric development approach. We worked on pre-production for about 3 months to explore different concepts and hypotheses.

The interaction mechanics, art direction and etc all amalgamate into a vertical slice for internal and external play-testing. We learnt that, despite the more cohesive aesthetics, highly specific theme packs reduced the sense of ownership to the space.

Single hand operable & ambidextrous controls

Because Home 2.0 is a core part of the OS, it means that that bar for accessibility is higher than a general application. The core interaction framework we used to guide our decisions are:

  1. Ambidextrous (all controls are mirrored on both sides of the hand - e.g. A and X face buttons should operate the same)

  2. Single hand operable (if only able to operate on one Touch controllers, that will not limit the range of features)

  3. Minimise the use of face buttons (too much context switching and not be able to see the actual buttons themselves makes it hard to develop muscle memories)

To align to this framework, requires input mapping to be an early thoughts. As button sets are halved (being ambidextrous) and features need to be designed to be operable with just one controller.

 

Gradated user state management

VR simulate your visual and spatial senses similar to real world, so users find binary state changes (common with abstract UI) jarring.

We favoured a seamless and gradated transition between states: like Alex Dracott (VFX)’s architectural sketch line that easing one environment from another.

More about gradated modality on editing tools, based on Jef Raskin’s Quasimode.


Precision control in 3D space

Webp.net-gifmaker (1).gif

To feel expressive and personal to a space, users need to feel in control (both in macro controls and fine tune movements.

A first step is to assist the users to snap to planes. Identified affordances below to create visual explorations

  • bounding box

  • pivot point of the object

  • when snapping is activated (based on proximity) to the surface

  • surface snapping to

Together Panya Inversin (eng), we bought the most physical representation (magnet) into production, after we resolved some depth x performance issues.


Exit Beta

From a product perspective, our goal is to set up the pillars for Rift Spatial Platform:

  1. Multiplayer,

  2. User generated content (platform integrated with 1st party app, Medium)

  3. Broadcasting 2D content from Dash into 3D objects (e.g. projector in Home).

To do so, design needs to consolidating 2D info and 3D objects interaction with a new tool metaphor and a redefined input mapping framework.

Multiplayer

 

To support multiplayer, the layers of complexity compounds as we now need to deal with

  • Real time synchronisation issues

  • Locomotion and visual continuity of group movements

  • Gesture control conflicts

  • Abusive behaviours

  • Backwards compatibility to existing features e.g. physics, editing, dynamically streamed UGC and broadcasting


User generated content

To enable a larger sense of space ownership and reduce the dependency of long art pipelines, we invested in allowing users to import their 3D content into Home. This can be done via either “all-VR-workflow” with medium (as a first party app test case) and also on local desktop drive.

Design wise, we now need to consider:

  • Server network states (loading, error, etc)

  • 2D desktop flow into VR

  • Abuse and report integration with existing social APIs