Oculus / 6DoF Object-control framework

WHy

Lead the design for platform components that are specifically focused on 6DoF interaction and spatial interfaces.

how

3D and 2D design
Platform OS design
Art direction


Input mapping

Users have trouble with delineating the usage between trigger button and grab button. Our platform also have a growing need to support more complex interactions as we bring online more user-generated content.

If we draw parallel to the mouse, left click is often associated with direct manipulation whilst right click is enables meta control such as bring up a context menu.

Developing a more coherent mapping system helps to extend interaction metaphors to hand tracking.

 

Meta control with trigger

Such as “select”, “context menu” and etc

Direct manipulation with Grab

Such as drag & drop, position change and etc


Tools to manipulate 3D

Throughout play testing we noticed that, despite our effort in providing strong sensory feedback (with vfx,sfx and hand representations), users struggle with modality in our experience.

This is particular a problem when we wanted to segment features based on play vs edit (similar to Sims).

 

Hypothesis

The immersive quality draws closely resemblance to our physical reality which has a blurred modality. So instead of locking users in mode, we can introduce

  • Quasimode controls (constantly aware of states, but possible ergonomic issues if use over long period of time)

  • Reversible states (undo stack via states instead of user action - to take into account of social and physics determinism)

 

Parameters

  • Single hand operable

  • Ambidextrous

  • Support a wide range of platform features and expose 3rd party developer entry points

  • Can mix and match tools with two hands to create depth

 

Multi-stage use cases