Challenge
Stakeholders needed to understand a complex event space before fabrication, but the only available hardware was the Samsung Gear VR (phone-in-headset) with limited input and compute. We needed a navigation model that felt intuitive with just gaze input.
Approach
Engineered mechanism to ingest the designer’s 3D spatial layouts and build a gaze-driven locomotion system. Using the floor plan as a minimap, we let users jump between nodes by looking at a spot and dwelling, matching the constraints of Gear VR. Collaborated closely with the event designer to keep fidelity while maintaining smooth performance on mobile hardware.
Outcome
Delivered a VR walkthrough that aligned stakeholders on layout and flow, reducing iteration cycles before the build. The experience helped secure confidence in the design and demonstrated how mobile VR could accelerate spatial decision-making.
Media