nasa - virtual reality

Redesigning virtual reality user interfaces for astronaut training

UI/UX

Creative Technology

User Research

3D Modeling - Blender

Summer 2018

full render 2.jpg

The summer of 2018 was spent interning in the Habitability Design Center at NASA's Johnson Space Center. The primary project addressed the existing user interface used to interact with deep space habitations in virtual reality. The result is an entirely redesigned UI and accompanying graphic standards manual to allow for continued development and applications, including living space evaluation by subject matter experts, design testing, and, potentially, astronaut training.

Virtual reality provides a new and unique design space for interfaces. Fully physical,

fully digital, and yet fully a combination of the two. This interface worked within the programmable capabilities of the HTC Vive VR System and within the framework of Unreal Engine Processing.

headset.png
headset.png

the previous user interface was limited to a floating screen that housed unfriendly hand tools

wireframes-01.png

Wireframes

The interface was broken down into three primary workflows:

1. Controller tools for interacting with the environment

2. Filter wheel to edit the environment

3. Evaluation panel to rate the environment

The movement tool allows users to navigate the space without walking. it is housed in the left controller's trackpad (green.)

wireframes-06.png

the evaluation panel hosts an interactive feedback session to rate aspects of a scene. it is stowed and deployed by pressing the top menu button on the right controller.

wireframes-09.png

the evaluation panel's interactions is based on a basic point and click chain of actions. the left controller's trigger houses the clicking capabilities.

wireframes-02.png

when the filter wheel is deployed, the right controller's trigger selects and toggles  the visibility of a filter or scene.

wireframes-03.png
tool change animation FINAL.gif
eval panel animation FINAL.gif

There are three interaction tools:

measurement, clearance, and audio flag. they are all housed in the right controller and accessed and interchanged by pressing the grip buttons (green.)

wireframes-07.png

a flashlight tool emits a light from each of the controllers for greater visual detail. it is toggled on and off by pressing the left controller's grip buttons.

wireframes-08.png
wireframes-05.png

the filter wheel houses all visual elements of the scene and can toggle their individual visibility. it is stowed and deployed by pressing the top menu button on the right controller.

filters and scenes are organized radially around the wheel. scrolling between these labels is housed in the right controller's trackpad, which is divided into a top and bottom section in this orientation.

wireframes-04.png
teleport animation FINAL.gif
filter wheel animation FINAL.gif

All work was documented and compiled into a design manual that outlines best practices and contains reference material for continued development and instrumentation. While prototypes were tested in virtual reality, this interface will continue to increase in functionality in the future as it is deployed across multiple NASA platforms.

manualrender.jpg
20416-NSNNRM.jpg
full render 2.jpg
full render 1.jpg

This project was supported by a number

of individuals at NASA, including Canaan Martin, Mark Cramer, Brett Montoya, Taylor Philips-Hungerford, Molly Harwood, Robert Howard, and Francisco Jung