nasa - virtual reality
Redesigning virtual reality user interfaces for astronaut training
3D Modeling - Blender
The summer of 2018 was spent interning in the Habitability Design Center at NASA's Johnson Space Center. The primary project addressed the existing user interface used to interact with deep space habitations in virtual reality. The result is an entirely redesigned UI and accompanying graphic standards manual to allow for continued development and applications, including living space evaluation by subject matter experts, design testing, and, potentially, astronaut training.
Virtual reality provides a new and unique design space for interfaces. Fully physical,
fully digital, and yet fully a combination of the two. This interface worked within the programmable capabilities of the HTC Vive VR System and within the framework of Unreal Engine Processing.
the previous user interface was limited to a floating screen that housed unfriendly hand tools
The interface was broken down into three primary workflows:
1. Controller tools for interacting with the environment
2. Filter wheel to edit the environment
3. Evaluation panel to rate the environment
The movement tool allows users to navigate the space without walking. it is housed in the left controller's trackpad (green.)
the evaluation panel hosts an interactive feedback session to rate aspects of a scene. it is stowed and deployed by pressing the top menu button on the right controller.
the evaluation panel's interactions is based on a basic point and click chain of actions. the left controller's trigger houses the clicking capabilities.
when the filter wheel is deployed, the right controller's trigger selects and toggles the visibility of a filter or scene.
There are three interaction tools:
measurement, clearance, and audio flag. they are all housed in the right controller and accessed and interchanged by pressing the grip buttons (green.)
a flashlight tool emits a light from each of the controllers for greater visual detail. it is toggled on and off by pressing the left controller's grip buttons.
the filter wheel houses all visual elements of the scene and can toggle their individual visibility. it is stowed and deployed by pressing the top menu button on the right controller.
filters and scenes are organized radially around the wheel. scrolling between these labels is housed in the right controller's trackpad, which is divided into a top and bottom section in this orientation.
All work was documented and compiled into a design manual that outlines best practices and contains reference material for continued development and instrumentation. While prototypes were tested in virtual reality, this interface will continue to increase in functionality in the future as it is deployed across multiple NASA platforms.
This project was supported by a number
of individuals at NASA, including Canaan Martin, Mark Cramer, Brett Montoya, Taylor Philips-Hungerford, Molly Harwood, Robert Howard, and Francisco Jung