b'\n\n\t\n\t\n\t\n\t\n\t\n\t\n\t\n\t\n\tAndrew Lilja • VirtuTrace After Action Review\n\n\n\t
\n\t\t
\n\t\t\t
\n\t\t\t\t

VirtuTrace After Action Review

\n\t\t\t\t

VirtuTrace is my lab\'s flagship simulator software. It\'s built on top of VR\nJuggler, which lets us run it\non everything from a mobile phone to a\nCAVE1.\nOne of its key features is the ability to combine physiological data\nwith decision tracing data to determine how users in a simulation are\nmaking decisions and what\'s influencing them. This research is two-fold:\ndeveloping a better understanding of how decisions are made and figuring\nout how we can help improve those decisions. A valuable tool for that\njob is the after-action review\n(AAR),\nwhere participants can replay their performance in the simulation.\nTypical AARs are based on notes from a trainer, and if you\'re lucky,\nsome static video — here, we can provide a fully immersive playback from\nany angle, speed, and location.

\n

Building the AAR interface is challenging for two reasons: first, the relative\ncomplexity of actions that the user can take, and second, a lack of a\ngraphical user interface (GUI) with which the user can directly\ninteract. Unlike most software interfaces which allow the user to\ninteract with UI elements via a mouse or finger, the display in virtual\nreality is an infinite, empty 3D space surrounding the user. There is no\nscreen, only empty space to project an interface on. I briefly\nconsidered trying to make a 3D interface, but I decided that it would be\ntoo clumsy and unintuitive to navigate in2, especially if the user is\nalso trying to navigate the world itself.

\n

It turns out that rendering a UI in VR is harder than it sounds3. Based on my research,\nI decided to limit heads-up display (HUD) information as much as\npossible, preferring to leave the user\'s field of vision empty.\nInstead of giving the user a series of\nbuttons floating in space, I thought it would be easier to interact with\nthe system via the gamepad, providing clear information about the\ncurrent state of the system and making it very easy to switch between\nthose states. This way, the user can press a button on the gamepad, see\nimmediately what state they\'re in, and if they made a mistake, they can\nrapidly switch to what they intended to do. This means that the\ninterface has to be highly responsive but also not lock the user into\nany course of action — if they accidentally enter free camera mode, it\nshould be just as easy to leave it without having to go through a\ncomplicated exit routine.

\n

Because we had just a few camera modes (free camera, locked-on, first\nperson, and overhead), I decided to map each button on the lower-left\nd-pad to one of the camera modes. There\'s no need to cycle through them\nor memorize a key command — just press the button and you switch modes.\nWhen used regularly, this quickly becomes muscle memory, but it isn\'t\nfair to require the user to memorize each direction\'s corresponding\ncamera mode. In keeping with the need to limit on-screen GUI, a small\nicon displaying the d-pad, the camera modes, and the selected mode\nbriefly appear each time a button is pressed. This gives the user enough\ninformation when they need it, but isn\'t distracting.

\n

The full control scheme.

\n

The system also allows users to move through time: fast forward, rewind,\nplay/pause, and jumping between bookmarks. Play pause was assigned to\nthe start button. Only one part of the gamepad contains fully mirrored\ncontrols, and that\'s the triggers and bumpers on the top, where the\nindex fingers rest. While the bumpers are simple digital switches that\ncan only be pressed or released, the triggers are analog switches that\nmeasure the strength of the depression. For this reason, I decided to\nuse them to control the fast-forward and reverse functions. A light\npress causes a slow speed up or slow down in time, with stronger presses\nresulting in faster speeds. This allows the user to quickly find the\ngeneral part of the timeline they\'re interested in, and then reduce\ntheir pressure on the control to dial in the precise spot they want4.\nIf this is too imprecise, the user can jump between bookmarks using the\nbumpers. Bookmarks are displayed in the HUD when the user is moving\nthrough time, and a list can be displayed of the existing bookmarks for\nthe user to jump to.

\n

The body navigation method of\nsimulator movement.

\n

Navigating in 3D space works the same way as in the simulator, using the\nsame "body navigation" technique. The center of the floor acts as a dead\nzone, while moving outside of it causes the simulation to move in that\ndirection. However, because the AAR is intended to give users more\nfreedom of movement, the controller can be used as a secondary input\nmethod. A standard two-stick setup is used, where the left analog stick\nis used to control the movement of the user and the right stick is used\nto control the camera movement. In order to allow flight, the left stick\nmoves in the direction the camera is looking: e.g., if the user is\nlooking up at a 45° angle, pushing the left stick forward will cause the\nuser to move up and forwards at a 45° angle.

\n

This dual-control model can be confusing if implemented poorly. First,\nif the user is out of the dead zone but is pushing the movement stick in\nthe opposite direction, they may stand still without being aware of why.\nFor this reason, when the movement stick is used, body navigation\nbecomes disabled until the user takes a step in any direction. A more\nchallenging problem comes in the form of dealing with the difference\nbetween the way the user is looking in the simulation and their usage of\nthe camera-control stick5. Valve also had this problem6, and their\ninsights were invaluable. I wound up adopting their input mode one\nsolution: the sticks control your body, and the head tracker controls\nyour head.

\n

This system is still in the testing phase, but it already appears to be\nan improvement over the previous (lack of) interface. Ideally, I\'ll have\nthe opportunity to run some controlled tests with it, but I may not have\nthe option before I leave in June.

\n\n\n\t\t\t
\n\t\t\t\n\t\t
\n\t\t↩︎\n\t
\n\n\t\n\n'