By KarI Y. Snyder, MFA
How to empower hospital staff to properly handle a hostile shooter event with knowledge that can provide them with better odds of survival? Active shooter training has traditionally been a large-scale physical event involving multiple organizations such as police, fire, emergency services, and actors with costs escalating into the hundreds of thousands of dollars. Due to cost and time, the training is limited and with scenarios that are non-repeatable. It can also be physiologically damaging with participants experiencing PTSD. Houston Methodist Hospital sought to leverage new technology to change the training experience into something repeatable that could be practiced within a live hospital that would not have staff suffer physiological damage. I undertook the mission to take the hospitals directive and make the first of its kind VR (virtual reality) training experience.
- Project Manager
- Instructional Designer
- Product Designer
- UI/UX
- Writer
- Jon Lindgren, Producer
- Carlos Puerta, Lead Unity Programmer
- Corey Smith, Jr. Programmer
- Nathan Nguyen, Jr. Programmer
- Phillip Vo, Jr. Programmer
-Angel Muniz, 3D Art
-4 months
- Monday
- Discord
- Adobe Creative Suite (Photoshop, Illustrator, XD)
- Unity
- Recolude
- VR Interaction Framework
- Blender
- HTC Vive Headset
360 degree Matterport scans were taken at three different Houston Methodist locations with a focus on the ER departments since they have the highest likelihood of a hostile shooter event. The scans provided me needed information to prepare various training scenarios; develop an asset list for the most common objects in the rooms and the environment, and to allow the coding team a visual representation for how doors where accessed (slide or turn to open). The 3D art team also used the Matterport scans to correctly measure various assets and match scale and size.
Outside of wireframing what UI would appear when a user engaged, I also worked on mapping out additional simulation features that fit the budget. Features such as having the shooter programed with auto enemy detection so it would hunt for the player automatically; auto object reposition per play so that all objects in the rooms and hallways change position each time the player runs through a scenario; all scenarios featuring a timer; and data was saved locally since the budget did not allow for a combined cloud account between all three hospitals.
I developed the simulation scenarios to match the learning objectives of Hostile Training which include the actions of an individual choosing to Run/Avoid, Hide/Deny, or Fight/Defend.
A Run/Avoid scenario involves the player choosing to race to the nearest exit without being spotted by the shooter.
The Hide/Deny scenario involves the player choosing to either hide under a desk or to go into a patient room, close the door, move a hospital bed in front of the door to block it, and then turn out the light. Hand gestures in VR are not the easiest to control. To provide the best experience, I had the programming team leverage VR Interaction Framework which includes pre-built interactive objects thus saving programming time to code the light switch triggers and open the doors properly.
The Fight/Defend involves the player picking up certain objects and throwing them at the shooter to stun him. A "fight" icon was displayed over all throwable objects so the player knew which objects could be picked up and used.
After launching the trainingin, over 99% of staff expressed that they perfer the VR training experinece over the physical primarly since they can redo scenarios repeatedly (Evaluations, 2022).
Learning self-efficacy skill reports reveal a cognitive increase of 600% recall of key scenario activies four months after a 15 minute round of VR training (Evaluations, 2022).