screencap1

VR Brain Jam: Games for Change with a neuroscientist!

I gathered a team comprised of seven Becker faculty, staff, and alums for the Neuroscience VR Jam that occurred in NYC as a unique precursor to the Games for Change conference. We applied and received a coveted spot in the event which provided us the opportunity to collaborate with neuroscientist Dr. Joe Snider from the University of San Diego’s Poizner Lab. Our project was a two player VR game focusing on proximity issues and immersion valuation. In short, we created a spatially limited arena in which a game mechanic could offer data on participation performance under various closed conditions. What we were looking at in particular was how a reward/risk situation may be affected by empathy, even in a virtual environment — thus the component of immersion factored into the equation rather heavily.

The Scenario: Two players, each riding an elevator they have secured in a facility which has been taken over by a nefarious tech force, is attempting to destroy the infiltrating drones that they will encounter on the various floors of the lab. You are awarded points which determine how successful you are at destroying the enemy forces and securing the safety and confidentiality of the facility, for not only destroying the drones, but also doing so with as few shots as possible.

The Risk/Reward: Initially, the player comes to a stop on the training floor and when the elevator doors open, they see a drone across the hall in an opposing elevator. They can fire within 3 distinct areas, representing increasingly difficult shots, and tangentially greater points. This continues for a bit to get a baseline of the players ability to target and shoot accurately since we are studying their delta in shooting proximity from the center point [the most valuable], rather than just the distance.

After the training level, the next time the player has the doors open onto a new floor, while there is a drone, there is also another player directly below the drone. This is similar to an apple on the head. During this actual testing level, the drones will appear above or to the sides of the player’s head.

The Second Test: The exact same scenario with the same two levels [training and test], but the difference is that during the training level, points are lost for firing into the two most central target areas and points are only gained for firing inside of the outermost target area on the drone, and on the second level, not only does the player lose points for not firing in the outer most drone target area, but the other player is ‘injured’ with a visible blood splash when hit.

Technical Challenges: Multiplayer, Inverse Kinematic Body Movement
We spent the jam designing the test scenario with Dr. Snider, creating a database to hold the backend variable information during runs of the simulation, integrating the multiplayer vr sdk, and incorporating an inverse kinematic body rig to allow for appropriate arm/body placement of players in order to more convincingly represent them in a multiplayer format — particularly since we needed to rely on empathy. The overall effect was successful — the other player would let out audible cues and often duck or attempt to move out of the way when they realized they were being fired upon, and this allowed the firing player to realize the reaction was the other person in the physical room with them — aka NOT an AI.

Findings: The vast majority of the jam was spent designing and building the experience with only a couple of hours at the end for others to try it out, so while we were able to establish a proof of concept and felt very satisfied with our efforts, we were not able to collect enough data to provide any true findings. But it did make our team pause when shooting at those drones when there were innocent bystanders!

Team:  Ulm, Anthony Botelho, David Gates, Matthew Hopkins, Stephen Read, Matthew Sylvia, Amanda Theinert and Dr. Joe Snider