Role: Creative Technologist
Tasks: Video production, camera stitching, video editing, audio editing
Technology: GoPro 360 rig, Ricoh Theta S, ambisonic audio, Autopano, Premiere
At grad school in 2015, we spent the semester experimenting with what was a very new technology at the time – live action 360 video. We used Ricoh Theta’s and six-camera GoPro rigs, stitched with Autopano software, and used Premiere for video and audio post production. Working with fantastic group members, I produced three works that give some examples of what can be achieved with live action VR.
I worked with Nick Hubbard and Natalia Cabrera to shoot some footage at Chelsea Pier Skatepark in Manhattan. We used the six-camera GoPro rig. We experimented with changing speeds in VR. Time lapse (for sped up footage) does not work in VR, because the stitching software cannot find consistency. Slow motion, however, looked fantastic, as long as we brought the individual footage into Premiere, rendered out in slow motion, and then stitched together.
I also worked with Natalia Cabrera and Michael Weber to explore another possible use case of live-action VR: live concerts. Can we get on the stage and mimic what is feels like to be performing on stage? We worked with well-known New York comedy act The Skivies and recorded a holiday show at Joe’s Pub in Manhattan. The result is an incredible immersive experience with mapped audio that feels more like being on stage than in hearing it from the audience. At the time of creating this video, YouTube did not support the ambisonic audio spec (which it does now, but this video was mixed in a different format).