VR Video FAQ
Q. Which camera do you recommend?
If you are an editor or film maker that wants to explore VR video and need good enough footage to learn from and for low-end projects: The Kodak SP360 4K Dual Pro Pack. It is a pair of very compact fisheye lens cameras (with 235° field of view) that comes with a rig that positions them correctly to capture 360° video with overlap to allow for stitching.
Q. Which application do you use for stitching?
Most eventually use Kolor Autopano Video (in conjunction with Kolor Autopano Giga)- on a PC. Although it runs on the Mac, all professional stitchers (that is now a whole new job) run it on PCs configured with as many high-end GPU cards as they can afford. I've stitched footage from different kinds of camera rig on the Mac, but the rendering speed is very slow: 2-6 fps on a 2013 MacBook Pro.
While learning, if you use the Kodak SP360 4K Dual Pro Pack (or the more affordable Ricoh Theta S), you'll only be dealing with a very simple stitching boundary - between the front camera and the back camera. In this case you can use the stitching done using the supplied app (Kodak) or in the cloud (Ricoh). The Dashwood 360VR Express tools have a plugin for stitching 1-2 lens rigs in Final Cut Pro X and Adobe Premiere (on the Mac).
I expect that as a tradeoff between quick production and sharing on social media, most viewers will learn to tune out the thin stitching line from these low end rigs and become immersed in the content of VR videos.
Q. Is Dashwood 360VR Toolbox the best solution for Final Cut Pro X editors?
Yes. As Tim Dashwood was hired by Apple in early 2017, he has made all his plugins for Final Cut Pro X, Motion 5, Adobe After Effects and Adobe Premiere (on the Mac) free. The 360VR Toolbox is used by high-end VR video producers to make 3D VR video - when you make videos that send different images to the left and right eye to get a feeling of depth inside the sphere of video.
Dashwood tools are available through the free FxFactory post-production app store system..
Q. How do I get YouTube to recognise that my video is a VR video? How do I tell YouTube that my audio is directional?
You need to add some special metadata using their free ‘360 Video Metadata’ application.
Q. How can I play back VR videos that I make on other people’s computers or phones?
The GoPro VR Player can show a VR video stored on a (Mac/PC/Linux) computer it is running on. For phone playback on iOS or Android, FreeVRPlayer from GingerMonkey Games will play locally stored VR videos, recognising 3D and ambisonic soundtracks.
Q. Which playback solution supports changes in audio playback following the direction you're looking?
No free desktop players yet, but the iOS and Android YouTube app recently added ‘3D audio’ ambisonic audio playback. That means you can share unlisted YouTube videos with clients for playback on their phone.
Q. Why isn't this FAQ called ‘360° Video FAQ?’
Although the vast majority of VR video today is about capturing, modifying and experiencing a full sphere of video, it is likely that there will be many uses of fractions of video spheres.
For now most people experiencing VR video initially look all around the sphere: up, down, behind. After that inital exploration, the spend the vast majority of their time hardly ever looking behind.
Single fisheye lens cameras can now capture as much as 235° of the video sphere. This means the whole stitching process can be avoided. You don't have to leave uncaptured 125° (or 180°) rest of the video sphere dark. You can use post prouction tools to composite stills or motion graphics that add to the experience: “to get more information, look behind you/down/up.”
Q. What is 180º video, what is 180x101 video?
A way of avoiding the problems of stitching is to use single lens rigs. Either a single camera for monoscopic (2D) video or two cameras for stereo (3D) video. These result in videos that allow the viewer to look around a view that is 180º wide: from looking straight left around to straight to to right. When you look at the 2:1 equirectangular projection of the VR video, only a central square has moving video.
180º video is shot with a fish-eye very wide angle lens that covers everything from looking straight down to looking straight up. It results in half a sphere of video.
180x101 video is shot with cameras that can't support a wide angle lens that covers the full field of vertical view. Instead of 180º vertically, the resulting video is 101º vertically - ‘straight down’ and ‘straight up’ aren’t captured.
Q. What 6 Degrees of Freedom/6DoF/Positional VR/Volumetric Video?
The advantage of normal VR video is that it isn’t too difficult for the playback software to take equirectangular footage and show the viewer what they would see if they looked in any given direction. Many VR camera rigs can capture much more information than playback systems on phones or computers on websites can show. Some record depth information to aid stereoscopic (3D) video post production. 6 Degrees of Freedom allows the viewers small head movements (left/right/up/down/backwards/forwards - six degrees) to influence what is seen in headsets. This would allow people to slightly ‘look around’ objects that obscure their view in a certain direction.
The more money you spend, the further you can move along the 3 axes, or in the six degrees of freedom. Intel have a technical demo that allows viewers to move at room scale. This requires large amounts of source camera resolution and computing power to create the 3D environment for 6DoF to work. For now positional VR will be based on post-production tools that recognise video footage with depth information, so that overlays and focus and lighting effects can be applied to areas of a 360º shot based on their distance from the camera.
Facebook plans to include 6DoF in their VR video player. They demonstrated test videos using depth maps produced in conjunction with Alpha software for Adobe After Effects and Adobe Premiere from Mettle in April 2017.