Now, in a joint effort between YouTube and Daydream, we're adding new ways to make 360 and VR videos look even more realistic.
360 videos need a large numbers of pixels per video frame to achieve a compelling immersive experience. In the ideal scenario, we would match human visual acuity which is 60 pixels per degree of immersive content. We are however limited by user internet connection speed and device capabilities. One way to bridge the gap between these limitations and the human visual acuity is to use better projection methods.
A Projection is the mapping used to fit a 360-degree world view onto a rectangular video surface. The world map is a good example of a spherical earth projected on a rectangular piece of paper. A commonly used projection is called equirectangular projection. Initially, we chose this projection when we launched 360 videos because it is easy to produce by camera software and easy to edit.
However, equirectangular projection has some drawbacks:
- It has high quality at the poles (top and bottom of image) where people don’t look as much – typically, sky overhead and ground below are not that interesting to look at.
- It has lower quality at the equator or horizon where there is typically more interesting content.
- It has fewer vertical pixels for 3D content.
- A straight line motion in the real world does not result in a straight line motion in equirectangular projection, making videos hard to compress.
Drawbacks of equirectangular (EQ) projection
These drawbacks made us look for better projection types for 360-degree videos. To compare different projection types we used saturation maps. A saturation map shows the ratio of video pixel density to display pixel density. The color coding goes from red (low) to orange, yellow, green and finally blue (high). Green indicates optimal pixel density of near 1:1. Yellow and orange indicate insufficient density (too few video pixels for the available display pixels) and blue indicates wasted resources (too many video pixels for the available display pixels). The ideal projection would lead to a saturation map that is uniform in color. At sufficient video resolution it would be uniformly green.
We investigated cubemaps as a potential candidate. Cubemaps have been used by computer games for a long time to display the skybox and other special effects.
Equirectangular projection saturation map
Cubemap projection saturation map
In the equirectangular saturation map the poles are blue, indicating wasted pixels. The equator (horizon) is orange, indicating an insufficient number of pixels. In contrast, the cubemap has green (good) regions nearer to the equator, and the wasteful blue regions at the poles are gone entirely. However, the cubemap results in large orange regions (not good) at the equator because a cubemap samples more pixels at the corners than at the center of the faces.
We achieved a substantial improvement using an approach we call Equi-angular Cubemap or EAC. The EAC projection’s saturation is significantly more uniform than the previous two, while further improving quality at the equator:
Equi-angular Cubemap - EAC
As opposed to traditional cubemap, which distributes equal pixels for equal distances on the cube surface, equi-angular cubemap distributes equal pixels for equal angular change.
The saturation maps seemed promising, but we wanted to see if people could tell the difference. So we asked people to rate the quality of each without telling them which projection they were viewing. People generally rated EAC as higher quality compared to other projections. Here is an example comparison:
EAC vs EQ
Creating Industry Standards
We’re just beginning to see innovative new projections for 360 video. We’ve worked with Equirectangular and Cube Map, and now EAC. We think a standardized way to represent arbitrary projections will help everyone innovate, so we’ve developed a Projection Independent Mesh.
A Projection Independent Mesh describes the projection by including a 3D mesh along with its texture mapping in the video container. The video rendering software simply renders this mesh as per the texture mapping specified and does not need to understand the details of the projection used. This gives us infinite possibilities. We published our mesh format draft standard on github inviting industry experts to comment and are hoping to turn this into a widely agreed upon industry standard.
Some 360-degree cameras do not capture the entire field of view. For example, they may not have a lens to capture the top and bottom or may only capture a 180-degree scene. Our proposal supports these cameras and allows replacing the uncaptured portions of the field of view by a static geometry and image. Our proposal allows compressing the mesh using deflate or other compression. We designed the mesh format with compression efficiency in mind and were able to fit EAC projection within a 4 KB payload.
The projection independent mesh allows us to continue improving on projections and deploy them with ease since our renderer is now projection independent.
Spherical video playback on Android now benefits from EAC projection streamed using a projection independent mesh and it will soon be available on IOS and desktop. Our ingestion format continues to be based on equirect projection as mentioned in our upload recommendations.
Anjali Wheeler, Software Engineer, recently watched "Disturbed - The Sound Of Silence."