Sharpening and denoising are important parts of the post production workflow, but special considerations need to be made when sharpening and denoising immersive videos.
Read Time: 5 Minutes
Sharpening your finished video can really help to bring out details and is often a mandatory step in the post production pipeline. While many cameras have built-in sharpening settings, a better result can usually be achieved by doing this at the post production stage. Sharpening can be applied as a step towards the end of the post production pipeline where the effects can be assessed and adjusted as required. However, for 360 videos, traditional sharpening tools should be avoided because they are not aware of the equirectangular 360 projection, which has a seam at the left and right sides of the frame. If you use a traditional sharpening tool on 360 equirectangular video, the result will be a visible seam line at the ‘back’ of the 360 video footage where the left and right edges wrap around and meet
Seam Line in VR Image Image: Light Sail VR
One way to avoid this is to use the VR Sharpen effect in Adobe Premiere Pro, which is immersive aware and can be found in the Effects panel under Video Effects > Immersive Video. As long as you have set up your video clip or sequence with the accurate field of view and stereoscopic arrangement, VR Sharpen will sharpen videos continuously across the seam. Be careful, however, of over-sharpening the footage as this can lead to unsightly and distracting halo effects. As usual, previewing in headset is a must.
Applying VR Sharpen in Premiere Image: Light Sail VR
VR-aware sharpening and denoising tools can produce very good results as long as video isn’t oversharpened. There is a second technique available which permits editors to use even more powerful sharpening and denoising tools than the limited, VR-aware versions. If the original source videos are processed individually before stitching, more nuanced sharpening and denoising tools can be used. The stitching process will then use those sharpened/denoised source videos, and the final results will already be sharpened. An additional advantage to sharpening and denoising the source videos prior to stitching is enabling the use of newer, AI-based sharpening and denoising tools like those from Topaz Labs. These tools are not VR-aware, but they can be used before stitching to address soft focus, crushed shadows, general sharpness, and low-light sensor noise.
Note that these traditional tools can also be used effectively on 180 and 3D-180 footage, which do not have seams at the edges.
Workflow tip: Many professionals prefer to apply post-production steps that might not be 360-aware before stitching. This could include steps like sharpening, denoising and color-grading.
Sharpening and denoising the individual source videos prior to stitching adds significant time to productions but is likely to improve overall image quality. An efficient balance is to only process the exact frame ranges as required by the final edit. Frame ranges should exactly match the frame counts in the final edit to support accurate timeline replacement with sharpened, denoised and stitched clips.
If your non-linear editor of choice doesn’t have a VR-compatible sharpen tool, there are suites of VR tools available as plugins, for example Boris Continuum. These tools are designed to work in Adobe After Effects and Premiere Pro, Avid Media Composer, Blackmagic Design Davinci Resolve, Vegas Pro and Foundry Nuke. A popular tool for denoising and deflickering is Neat Video.