Virtual reality presents the next big challenge in content creation. At The Foundry, we are focused on developing tools to help artists create immersive live-action content. Building on NUKE’s powerful multi-camera compositing and stereoscopic workflows, we are now taking on the unique challenges presented in developing high-quality, seamless VR experiences.
As we continue to scope and develop our solutions, we would like to keep you updated on our progress. Register your interest using the form to the right of this message.
The future of VR with The Foundry
Our goal is to deliver to market a commercial toolset that addresses challenges presented by live immersive content. We’re starting with a proven base, NUKE—currently the only platform that natively supports the compositing of multi-camera live action. From there, we’ve identified four keys areas that can help our VR clients in this challenging field. First previewed at NAB 2015, these are:
- Improved calibration and stitching of live-action 360 footage from multi-camera rigs
- Live connection to Oculus Rift to review stitching, grading and depth
- Support for compositing with ray-trace rendering for CG placement and projections
- Spherical aware operations and viewing with equirectangular images
NUKE has long been used for similar work including image-based lighting, environment maps, and matte painting and as a result is today being used to create VR experiences around the world. At the same time, MARI often gives a helping hand to image-based modeling and painting techniques.
Already, The Foundry’s NUKE is a critical component in the majority of all VR/AR content created today.
