Sure Hollywood movies have used green screen technology to create amazing special effect experiences — but never in real time. Previously, thousands of hours in “post production” have been required to match live-recorded actors to their special effect environments — but that might not be the case in the future. Watch this recent Real-time compositing video from OnSetFacilities.com to see a live action rendered composite of a real actor, in a simulated scene.
About the video production from the filmmakers:
This is a great PreViz tool, so that we can record in REALTIME, the FG, the BG, the Matte, and the Camera Track data (in FBX format, with zoom a focus) so that we can finish in NUKE the final composition with the OTOY same ORBX file we used for PreViz in UE4. Credit Oscar Olarte Ruiz. Pre Vis, After Effects, Maya, Autodesk. On-set Facilities Real-time VFX systems, are built by OSF teams in the UK and Madrid, using virtual, 360, 3D environments, camera tracking, hardware compositing tools, color correction software, and game engines. The camera tracking information is captured along with zoom and focus data in FBX, it can then be passed directly to NUKE for on-set compositing or saved to be manipulated later.
How do you think this technological breakthrough could be utilized in healthcare simulation? Leave us a comment and share below!