AI-enabled VFX creates a double-edged sword
Crossing a valley of hot lava or fighting in the alien’s terrain? These photorealism scenes are being rendered by AI-enabled visual effects (VFX) to near perfection.
Infusing physics simulation of American actor Josh Brolin with AI/ML (machine learning)-generated imagery, Marvel Studio has brought Thanos, a 2D comics villain, to live in Avengers Endgame.
Indeed, artificial intelligence (AI) is revolutionising the creation of VFX for blockbusters movie and simplifying the techniques of VFX for everyone.
In the past, most labour-intensive and monotonous tasks — match move, tracking, rotoscoping, compositing and animation — were outsourced to foreign studios. However, with the new technology, these tasks can be fully automated and performed in-house swiftly.
Moreover, the creation of VFX requires a flair in replicating natural nuances or motion of the element in the story — and AI/ML can learn the movements from the data received and quickly create a realistic simulation.
Pinscreen is one such company that utilises AI to create hyper-realistic 2D and 3D images from a million dataset of real 2D photo inputs. This liberates the artist from technical work to focus on more important creative tasks.
Pinscreen demonstrates paGAN model
However, AI-enabled VFX has given rise to deepfakes — videos that were tempered by combining the actions of the real person with a fake element. With the multitude of video footages online, the University of Washington has created and posted a deepfake of former President Barack Obama to show the vulnerability of the technology.
While AI promises to set us free from mundane chores, has the rapid development of AI technology created more problems such as deepfakes than it can solve?
Image credits: VFX Voice