Brainstorm’s InfinitySet is able to cast virtual shadows over the real chroma-keyed shadows, and is also capable of creating accurate selective de-focus and bokeh that matches the depth of field of the lenses on a given scene.
BY MIGUEL CHURRUCA
As augmented reality (AR) is becoming a hot topic and is increasingly used in many different shows, the process of how we create and develop such applications becomes more important.
There are two different aspects to take into account: one is the content that makes sense to be displayed as AR; the other is the quality and integration of such content in a given broadcast show.
AR allows for displaying of data-driven graphics along with real images, where real footage or live videos are mixed with virtual backgrounds or scenes, chroma-keyed talents, and additional broadcast graphics or data-driven 3D graphics.
During election nights, news, sports or entertainment programmes, data graphics can interact with the talents creating an attractive environment for the audience. This “mixed reality” allows for creating virtual environments, where visually engaging representations of the data can be better explained by the presenters, making complex data easier to understand while enhancing the storytelling.
When facing such projects, the perfect integration between the real and virtual objects with the backgrounds becomes essential. This is because what really makes the difference for the audience is, to be unable to tell whether the images they are watching are real videos or digital renders.
However, for virtual set production and live broadcast operation, photo-realism is a tough challenge because of the constraints of real-time rendering and operation. So, what if we were able to create, in real time, photo-realistic scenes combining green screen shots of talents and AR graphics, and mix them all together so seamlessly that the audience could not tell which are real and which are not?
The full story is in the APB July 2018 issue.