Project Pearl Virtual Production – Interview

Pink Media Group is a Serbian media company headquartered in Belgrade, which operates with: Pink TV (one of the leading commercial television stations in Serbia); Radio Pink (a successful radio station covering Belgrade); Fashion TV Southeast Europe, Fashion TV SEE or FTV SEE (regional channel franchise from Fashion TV International); and, PFI Studios (international film production studios). The development of their virtual production system is coming to an end and will be ready for commercial use. Our interlocutor is Mirko Kantor, A Head of Animation Department in Pink Media Group and the leader of the development of this project.

In the last few years, there has been talking about virtual production for TV and film. What are your reasons for deciding to develop a virtual production system at Pink Media Group?

The reason for starting virtual production is to cope with the times and even to be ahead of time to reduce costs and speed up to the workflow and final results in film and TV production. I had the idea since 2012 when this type of production did not yet exist, and finally, with the advancement of hardware technology and the increasing use of artificial intelligence, I was able to realize my idea. I have been in the 3D industry for a long time,  and I know how demanding and even extremely hard it is to reach the final result, from modeling, setting the 3d scene, and lights to long-term rendering on rendering farms. So I came to the realization of my idea and independently created VisualSmartSystem software based on 3D compositing in real-time where the actors in the studio are in full interaction with the virtual environment.

Unreal and Unity have the highest percentage of realtime engine use in the gaming industry. Which realtime engine did you choose and what are the reasons?

I chose Unreal Engine as the rendering system, primarily because it is free in TV and film production and because it is the most supported game engine in the market so that the source code of the engine itself can be modified. The Blueprint scripting system is fantastic and quite optimized so that it reduces the time required to write the code, and the realization of the idea itself is much faster. There is also a supported Python code that can be implemented in Unreal Engine.

What is your pipeline and what hardware do you use? 

We worked to make Pipeline available to everyone, to make far better results than existing software and hardware solutions that are extremely expensive and sometimes licenses cost up to 100,000 USD a year for a more advanced system. We started with the simple desktop computer that uses NVIDIA GeForce RTX 2060, Blackmagic, or Aja card for video, and SteamVR trackers from HTC Vive to Oculus Rift. Generally, it is the hardware that is very cheap and available to everyone. Now we can make top results with the GeForce RTX 2080 card, and we are far ahead of the others. The reason why we are ahead of the competition is to use artificial intelligence to optimize the image. The superresolution algorithm works in this way that when the system is running into operation,  it first renders one 16k image, and the artificial intelligence itself through the servers predicts the next image using motion vectors at a lower resolution. In our case, we use 2K for TV or 4K for the film, and the results are much better because the image quality and frame rate are even up to three times higher than standard on the Unreal Engine. The source code of Unreal Engine has reworked to reach the full potential of the system itself. We are currently working on installing gyroscope-accelerometers that costs about thirty dollars, and it can be found literally, in every mobile device. We are now also working on the development of a markerless motion capture system to fully interact with the virtual environment.

How did you solve the camera tracking? 

We have solved the problem of Camera Tracking and we are currently using SteamVR tracking that has millimeter precision. The calibration of the lens itself, which is done by software so that the distortion of the lens is removed, and the camera image itself fits in with the virtual camera.

How did you realize keying in real time?

Real-Time keying already exists in Unreal Engine but has its imperfections. We applied two algorithms of difference keying and 3D color keying with despill effect and color correction of the keyed image.

Have you used artificial intelligence and in which cases?

We use artificial intelligence all the time when operating a system that optimizes and increase the resolution of the image. All this is done through servers where access is free. We use Tensorflow, Keras, and Pytorch for machine learning.

It is interesting that you use 3D (depth) compositing instead of layer compositing. Can you say something more about that?

We use 3D compositing as the right solution for the interaction of actors in the virtual space. The actor himself wears a 3D tracker, and thus the position of the actor in the 3d space is determined. We are now working on a markerless motion capture system with ordinary cameras where the software can do motion capture of multiple actors in the studio at the same time, and rigged 3D characters appear in the scene where the character texture is a projection of the camera image with a real actor on the 3D character.

Today, there are several companies in the world that have already developed a system for virtual production. How is your system different from others? 

Several companies in the world do virtual production. The problem is that some companies do not follow the flow of time and make partnerships with hardware vendors where hardware is extremely expensive and such a system cannot be sustainable due to too high price and there has been no wider commercialization of such systems. Some software houses that are known in virtual production remain on the old paradigms, and if they do not adapt to the changes in time, they will soon disappear in history. New ideas, new solutions, and new paradigms are coming to that change everything. Today’s ordinary man cannot fathom the enormous changes that happen in the blink of an eye. We are witnessing the enormous development of artificial intelligence that can do literally anything, so sometimes it is frightening what the possibilities are like today, and I am worried that it can be out of human control.

Pink international has a TV station and movie production studios. Do you plan to implement the system in your production?

Right now, we are working on implementation in everyday production. From October, we will start with a new channel where there will be a full implementation of the new system and other new solutions in digital production.

What were the challenges during the development of this project and how many people is in your team?

Everyday we have new challenges because we discover new things every day. I have a small development team of a couple of people who are a great support in the development of the system, and we function as a small family. Everyone in our team gives their great contribution, starting from Željko Mitrović who wholeheartedly supports such a project to my close associates like Miloš Momčilović who works with me all the time as part of a team when we sometimes stay up to 24 hours a day. We are all united by our vision and strategy to achieve our goals what we have imagined.

We are aware that the situation of the global pandemic has brought film industry to a state of inaction and slow start due to social distance. How can virtual production help film industry  during a corona virus pandemic?

It is true that the COVID 19 pandemic has led to very damaging changes in the film industry and the global economy in general. I believe that virtual production will generally reduce the need for large resources in film production by bringing multiple savings. Now in these conditions, we can make a film or series in one studio, without the need to create sets, choose locations and obtain permits for those locations, no more strenuous 3d rendering, and when an error occurs everything has to start over. To me, it is a fantastic example of making the film “Life of Pi” and almost the entire film is done in the studio, and the film is visually breathtaking.

What development problems are you currently solving and when can we expect use on your television or in film studios?

In development, we have faced many problems, from the camera calibration to motion tracking, and there were conflicts between hardware and software, to image quality problems in the Unreal Engine itself. We have been working on the development of this system for almost a year and solved the main obstacles.

Now, VisualSmartSystem is production-ready.

Every day we are working to improve our software, and it has no end for development. It is our challenge to be with the competition by bringing new ideas and implement new solutions.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from VFX Serbia

Subscribe now to keep reading and get access to the full archive.

Continue reading