You need video/data resync to achieve real remote production, even if you didn’t know it
Remote production is a hot topic at the moment. The ability to produce content optimizing the quantity of mobilized equipment and personnel is a trend that, given the actual global context, will be of more importance in the next few years. There is a number of ways to go around remote production, but for the future it seems the obvious solution is going the IP route. Unfortunately there is still a long way to go for IP to give answers to all the challenges a remote production poses.
IP video requires highly specialized equipment and very good connectivity between endpoints, which is not always achievable. Also, to solve timing issues when sending data over remote networks, there is a big emphasis on the PTP protocol as a synchronization method. PTP focuses on synchronizing network equipment, trying to adapt to changing delay conditions over networks, but doesn’t take into account actual context of the data being transmitted.
TUBOC’s resync technology is a set of tools that can be used to regain lost synchronization between video and other data feeds. Different video feeds, audio feeds and pure raw data feeds can be sent from a location to another, using any transmission method needed (satellite, IP, fiber, internet, etc.), and will be resynced back to their relative times with millisecond accuracy. This includes the use of IP video (for future use), but also the use of any other technology already available on the market.
So what can resync do for you?
Resync is not a single product, but a set of highly customizable tools that can be used in a number of ways to achieve almost any workflow needed. It doesn’t relay con computing delays over the network, but actually it uses each video frame of each video signal present as a source of synchronism. This ensures that each and all video frames present in the system have the right data, and that that data is consumed with millisecond accuracy. This also means that a video feed can be stopped and rolled back, scrubbed over, and the data will follow.
Some uses of this technology include:
- Remote use of data (sensor data, camera tracking data…).
- Remote control of equipment (tally data, graphics control data, mixer control…).
- Resynchronization of video feeds in different formats (using different encoder/decoders, different signal paths, even different video standards).
- Resynchronization of video and audio feeds.
The use cases for this system are endless and include some straightforward remoting of equipment, but also some interesting uses.
- Remote production of graphics and AR’s on a centralized environtment.link
- Remote production of graphics and AR’s on each final broadcasters allowing for infinite customization. link
- Resynchronization of video feeds at diferent encoding qualities and video standards.
- Fixing the “broken phone” audio effect on long-range interviews
- Real 3D integration of remote virtual studios. Next level “teletransportation” of talents.
- Using real environments as virtual sets.
- Video synchronized talk-back.