AirPixel successfully tracked the in-stadium Skycam at State Farm Stadium during Super Bowl LVII to deliver accurate position and orientation data as well as the camera’s focus, iris and zoom values. The integration of this data enabled flawless, dynamic matching of live imagery with virtual imagery throughout the show.
AirPixel is highly scalable and easy to deploy, making it ideal for temporary installations within stadiums. It is unaffected by lighting changes or weather conditions, ensuring reliable data is available at any point in the production.
To match live imagery with virtual imagery, you need to know the exact position and orientation of the physical camera. Additionally, for the graphics to interact realistically with the camera’s focus, iris and zoom, you also need to capture and integrate these values.
Getting position and orientation of a camera is known as camera tracking. Inside a broadcast studio, cameras are often tracked with systems based on (optical) markers. However, if you want to track a camera in a large open volume such as a football stadium with changing lighting conditions, these systems are very hard to operate reliably.
If a camera within a stadium is in a fixed position, on an encoded pedestal or crane, tracking values are relatively easy to obtain. However, this only provides the option for graphics to interact with imagery from a static camera position and does not capture camera values.
Many stadium shots now take advantage of an aerial camera mounted on cables that flies above the field, such as a Skycam. It is a lot harder to get exact position and orientation of such a camera as it is moving at pace, in three dimensions, over a large area, and with bounce and motion within its cable mounting. This unpredictable motion is not detected by encoders on cable systems.
To track a Skycam , or other freely moving cameras such as a Steadicam, Racelogic developed a system called AirPixel. This is a system that combines Ultra-Wideband (UWB) radio signals with inertial sensors.
The principle of AirPixel is comparable to a GPS system where a receiver calculates its position and orientation by communicating with a number of satellites. In the case of AirPixel, the receiver is called a ‘rover’ which is mounted on the camera, and the satellites are called ‘beacons’ which are mounted around the stadium.
By not relying on encoders or optical tracking, AirPixel is unaffected by changes in lighting or weather conditions. The combination of UWB and inertial sensors allows for high accuracy tracking of all camera motion across a wide-area, ensuring accurate data throughout a whole stadium.
Additionally, AirPixel integrates directly with the camera to capture all focus, iris and zoom values so that graphics can interact with camera generated movement as well as its physical movement.
An AirPixel system consist of a camera-mounted receiver (rover) and several positioning beacons that communicate with each other using ultra-wideband radio. The rover calculates its location and orientation using the UWB data and an internal IMU. This information is processed through a sophisticated filter algorithm to provide accurate X, Y, Z, Pan, Tilt, and Roll outputs.
The rover also connects to a lightweight control unit which combines the position data with lens FIZ data, genlocks the output and transmits to the render engine via Ethernet or serial connection.
To set up the system, beacons are placed around the perimeter of the stage and/ or above the stage at varying heights and locations. The beacons are then surveyed using a high-speed robotic total station and bespoke in-house software. Using this method, it is possible to set the beacon locations with millimetre accuracy, and still complete setup in less than 2 hours for most configurations.
AirPixel works with many camera rigging systems, having been used on dolly, Steadicam, jibs, cranes and cable cams. It also works in any lighting conditions, including total darkness.
AirPixel integrate with popular 3rd party products, to provide the best solution for your needs. This means that accurate data can often be provided on shots where traditionally tracking would be difficult or impossible.