Vision[Ai]ry Ft

Facial Tracking and Recognition System

Vision[Ai]ry Facial Tracking (Ft) is the first in a suite of products that use video analytics to automate the functions of a camera operator. Vision[Ai]ry FT uses AI-based facial recognition to detect, locate and track the position of faces within the video stream directly from the camera.

It then uses these facial positions to drive the pan, tilt and zoom axes of the robotic camera system to maintain the desired framing of the face or faces in the image. This eliminates the need for a camera operator to manually adjust for the position of the subject in the image.

Consistent Framing

Vision[Ai]ry Ft reduces the burden on the camera operator by eliminating the need for manual corrections of the camera position to compensate for day-to-day variations in talent seating position, posture, height, and more.

Hands-Free Camera Workflow

Framing settings can be saved to templates that can be automatically recalled with robotic presets to provide a hands-free camera workflow when combined with automated production control software such as OverDrive.

High-Quality, Consistent Tracking

Vision[Ai]ry Ft improves quality and consistency by automatically tracking on-air movements of the studio talent, driving the robotic camera to provide smooth, consistently well-framed images at all times, eliminating the reliance on a skilled operator.

Vision[Ai]ry Ft
Combining a flexible feature set with reliable facial detection algorithms, Vision[Ai]ry Ft simplifies camera workflows and further improves consistency and quality in your productions. See it in action!


Robust Detection Algorithm

Vision[Ai]ry Ft’s algorithm ensures that faces will be accurately identified and located as long as at least 50% of the face is visible in the image. The algorithm is trained against a diverse set of race, gender and age data that includes a wide variety of poses, head coverings and accessories.

Simple, Powerful User Interface

The clean, simple UI provides a live display of the video from the camera with detected faces and framing target clearly indicated, all surrounded by easy-to-understand status info, tracking controls and framing template library.

Adjustable Damping and Deadband

The adjustable damping and deadband settings allow the user to tailor the behaviour and performance of the system to suit the movements of the talent. This ensures that the system maintains optimal framing and tracking, eliminating undesirable movement and overshoot.

Automatic Zoom Compensation

Vision[Ai]ry Ft automatically calibrates the camera lens and adjusts tracking gain to avoid over-reacting at higher zooms, where small pan or tilt changes result in much larger changes in framing.

Unlimited Framing Templates

Framing templates allow all framing parameters, including the framing target location and size, deadband and damping, to be stored and recalled. This enables the operator to create an unlimited number of shot compositions that can be instantly applied to any camera.

Recall Framing Templates with Robotic Presets

By linking a framing template to a robotic preset, the desired framing parameters are automatically recalled and applied to the camera. This is particularly useful in automation workflows where it eliminates the need for the automation operator to touch up shots before taking them to air, allowing them to focus on other aspects of the production.


Specifications Vision[Ai]ry Ft
Number of robots controlled No fixed limit
Minimum PC requirements i7-2.9 GHz, 8 cores, 8 GB RAM, Intel integrated graphics, Solid State drive
Video sources Local, eg: SDI capture card
Number of faces that can be tracked simultaneously Up to 30

Latest News and Resources

Solution Brief

Complete overview of Vision[Ai]ry Ft



How Can Facial Recognition and Tracking Assist Camera Operators?



Getting the best shots with robotic cameras.


Interested in more info?

Send us a Note!

The first step is a consultation with a member of our sales team or one of our certified Ross resellers to discuss the specific needs of your organization. Next, we will look for a mutually convenient opportunity to coordinate a demo.