The International Institute of Information Technology, Hyderabad, has developed GAZED, an Artificial Intelligence powered video editing tool that combines images and videos from a static camera with that viewed by a user.
GAZED, an acronym of GAZe-guided EDiting technique, was developed by a team comprising IIIT’s Centre for Visual Information Technology’s Prof. Vineet Gandhi, and his students Moneish Kumar and K.L. Bhanu Moorthy.
In traditional recording techniques in events like music concerts, several cameras placed at different points are handled by trained personnel. After the footage is captured, the video is then edited meticulously by a video editor. GAZED seeks to make things much simpler even as it improves visual experience.
While footage from a single static camera is taken, the AI engine triggers a multiple pan-tilt-zoom cameras from a single video output. It is from this point that the GAZED dons the hat of a video editor and selects the best shots. The result is a rich visual experience akin to that of cinematic footage.
To give the footage more authenticity, the algorithm has been to maintain consistency in rhythm. This is achieved by training the algorithm to avoid cuts between overlapping shots. It has also been trained to not go for rapid shot transitions. GAZED first composes virtual shots from actor or artiste tracks, it then selects the best shot at each time interval by optimising, incorporating gaze plus cinematic editing principles.
Given the COVID pandemic, GAZED is expected to help smaller companies that do not have the wherewithal to procure expensive and state-of-the-art recording equipment.