Data is gathered locally as the video plays. Graphs are generated automatically when the video is paused or comes to an end.
By default the graph units are seconds and pixel position from the video, but they can be calibrated to the desired unit, including but not limited to: centimeters, inches, feet, yard, meters, football fields. Timing can also be adjusted to slow motion or high speed video.
Kinematic data (graph)
Shows keypoints positions as they change through the video, creating a three dimensional visualization of two dimensional movement through time.
Vertical displacement data (graph)
Plots each keypoint's position through the length of the video or selected area of the seek bar, perfect for tracking the height of a jump or identifying contact times in a sprint.
Horizontal displacement data (graph)
Tracks the keypoint's position through time, great for identifying symmetry issues and evaluating balance.
Angular data (graph)
Shows the angle of the selected joints and how they change through time, this graph also includes the Center of Mass position, to be used as a reference.
Calculations (graph)
Shows the speed and acceleration of the center of mass throughout the video. Smoothing takes into account previous frames.
Annotations on graphs
You can make notes on the charts to highlight specific events for a consult with your AI Biomechanics Assistant, export or further analysis.
3d pose estimation
Real-time 3d projection of the estimated position of each keypoint of the subject in the current video frame. You can orbit the frame to observe motion from different perspectives.