Use Action's Analyzer to compute the path of live-action camera and object motion in 3D space. Using the calculated position and motion of the virtual camera, you can match image sequences perfectly, placing any element in the scene. The perspective of the element you place in the scene changes with the perspective of the background as the camera moves. The virtual camera motion is intended to be identical to the motion of the actual camera that shot the scene.
Use the following workflow as a quick start guide to the Analyzer. Follow the links for more detailed information.
Step 1
Select back or front/matte media (mono or stereo) to analyze and add an Analyzer node.
Optional steps:
- Perform a lens correction.
- Adjust the four corners of the perspective grid to set the focal length.
- Add manual trackers to obtain a more predictable and consistent result.
![](https://help.autodesk.com/cloudhelp/2024/ENU/Flame-Compositingin3d/images/GUID-9545B40C-ED86-4E2C-9A73-5B04CDDE23D8.png)
Tracking using the Analyzer Node
Step 2
Perform Camera Tracking in the Analyzer menu.
Optional steps:
- Add mask constraints to moving areas or areas not wanted in the analysis.
- Add properties of the camera that shot the footage to be analyzed.
Step 3
Fine-Tune and recalibrate or refine the camera tracking analysis. This step is optional depending on the results of your initial analysis.
![](https://help.autodesk.com/cloudhelp/2024/ENU/Flame-Compositingin3d/images/GUID-0F93CC2E-8C69-47F4-8718-F740F12AE614.png)
Step 4
Create a Point Cloud of selected points after the analysis.
![](https://help.autodesk.com/cloudhelp/2024/ENU/Flame-Compositingin3d/images/GUID-DD4D335E-CAF8-4D24-B402-989BC0182599.png)
Step 5
Perform Object Tracking. If needed, after camera tracking, you can track moving objects in the scene — such as inside the masks that were not tracked in the camera tracking analysis.
![](https://help.autodesk.com/cloudhelp/2024/ENU/Flame-Compositingin3d/images/GUID-5B398BDC-89FD-45EA-9BF0-2E38E5DC83F3.png)