nx:genai function calls an external AI media generation provider during the render pipeline and injects the result into a specified layer before After Effects starts compositing. This lets you combine AI-generated content with your AE templates in a single job - no manual steps, no intermediate storage wrangling.
The provider is called at render time, so the generated asset is always fresh and unique to each job.
Parameters
The AI provider to call. Currently supported:
falAPI key for the provider. Use a secret reference to avoid embedding credentials directly - e.g.
${secret.FAL_KEY}The model identifier on the provider’s platform. The exact value depends on which model you want to use - refer to your provider’s model library for available options.
Output type:
video or image. Determines how the result is injected into the layer.The footage layer in your After Effects composition where the generated asset will be placed.
Model-specific parameters passed directly to the provider’s API. The shape of this object varies by model - refer to your provider’s documentation for the full list of available fields.
Example
The following example uses FAL.AI to generate a talking-head video of a character speaking a provided script, then injects it into a footage layer:data fields here are specific to the infinitalk/single-text model - a different model would expect a different set of parameters. Always consult the model’s documentation on the provider platform for the exact payload shape.
Adaptive Composition Duration
Since AI-generated videos have variable lengths, usenx:layer-duration-set immediately after nx:genai in your assets array to automatically adjust the layer and composition duration to match the generated output:
Storing Your Provider Key as a Secret
Never embed provider API keys directly in a job payload. Store them as Nexrender secrets and reference them with the${secret.NAME} syntax - see Secrets Management for details.
