Last updated: 2026-02-17
By Michael Perdomo — I Help Brands Create High ROI Ad Creatives Faster, For Less
Unlock a complete library of ready-to-use AI prompts and a thorough, field-tested workflow to produce cinema-grade AI-generated video assets for marketing, branding, and storytelling. Access proven prompts for base images, motion cues, texture and lighting, plus step-by-step strategies that shorten production cycles, reduce trial-and-error, and deliver scalable video assets without the guesswork. Ideal for teams and creators looking to accelerate AI-driven video workflows and maintain consistent quality.
Published: 2026-02-11 · Last updated: 2026-02-17
Create cinema-grade AI-generated videos faster by using a proven prompt library and step-by-step workflow.
Michael Perdomo — I Help Brands Create High ROI Ad Creatives Faster, For Less
Unlock a complete library of ready-to-use AI prompts and a thorough, field-tested workflow to produce cinema-grade AI-generated video assets for marketing, branding, and storytelling. Access proven prompts for base images, motion cues, texture and lighting, plus step-by-step strategies that shorten production cycles, reduce trial-and-error, and deliver scalable video assets without the guesswork. Ideal for teams and creators looking to accelerate AI-driven video workflows and maintain consistent quality.
Created by Michael Perdomo, I Help Brands Create High ROI Ad Creatives Faster, For Less.
Video production leads at marketing agencies seeking scalable AI-generated footage without traditional shoots, Freelance editors exploring prompt-driven AI video workflows to speed up projects, Brand marketers evaluating AI video prompts to shorten production cycles and ensure consistent quality
Basic understanding of AI/ML concepts. Access to AI tools. No coding skills required.
ready-to-use prompts. cinema-grade outputs. case studies and applicability
$0.50.
AI Video Prompt Masterclass: Exact Prompts & Step-by-Step Tutorial is a field-ready playbook that bundles exact prompts, templates, and a step-by-step workflow to produce cinema-grade AI-generated video assets. Use it to create cinema-grade AI-generated videos faster with a proven prompt library and structured process; value $50 but get it for free, and it typically saves about 5 hours per project. It is built for video production leads at marketing agencies, freelance editors, and brand marketers.
This playbook is a practical collection of ready-to-use prompts, motion cues, texture and lighting prompts, plus operational checklists and execution frameworks for AI-driven video production. It includes templates, workflow checklists, systemized prompt libraries, and step-by-step operational tools that map DESCRIPTION into repeatable outputs and HIGHLIGHTS like ready-to-use prompts and cinema-grade outputs.
The playbook reduces experimental cycles and standardizes outputs so teams can deliver predictable, scalable AI video without expensive shoots.
What it is: Structured prompt templates for generating high-fidelity base frames with explicit camera, lighting, and texture directives.
When to use: Start of every asset when you need a consistent visual baseline for animation and compositing.
How to apply: Use the provided templates, adjust focal length, skin descriptions, and lighting nodes, and lock the base image before animating.
Why it works: Constraining the base frame reduces downstream variation and provides a single reference that animation and grading stages can rely on.
What it is: A set of motion-cue prompts for subtle facial expressions and micro-movements suitable for marketing close-ups.
When to use: During animation pass when realism and emotional nuance are required without over-animating.
How to apply: Layer micro-motion cues on top of the base image, test at 24–30 fps, and run two quick subjective reviews before finalizing.
Why it works: Small, repeatable motion directives produce believable movement while preserving intent and brand tone.
What it is: A checklist and prompt set for matching studio-grade lighting, skin texture, and environmental reflections across assets.
When to use: For any asset that will appear in the same campaign or brand set to ensure visual continuity.
How to apply: Apply the lighting checklist, lock color temperature and specular descriptors in prompts, and compare with a reference frame.
Why it works: Consistent physical descriptors and a short validation checklist prevent drift between renders and reduce grading time.
What it is: A repeatable sequence that copies a working pipeline: generate base image, animate, and refine using the same toolchain and prompt structures.
When to use: When replicating a successful asset style across multiple variations or campaigns.
How to apply: Recreate the pattern from the LinkedIn workflow: base image via Nano Banana Pro, animation via Kling 3.0, then apply the same texture/lighting prompts and grading steps to each variant.
Why it works: Copying a proven toolchain and exact prompt sequence reduces variance and accelerates scale of cinema-grade outputs.
What it is: A set of acceptance criteria, version tags, and rollback rules to keep output stable across iterations.
When to use: Before handing assets to editing or delivery; after each major render pass.
How to apply: Run the acceptance checklist, tag the asset version, store prompts and seeds with the render, and require sign-off for major changes.
Why it works: Structured gates and version control prevent accidental drift and make reproduction deterministic.
Follow this step-by-step operational sequence to integrate the masterclass into an execution pipeline. Each step is actionable and designed for small teams to scale AI video production without disrupting existing workflows.
These mistakes reflect real trade-offs between speed, quality, and reproducibility; each entry includes a concrete fix.
Positioning: practical playbook built to help production teams and independent editors adopt a repeatable AI video pipeline that reduces turnaround and preserves brand quality.
Turn the playbook into a living operating system by integrating it into tooling, cadences, and automation.
This playbook was authored by Michael Perdomo and sits inside a curated marketplace of operational playbooks for creative teams. It is categorized under AI and is designed to be integrated into existing creative operations without marketing spin. See the full playbook reference and implementation details at the internal link: https://playbooks.rohansingh.io/playbook/ai-video-prompt-masterclass
The content is neutral and operational: use the templates, record your toolchain choices, and copy working patterns to scale outputs reliably.
It is a hands-on playbook that bundles exact prompts, workflow steps, templates, and checklists to produce cinema-grade AI video. The package includes base-image prompts, motion cue libraries, lighting and texture templates, and operational frameworks so teams can reproduce and scale outputs without excessive trial-and-error.
Start with a 2-hour kickoff to select reference assets, assign owners, and generate base images using the provided templates. Lock lighting and prompt versions, run the animation pass, apply the QA gate, and store prompts and seeds in your repo. Iterate using the three-try rule to limit exploratory renders.
The playbook is ready-made with exact prompts and step sequences, but requires minimal integration: capture prompt metadata at render time, add templates to your PM system, and run the initial workshop. It’s plug-and-play for teams willing to follow the operational checklist.
Unlike generic templates, this system pairs exact prompt strings with toolchain patterns, acceptance gates, and versioning rules. It prescribes a reproducible pipeline (base-image, animation, lighting lock, QA) so outputs are deterministic and suitable for brand-level delivery.
Ownership typically sits with a production lead or creative operations manager responsible for asset quality and delivery cadence. Technical leads should control toolchain versions and prompt repositories, while editors handle applied prompts and final QA sign-off.
Measure turnaround time, render iterations per asset, and acceptance rate after first QA pass. A practical metric set: hours saved per asset (target ~5 hours), percentage of assets passing QA on first submission, and number of reproducible variants produced per master.
Discover closely related categories: AI, Content Creation, Marketing, Education and Coaching, No Code And Automation
Most relevant industries for this topic: Artificial Intelligence, Media, Advertising, Education, Software
Explore strongly related topics: Prompts, AI Tools, AI Workflows, LLMs, ChatGPT, No-Code AI, Content Marketing, Growth Marketing
Common tools for execution: Runway, OpenAI, Midjourney, Descript, Loom, ElevenLabs
Browse all AI playbooks