
About eighteen months ago, I spoke with the head of creative production at a mid-sized DTC brand who described her team’s situation in a way that stuck with me. They had committed to a serious CTV program, meaningful budget, high ambitions, and genuine executive buy-in. The media strategy was solid. The targeting was thoughtful. The problem was that their creative production process was built for the old model: brief the agency, wait three weeks, receive finished spots, run them until performance drops, brief the agency again.
That cycle worked fine when you were running one or two creative versions across a national campaign. It completely fell apart when CTV performance data started telling them they needed eight or ten variations, different messages for different audiences. segments, different lengths for different placements, regional adaptations, and A/B tests running simultaneously. The agency model couldn’t keep up with what the media strategy actually required. Something had to change.
What they landed on, and what a growing number of CTV teams are landing on, is rebuilding creative production around automation tools that can generate and iterate at the speed the channel demands. Tools like Starti’s AI Studio are built specifically for this, automating CTV ad creation, generating video creatives at scale, and optimizing performance across global streaming audiences without the production bottlenecks that kill campaign agility. But the technology is only part of the story. The bigger shift is in how teams are thinking about creative work itself.
The Old Production Model and Why It Breaks Down in CTV
Traditional TV ad production is a waterfall process. You start with a brief and move through concept development, scripting, casting, shooting, editing, finishing, and delivery. Each stage gates the next. The timeline from brief to finished spot is measured in weeks at minimum, often months for anything ambitious. And once the spot is finished, it’s fixed; changing anything meaningful requires going back through significant portions of the process.
This model made sense when you were producing a handful of spots per year for broadcast television. If the investment per spot was large, the production quality was high, and the distribution context was relatively uniform, your ad would run in a fixed set of broadcast placements, and the same version would reach everyone.
CTV breaks every one of those assumptions. The data-driven nature of programmatic streaming means you quickly discover that different audience segments respond to different messages. The targeting capabilities mean you can serve different creatives to different households watching the same show. The performance feedback loop means you know within days which variations are working and which aren’t. All of that is only useful if you can actually produce and iterate on creative ideas fast enough to act on the data. A production process measured in months simply can’t keep pace with optimization cycles measured in days.
What Video Automation Actually Changes About the Workflow
The shift that video automation enables isn’t just faster production; it’s a fundamentally different relationship between creative and data. Instead of producing creative based on assumptions about what will work, testing it, and then waiting weeks to produce something different based on what you learned, automated workflows let you build variation into the creative from day one and iterate continuously based on live performance data.
In practice this looks something like this. A creative team produces a master template, the core visual and narrative structure of the ad, with defined variable slots for elements that can be swapped. Headline copy. Product imagery. Promotional offer. Call to action. Background visual. Voiceover tone. The template itself is produced with the same care and quality as a traditional spot. But instead of being a finished piece, it’s a production framework.
The automation system then generates variations by filling those variable slots from an asset library, combining them according to audience targeting rules and testing logic. Twenty variations can be generated from one template in the time it would previously have taken to produce one finished spot. The variations go live simultaneously. Performance data comes back. The system, or the creative team, depending on how much human oversight you want in the loop, identifies what’s working, and generates the next round of variations that build on those insights.
That feedback loop, running continuously over the life of a campaign, is genuinely different from anything traditional production workflows could support. It’s not just efficiency. It’s a different mode of creative learning.
Building the Asset Library, The Work Nobody Talks About
Here’s the part of video automation that vendors don’t talk about enough in sales presentations. The quality of what the automation system can produce is entirely dependent on the quality and variety of the assets you feed into it. A template with a shallow asset library produces limited, repetitive variation. A template with a deep, well-organized asset library produces genuinely meaningful creative differentiation.
Building that library is real work. For a DTC brand running a national CTV campaign, a proper asset library might include multiple headline options testing different value propositions, five or six product imagery treatments at different angles and contexts, regional background visuals for geographic targeting, multiple voiceover recordings at different tones and pacing, and several CTA variations testing different levels of urgency. Producing all of those elements takes time and creative investment; it’s not something you can skip and still expect the automation to produce great results.
The good news is that this investment is made once and then leveraged continuously. A well-built asset library feeds multiple campaigns over time, not just one. Each campaign generates performance data that informs which new assets are worth producing for the next round. Over time the library gets richer, and the automation gets smarter about how to use it.
Creative Testing in CTV, A Different Mental Model
Most marketers are familiar with A/B testing from email, landing pages, or paid social. The mental model usually tests one variable at a time, reaches statistical significance, declares a winner, and moves on. That model doesn’t quite work in CTV, and trying to apply it directly leads to frustration.
CTV campaigns run across too many audience segments, content environments, and device types for single-variable testing to produce clean results at a reasonable pace. The interaction effects between targeting and creativity are too complex. And the impression volumes, while significant at scale, aren’t always high enough within specific audience segments to reach the statistical significance thresholds that traditional A/B testing requires.
The approach that tends to work better is multivariate testing with a focus on directional insights rather than definitive conclusions. Run multiple creative variations simultaneously across your full campaign. Look for consistent patterns: which message angles perform well across multiple audience segments and which visual treatments drive better completion rates across different content environments, rather than trying to find one universal winner. Use those patterns to inform the next generation of creatives rather than waiting for perfect statistical certainty.
This requires a shift in how you think about creative decisions. You’re building probabilistic knowledge about what works, not seeking definitive answers. That’s actually closer to how good creative directors think anyway; they’re pattern-matching across lots of signals, not waiting for a p-value to make a decision.
The Role of Human Creative Judgment in an Automated System
I want to address something directly because I think it’s the question most creative professionals have when they first encounter these tools. Does video automation reduce the need for human creative judgment? The short answer is no, but it changes what that judgment gets applied to.
The things that automation does well, generating variations, assembling assets, managing delivery, and processing performance data, are not the things that require genuine creative thinking. Creative direction, conceptual development, brand voice definition, quality standards, and the intuitive judgment about whether something feels right, those remain entirely human. What changes is that the creative team spends less time on execution and more time on the thinking that actually differentiates good creative from mediocre creative.
Some creative professionals find this liberating. Others find it disorienting because execution has always been part of how they expressed their craft. That’s a real adjustment, and it’s worth acknowledging. Teams that work through that transition and find their footing in the new model, where creative intelligence is applied at a higher level and automation handles the volume, tend to produce better work than both teams still stuck in manual production workflows and teams that have handed everything to the automation without maintaining strong creative oversight.
Practical Steps for Teams Making This Transition
A few things worth doing if your team is moving toward automated CTV creative production. Start by auditing your existing creative assets honestly. What do you have that’s actually modular and reusable? What would need to be rebuilt from scratch as template-compatible elements? That audit gives you a realistic picture of how much production investment the transition requires.
Then run a contained pilot before you try to automate your entire creative program. Pick one campaign, build a proper template and asset library for it, run the automation, and evaluate both the process and the output quality before you commit to the full workflow change. The pilot will surface workflow issues, technology gaps, and creative process questions that are much better to solve at a small scale than after you’ve already made a broad organizational commitment.
Set clear quality standards before you start generating variations. What does a variation need to clear in terms of brand consistency, production quality, and message clarity before it goes live? Having those standards defined upfront prevents the automated system from producing a volume of mediocre content that technically works but doesn’t actually represent your brand well.
What the Best CTV Creative Teams Look Like Now
The teams doing CTV creative production best in 2026 are smaller than you might expect, faster than you might believe is possible, and more data-fluent than most creative teams were five years ago. They’ve stopped thinking of creative and media as separate functions that hand off to each other and started treating them as one integrated process where data informs creative direction and creative variation enables data learning.
They’re also genuinely comfortable with imperfection in ways that traditional creative teams weren’t. Not every variation they generate is brilliant. Some are clearly better than others, and the data makes that visible quickly. The goal isn’t to produce perfect creativity; it’s to produce enough variation that the best-performing versions can be identified and built on continuously. That’s a different relationship with the work, and it’s one that produces better campaign results over time even if no individual piece of creative would win an award. For more on how viewers engage with streaming content on their connected TV devices, check out streaming device tips and tricks, useful context for understanding the environment your creative is actually landing in.