Behind the Scenes
How this is actually getting made. The tools, the process, and what I'm learning along the way.
The Storyboards
145+ panels across 31 scenes, most of them AI-generated. This is how I'm planning the whole movie before committing to full renders.
View All StoryboardsThe Idea
I wrote this screenplay in 2018. It sat in a drawer because making an animated film by yourself is basically impossible. Then AI tools got interesting enough that I figured: what if I actually tried?
I'm not here to sell you on AI filmmaking. Honestly, I don't know if this will work. But I'm going to find out in the open, because that's more fun than failing quietly.
The Pipeline
Here's roughly how a scene gets made:
The Script
Already exists. I wrote it in 2018. AI didn't touch the story — that part is just a screenplay I wrote.
Breaking It Down
I turn each scene into shot descriptions: what the camera sees, where characters are, what happens. This bridges "words on a page" and "actual images."
AI Scene Setup
Claude + Blender generate initial scenes from my descriptions. It's like having a very fast intern who sometimes puts the lamp inside the wall.
Fix Everything
I review the output. Something is always wrong. Fix it manually, or describe the problem and let AI try again. Usually both. Repeat until it stops being terrible.
Make It Actually Good
The stuff that makes it feel like a real movie — timing, expressions, little details — still needs human hands. AI gets you 80% there fast, but the last 20% is where the magic lives.
What AI Actually Does
AI Is Good At:
- Setting up scenes from text descriptions
- Writing Python code for Blender
- Generating variations quickly
- Tedious technical stuff I don't want to do
- Being available at 2am
- Not getting annoyed when I change my mind
AI Is Not Good At:
- Knowing what's actually funny
- Emotional timing
- Understanding why something feels "off"
- Making creative judgment calls
- Telling me when an idea sucks
- That thing that makes animation feel alive
The Stack
Blender
Free, open-source, and has a Python API you can script the hell out of.
Claude
The AI doing the heavy lifting. Writes code, generates content, runs the pipeline. I use it through Claude Code for development.
Python
The glue. Automation scripts, Blender integration, validation — all Python.
Git
Everything is versioned. Scripts, screenplay, assets. When I break something, I can go back.
The Hard Parts
If this were easy, everyone would be doing it. Here's what I'm struggling with:
Describing What I Want
"Make it feel more ominous" isn't something a computer can act on. I spend a lot of time figuring out how to translate artistic instinct into words AI understands.
Waiting
Everything takes longer than you'd think. Generate, wait. Tweak, wait. Render, wait. The automation helps, but the feedback loop is still slow.
Consistency
Mia in Scene 1 needs to look like Mia in Scene 47. With AI generation, everything wants to drift a little. Asset management is weirdly important.
The "Almost Right" Problem
AI output often looks fine at first glance but weird when you actually watch it. Something about the motion, the expressions, the timing. Takes human eyes to catch it.
Follow Along
I write about the process on the blog. For technical details, there's the tech section.
Questions? Thoughts? I'm around.