About This Project

How I'm trying to make a real movie using AI — and whether it'll actually work

Why I'm Doing This

I wrote the screenplay for Fairy Dinosaur Date Night back in 2018. I liked it. Other people who read it liked it. And then it sat in a drawer, because actually making an animated film is impossibly expensive and time-consuming for one person.

Then AI tools started getting genuinely useful — not hype-useful, but actually-helps-you-do-things useful. So I figured: what if I tried to make this thing? Not as a demo or a proof of concept, but as an actual movie people might want to watch?

I don't know if it'll work. But I'm going to find out in the open, because that's more interesting than quietly failing.

How It Actually Works

Here's the rough pipeline. It's still evolving as I figure out what's possible.

1

The Screenplay

This already exists — I wrote it in 2018. AI didn't help with the story. That part is just... a script I wrote.

2

Breaking It Down

I turn each scene into shot descriptions: what the camera sees, where characters are, what happens. This is the bridge between "screenplay" and "actual images."

3

AI Does The Heavy Lifting

Claude + Blender generate initial scene setups from my descriptions. It's like having a very fast assistant who sometimes puts the couch on the ceiling.

4

Fix What's Wrong

I review the output. Something is always wrong. I either fix it manually or describe the problem and let AI try again. Usually both.

5

Polish

The stuff that makes it feel like a real movie — timing, expressions, the little details — still needs a human. AI can get you 80% there fast, but that last 20% is everything.

What AI Actually Does Here

AI Is Good At

  • Setting up 3D scenes from text descriptions
  • Writing Python code for Blender automation
  • Generating variations fast
  • Tedious technical stuff I don't want to do
  • Being available at 2am when I'm in the zone
  • Not complaining when I change my mind (again)

AI Is Not Good At

  • Knowing what's actually funny
  • Emotional timing
  • Understanding why a shot feels "off"
  • Making creative judgment calls
  • Telling me when something sucks
  • That ineffable thing that makes animation feel alive

The Stack

Blender

Free, open-source, and has a Python API you can script the hell out of

Claude

The AI doing most of the heavy lifting. Writes code, generates content, keeps the pipeline running

Python

Connects everything together. Blender scripts, automation, validation — all Python

Git

Everything is versioned. Scripts, screenplay, assets. If I break something, I can go back

🔧

The Hard Parts

If this were easy, everyone would be doing it. Here's what I'm actually struggling with:

Describing What I Want

Turns out "make it feel more ominous" isn't something a computer can work with. I spend a lot of time figuring out how to translate artistic instinct into words AI can act on. It's like being a translator between two languages I'm not fluent in.

Waiting For Renders

Everything takes longer than you'd think. Generate a scene, wait. Tweak it, wait. The automation helps, but the feedback loop is still slower than I'd like.

Keeping Characters Consistent

Mia in Scene 1 needs to look like Mia in Scene 47. This is genuinely hard with AI generation, where everything wants to drift a little. Asset management has become weirdly important.

The "Almost Right" Problem

AI output often looks fine at first glance but weird when you actually watch it. Something about the motion, the expressions, the timing. It takes human eyes to catch it, and human effort to fix it.

Want to Follow Along?

I write about the process on the blog — the wins, the failures, the stuff I'm figuring out. The production page has actual work-in-progress renders and audio.

If you have questions or just want to tell me this is never going to work, I'm around.