How I Created an AI Sci-Fi Film: Dear Mom
- S B
- Aug 29
- 6 min read
Updated: Aug 30

I love films. I love experimenting with new tools.
When Runway announced their Gen:48 Aleph AI film challenge, I decided to push myself in ways I had not before.
Dear Mom became my entry. It is a film that stepped into the challenge of telling an emotional sci-fi story completely generated with AI. In addition to making a film within a specific genre, I also decided that I wanted my characters to speak and that I would not use a musical soundtrack to guide the emotional pacing of the film. For someone with no formal film or art training, this was a formidable task, but I wanted to see how far I could go with my prompts and my coffee.
Many people still see AI as spectacle. I would argue that we have reached the point where AI outputs should be evaluated for quality and utility. This was my thinking when I entered Gen:48 Aleph. I wanted to build an AI film that was not centered on what AI could do, but on what it could make us feel.
Here is how I built Dear Mom.
The Challenge Brief
Runway's Gen:48 Aleph required participants to create a complete 1–4 minute film in only 48 hours. The brief included three required elements, each selected from a list of choices: a location, a character, and an event. The competition also required that the film integrated scenes created with Runway's AI video editing model Aleph.
For Dear Mom I selected:
Location: a snowy field
Character: a wanderer
Event: a disappearance
More than 4,000 teams entered the challenge. I chose to compete solo, which meant every element of the production fell to me: story, visuals, editing, and final assembly.
Integrating the Requirements
To satisfy the snowy field requirement, I had the main character in the story, Ava, have a memory in which she was playing in snow. The voiceover itself served as a breadcrumb and a clue for discovering who or what "mom" was, as Ava said, "I wish you could have played in the snow with us."
The wanderer requirement was met through Ava's journey as she wandered and searched through her world in hopes of finding her mom.
The disappearance requirement came through another memory in which Ava returned home to find her mom gone. The voiceover made it clear that this was not a one-time event but a repeated pattern throughout her upbringing.
Creative Intent
Throughout my AI film work I have inadvertently developed a visual language. My films often center on women and children navigating themes of memory, identity, and purpose. These stories are told through rich imagery and carefully chosen color palettes.
In Dear Mom I used the colors pink and white for these purposes, tying them to memory and identity. The memories themselves were stylized for added effect: while Ava appeared in full color, the world around her was in black and white monochrome.
The role of white was twofold. It established a futuristic aesthetic while allowing pink to stand out as the film's throughline. Attentive viewers will notice that pink is not just the color of Ava's clothes. It is central to her very existence.
I also took a creative risk in this film by showing a pregnant AI character. It is not an AI-generated image I have seen frequently. My goal, however, was to tell a story grounded in humanity, not to produce an AI tech demo.

Day 1: Story Construction
The first day was focused on building the story. I developed the central idea: a woman writing a letter to her mother while experiencing flashbacks, before setting out to search for her. The plot would lead to a twist ending. When she finally finds her, the revelation is far from what was expected.
I've always been a fan of The Twilight Zone and Black Mirror, both of which blend the uncanny with the routines of everyday life. Inspired by that tradition, I wanted this film to close with a bigger message about the meaning of "finding her mom." Ava found her "mom," but the truth carried a deeper, unsettling surprise.
By the end of Day 1, I had a clear narrative foundation.
Day 2: Production Sprint
The second day was a full sprint. I moved from idea to finished film in under 24 hours.
Image Generation: All stills were created in Google Whisk, refined through prompt iteration until they aligned with the story.
Animation: Using Runway Gen-4, I transformed the stills into moving scenes with cinematic pacing.
Performance Capture for Speaking: I used Runway Act Two to map a recorded performance of myself onto my characters.
To do this, I recorded a video of myself acting out the scenes and mapped that video onto the AI-generated image of my character. This captured and applied my gestures, facial expressions, and voice to the characters. I then replaced the voice with an AI-generated voice that better fit the story.
Editing with Prompts: For select moments, I turned to Runway Aleph, which allowed me to make adjustments with prompts rather than regenerating entire clips. I used Runway Aleph to make the travel scenes dynamic by keeping the characters fixed while changing the world around them.
Assembly: The film was finalized in DaVinci Resolve, where I refined timing and color.
Tools I Used
AI Content Generation
Image Generation: Google Whisk (Model: Imagen 4)
Animation: Runway Gen-4
Prompt-based Video Editing: Runway Aleph
Character Speech Animation: Runway Act Two
Assembly
Video Editing and Color: DaVinci Resolve
Sound Design: Epidemic Sound
Voice Over Narration: ElevenLabs
Challenges
Time Pressure: 48 hours left almost no margin for error
Character Consistency: Maintaining faces and expressions required repeated regenerations
Dialogue on Camera: This was my first attempt at characters speaking directly to the viewer, which pushed me into new territory
Post-Production with Aleph: It took many iterations to get the desired results
Final Output
The result was my first genre specific AI film. It wasn't an easy project. There were many moments of frustration when I asked myself why I chose this path, or why I didn't just make something visually pretty with lots of special effects.
But then my little one brought me a note that said "Mom, keep going".
Runtime: 2 minutes 32 seconds
Format: 1080p
Submitted to: Runway Gen:48 Aleph (2025)
Looking Ahead: The Future of AI Filmmaking
I submitted my film only 5 minutes before the deadline. It was a relief but also a sense of completion. It was a family accomplishment. Making these films isn't just about experimenting with AI or testing new frontiers. For me, it's about preserving culture, heritage, and humanity in a world that is rapidly changing as AI reshapes how we create and interact with content.
That's why I believe I won the contest, as did every single contestant. We won something larger than the prizes: the experience, the opportunity to take part in reshaping how our humanity is reflected back to us through technology. Everyone who participated contributed to this moment of creative and technological evolution.
Dear Mom is more than just a sci-fi short. It represents my growth as an AI filmmaker, and maybe even my crossover into simply calling myself a filmmaker. By experimenting with dialogue, narration, and silence, I found new ways to tell a story that feels futuristic yet deeply personal.
And this is only the beginning. As AI tools evolve, so too will the ways in which we tell stories.
Join the Conversation
"AI is the tool, but the vision is human." — Sophia B.
👉 For weekly insights on navigating our AI-driven world, subscribe to AI & Me:
Let’s Connect
I’m exploring how generative AI is reshaping storytelling, science, and art — especially for those of us outside traditional creative industries.
About the Author
Sophia Banton works at the intersection of AI strategy, communication, and human impact. With a background in bioinformatics, public health, and data science, she brings a grounded, cross-disciplinary perspective to the adoption of emerging technologies.
Beyond technical applications, she explores GenAI’s creative potential through storytelling and short-form video, using experimentation to understand how generative models are reshaping narrative, communication, and visual expression.

