Work / 02 · META · Brainchild
2025—26

NO.

02.

Meta

YEAR
2025—26
ROLE
Product Launch Content
CLIENT
Meta
OUTPUT
120 assets
TOOLS
Meta AI

After Meta licensed Midjourney's image model, I joined as a creative partner on the launch of their AI platform (Meta AI). Produced 100+ videos in closed alpha that became the platform's first showcase content.

PRODUCT LAUNCH CONTENT · 2025—26
120 AI videos produced
30 Grumbleverse episodes
5+ Modalities stacked per video
1 Public launch · day-0 content
01

The brief

FIG. B · 01

Meta licensed Midjourney's image model and started building toward something bigger: a Meta AI platform with native generative tools that needed to launch with content already living on it. Empty platforms and AI demos don't onboard users.

Meta needed an expert AI creator who already understood the underlying technology well enough to push the platform's full capability set in production and show the audience what's possible when AI generation moves past the gimmick stage.

The mission: test the platform inside its closed alpha, find the use cases that held up under real production, and produce AI video content that gave new users something worth landing on at launch.

02

The approach

FIG. A · 02

I came in as an early Midjourney expert. Meta had just licensed the model and wanted an advisor who knew the Midjourney platform deeply enough to push it past surface-level usage. From there, the work expanded into a creative partner role on the Meta AI platform itself, where the bet shifted from advising on the underlying tech to making content that proved what the new platform could actually do.

The discipline came down to one rule: never lean on a single capability when you could stack them. So each video pulled AI video generation, lip sync, audio, and whatever other modalities the platform supported, all layered together into something that read as finished work. The goal was videos other users would want to remix, post, or build from.

The bet was that the first wave of content on a generative platform shapes how every user after it shows up. Set the bar high in the closed alpha, and the public launch inherits the standard.

03

The work

FIG. W · 03

I produced ~120 AI-generated videos for the Meta AI platform during its closed alpha, and many of those videos went live as part of the first wave of content when the platform opened up to users.

The platform was new and the constraints were real, so every video had to work twice: once as a creative piece, and once as a stress test of what the toolset could pull off.

The biggest swing in that body of work was a series I called The Grumbleverse. It was set in a made-up place I named the Miserable Zoo, where the animals spend their days complaining about everything. Each video built out a different corner of that world. The animals grumble about their enclosures, the food, the humans who keep showing up to stare at them. Then they escape, and the series takes a hard left turn into chaos. Zoo employees get interviewed in news-style segments, reporters cover the breakout from the streets, and the production starts treating a fictional zoo break as if it's a real breaking-news event. The bit holds because the production values catch up to the joke.

The craft was in the stacking. I'd pull animal scenes from one tool, interview footage from another, news graphics on top, audio design and lip sync layered through, and each video ended up assembling four or five of the platform's capabilities into something that watched like a show instead of a clip reel.

The series proved a few things at once. AI video can carry characters across episodes, hold a narrative thread, and land jokes that need a setup and a payoff. The technology has a personality if you give it one.

The work started as an advisor role on Midjourney's image model after Meta licensed it, then opened up into a creative partner role on the Meta AI platform. The two pieces fed each other. The Midjourney depth shaped how I prompted, and the platform partnership turned that prompting into content that landed at the top of new users' feeds.

What mattered most was what those videos did at launch. They proved the platform was creatively serious. They gave new users a bar to push toward. And they seeded the kind of inspiration that turns first-time visitors into people who post.