The Future of Ads Is Remixable: How Generative AI Hackathon Gave Old Ads New Life
October in New York felt more like July, and I almost skipped the hackathon to take a walk along the Hudson River instead. Then I remembered that New York is the ad capital of the world, and probably the capital of media, finance, and creativity too. This was the first Generative AI Advertising Hackathon I had ever heard of, and it might have actually been the first of its kind anywhere. So I went.
I showed up at Betaworks in the Meatpacking District at nine in the morning, grabbed a fancy everything bagel, and spent the morning talking to about a hundred people. Founders, developers, and creative folks filled the space, all trying to figure out how AI could change the way advertising works. I told them about Ambistream, Social TV, and what we’ve been building to make ads and content live together instead of fighting for attention. This was a room of people who were very encouraging and saw the need for a product like this, and they were curious about my VJ experiences also.
By noon I met Zeel, a developer and AI researcher from NYU. We teamed up and came up with an idea we called the AI Remix Engine.
Brands already have years of ads and video content sitting in archives. Most of it never gets reused. Creative teams keep making new material because the old media is buried or forgotten. Some of those ads are beautiful, but nobody wants to sit through a two-minute spot anymore. Attention spans have changed. Everything has to be short, quick, and fit naturally in a feed. We wanted to turn old ads into new ones using AI.
We used TwelveLabs video intelligence to scan long ads and pull out the most interesting and emotionally charged scenes. Many AI tools can freeze a frame and describe it with text. This one goes deeper. It recognizes tone, pacing, and narrative flow. It can feel when a scene actually works.
In our demo, it found an octopus sequence in an automotive ad that hit all the right notes. The system matched that moment with ambient content and turned it into a short clip that looked and felt native to a modern feed. We also paired dance videos with a jeans ad where people were already moving the same way, and an AI-generated ad for a luxury purse that played between a set of artsy fashion videos.
We built our own dashboard that plugged into TwelveLabs. It let us upload videos, set filters, and pull back scenes that matched what we were looking for. You could search by theme, tone, or concept and see clips that fit, complete with timestamps, previews, and short prompts to remix or caption them.
By Sunday we had a working demo. The dashboard ran live, the pipeline was in place, and we had examples of original content and ads that blended into one continuous experience. We explained that reusing old media and editing ads to flow with surrounding content could create a new way to think about art and advertising. The judges understood right away.
We left tired, wired, and full of new ideas for how to build it further. The next version will add QR overlays, branded graphics, and voice layers. We plan to connect with more creative tools for editing and audio production. For now, the foundation works: AI that can search, remix, and approve.
Since Ambistream’s roots are in the programmatic ad industry, these kinds of integrations have been on our mind for a long time, but they always felt distant or hidden behind big budgets. Working through this hackathon made it real. We were able to connect to adjacent systems, test new ideas, and move faster than we expected.
It also helped us put some of our long-held ideas into the open:
Ads and content should blend together better without killing the vibe.
We can generate ads that fit around content, and content that fits around ads.
Every ad should include a clear logo and call to action.
Programmatic systems can evolve into creative remix engines for short-form media.
We’re grateful for that, and for the chance to show what comes next for Ambistream.