


Evan Halleck was on this team, and he used Runway's AI tools to save time and automate tedious aspects of editing. The Everything Everywhere All at Once film had an unusually small team of visual effects artists who were working against tight deadlines. How was AI used in making 'Everything Everywhere All at Once'? And can you please also explain what the “multiverse” is? This is how the first iteration of our video generation model, Gen-1, works. For example, natural language has become one of the easiest ways to control image generation algorithms, but other input systems like video are also possible. To generate new things-like images, video, or text-you can use different input mechanisms. These algorithms can then synthesize new data that didn’t exist before. More broadly, generative AI tools work by generating new data using patterns learned in existing data. Of course, underneath those interfaces, there’s incredibly complex algorithms working together to produce the results you see. That’s a huge part of why tools like ours have taken off, because they fit right into an existing workflow with a familiar look and feel.

But for our users, they are simple tools that make videomaking workflows effortless. For those of us without a background in machine learning, can you give us some of the fundamentals of generative AI and how it gets applied in your world?Īt a general level, generative AI and AI systems are complex mathematical algorithms used to understand data or generate new data points. We’re allowing anyone, regardless of skill level, to be able to create professional-grade content. We invent and build AI models for content creation, and we develop tools for our users to create and edit content, through every aspect of the creative process, from preproduction to postproduction.

Runway is a full-stack AI applied research company. Today, our research has continued to advance, and we have over 30 AI Magic Tools across video, image, 3D, and text, all with the goal of democratizing content creation and making creative tools more accessible. And so, we started building tools and conducting research for storytellers to interact with this emerging technology. We were really interested to see how computational creativity and neural techniques-an approach used in machine learning models that mimics in part how we learn-could serve filmmakers, artists, and creatives. My two co-founders and I met in art school at New York University. How is Runway an extension of that thinking and exploration? Tell us about the origins of Runway and how your background in researching and teaching design led to its founding.
