Content creators will have more control over the look and feel of their AI-generated videos thanks to a new feature set coming to Runway’s Gen-3 Alpha model.

Advanced Camera Control is rolling out on Gen-3 Alpha Turbo starting today, the company announced via a post on X (formerly Twitter).

Advanced Camera Control is now available for Gen-3 Alpha Turbo. Choose both the direction and intensity of how you move through your scenes for even more intention in every shot.

(1/8) pic.twitter.com/jRE6pC9ULn

— Runway (@runwayml) November 1, 2024

The new Advanced camera controls expand on the model’s existing capabilities. With it, users can “move horizontally while panning to arc around subjects … Or, move horizontally while panning to explore locations,” per the company. They can also customize the direction and intensity of how the camera moves through a scene “for even more intention in every shot,” while combining “outputs with various camera moves and speed ramps for interesting loops.”

Unfortunately, since the new feature is restricted to Gen-3 Alpha Turbo, you will need to subscribe to the $12-per-month Standard plan to access that model and try out the camera controls for yourself.

Or quickly zoom out to reveal new context and story.

(7/8) pic.twitter.com/dovmMUsGEx

— Runway (@runwayml) November 1, 2024

Runway debuted the Gen-3 Alpha model in June, billing it as a “major improvement in fidelity, consistency, and motion over Gen-2, and a step towards building General World Models.” Gen-3 powers all of Runway’s text-to video, image-to-video, and text-to-image tools. The system is capable of generating photorealistic depictions of humans, as evidenced in the X post, as well as creating outputs in a wide variety of artistic styles.

Advanced Camera Controls arrive roughly a month after Runway revealed gen-3’s new video-to-video capabilities in mid-September, which allows users to edit and “reskin” a generated video in another artistic style using only text prompts. When combined with Apple’s Vision Pro AR headset, the results are striking. The company also announced the release of an API so that developers can integrate gen-3’s abilities into their own apps and products.

The new camera controls could soon be put to use by film editors at Lionsgate, the studio behind the John Wick and The Hunger Games franchises, which signed a deal with Runway in September to “augment” humans’ efforts with AI generated video content. The deal reportedly centers on the startup building and training a new generative AI model fine-tuned on Lionsgate’s 20,000-title catalog of films and television series.

Related Posts

New study shows AI isn’t ready for office work

A reality check for the "replacement" theory

Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns

The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.

Microsoft tells you to uninstall the latest Windows 11 update

https://twitter.com/hapico0109/status/2013480169840001437?s=20