AI & RoboticsNews

Runway Gen-2 adds multiple motion controls to AI videos with Multi Motion Brush

AI video is still in its early days but the tech is advancing fast. Case in point, New York City-based Runway, a generative AI startup enabling individuals and enterprises to produce videos in different styles, today updated its Gen-2 foundation model with a new tool, Muti Motion Brush, which allows creators to add multiple directions and types of motion to their AI video creations.

The advance is a first-of-its-kind for commercially available AI video projects: all other rival productrs on the market at this stage simply use AI to add motion to the entire image or a selected highlighted area, not multiple areas.

The offering builds upon the Motion Brush feature that first debuted in November 2023, which allowed creators to add only a single type of motion to their video at a time.

Introducing Multi Motion Brush.

Control multiple areas of your video generations with independent motion.

Available now for Gen-2 at https://t.co/ekldoIshdw pic.twitter.com/ZZoBWczNkg

Multi Motion Brush was previewed earlier this year through Runway’s Creative Partners Program, which rewards select power users with unlimited plans and pre-release features. But it’s now available for every user of Gen-2, adding to the 30+ tools the model already has on offer for creative video producers.

The move strengthens the product in the rapidly growing creative AI market, which includes players such as Pika Labs and Leonardo AI.

The idea with Multi Motion Brush is simple: give users better control over the AI videos they generate by allowing them to add independent motion to areas of choice.

This could be anything, from the movement of a face to the direction of clouds in the sky.

The user starts by uploading a still image as a prompt and “painting” it with a digital brush controlled by their computer cursor.

The user then uses slider controls in Runway’s web interface to select which way they want the painted portions of their image to move and how much (intensity), with multiple paint colors each being controlled independently.

The user can adjust the horizontal, vertical and proximity sliders to define the direction in which the motion is supposed to be executed – left/right, up/down, or closer/further – and hit save.

“Each slider is controlled with a decimal point value with a range from -10 to +10. You can manually input numerical value, drag the text field left or right or use the sliders. If you need to reset everything, click the ‘Clear’ button to reset everything back to 0,” Runway notes on its website.

The introduction of Multi Motion Brush strengthens the set of tools Runway has on offer to control the video outputs from the Gen-2 model. 

Originally unveiled in March 2023, the model introduced text, video and image-based generation and came as a major upgrade over Gen-1, which only supported video-based outputs.

However, in the initial stage, it generated clips only up to four seconds. This changed in August when the company added the ability to extend clips up to 18 seconds.

It also debuted additional features such as a “Director Mode,” allowing users to choose the direction and intensity/speed of the camera movement in generated videos, as well as options to choose the style of the video to be produced – from 3D cartoon and render to cinematic to advertising.

In the space of AI-driven video generation, Runway takes on players like Pika Labs, which recently debuted its web platform Pika 1.0 for video generation, as well as Stability AI’s Stable Video Diffusion models.

The company also offers a text-to-image tool, which takes on offerings like Midjourney and DALL-E 3. However, it is important to note that while the outputs from these tools have improved over time, they are still not perfect and can generate images/videos that are blurred, incomplete or inconsistent in different ways.

AI video is still in its early days but the tech is advancing fast. Case in point, New York City-based Runway, a generative AI startup enabling individuals and enterprises to produce videos in different styles, today updated its Gen-2 foundation model with a new tool, Muti Motion Brush, which allows creators to add multiple directions and types of motion to their AI video creations.

The advance is a first-of-its-kind for commercially available AI video projects: all other rival productrs on the market at this stage simply use AI to add motion to the entire image or a selected highlighted area, not multiple areas.

The offering builds upon the Motion Brush feature that first debuted in November 2023, which allowed creators to add only a single type of motion to their video at a time.

Multi Motion Brush was previewed earlier this year through Runway’s Creative Partners Program, which rewards select power users with unlimited plans and pre-release features. But it’s now available for every user of Gen-2, adding to the 30+ tools the model already has on offer for creative video producers.

The move strengthens the product in the rapidly growing creative AI market, which includes players such as Pika Labs and Leonardo AI.

What to expect from Motion Brush?

The idea with Multi Motion Brush is simple: give users better control over the AI videos they generate by allowing them to add independent motion to areas of choice.

This could be anything, from the movement of a face to the direction of clouds in the sky.

The user starts by uploading a still image as a prompt and “painting” it with a digital brush controlled by their computer cursor.

The user then uses slider controls in Runway’s web interface to select which way they want the painted portions of their image to move and how much (intensity), with multiple paint colors each being controlled independently.

The user can adjust the horizontal, vertical and proximity sliders to define the direction in which the motion is supposed to be executed – left/right, up/down, or closer/further – and hit save.

“Each slider is controlled with a decimal point value with a range from -10 to +10. You can manually input numerical value, drag the text field left or right or use the sliders. If you need to reset everything, click the ‘Clear’ button to reset everything back to 0,” Runway notes on its website.

Gen-2 has been getting improved controls

The introduction of Multi Motion Brush strengthens the set of tools Runway has on offer to control the video outputs from the Gen-2 model. 

Originally unveiled in March 2023, the model introduced text, video and image-based generation and came as a major upgrade over Gen-1, which only supported video-based outputs.

However, in the initial stage, it generated clips only up to four seconds. This changed in August when the company added the ability to extend clips up to 18 seconds.

It also debuted additional features such as a “Director Mode,” allowing users to choose the direction and intensity/speed of the camera movement in generated videos, as well as options to choose the style of the video to be produced – from 3D cartoon and render to cinematic to advertising.

In the space of AI-driven video generation, Runway takes on players like Pika Labs, which recently debuted its web platform Pika 1.0 for video generation, as well as Stability AI’s Stable Video Diffusion models.

The company also offers a text-to-image tool, which takes on offerings like Midjourney and DALL-E 3. However, it is important to note that while the outputs from these tools have improved over time, they are still not perfect and can generate images/videos that are blurred, incomplete or inconsistent in different ways.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Shubham Sharma
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!