Skip to main content

Runway can now mimic everything from 35mm disposable cameras to ’80s sci-fi

Images showing Runway Frames.
Runway

AI startup Runway, makers of the popular Gen-3 Alpha image generator, debuted a new foundational model that “excels at maintaining stylistic consistency while allowing for broad creative exploration,” per the company.

The new model is called Frames, and it offers users the ability to generate images on a wide variety of subjects while strictly adhering to a consistent visual style and aesthetic. Whether it’s mimicking ’80s camp films like Flash Gordon and Xanadu, aping the format of ’90s-era 35mm disposable cameras or retro anime, generating sweeping landscapes or carefully composed still-life shots, Frames sticks to the artistic style the user dictates.

Recommended Videos

Introducing Frames: An image generation model offering unprecedented stylistic control.

Frames is our newest foundation model for image generation, marking a big step forward in stylistic control and visual fidelity. With Frames, you can begin to architect worlds that represent… pic.twitter.com/kg8Fz0LEgU

— Runway (@runwayml) November 25, 2024

“With Frames, you can begin to architect worlds that represent very specific points of view and aesthetic characteristics,” the company wrote in Monday’s announcement post. “The model allows you to design with precision the look, feel and atmosphere of the world you want to create.”

Frames is not going to replace the current Gen-3 Alpha generation model but rather augment it. “We’re gradually rolling out access inside Gen-3 Alpha to allow you to build more of your worlds within a larger, more seamless creative flow,” the company wrote.

Gen-3 Alpha is a relatively new model, having been introduced in June 2024. It is built for large-scale multimodal training and marks a “major improvement in fidelity, consistency, and motion over Gen-2, and a step towards building General World Models,” the company announced at the time. The model has recently been updated with more precise camera controls and the capacity for video-to-video generation.

The Gen-3 alpha has been trained on thousands of YouTube videos, a practice that has led to accusations of copyright infringement from YouTube content creators. Tests conducted by 404 Media found that naming a specific creator (say, MrBeast) in the prompt instigated the system to generate images in that creator’s aesthetic.

Andrew Tarantola
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Get ready: Google Search may bring a pure ‘AI mode’ to counter ChatGPT
AI Overviews being shown in Google Search.

It is match point Google as the tech giant prepares to introduce a new “AI Mode” for its search engine, which will allow users to transition into an atmosphere that resembles the Gemini AI chatbot interface.

According to a report from The Information, Google will add an AI Mode tab to the link options in its search results, where the “All,” “Images,” “Videos,” and “Shopping” options reside. The AI Mode would make Google search more accessible and intuitive for users, allowing them to “ask follow-up” questions pertaining to the links in the results via a chatbot text bar, the publication added.

Read more
I tested Intel’s new XeSS 2 to see if it really holds up against DLSS 3
The Intel logo on the Arc B580 graphics card.

Although it technically arrived alongside the Arc B580, Intel quickly disabled its new XeSS 2 feature shortly after it was introduced. Now, it's back via a new driver update, and with a few fixes to major crashes issues. I took XeSS 2 out for a spin with the Arc B580, which has quickly climbed up the rankings among the best graphics cards, but does XeSS 2 hold up its side of the bargain?

XeSS 2 is Intel's bid to fight back against Nvidia's wildly popular DLSS 3. The upscaling component at the core of XeSS is the same, but XeSS 2 includes both a Reflex-like latency reduction feature and, critically, frame generation. The latency reduction, called XeLL, is enabled by default with frame generation.

Read more
Windows PCs now works with the Quest 3, and I tried it out for myself
i tried windows new mixed reality link with my quest 3 alan truly sits in front of a pc and adjusts virtual screen while wear

Microsoft and Meta teamed up on a new feature that lets me use my Windows PC while wearing a Quest 3 or 3S, and it’s super easy to connect and use. I simply glance at my computer and tap a floating button to use Windows in VR on large displays only I can see.

Meta’s new Quest 3 and 3S are among the best VR headsets for standalone gaming and media consumption. When I want more performance or need to run one of the best Windows apps that aren’t yet available in VR, I can connect to a much more powerful Windows PC.
Setting up Mixed Reality Link
Scanning Microsoft's Mixed Reality Link QR code with a Meta Quest 3 Photo by Tracey Truly / Digital Trends

Read more