Friday, November 22, 2024
Uncategorized

How A.I. is reshaping the way movies get made

Nearly 30 years after Tom Hanks and Robin Wright starred in the Oscar-winning film Forrest Gump, the actors are collaborating again in the upcoming film Here.

Hanks and Wright will be decades older since moviegoers last saw them share the silver screen. 

Or will they? 

Hollywood is a town filled with plot twists. The latest is the emergence of A.I., which is mostly being used today to de-age actors like Hanks and Wright. But there is a whirlwind of future applications that could influence script development, predict casting or how much money a movie will make, and even ascertain moviegoers’ film preferences.

“A.I. is a tool that’s going to help people express themselves in ways they’ve been stifled until now,” says Helena Packer, senior vice president and general manager of DGene, a Silicon Valley– and Shanghai-based developer of A.I. technology. 

Miramax tapped A.I. technology developer Metaphysic to help its production company create younger versions of key characters featured in the film Here, which is adapted from Richard McGuire’s graphic novel. As cofounder and CEO Tom Graham explains it, Metaphysic’s A.I. model is able to de-age actors such as Hanks with a younger version of the Oscar winner generated by A.I. and stitched atop his live performance. This new technique is cheaper to produce than VFX (visual effects) or CGI (computer-generated images) and also more realistic, according to Graham.

“We can actually do things with A.I. that are impossible with traditional methods to get that really hyperrealistic look, that doesn’t look uncanny, doesn’t look weird, and doesn’t look CGI,” says Graham.

An example of the A.I. work performed at DGene.

Courtesy of DGene

Graham says A.I. is at the beginning stage but could eventually consume “everything.” Entertainment, gaming, basically any content that is on the internet could be driven by generative A.I. models that are trained on data from the real world. Metaphysic, known for its deepfake of actor Tom Cruise and of Simon Cowell on the stage of America’s Got Talent, raised $7.5 million last year to help expand its content creation tools.

“The growth has been really fast in terms of the sophistication of the A.I. models, the techniques, and the software that supports it,” says Graham. “But in the last six months, it has really taken off, and there is no reason why that will not continue. Because of the sheer number of people who are interested in building things on top of it.”

Count visual effects company Industrial Light & Magic as among those who are intrigued. “We’re working on over 30 shows right now. I think all of them are using machine learning and A.I. in one way or another,” says Rob Bredow, chief creative officer at Industrial Light & Magic and senior vice president of creative information at ILM’s parent company, Lucasfilm.

ILM is a visual effects pioneer in film, responsible for the dinosaurs in the Jurassic Park franchise and the spaceships and lightsabers in Star Wars. Jurassic Park, in particular, was a watershed for the industry as the first film to use realistic computer graphics on-screen, an evolution from models and miniatures that were used previously.

Today, Bredow says, the film industry is beginning to see some machine learning algorithms compete, and even sometimes outperform, hand-coded algorithms. One area of notable excitement is in denoising algorithms, which can resolve some of the noise that basically looks like grain on an image. Newer technology can speed up artists’ workflows by reducing render times dramatically.

Machine learning–based algorithms were used by ILM to de-age actor Mark Hamill, who was featured as a younger Luke Skywalker in the TV series The Book of Boba Fett. Bredow says de-aging tools are “definitely an area that we are more than keeping an eye on,” but stresses that hundreds—even thousands of hours—of artists’ labor goes into creating the final illusion. Animation, lighting, 3D models, and other techniques all help filmmakers obtain the ultimate results they are seeking. 

The mixed use of computer graphics, practical effects, and new technologies like A.I. is what excites Bredow about the future of visual effects. He’s encouraged that ILM still uses some miniatures and models, including in the latest Jurassic World and The Mandalorian.

“I think every time a new technology comes, there is a question: What is it going to mean for the older generations of technology?” notes Bredow. “I think we are going to continue to see a mix into the future.”

A still from Lucasfilm’s “The Mandalorian,” featuring (from left) Rosario Dawson as Ahsoka and Mark Hamill, de-aged in this series via A.I. to depict a younger Luke Skywalker.

Courtesy of Lucasfilm

Hollywood’s heaviest hitters are investing in A.I. in various ways, including Warner Bros., which signed a deal with Cinelytic to better inform movie decision-making on release dates, marketing, and distribution strategies. And in its earlier incarnation, 20th Century Studios worked with Google to use machine learning to better predict audience segmentation.

Startups like DGene help filmmakers by writing algorithms that could be used to de-age characters and potentially use A.I. for even more aspects of moviemaking. 

“From the first cave painting, you know, where they put their hand on the wall. They’re saying, ‘I’m here. I’m representing something,’” says Packer. “And visual effects are the same thing, and so is film. It’s a representation of feelings, of imagery, and of communication. So now A.I. comes along. And it’s a different way of representing these images, a faster way, perhaps a better way—definitely a different way.”

Packer views A.I. as a new tool for visual artists to expand the art of filmmaking. If A.I. can expand our view by even 10%, how can it influence art, Packer asks rhetorically. She believes the future of A.I. in movies ultimately relies on what audiences believe. They must accept A.I. as reality for it to work.

Packer sees both potential and challenges as this new tool evolves. A.I. is mostly used for de-aging techniques that often focus on the face, eliminating wrinkles to make a character appear younger. But the tech may evolve to create all the tiny motions of the body. Packer says one of the most complex features to get right are the eyes, because they are so communicative. 

A.I. is also playing a role in talent development. Digital studio TheSoul Publishing uses DALL-E 2 to create various still images for their virtual music artist Polar, who has amassed 1.7 million TikTok followers since launching a debut single, “Close to You,” in September. “Over time, we definitely see artificial intelligence playing a bigger role in the production process,” says Patrik Wilkens, vice president of global operations at TheSoul.

Wilkens says the creative team is using A.I. to speed up the production process, including the creation of various colors for marketing materials and reducing the amount of time needed for a music editor to create longer tracks. Ultimately, this tech investment allows TheSoul to create more content.

He says that younger consumers, in particular, are gravitating toward artists like Polar because the line between real and unreal people has blurred. “I think they will make a distinction between authentic and inauthentic people, so they’re much more open and much more natural to engage with virtual influencers,” says Wilkens.

A digital representation of Polar, a virtual music artist.

Courtesy of TheSoul Publishing

The entertainment industry’s embrace of A.I. does raise a lot of questions. Last year, a virtual rapper called FN Meka got dropped by Capitol Records over allegations of racism. Screenwriters are alarmed that ChatGPT and other tools could replace their jobs. If A.I. drives more decisions for Hollywood about what projects to green-light, what will that mean for artistic expression? Experts say the technology is so new, it is hard to say exactly how it will be used in the future.

“There’s a lot of debate right now about who should be viewed as a creator when using A.I.,” says Danny Tobey, partner at global law firm DLA Piper. “For the time being, the law says only a natural person can be an inventor.  But when you understand the technology, some of the concern goes away, because there can still be a real human element in using A.I. to generate creative content.”

source

Leave a Reply

Your email address will not be published. Required fields are marked *