Harrison Ford has been donning the fedora and cracking the whip of the daring tomb-raider Indiana Jones for more than 40 years.
The series’ latest installment, “Indiana Jones and the Dial of Destiny,” leans on myths from ancient Greece about a device that can supposedly turn back the hands of time. But that’s fiction and, when “Dial of Destiny” started filming two years ago, 79-year-old Ford and the filmmakers were confronted with a real-life challenge: how to actually reverse the clock and make the veteran Hollywood actor appear decades younger.
Moviemakers have confronted the conundrum of how to convincingly look younger before. In the past, pulling it off meant employing wonky solutions that left a lot to be desired by visual-effects specialists. But that’s rapidly changing, according to Rob Bredow, chief creative officer of Industrial Light & Magic, who was named in October to Insider’s inaugural AI 100 list, recognizing 100 transformational voices in the artificial-intelligence community.
To make Ford look like a sprightly, more youthful version of himself, producers on “Dial of Destiny” employed a special camera to shoot the “Indy” star’s every move, Bredow told Insider. With the aid of generative artificial intelligence, the studio employed a “face-swapping” procedure in VFX that spared Ford from having dozens of small “markers” applied to his face — a typical hallmark of the de-aging process. The result? Indiana Jones suddenly looked generations younger — perfect for the movie’s opening sequence, set during World War II.
“We had a fantastic reference that we could use for these machine learning-based models to learn from,” said Bredow, who also serves as senior vice president for creative innovation at Lucasfilm, pointing to the years worth of tape the studio had of Ford playing the character of Jones. “The quality of the work has gotten better, but also minimizing the impact to the performer and the camera crews and the rest of the team on set has been a real area of innovation for us and helped us move a lot faster.”
Compare that to the de-aging process on “The Irishman,” a 2019 Netflix film starring actors like Robert de Niro, Al Pacino, and Joe Pesce, which saw VFX handled by ILM. For that film, Bredow estimated that cameraman lugged more than 50 pounds of equipment with them, “to make sure we could capture all the detail we needed to create that likeness.” In just a handful of years, the process has been streamlined.
Founded in 1975 by celebrated “Star Wars” creator Geroge Lucas, ILM — a division of Lucasfilm, which also produces “Indiana Jones” — is at the forefront of this new Hollywood frontier. A celebrated name in the visual-effects space, Bredow has worked on films including 2018’s “Solo: A Star Wars Story,” and “Ready Player One.” ILM’s films have won 16 Academy Awards and been nominated for 50, as well as a number of Emmys and BAFTAs.
In Hollywood, Bredow has emerged as a sort of senior statesman at the crossroads of AI and filmmaking — a contentious space that contributed to the recently-ended writers’ strike and ongoing actors’ strike. Members of the Writers Guild of America and SAG-AFTRA, the actors’ union, have been strident critics of artificial intelligence, warning that digital brains threaten to supplant human ones or replace their likenesses, jeopardizing their creative crafts.
But Bredow is optimistic about where innovation around AI and machine learning can go. He spoke to Insider about evolutions in this arena that stand to dramatically impact the entertainment industry, as well as the opportunities for growth that he’s most excited about.
AI in entertainment is poised to be a $100 billion business
AI in media and entertainment is ballooning, with the size of the market expected to swell to nearly $100 billion by 2030, according to a report last year from Grand View Research. To keep ILM competitive, Bredow pointed to the company’s internal research and development team that generates prototypes as well as experiments with techniques to enhance the efficacy of the machine-learning algorithms.
One of the team’s biggest achievements has been speeding up the timeline by which filmmakers can implement these tools, Bredow said. “In the past, using digital tools to replicate the likeness of a person was a process that required a contingent of artists and could take six to 12 months,” he explained. But, with strides in the field, that timetable has shrunk to just a few weeks for a rough-quality approximation of a person.
He offered another example of how the tech is influencing studios’ casting decisions: When assessing multiple possible stunt doubles, producers can now use automated tests to evaluate which one would “play best” on camera while emulating one of the actors in the film.
Previously, adding visual effects was confined almost exclusively to the post-production phase, Bredow explained: “You’d cast the person that was great at doing the stunt, and you thought had the right likeness, and then in the end you find out how well it works or how well it doesn’t work, and how much fixing you have to do in the post-production process.”
Even though the tech has come far, there are areas for development, he noted.
Bredow is seeing big gains in the face-swap arena, as well as the creation of basic characters that can interact with audiences during live shows or engage with users on their phones. Another area where the tech is making leaps and bounds, he said, is in generating aesthetic digital environments “without artists having to hand-create every panel of glass and every brick” — in a background, for instance.
How AI and the creative community can co-exist
While AI tech is starting to reach ordinary consumers, with social media users sharing AI-enhanced photos of themselves in fanciful outfits or garb like astronaut suits, for now, the kinds of tools Bredow’s team uses are primarily confined to studios with deep pockets. But Bredow sees a world in which they’ll one day help at-home creators on platforms like YouTube to enhance their content.
Nevertheless, these tools have fomented deep division and consternation within the creative ecosystem, where some fear that digitization could encroach on their craft or livelihood. Bredow takes a different view — that it’s not a zero-sum game with humans losing out while robots or computers make gains.
Producers are enthusiastic that these tools can help on-camera talent tell stories that they couldn’t have imagined a few years ago, he added. He sees evolution less as a threat, and more as the next chapter in an age-old story of advancement. “Sometimes people ask me, ‘Do you think some of these new techniques are going to replace movies?” he said. “I always point out that even VHS or DVDs didn’t replace the movie theater.”
“What’s really important to us is to put these tools in the hands of artists who are using them to create higher-quality workflows” and “that this is done with a high degree of care for the artists who are actually powering these processes,” Bredow concluded. “Whether that artist is someone painting on a workstation and creating some of these performances digitally, whether that artist is the person in front of the camera — those artists’ performances need to be respected.”
This article was originally published October 13 and has been updated.
Are you a Hollywood insider with a story to share? Contact this reporter. Reed Alexander can be reached via email at [email protected], or SMS/the encrypted app Signal at (561) 247-5758.
Read the full article here