
James Cameron doesn’t think generative AI technology will replace the need for human artists and actors – see what the filmmaker had to say below.
For as long as the use of AI has risen across virtually every industry over the last couple of years, its inclusion in the real of filmmaking and entertainment has been a topic of debate.
This began heating up most recently after Dutch actor and comedian Eline Van der Velden debuted an AI actor named Tilly Norwood earlier this week at the Zurich Film Festival. Norwood has reportedly been drawing the attention of talent agents, though has not signed to a talent rep at the time of writing.
Norwood’s debut sparked outrage among actors and fans alike, with Emily Blunt and Natasha Lyonne among the big names slamming the AI actor’s creation. Screen Actors Guild-American Federation of Television and Radio Artists has also condemned the AI actor and has urged talent agencies to not partake in the support of AI.
James Cameron has weighed in on the topic of AI’s use in Hollywood. Speaking to Variety back in September for an interview that was only published after the news of Tilly Norwood’s creation, the Avatar director said that his work with people in VFX has assured him that AI will never be able to replace humans and artists.
“The creative culture is so strong across all these artists that I can look at a shot for review for the very first time and say ‘It’s done.’ That is the craziest thing… So this idea of really encouraging them to think as storytellers is really paying off. And this is why the Gen AI stuff is never going to take the place of that. We need our artists. It’s artists in control of the process, right?”
The Titanic filmmaker went on the record earlier this year that he is exploring ways in which AI technology could help brings costs down in the film industry without having to fire staff, but stated in August that he remains wary of the possible future that it could bring about, even suggesting a real-life unfolding of The Terminator could take place at some point.
Recommended
Cameron said at the time: “I do think there’s still a danger of a Terminator-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defence counterstrike, all that stuff.”
“I feel like we’re at this cusp in human development where you’ve got the three existential threats: climate and our overall degradation of the natural world, nuclear weapons, and super-intelligence. They’re all sort of manifesting and peaking at the same time. Maybe the super-intelligence is the answer. I don’t know. I’m not predicting that, but it might be.”
Last September, Cameron joined the board of directors for Stability AI. “My goal was not necessarily make a shit pile of money,” he said at the time. “The goal was to understand the space, to understand what’s on the minds of the developers. What are they targeting? What’s their development cycle? How much resources you have to throw at it to create a new model that does a purpose-built thing, and my goal was to try to integrate it into a VFX workflow.”
Comments are closed, but trackbacks and pingbacks are open.