close
close

Video game actor doesn't want his performance to be viewed as just data

For hours, motion sensors attached to Noshir Dalal's body tracked his movements as he performed air punches, overhead strikes, and one-handed attacks that would later appear in a video game. Eventually, he swung the sledgehammer in his hand so many times that he tore a tendon in his forearm. By the end of the day, he could no longer open the handle of his car door.

The physical strain that this type of movement work entails and the time it takes are one of the reasons why he believes all video game actors should be equally protected from the use of unregulated artificial intelligence.

Video game actors fear that AI could reduce or eliminate jobs because the technology could be used to translate one performance into a range of other movements without their consent. This fear led to a strike by the Screen Actors Guild-American Federation of Television and Radio Artists in late July.

“When motion capture actors, video game actors in general, are only making the money they make that day … that can be a really tricky thing,” said Dalal, who played Bode Akuna in “Star Wars Jedi: Survivor.” “Instead of saying, 'Hey, we're bringing you back' … they just don't bring me back at all and don't tell me at all that they're doing this. That's why transparency and compensation are so important to us in protecting AI.”

Hollywood's video game actors have announced a work stoppage – their second in a decade – after more than 18 months of negotiations over a new interactive media deal with the gaming industry giants failed over protections for artificial intelligence. Members of the union have said they are not against AI, but actors are concerned the technology could provide a way for studios to displace them.

Dalal said he took it personally when he heard that the video game companies negotiating a new contract with SAG-AFTRA wanted to consider “data” rather than performance in some motion work.

If players were to add up the number of cutscenes they see in a game and compare it to the time they spend controlling characters and interacting with non-player characters, they would find that they interact with the work of motion artists and stuntmen “a lot more than you interact with my work,” Dalal said.

“They're the ones who sell the world that these games take place in, when you're doing combos and doing crazy, super cool moves using the Force powers, when you're playing Master Chief or swinging around the city as Spider-Man,” he said.

Some actors argue that AI could deprive less experienced actors of the chance to play smaller supporting roles, such as non-player characters, where they usually earn their stripes before landing bigger jobs. The uncontrolled use of AI, performers say, could also lead to ethical issues if their voices or images are used to create content they morally disagree with. This type of ethical dilemma has recently cropped up with game “mods,” in which fans alter game content and create new ones. Last year, voice actors spoke out against such mods in the role-playing game “Skyrim,” which used AI to generate actors' performances and clone their voices for pornographic content.

Motion capture in video games involves actors wearing special lycra or wetsuits with markings on them. In addition to more complex interactions, actors perform basic movements like walking, running, or holding an object. Animators take these motion capture shots and link them together to react to what someone playing the game is doing.

“AI allows game developers and game studios to automatically generate many of these animations from previous footage,” says Brian Smith, assistant professor in Columbia University's Department of Computer Science. “Studios no longer have to collect new footage for every single game and every type of animation they want to create. They can also draw on their archive of previous animations.”

If a studio has motion capture data from a previous game and wants to create a new character, the animators could use those saved recordings as training data, he said.

“With generative AI, you can generate new data based on this pattern of previous data,” he said.

A spokeswoman for the video game producers, Audrey Cooling, said the studios had offered “significant” AI protections, but the SAG-AFTRA negotiating committee said the studios' definition of who is an “artist” was key to understanding who would be protected.

“We worked hard to put forward proposals with reasonable terms that protect the rights of performers while ensuring we can continue to use the most advanced technology to deliver a great gaming experience to fans,” Cooling said. “We proposed terms that provide consent and fair compensation to all employees under the (contract) when an AI reproduction or digital recreation of their performance is used in games.”

The game companies have offered wage increases, she said, initially a 7 percent increase in rates and a further 7.64 percent increase starting in November. That represents a 14.5 percent increase over the life of the contract. The studios have also agreed to increases in per diem rates, overnight travel payments and increases in overtime rates and bonuses, she added.

“Our goal is to reach an agreement with the union that will end this strike,” Cooling said.

A report on the global gaming market for 2023 by industry watcher Newzoo predicted that video games would increasingly feature AI-generated voices, similar to the voice acting in Squanch Games' “High on Life.” Game developers, the Amsterdam-based company said, would use AI to produce unique voices, eliminating the need for voice actors.

“Voice actors may have fewer opportunities in the future, especially as game developers use AI to reduce development costs and time,” the report says, noting that “major AAA prestige games like 'The Last of Us' and 'God of War' use motion capture and voice acting in a similar way to Hollywood.”

Other games, such as “Cyberpunk 2077,” featured celebrities.

Actor Ben Prendergast said the data points collected for motion capture don't capture the “essence” of a person's acting performance. The same is true, he said, of AI-generated voices, which can't convey the nuanced decisions that occur in big scenes – or smaller, strenuous efforts like 20 seconds of screaming to portray a character's death by fire.

“The big problem is that someone somewhere has this massive amount of data and now I have no control over it,” said Prendergast, who voices Fuse in the game “Apex Legends.” “Nefarious or not, someone can now take this data and say we need a character who is nine feet tall, who sounds like Ben Prendergast and can do this fight scene. And I have no idea if that's going to happen until the game comes out.”

The studios would “get away with it,” he said, unless SAG-AFTRA could enforce the AI ​​protections they are fighting for.

“It reminds me a lot of sampling in the '80s, '90s and 2000s, when a lot of people were sampling classic songs,” he said. “It's an art. If you don't protect the rights to their image, their voice or their body and gait, you can't protect people from other endeavors.”