© 2024 WVIK
Listen at 90.3 FM and 98.3 FM in the Quad Cities, 95.9 FM in Dubuque, or on the WVIK app!
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Video game performers want protections from artificial intelligence

AILSA CHANG, HOST:

Members of the union SAG-AFTRA are on strike against major video game companies. We're talking about performers whose voices, faces and movements are used to create video games. They want protections from artificial intelligence. NPR's Mandalit del Barco reports.

(SOUNDBITE OF MUSIC)

MANDALIT DEL BARCO, BYLINE: Jasiri Booker's real-life parkour, capoeira and breaking moves help make Marvel's Spider-Man: Miles Morales video game come to life.

JASIRI BOOKER: I stick to walls. I beat people up. I get beaten up constantly, being electrocuted and turning invisible.

DEL BARCO: To play Spider-Man and other video game characters, the 26-year-old wears full bodysuits and caps dotted with reflective sensors. His motions are captured by cameras surrounding the performance spaces. Then, his digitized moving skeleton is rendered into animated characters, a process that increasingly uses AI. And that hits at the center of this strike.

(SOUNDBITE OF VIDEO GAME SOUND EFFECTS)

DEL BARCO: A representative for the video game companies told NPR that, under their proposal, performers would be paid and asked for consent to use their digital replicas. That goes for actors' voices and faces. But Booker says their body movements would not be covered. On the picket line outside Warner Bros. Studios, he and others argued their work is more than just motion-capture reference data.

BOOKER: Our ask is not that they don't use AI altogether. We're saying, at the very least, please inform us and allow us to consent to the performances that you are generating with our AI doubles.

DEL BARCO: Animators have always relied on real-life movements of human performers. During the silent picture era more than a century ago, they began using live-action footage of people to create cartoon sequences. By hand, they traced over projected images, frame by frame - a time-consuming process that became known as rotoscoping.

(SOUNDBITE OF ARCHIVED RECORDING)

MAE QUESTEL: (As Betty Boop) Boop-oop-a-doop (laughter).

(SOUNDBITE OF ARCHIVED RECORDING)

JACK MERCER: (As Popeye, singing) I'm Popeye the sailor man.

DEL BARCO: The creator of the old "Betty Boop" and "Popeye" cartoons, Max Fleischer, patented the first rotoscope in 1915 for his film "Koko The Clown."

(SOUNDBITE OF FILM, "SNOW WHITE AND THE SEVEN DWARFS")

UNIDENTIFIED ACTORS: (As characters, singing) Heigh-ho, heigh-ho, heigh-ho. It's off to work we go.

DEL BARCO: In 1937, Walt Disney animators used rotoscope techniques for "Snow White And The Seven Dwarfs." The method was standard until the 1980s, when computer-generated images came along, like in this 1985 Super Bowl commercial for the Canned Food Information Council.

(SOUNDBITE OF AD, "BRILLIANCE")

UNIDENTIFIED ACTOR: (As character) A package that can last the three-year journey to Jupiter.

DEL BARCO: To create the innovative TV ad, visual effects pioneer Robert Abel and his team painted dots onto a real woman. Her body movements were the basis for the so-called sexy robot character.

In the early 1990s, mechanical engineer Alberto Menache advanced the technology. He developed an animation software for the arcade video game Soul Edge.

(SOUNDBITE OF MUSIC)

ALBERTO MENACHE: It was a Japanese ninja fighting game. And they brought a ninja from Japan, and we put markers on the ninja. And we only had, like, a 7- by 7-foot area where he could act.

DEL BARCO: These days, new technologies allow performers to watch themselves moving virtually, in real time, as fully animated characters. And with his company, NPCx, Menache is developing AI that no longer requires performers to wear motion-capture bodysuits.

So you don't need little sensors anymore?

MENACHE: We won't need sensors.

DEL BARCO: Do you need people anymore?

MENACHE: People, yes. And people need to agree. And AI needs to be trained. So to train AI, you need data from people. We don't just grab people's motion. We get their permission.

DEL BARCO: The SAG-AFTRA members on strike argue that, for now at least, AI still needs their very human performances.

Mandalit del Barco, NPR News.

(SOUNDBITE OF MOTOR CITY DRUM ENSEMBLE'S "DETRIOT")

CHANG: And just a note - many NPR employees are also members of SAG-AFTRA, but they're under a different contract.

(SOUNDBITE OF MOTOR CITY DRUM ENSEMBLE'S "DETRIOT") Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

As an arts correspondent based at NPR West, Mandalit del Barco reports and produces stories about film, television, music, visual arts, dance and other topics. Over the years, she has also covered everything from street gangs to Hollywood, police and prisons, marijuana, immigration, race relations, natural disasters, Latino arts and urban street culture (including hip hop dance, music, and art). Every year, she covers the Oscars and the Grammy awards for NPR, as well as the Sundance Film Festival and other events. Her news reports, feature stories and photos, filed from Los Angeles and abroad, can be heard on All Things Considered, Morning Edition, Weekend Edition, Alt.latino, and npr.org.