How Disney is using artificial intelligence to figure out exactly how much you enjoy its films,
Movie studios have a long tradition of testing out new films to see how audiences react before launching them in wide release. But with their latest research innovation, Disney is taking it to a whole new level.
Now as you’re settling in to watch the latest Disney blockbuster, the movie could also be watching you.
And while this could signal an exciting new era of responsive storytelling in which movies are shaped around our likes and dislikes in real time, it also raises some red flags about yet another frontier in personal data collection.
At a conference in July, Disney Research presented a new process called factorized variational autoencoders (FVAEs). Put in plain English, it measures complex audience reactions by assessing facial expressions.
This deep-learning system has been trained to watch an audience of hundreds of faces in a darkened theatre, and to track their reactions: Are they smiling or crying? Bored or asleep, even?
Disney Research is able to generate far more data than human intelligence is able to process. In their tests, they generated 16 million data points derived from 3,179 viewers.
And that’s where artificial intelligence (AI) plays a role. As Disney Research scientist Peter Carr explained to Phys.org, computers can much more easily synthesize that massive yield of data, allowing Disney to measure the success of a film with a granularity that goes far beyond a subjective “did you like the movie?”
In fact, not only can FVAE measure reactions, Disney says the process can reliably predict them, too.
After observing an audience member’s reactions for just a few minutes, the system is able to predict his or her facial expressions for the rest of the film using a pattern-recognition technique that functions similarly to a recommendation engine; it can generalize the reactions of an entire audience, and measure those reactions against an input that states how viewers “should” be reacting.
So you know all of those times you started watching a movie, thinking you’d hate it, but actually ended up loving it? It could now be possible for Disney to predict your enthusiasm for the flick before you were even consciously aware of your change of heart.
Quantitative vs. qualitative
Why not? Our data is already being used to predict how we’ll vote in major elections; it only seems fitting that it could also be used as a predictor of our love of Wonder Woman or The Secret Life of Dogs.
For Disney, it has big advantages over the old way of testing, which was qualitative, meaning the audience would be asked what it thought.
After all, we’re not that reliable; we often don’t know or can’t explain why we feel the way we do. And it’s likely that after two hours of watching a film, we’re unable to remember our exact engagement at the 30-minute mark, or recall the parts where we might have dozed off.
But using this new process, Disney is able to measure the audience’s facial gestures so they can match those reactions to specific scenes — even frames — of a film.
“If data analysis of audience response in real time is used to customize content, either as revision or as new unfolding content, it will create a tighter loop between what studios and producers view as successful, marketable content in the context of audience demand,” says Siobhan O’Flynn, a digital storytelling consultant and University of Toronto instructor.
O’Flynn points out a data-informed strategy is already being used by the likes of Disney-owned Marvel to determine their next movies. From Iron Man to Captain America, Marvel has been able to use big data to keep track of the detailed storyworld that unites its comic book characters, identifying the most important ones and those with the strongest fan loyalty.
But what happens when that loop between audiences and creators is even tighter? Based on how AI can provide real time feedback about how an audience is reacting, it’s possible that Disney could actually reshape a film — as audiences are watching it.
So whereas up to now, our data has been collected to better market toward us — who hasn’t noticed all of their online ads change after they start researching a particular topic in a browser window? — that data could be put to a creative use, changing the shape and execution of a story.
There could potentially be multiple endings to a film, allowing the screening to actually change dynamically, depending on a viewer’s response to it.
Similarly, with something like a Disney musical, one of the company’s favourite formats, this smart system could measure how the audience is responding to the musical numbers. If it found the viewers weren’t feeling particularly favourable about the songs, perhaps a dialogue-driven version of the same scene could be swapped in.
From real time to real life
And, of course, there’s a natural application for this new face-reading process outside of the cinema, too.
Just as Disney is known for its movie magic, the brand is also synonymous with theme parks. With this kind of granular facial-tracking, Disney Parks could react to your mood in real time, suggesting where park attendees might want to go next, what ride to try out, or even engaging them in a custom experience with a mascot.
But as O’Flynn points out, “scanning and archiving facial data in theme parks is a whole new level of privacy incursion and erosion.” And needless to say, once Disney brings this out of the theatre and into real life, you can be sure that other brands will be eager to get in on the action, too, measuring our facial gestures and modifying content in real time.
That’s where the red flags come in. Having our movies “watch” us inevitably means more cameras installed in theatres, in addition to those on our smartphones, on street corners, and practically everywhere else we frequent each day.
We already know that between our mobile devices and online browsing habits, there are troves of data about where we go, what we read, who we talk to, and what we like. This adds an additional layer of how we react — and the instinctive, emotional reactions we might not even be aware of.
As Ann Cavoukian, executive director of the Privacy and Big Data Institute at Ryerson University, points out, just the way our online browsers track and provide us with advertisements, based on what we’re searching for and posting about, this could open a whole new can of worms in which our reactions to the things around us trigger real-time feedback.
Cavoukian cautions while at this stage, only general features indicating pleasure or dismay through smiles or frowns would be obtained, “the much larger problem from a privacy perspective will be when the entire facial image — the actual biometric template uniquely identifying an individual — will be collected and retained.”
Such a biometric, she says, “could be used to track an individual’s activities and gather a host of other pieces of data associated with that individual.”
While the application of this new technology has the potential to change movie-going as we know it, Cavoukian suggests it will be important for Disney to get buy-in from its audiences, and questions how consent will be obtained.
Otherwise, she says, “This may come back to haunt Disney.”
And despite all of the company’s access to data, that would be an ending to the story that it hadn’t anticipated.
Go to Source