Can a computer write a script? Machine learning goes Hollywood - Los Angeles Times
Advertisement

Can a computer write a script? Machine learning goes Hollywood

San Jose-based Adobe is experimenting with using machine learning to help mimic someone’s voice. Its technology can create a synthetic version of a person’s voice with 30 minutes of their audio. (Adobe) LA Times Today airs Monday through Friday at 7

Share via

Award-winning filmmaker Kevin Macdonald has directed many movies, including the drama “The Last King of Scotland†and thriller “State of Play.†But last year was the first time Macdonald worked with a script written by a machine.

Macdonald directed a 60-second Lexus sedan commercial using artificial intelligence that relied on tech giant IBM’s platform, Watson. The computer produced a script featuring a sentient-like Lexus ES that hits the open road, whizzing by stunning vistas of shoreline and forests before saving itself from a dramatic crash.

“I thought this was something amazing that had all these ambiguities in it and strangeness in it,†Macdonald said. “It’s only a matter of time where the formula of what makes up a great story, a great character can be learned by a computer.â€

Advertisement

It may sound like science fiction, but the idea of using computers to help write scripts and other tasks is gaining serious traction in Hollywood. Machine learning — where computers use algorithms to sift through large amounts of data and often make recommendations — is infiltrating all corners of the industry. Entertainment companies are using the technology to color-correct scenes, identify popular themes in book adaptations and craft successful marketing campaigns. Even talent agencies are harnessing the technology for suggestions on how to market their stars.

“These are tools that enable us to make smarter decisions,†said Kenneth Williams, executive director of the Entertainment Technology Center at USC.

Advertisement

Unlike many Silicon Valley tech companies such as Netflix or Google, Hollywood studios have been slow to embrace artificial intelligence and machine learning, at least off screen. Cautionary tales of machines taking over abound. Think of Hal in the 1968 film classic “2001: A Space Odyssey.â€

“You have this sort of Armageddon type of response to the Orwellian implications of computers ruling the world,†Williams said. “People get very fearful of any kind of automation, especially artificial intelligence.â€

But attitudes are shifting — and for good reason. It can be hard for a team of humans to sort through overwhelming amounts of information, such as audience surveys and critical reviews, to understand just what makes a commercial or movie a hit. Feeding all of that information into a machine equipped with artificial intelligence — and programmed with a huge database of successes — can yield surprisingly prescient suggestions.

Advertisement

Ask the computer to analyze a plodding script, and the computer might ask: “Where’s your chase scene? Why is your dialogue so superficial? Why are there too few women on screen?†After all, if IBM can make a computer that defeated world chess champion Garry Kasparov, why can’t it be a critic?

Machine learning can provide a treasure trove of data on why certain movies or TV shows work and why others fail. The Entertainment Technology Center last year presented analysis showing correlations between a movie’s story structure and how well it performed worldwide at the box office. For example, films that led with action sequences, like the robbery in 2008’s “The Dark Knight†or a battle in 2010’s “How To Train Your Dragon,†did more than 13 times better at the box office on average than films that started with memory sequences.

“We’re not telling anyone what to make,†Williams said. “Maybe we’re saying there is a smaller audience statistically for this kind of movie and if you want to make this movie, maybe make it a little less expensively.â€

To create its commercial, Lexus worked with several companies, including creative agency The&Partnership and marketing technology services firm Visual Voice, whose artificial intelligence platform was supported by IBM Watson. The artificial intelligence was fed 15 years worth of award-winning car and luxury products ads as well as consumer insights data.

This helped the machine identify what would resonate with consumers, which the artificial intelligence interpreted to mean limited dialogue and a handful of visually appealing scenes, including a winding road that showed water on one side and trees on the other. It took location scouts two to three weeks to find such a road in Romania.

“We wanted to create something memorable,†said Michael Tripp, general manager of brand communications at Lexus Europe. “I believe there is a strong emotional connection with the ad because of the way it was scripted.â€

Advertisement

Lexus said the ad, which appeared on YouTube and other social media sites in November, helped to boost sales of the luxury sedan in Europe 35% higher than its goal.

It wasn’t the first entertainment project for Watson.

The computer was also used in 2016 to scour 100 horror movie trailers to determine what makes them effective. It used that knowledge to create a trailer for 20th Century Fox’s horror film “Morgan†by selecting different clips from the movie.

“Together man and machine create a better product,†said Michelle Boockoff-Bajdek, chief marketing officer for IBM’s Watson.

Fox also partnered with Google Cloud, the tech giant’s division that sells cloud computing services, to analyze movie trailers.

For example, the companies used software to identify different aspects in the trailer for the 2017 Fox action movie “Logan,†factoring in such images as the Marvel Comics hero’s facial hair. Then it examined other movie trailers on YouTube with similar images to gauge what types of audiences would go see the “Logan†movie.

The software also was able to accurately predict movies that “Logan†viewers had seen in the past, showing an overlap in audiences for superhero films and movies with a “rugged male action lead.â€

Advertisement

“You want to make sure you are getting as much intelligence behind those decisions as possible,†said Buzz Hays, Google Cloud’s global lead of entertainment industry solutions. “Machine learning is going to help us learn from that instinct to go with the gut or adjust as we go.â€

The new technology can also pinpoint what stories are resonating online, isolating particular scenes or characters that people are most passionate about. Wattpad Studios, based in Toronto, identifies popular stories uploaded to Wattpad’s online platform and flags them for studios to develop into movies and shows.

Canadian company Entertainment One, or eOne, is developing two of Wattpad’s fictional stories into TV series, including a dystopian drama called “The Numbered.â€

Wattpad’s technology revealed a burst of user comments around the reveal that one of the characters is gay, convincing eOne to keep him on the TV version of story.

“It’s an interesting form of development for us — new voices, new points of view, a built-in audience from a different kind of platform,†said Jocelyn Hamilton, an executive at eOne.

Advertisement

Talent agencies also are experimenting with data crunching. Los Angeles-based Creative Artists Agency operates a data analytics platform that uses machine learning to pull up information on consumer behavior. The platform, called CAAintell and developed in 2017, gathers data from dozens of sources, including social media sites and information on credit card purchases.

Agents can then use the data to support their recommendations to studios, such as why an actress should be in a film if she has a large global following. It could also give agents ideas on what types of brands their clients could represent, based on products and brands their fans like.

“It’s just building that picture around a particular talent,†said Steve Hasker, chief executive of CAA Global. “It creates a really interesting dialogue around a set of opportunities that a talent might not have thought about.â€

But Hasker says that CAAintell is in no way meant to replace agents.

“If we can provide in real time better information and insights to our agents, we think they are going to be better at their jobs,†he added.

San Jose-based Adobe is experimenting with using machine learning to help mimic someone’s voice. Its technology can create a synthetic version of a person’s voice with 30 minutes of their audio.

“If you have a large body of someone’s audio, you can train that to make up extra words,†said Gavin Miller, head of Adobe Research.

Advertisement

That could make it easier to re-record dialogue or fix flubbed lines in a script without having to fly out actors to re-record their lines. Of course, that also raises the potential for abuse.

“This question has been around for a long time — ever since people could edit tape — so we feel an added responsibility to figure out steps we can take to mitigate some of the possible misuses of it,†Miller said.

Computer analytics can also examine diversity issues in Hollywood. USC’s Viterbi School of Engineering Signal Analysis and Interpretation Laboratory, or SAIL, and the Geena Davis Institute on Gender in Media used face tracking and audio analysis powered by machine learning to educate studios about how often women are seen or speak in movies, which is very little.

Studios including Sony Pictures are working with the institute’s machine learning tools to advance their diversity and inclusion initiatives. The goal is “not to tell people what to do, but give them a mirror of opportunity to see the unconscious bias that they have and make an active decision on whether they intended that,†said Megan Smith, chief executive of tech firm shift7. “If they didn’t, it’s a real opportunity.â€

But there are limitations as to what machines can do.

Algorithms can also produce flawed and biased results if they are based on insufficient data. In one infamous case, photo categorization software Google Photos in 2015 mistakenly labeled black people as gorillas because the company failed to provide its algorithm with a diverse range of human faces to analyze.

Advertisement

USC’s SAIL is also working on how to analyze emotional sentiment in scripts, but its technology hasn’t quite figured out comedy yet, said Shri Narayanan, SAIL’s director.

“What is funny and how to be funny — that is something technology is not ready for,†Narayanan said.

[email protected]

Twitter: @thewendylee

Advertisement