Content Insider #704 – Content AI
By Andy Marken – andy@markencom.com
“When your enemy’s making mistakes, don’t interrupt him.” – Billie Beane, “Moneyball,” Columbia, 2011
Netflix reportedly has a great set of analytics that will help them determine if a screenplay or film they’re considering from the Mumbai Film Festival will have the stickiness they need to attract and retain the broadest audience possible.
But after a rough day of Zooming; why, when we sit down in front of the flat screen and simply say, “Netflix show me something of interest,” do we get three options we watched last week?
Dude … we saw them; they were good but … NO!
Machine language and AI runs across every segment of the M&E industry – and everywhere – but they can’t replace the writer, actor, director, cinematographer, editor, CGI tech, audio engineer, animator, you name it to create content that people want to see and experience.
Sure, Netflix, Disney, Universal and every studio for years have had their own process to determine what should be greenlighted, moved down the food chain or rejected. But now, with new AI platforms, they are able to build on what they have been working on for years.
The use behavior for relying on data analysis in the studios and film community, which has led to their embracing AI today, has been around for over sixteen years.
RENTRAK, now part of Comscore, moved into the studio system in 2004, becoming an almost instant success by providing detailed analysis on how films might perform at the box office.
Today, with even more detailed data being fed into today’s AI platforms; producers and editors are able to search through terabytes of stored content to find just the right scene(s) that might enrich/complete the story.
Computer vision enables content producers to manage all of the digital visual content to help speed the media production process.
But it still requires the human touch to deliver the money shot or deliver a show/movie people will sit through and recommend to friends.
See That – Thomas Middleditch follows the AI-developed script in the short film Sunspring and takes out his eye. O.K., so he didn’t, but it was a logical move for the machine-language system.
“Sunspring,” produced in 2016, was written, directed – even the musical interlude – by AI with Oscar Sharp and Ross Goodwin collaborating as an example of how well the technology can handle a creative project.
If you’re kind, you call it an incoherent sci-fi B-movie that had all of the elements – intrigue, romance, murder, a dark future world – but fell short … far short.
Sharp and Goodwin did their best. They had their system study/analyze dozens of sci-fi screenplays.
Using long- short-term memory (LSTM) recurrent neural network technology (often used for text recognition), the AI (named Benjamin) delivered the screenplay, stage directions and character lines for “Sunspring.”
The film finished in the top ten of entries into the Sci-Fi London contest, earning an analysis by sci-fi author Pat Cardigan who said, “I’ll give them top marks if they promise never to do this again.”
In other words, machine learning and AI aren’t designed to replace human decision-making but to empower it.
A bad or good movie can’t be technically quantified any more than the difference between a 10-20 percent or 80-90 percent rotten tomato project can be defined.
Quality, interesting, captivating, provocative, thrilling, beautiful, ugly and sordid are all soft or subjective words.
The measurement, quantification of these words is often abstract.
The images and words that kept you involved in the film may be the elements that bored or repulsed us.
It’s a lot like Justice Potter Stewart’s Supreme Court-written ruling in 1964 regarding obscenity when he said, “I know it when I see it.”
AI is now increasingly instrumental in aiding studios, underwriters and producers determine the relative financial outcome of a show/project.
Industry analyst and producer Stephen Follows has found that even though they are nebulous terms, positive or negative messages can have an effect on whether or how profitable a show/film will be.
Keep Positive – With people suddenly “locked away” in their homes away from each other, it’s a little obvious that a positive show/film is more meaningful, more profitable than one that is a downer.
Studying the financial performance of more than 4,000 films across all genres (sci-fi, adventure, comedy, drama, romance, fantasy), Follows found (which you might say is obvious) that the more positive the message, the more profitable the project.
Sure, there are a number of people in the midst of the pandemic couldn’t wait to watch “Alien,” “Friday the 13th,” “Nightmare on Elm Street,” “Suspiria,” “It” or “Halloween;” but frankly, our wife would forever revoke our remote or voice control privileges – and worse – if we attempted to watch any of them.
No way … ain’t gonna happen!
While AI is data-driven, shows/movies are built around human creativity … the rest is just execution.
An early application of data analytics has been in increasing the diversity in films because not only is it right, but because it is/needs to be a natural part of life around us.
Smashing Barriers – It has been a long time in coming but women and people of color attract audiences as well as anyone/everyone else. Chadwick Boseman and the female warriors in Black Panther certainly showed that a good story, strong/positive individuals and great acting are what attract audiences.
Films like “Black Panther,” “Parasite” and “Roma” showed the industry that people like strong, positive stories, regardless of race or sex.
AI is increasingly used to “show” studio executives and project underwriters that the project accurately presents the best diversity to appeal to the broadest possible audience.
The technology is being used to predict MPAA (Motion Picture Association) rating, detect emotions expressed, predict the target audience, determine if the project passes the Bechdel Test (baseline for female representation) and includes a diverse cast of characters.
Today’s advanced technology helps studios and directors overcome biases and deliver a project that more accurately represents and presents a story that appeals to the broadest range of audiences.
Still Climbing – While there have been bright spots in the M&E industry with females and people of color proving they are equal to the task, the numbers behind the camera and in studio management are still painfully low.
Diversity doesn’t ensure a successful project, but it does help–a lot.
Machine-learning and AI are proving to be valuable tools throughout the TV/film ecosystem from recommendation engines and during the creative process, scripting, shooting, post-production, meta-tagging and distribution.
“It’s true that some jobs/activities will potentially become redundant as the industry progresses,” Allan McLennan, chief executive of PADEM Media Group, noted. “Most of the displacement will be tasks that are highly repetitive, boring jobs/tasks that can take advantage of today’s advanced technology. This frees the production team for more creative work.”
Post-production workflow is one of the prime areas already taking advantage of AI to minimize drudgery, reducing the cycle times and the multiple handoffs.
Just the Right Scene – It has been years since crew members poured through miles of celluloid to find just the right segment, scene and shot to include in the final film. Things have become faster, better and more accurate when technology is assigned the task.
Computer vision has eliminated the hours of manual checks to spot dead pixels, fix aspect ratios, improve color correction, make touch-ups and more quickly, accurately render some of today’s breathtaking special effects.
VFX has taken advantage of a growing list of AI tools that enable crews to eliminate bottlenecks and cost overruns. Advanced solutions manipulate and extract images during capture, reducing the need for green screens during the shoot which reduces the time required and cost during the rotoscopy stage.
Fine Hairs – Special effects, CGI and AI have improved dramatically in recent years as you can easily see if you watch the Planet of the Apes from 1968 to the present. Caesar looks about as real as you can get.
If you study the technology advances made in the production of the “Planet of the Apes” sequels, it is readily apparent that the improvements made in advanced machine learning algorithms brought Caesar, the apes, gorillas and orangutangs to “life,” showing the bristling and movement of individual hairs on their body.
Crowd Control – Russell Crow played to a packed crowd in Gladiator thanks to the AI-enabled technology that filled the seats in the coliseum.
Rather than hire thousands of extras for his epic “Gladiator,” Ripley Scott creatively used AI-driven VFX technologies to create and orchestrate the coliseum’s massive crowd.
Sure, given enough time, talent and money these and other epic life-like creation and simulation of characters and scenes could have been produced but…
We don’t judge, but exactly how much more real was it for Christopher Nolan to crash a real 747 in the airport scene of “Tenet” rather than employ advanced camera work, CGI and scale models?
Okay, accounting ran the numbers, and it was a “retired” 747 which Nolan says showed it was cheaper to use the real thing.
We’re not a tree-hugger, but studies by the UCLA’s Institute of the Environment and recent British Film Institute (BFI) found that films with budgets of over $70M produce an average of 2,840 tons of CO2 which is roughly equal to the amount absorbed by 3,700 acres of forest annually.
Environmental Look – In addition to dealing with pandemic protocols, studios and production teams are closely examining how their work impacts the environment. We can do better.
I don’t know about where you live but here in California, we’re literally burning through our tree supply to absorb the stuff so, advanced creative tools will have a big, positive impact.
The British Academy of Film and Television Art-backed report recommended that studios and producers increase their use of advanced digital technology.
Beyond the noble goal of helping save the world around us, McLennan said that advanced machine learning and AI are helping M&E organizations produce cost advantages, increase their speed to market and strengthen viewer loyalty.
“Deep learning and AI have proven to deliver value to content management and distributors,” McLennan noted. “It is fast and accurate in A-B testing as well as segment selection in the development of trailers and helps build successful marketing campaigns and promotions.”
But the acid test for the technology, the studio, producer and production team is in the show/film’s audience reach and being more interesting than the myriad of other shows/movies that are ultimately in the theater but increasingly streamed to your favorite screen.
Whether people have a smart TV, streaming media player or remote, people want/expect the service to offer shows/movies that they want to watch.
By using AI and data analytics, it is increasingly easy to meet the consumer’s expectations as long as the service has all of the personal data it needs to make the appropriate recommendations.
“The sharing of personal data or personal privacy has become a hurdle streaming services have to address head-on,” McLennan emphasized. “Without the authorized access to the customer’s data, all the streaming service has is a huge library of content and a nice but worthless computer program.
“Netflix, Disney and Apple have strongly – and credibly – emphasized to their subscribers that they do not/will not share their data and only use it to offer up content that is in line with entertainment they have and might want to consider watching,” McLennan emphasized.
“In addition, they advise the subscribers that they also use the data to help them develop future projects that they will be interested in viewing,” he continued.
Peace of Mind – People everywhere use AI-enabled technology in their daily work/lives and often don’t even realize it. But they do know when someone missuses their personal data.
“The services that promise and deliver strong personal data security are not only the ones with the best customer loyalty but also the best consumer participation in willingly sharing their data,” he added.
Robust AI and data analytic solutions, combined with aggressive security technology, not only increase consumer confidence in the service but also extend their loyalty.
China’s Kai-Fu Lee – the oracle of AI – explained on “60 Minutes” early last year that machine learning technology can have a broad range of applications in the world, but it can’t replace compassion, trust and empathy.
This is especially true of the content creation, production and distribution of visual stories and entertainment.
Machine learning, deep learning and AI will improve and enrich the M&E industry; but as Lee said, “Neural networks and computation algorithms don’t have our soul.”
Despite what computerists might say, AI doesn’t have the creative empathy needed to make you want to tune in another show or movie.
As Billy Beane said in Moneyball, “Guys, you’re just talking. Talking, “la-la-la-la”, like this is business as usual. It’s not.”
Andy Marken – andy@markencom.com – is an author of more than 700 articles on management, marketing, communications, industry trends in media & entertainment, consumer electronics, software and applications. He is an internationally recognized marketing/communications consultant with a broad range of technical and industry expertise especially in storage, storage management and film/video production fields. This has led to the development of an extended range of relationships with consumer, business, industry trade press, online media and industry analysts/consultants.