Thursday morning began with discussions on HDR (High Dynamic Range) and its many implementations. The panel consisted of representatives from consumer electronics manufacturers, studios, broadcasters and technology developers.HDR is still a work in progress. That’s not to say it doesn’t work, it’s just that it requires a lot of attention to the overall workflow, from capture to mastering. The tools and implementation aren’t yet universal and widespread.
I talked with one studio leader who said that they work in standard dynamic range (SDR) all through the post process. At the very end, they go back to the original files and grade the HDR. The time spent doing the grade is a fraction of what it probably should be.
This presents a problem in that everyone is used to working with the SDR throughout the creative process: “We are creating the lowest quality that we deliver and then at the end we jump to HDR.” But that’s changing.
Of course, I won’t talk about the fact that TVs are usually shipped with the image enhancement options turned on, suborning TVs from displaying the content creator’s original intent. I won’t.
But the next panel did talk a little bit about that. The discussion from DPs, studios and display manufacturers was encouraging. Metadata and standards were indicated as a possible solution.
Following was an update on the Academy Color Encoding System (ACES). This is the color workflow system that Hollywood professionals started working on as early as 2004. The first release of the system was in 2014 and it is being used on more and more features. The discussion was about what was happening with the spec and about adoption. More information on ACES can be found here: www.ACEScentral.com
Remote production wrapped up the morning. First, eGames/eSports took center stage to show how this fast-growing segment will impact mobile trucks and workflows. Following that were examples of concert live production. Then a 360 studio presented a behind-the-scenes glimpse of a 360 live production of the 2017 Eclipse. Some interesting facts from that presentation:
- They had 7 remote locations plus a host in Atlanta and a helicopter shot
- Each remote location had a 3-camera 360 rig
- Each location had satellite upload of each camera (3) to New Jersey
- Each location had 1 satellite back feed to control camera operation
- Real-time stitching was done in New Jersey
- There were more than 27 separate satellite feeds
The day moved on to localization. Pixar explained the need by studios to localize their content for worldwide distribution. For example, Pixar often uses newspapers in a scene to fill in the backstory quickly and efficiently. If the newspaper is in English it doesn’t make any sense to non-English speaking audiences, so that backstory doesn’t get told. This can’t be solved with captions and dubbing without destroying the moment on screen. Instead, new scenes are rendered with the appropriately translated text. Pixar mentioned over 40 different regions that have to be addressed.
There are also situations where there’s text on screen that isn’t part of the story but needs to be neutralized so that the scene is localized by accident. For example, you might want a street to be just a street, not a street in America. In this case, text on screen might be neutralized. Signs might be changed to an icon or a drawing – something that isn’t necessary for the story but which keeps the location universal.
Another example was from Inside Out where Riley is offered broccoli. Her reaction is commonplace in the United States, where most kids don’t like broccoli. That joke doesn’t play in Japan where most kids like broccoli. Instead, green peppers were used.
The teams working on localization start this process in parallel with the production of the film, looking at scripts and storyboards and always asking the question, “Will that work in…?” This also points to the need by studios for efficiency of storing all these versions. The IMF standard in my previous post alluded to this.
The last day at the Tech Retreat included an interesting presentation on how the Consumer Electronics media is pretty bad at predicting trends. Various data showed how long “hot” products last (about four years in the media life cycle). Some tips on how to get it right:
- Trends don’t come out of nowhere – there are early stages
- Many trends are boring
An example of that last point is that talking about the industry prepping and moving to the cloud isn’t as exciting as the “smart home”, but it is happening.
The tech retreat brings together a lot of different perspectives. This year the trends topic was a presentation from an engineer at the Department of Defense, who talked about detecting modified video. Various samples of fake video were shown and then broken down to illustrate how they determined it was fake. Some fakes were blatantly obvious, like vehicles changing color one frame before an explosion, but others were more subtle. Techniques used to determine fakes include examining compression macroblocks to see if they line up or match cloned areas. Mind-boggling.
The next presentation was on how the fusion of Artificial Intelligence (AI), social media and advertising is changing how audiences are built and maintained. It was amazing to see how social advertising is able to create demographic groups so precisely. Equally amazing is the move to real-time bidding for advertising – not paying for the audience you hope to reach, but paying for the audience you do reach.
The retreat wrapped up with a presentation on perceptual fatigue in Film, Broadcast and VR and how we humans prioritize our senses. That was followed with thought provoking questions of where we are headed. By examining the growth of data centers and their impact on in-house computing power, it made you wonder if any of the tools we have now will still be relevant.
Wonder. Mind-boggling. Amazing. Those are all good words to describe the Hollywood Professional Association Tech Retreat.