The Future Of Mobile Storytelling

Filmmakers and visual storytellers have been attempting to bring virtual-reality narrative content to audiences for almost a decade, and during that time, the production challenges alone have been numerous and daunting. The ongoing quest remains the same: How can a 360º view of any narrative adventure be created without seeing camera equipment and crew? How can sophisticated visual effects be generated with free-flowing camera movement mostly dependent on the viewer, not the director and cinematographer? And, most importantly, how can you create an immersive virtual experience without some sort of massive circular IMAX theater or expensive, specialized VR headsets and playback systems?

Visual effects and creative content studio The Mill recently overcame these obstacles, teaming up with the Google Advanced Technology and Projects (ATAP) group, Bullitt Productions and Fast & Furious and Star Trek 3 director Justin Lin to create HELP, the latest Google Spotlight Story, and also the first immersive, live-action film created for mobile devices.

Google’s Spotlight Stories is a relatively new venture, launched in October 2013 and billed as a “storytelling platform for the mobile audience.” HELP is the fourth story in the series and the first to feature live-action actors. It takes place after dark in Los Angeles’ Chinatown and features aliens, a generous helping of explosions, and a cast of heroines and heroes. Adapting the playback medium to mobile phones gives the audience an accessible window to move their view anywhere in any given scene, allowing a feeling of being part of the action—far more immersive than passively sitting on a couch or at a computer desk.

The Mill, respected for its high-end visual effects in award-winning moving image, design and digital projects for the film, advertising, games and music industries, played a pivotal role in the creation of HELP, working in close partnership with Bullitt Productions and director Lin, as well as the team at Google ATAP. The challenge was to fully realize many ambitious goals for this new medium.

The Mill helped create a full 360º camera rig equipped with an array of four RED EPIC DRAGON digital cinema cameras to follow the action. Being able to successfully photograph each scene from four different vantage points created four angles that were combined into a 360º view. Gawain Liddiard, visual effects supervisor at The Mill, explains some of the initial challenges in making this format workable.

“We began this project almost two years ago,” Liddiard recalls. “One of the really tricky challenges to get around was the fisheye lenses used on the rig. These have an inherent bowing effect with virtual image projection that’s meant to be there, but mix that with normal lens effects and distortions that were designed to project the image onto a flat imaging plane, and you have some challenges to overcome.

Collaborating with the Google ATAP team of experts and acclaimed live-action director Justin Lin through Bullitt Productions allowed The Mill to flex its creative and technical muscles to solve new and complex challenges. All photos courtesy of The Mill.

“That’s where we stepped in and wrote our own de-lensing tools to make the format work,” adds Liddiard. “The tools created are very accurate and powerful, and turn the four views into the single ‘unwrapped globe’ format. We also developed a proprietary software and hardware solution to work in real time on set. This performs a simplified version of stitching a continuous 360º view that we called Mill Stitch.”

There were other significant technical challenges faced by the team, such as how to track motion in a virtual world where viewers can move almost anywhere they decide, moving or tilting their phone.

Advertisement

“In motion-tracking work for traditional media, the camera is mounted on a motion-control rig,” Liddiard explains, “where the moves can be programmed into the rig and the data for the camera movement, lens focal length, exposure and camera field of view are recorded and archived. This data is then passed on to the visual-effects team as tracking reference so that the backgrounds created can be accurately tracked to match the movement of the camera, resulting in a seamless and convincing composite.”

In the bluescreen photography for HELP, a slightly different approach to how to move the camera rig was needed. “In order to accurately track the action, we would have to take a rectilinear crop of the image, assigning each camera with a theoretical 4.6mm field of view, as if each camera had been using a normal 4.6mm lens,” Liddiard continues. “We discovered that if we tried to track one background plate image at a time, we could get a very accurate, stable result. However, combining the four images would result in drift of the tracking between the shots.”

The team needed to bias the tracking for each shot to a master camera, typically the “hero” camera pointing at the actors. “We would compensate by weighting the other three angles to respond and base their tracking calculations toward the ‘hero’ camera shot,” says Liddiard. “We were then able to obtain a very steady and realistic tracking for the film in this way.”

One challenge filming in any sort of VR environment is that the camera rig, by definition, needs to photograph a 360º view. So where do the camera operator, crew, grip and lighting go in the final shot? The HELP team came up with an ingenious solution.

“You could move the camera in a lot of different ways, and we discovered that utilizing a cable camera rig like the ones used in sporting events seemed to work best,” reveals Liddiard. “The only visible items in the shot were the cable and a small portion of the cable cam rig above the camera that were easy to work around. We had rigging and gaffing challenges, as well. You just have to get creative as to where and how you rig lighting, a factor that drove us to a bluescreen approach. After trying a more live-action location, we evolved toward the idea that CGI environments shot with bluescreen were best. This made our workflow much more efficient by virtue of not having to do massive cleanups for every single shot.”

After numerous tests, experimentation and R&D, The Mill developed Mill Stitch (trademark pending), a proprietary software solution that takes images from multiple cameras and then stitches the output into a continuous 360º view. This on-set solution provided the director and the director of photography with the unique ability to dynamically move the camera to follow high-intensity action that is the dramatic hallmark of HELP.

One of the primary stars of the film is an alien creature that evolves during the story. It begins as a friendly and childlike being, but as the action heats up, the creature transforms into an increasingly aggressive and threatening presence. The Mill team worked to develop the creature’s proportions, musculature and skin textures to help sell character on a massive scale. Crocodile, elephant and rhino skin, as well as bearded dragons, were referenced to get the right effect. As the creature grows from about six inches to 200 feet tall, balancing the artistic look with all the technical aspects became a considerable challenge.

During postproduction, a more refined version of Mill Stitch (trademark pending) was used in tandem with several other proprietary Mill software tools to rebuild a seamless drama combining live-action and vast CGI environments.

Lead character artist Majid Esmaeili shares, “One of the more difficult parts of designing the character’s evolution was to make the few spikes that appear in the second phase of the creature’s growth increase to more than 4,000 spikes in its fifth phase. We had to do the same with its internal skeleton for the rigging purposes, creating a blend-able skeleton from Phase One to Phase Five.”

With all of the production value, innovation and craft put into the creation of HELP, it stands out as groundbreaking in a field crowded with many technological achievements.

The experience of viewing HELP on your mobile phone is unique, and I can honestly say I’ve never experienced media in this way before. While the experience isn’t as immersive as sitting in an IMAX theater at this juncture, it’s more physically involving.

As an experiment, I handed my iPhone with HELP running on it to several friends. It was fascinating to witness the different ways that each person experienced the film—some merely rotated the phone to slightly change their view, while others literally were spinning around the room, truly immersed in the story.

At this point, HELP and the other short films in the Google Spotlight Stories series seem to be more a technology demonstration than an actual product or entertainment stream. It seems natural that Google may possibly monetize, license or put forth this “VR for mobile platform” technology for other companies and individuals to use and create their own content.

While it may or may not become the future of visual entertainment, Google’s VR for mobile is a smart step forward, utilizing ubiquitous technology that we carry around in our pockets and interact with on a daily basis. HELP and the Spotlight Stories app are impressive accomplishments that will leave you wanting for more.

The best way to experience this new technology is to go to the Google Play Store (if your smartphone is Android-powered) or the Apple App Store and download the free Google Spotlight Stories app.

To download the app, go to the Google Play Store at play.google.com, and go to the Apple App Store at itunes.apple.com. Visit The Mill’s website at themill.com.

MENU