A Boy And His…Dog?

The story for The Terminator evolved from an initial image in the mind of writer/director James Cameron, that of a gleaming metal skeleton emerging from billowing flames. For filmmaker Oliver Daly, his proof-of-concept short film MILES came about in a similar fashion, the vision of a mechanized creature keeping pace alongside a motocross biker sparking his muse.

“I imagined this ‘robot dog’ that has escaped from a military base, very powerful, artificial intelligence, paired with this kid who crashes dirt bikes in the desert,” Daly explains. “I realized what I had on my hands was a badass E.T.—in the same way Spielberg looked at Old Yeller and said, ‘How can I update that?’”

For his gone-rogue rattlebox MAX (Mechanized Assailant and Explorer), Daly extrapolated from contemporary thought on military/industrial developments in robotics—like the walking machine BigDog and its various mechanized brethren birthed from Boston Dynamics—as well as how we’ve begun to build ourselves companions.

“It’s interesting to see how technology, surveillance and the military presence is encroaching on kids’ lives,” says Daly. “We’re building machines, and even though they’re not living things in the same way, we have a very primal relationship with technology, and we’re only going to become more emotionally attached to technology in the long run. The way that [we] pair identity to devices could be similar to how you pair with a dog. When I realized the story is, in essence, about a dog, it became a lot easier. We started with sketches and went to a number of talented designers for assistance.”

“MAX”-IMUM OVERDRIVE

Professing a desire to follow in the footsteps of Neill Blomkamp and Damien Chazelle, who both made proof-of-concept shorts that led to the feature films District 9 and Whiplash, respectively, Daly generated funding for the short through Kickstarter. The filmmaker sought to draw upon the well of creatives in Southern California, including animators and VFX artists, and offer up a tasty bite-sized nugget of what his prospective feature would include. Key talent included Andrew Gant, who handled visual-effects photography that exploits the new DepthKit system, VFX house Territory Studio (see “NASA Goes Hollywood,” HDVideoPro, December 2015) and director of photography Isaac Bauman.

Bauman had been shooting a documentary when he received the script and storyboards for MILES. “Everything was completely thought out,” he recalls. “It looked like they’d been doing prepro for two years. Because this was science fiction, Oliver’s first inclination was to make it a very slick, modern-looking short—like shooting on a RED DRAGON with Master Primes, Cookes or Summilux—and that was my first thought, too. RED is great for making images look cold, sharp and slick, but Oliver hired me because he liked the soft, organic look of my work and wanted this to be different from anything else in the sci-fi genre.”

Reevaluating MILES with an eye toward the desired visual aesthetic, the cinematographer quickly settled on the ALEXA Plus with a 4:3 sensor. “Using vintage anamorphics [Japanese-made Kowa Cine Prominars from the 1960s] makes everything softer and creamier, almost dreamy,” he reports, going on to explain how this approach netted a more film-like look. “On film, every new frame is changing. You never know the imperfections of the chemistry and the temperature and the physics, so every physical element that goes into the corresponding frame of film could actually be brighter, darker, softer—and that’s why film looks so much more organic in the end. With digital, you have the same photosites for every frame, so there’s a more consistent look that appears much cleaner. The ALEXA looks better than the RED because it has less pixels. [With] less, you have larger, more light-sensitive photosites.”

In further pursuit of a more textural and organic feel to his imagery, Bauman shot at 1600 ASA, altering the range from the usual seven stops over/seven stops under.

“That shifts [things] so you have eight stops over,” he points out. “So much of MILES takes place in sunlight that choosing 1600 wasn’t that counterintuitive, as it protected our highlights.”

To create the robot’s vision in MILES, a software/hardware combination called the DepthKit was used on set, enabling the production to film actors, objects—and even live motocross racers—and instantly turn them into 3D models. Once a scene was filmed, director Oliver Daly had the ability to fly virtual cameras around the actors, change angles and add beautiful motion design and other effects to that same data. A HUD display, user interface, particle effects and other motion design work were also added by Territory Studio.

After quickly scouting locations, which included a biking spot in the Inland Empire, the two-day/two-night shoot followed immediately. “Oliver spent so long preparing that he didn’t feel rushed at all,” Bauman observes. “He had the whole thing worked out—the cast, locations, script and [VFX] storyboards were all locked, all except for my department. It felt less rushed because he was good to go. It was all about making sure that I knew what Oliver wanted and that I had what I needed to do it.”

The DP relied upon an Easyrig Cinema 3 700N with 5-inch arm for most handheld work. “The whole scene on the bed at the beginning is shot doc style; we had the actors improvise their dialogue as I moved around them capturing different moments. I shot everything as close to wide open as possible.”

Director and DP agreed on the improv approach, with Daly explaining blocking without becoming specific about framing, which enabled more naturalistic and organic results with both camera and performance.

With extremely tight budgeting and just a two-man lighting crew, Bauman utilized available light in most instances, though views of a scientist viewing the goings-on of boy and robot on a monitor from his lab required a more directed approach, with lighting shaped to hit only the man and his desk.

“I rigged a Kino Flo 2ft 4Bank from the ceiling [and] used some diffusion,” says Bauman, “as well as a 1/2 CTB on it to give a cold, electronic vibe. I also used a close-focus diopter in the matte box to get the extreme close-up of the scientist’s eyes/glasses.”

Visual effects photographer Andrew Gant tests the new DepthKit system on set.

Bauman reduced ambient illumination in the lab by blacking out the skylight. He found the removal of light to be his most useful resource in creating well-defined imagery. “It’s very important for light to have a shape,” he maintains, “and with a low budget, you get that shape not by adding light, but by taking it away. In the bedroom scene, there were windows on two sides of the room. I blacked out one and shot with the light coming in from the other direction.”

On day exteriors, black flags helped create a shadow side when shooting actor close-ups.

POV—WITH ATTITUDE

Creating distinctive points of view for nonhuman intelligence has often been among the most memorable visual aspects in past science-fiction films, ranging from HAL 9000’s fisheye view of his shipmates in 2001: A Space Odyssey to RoboCop’s RoboVision. Diverse visual treatments also inform the perspectives revealed in POV shots from the Terminator and Predator series, though it wasn’t until Star Trek: First Contact that audiences experienced Borg first-person perspectives that were long on mood and style without relying on heat signatures or fields of printouts.

Daly helped ensure MAX would come off as something beyond just a knockoff of past movie bots like Johnny from Short Circuit and Red Planet’s AMEE by envisioning a new look for how the machine perceived the world around him.

“When writing the robot dog character, I was seeing him in a ‘lensless’ way,” the director recalls, “as machines are starting to see lenslessly through lasers and different sound waves. I always imagined this quivering, almost organic and messy array of points that are then interpolated to create the robot vision. I didn’t know if that look really existed, but I Googled to the hills, and lo and behold, found artists who had hacked an Xbox Kinect system with open-source software.”

Cinematographer Isaac Bauman preps ARRI gear to get rolling for another day on MILES.

Exploiting Kinect-enabled systems for capture of data—specifically, the repurposing of its depth-sensing function—has become more than a hobby for some, and the deployment of the DepthKit is more than proof of that. The rig allows volumetric incorporation of live-action elements for VFX animation, so that filmmakers have the capability to reshape their visuals in post, producing everything from character work to virtual fly-arounds.

Artist/filmmaker Gant, to whom Daly first turned for his robot vision, believes, “It opens up a whole new world of cinematography, where you can turn any subject into a 3D model and add your own creative vision after that; [in MILES], it looks a little bit like TRON, mixed with glitching effects. You needed hacking programming knowledge until these guys designed the DepthKit, a software/hardware combination package that allowed me to use this technology and go out there and film on-set instantly. It was just a rig and a computer set up on a cart ready to go.”

The DepthKit rig and software allow a camera to be matched with the Kinect sensor and calibrate the depth data (production used a Canon EOS 5D Mark II). The resulting scene can be exported in PNG or OBJ, the latter for high-res 3D applications such as Maya.

After meeting up, both Gant and Daly were in agreement that the DepthKit needed to work for high-speed action scenes. “There was a lot of testing needed,” acknowledges Gant. “Previously, I was just shooting static, locked off on a tripod, and then you can move around the image in post after collecting a 3D model. For example, if I film your face [on a tripod] on the rig, I can fly around it in post. The biggest challenge was figuring out a flawless workflow to film high-speed stuff on set. I would have 30 minutes to get my scene so we couldn’t have this faulty rig. Nobody had ever done anything like this with the DepthKit, so we were voyaging into a new frontier. We ended up handholding the rig, trying to get the right angle where the laser can scan the most effectively to capture information.”

Actor Robby Rasmussen, who plays the protagonist Miles, gears up for a racetrack scene.

While both parties were massively pleased with the final effects as rendered by Territory Studio, Gant sees an even broader future for the DepthKit, one freeing up filmmakers from current logistical constraints. “You don’t have to go to a motion-capture facility and put on a suit with all these balls attached,” he declares. “Now you can hire a kid with a 5D and an Xbox to create beautiful effects. I want to look at Lidar scanning, where you can scan an entire city and use that as a backdrop. Also, the new DepthKit is coming out, so it will provide a lot more features when we make the film.”

CINE-“MAX”
Response to the completed short was strongly positive, with Daly securing representation and a development deal with David S. Goyer. No stranger to science fiction and fantasy, Goyer lent a hand with writing all entries of the Batman reboot, then created the series Da Vinci’s Demons and Constantine.

A Boy And His...Dog?
Writer/director Oliver Daly

“David told me that he wants to shepherd young filmmakers into their first films, and that’s what he’s doing with his company Phantom Four,” says Daly. “Science-fiction films about technology are often cautionary tales. Many studies show how people have a very emotional reaction to their phones. Scientists have examined people’s interactions with robot dogs; in Japan, a discontinued robot dog meant the manufacturer could no longer repair them. Owners were actually having funerals for their robot pets to deal with the sense of loss.

“For me,” adds Daly, “the main idea of this film is to talk about how domesticated animals can be a proto-technology that we, as humans, have created and shepherded, how technology can amplify our senses and provide companionship—just like living things. I’m offering that maybe there’s some symbiosis there if this thing we create lives alongside us. Maybe we can be helpful to each other.”

For more information on MILES, visit crainesystems.com. Learn more about the DepthKit at rgbdtoolkit.com.

MENU