The role of editor has changed drastically from days of old, where professionals toiled in tandem to shape and carve out the final product. The ingest process would convert footage into an edit-friendly, low-res format that was logged by an assistant editor before an off-line editor shaped storyline, culminating in an on-line edit to conform to all off-line footage. This higher-resolution end product then would be sent out to a colorist for grading, while a mixer would arrive to shape sound design and final audio. This basic postproduction process usually would involve at least five savvy professionals. Today, the picture is different, however, with an editor expected to fill many, if not all, of these positions while cutting the work. As a result, we reached out to two versatile editors in the field to decipher how technology has changed the editorial process for better, and sometimes, for worse.
Freelance editor Sarah LaSpisa has cut a wide variety of documentary and television promos, as well as shaping behind-the-scenes content for high-profile clients, including work on Alfonso Cuarón’s Gravity. LaSpisa also edited the Showtime documentary, That Gal…Who Was in That Thing: That Guy 2.
“One of the great things about freelancing is that you get to try out your editing chops in a lot of different ways,” overviews LaSpisa. “I recently edited behind-the-scenes footage on Gravity and became enamored with Adobe Creative Cloud and how Premiere integrates so effortlessly with After Effects and Photoshop. It seems more and more that editors have caught on to using Premiere with After Effects in the freelance world, a trend I’ve gladly embraced, as it’s an efficient way of working.”
When asked to name specific tools that have significantly changed her workflow, LaSpisa references Video Copilot’s Optical Flares Package. “It has some fantastic lighting effects that allow you to easily hide awkward transitions,” she reveals. “It can be used in almost any kind of situation. I’ve used it to good effect in a web series spoof on ABC’s The Bachelor.” The package gave LaSpisa a very professional look in no time, allowing her to emulate the effects that give The Bachelor such a signature look.
“I’m also very happy with the FxFactory plug-in architecture [used for Final Cut Pro, Motion, Premiere Pro and After Effects],” she continues. “It allows you to easily access dozens of impressive filters from a variety of designers.”
Such tools are gaining popularity with editors due to greater demands from clients operating with dwindling budgets, yet high expectations.
“I personally believe that the best work comes with a team effort, but clients often expect me to cover the gamut myself—off-line, on-line, color correction, sound design, as well as mixing,” LaSpisa adds. “When this is the case, DaVinci Resolve becomes an excellent tool, so much so that I need to learn how to use it better. I like the color-correction tool in Premiere, and sometimes turn to Resolve on higher-end projects, but it ends up taking me a lot longer to finish the work due to lack of familiarity.”
When asked where she sees her future in editing, LaSpisa references the current resolution race in film and TV. “It’s sort of an odd paradox for me to be editing in high resolution these days—4K, 6K, even 8K—on some of the newest high-end cameras, but so much of the content ends up on tiny four-inch smartphone screens,” she responds on HD demands. “We live in interesting times.”
For many editors, switching visual formats also has become the latest technological challenge. Livio Sanchez, acclaimed editor at bicoastal Spot Welders, is an accomplished, multi-award-winning artist who has carved out compelling content across commercials, music videos, documentaries and independent films.
Even so, he recently found himself entering a completely new medium—virtual reality (VR)—but Sanchez was up for the challenge and has since embraced more VR content, quickly learning how to craft compelling storylines in this fresh new format.
In VR programming, it’s the viewer—not the director, cinematographer or editor—who ends up choosing the shot and deciding what to watch. There are no zooms, less camera movement than classic 2D programming, and every shot has the potential to be watched in a 360-degree sweep. This begs the question: How does an editor decide how long to hold any shot in VR? What are the challenges of switching from traditional video editing to the expansive world of VR?
“VR is still very much in its infancy, but growing,” Sanchez offers. “I’ve worked on three New York Times Magazine VR projects and, as you can imagine, it isn’t a movie studio, so large budgets don’t exist. However, January saw free Google Cardboard sent out to magazine subscribers, a low-cost system to encourage interest and further the development of VR and augmented reality [AR] applications. That ended up being the most expensive project that The New York Times had ever produced, offering photography and articles from the magazine, plus a VR film that I edited. It was an interesting new way of distributing VR to the masses.”
The method of “capturing” VR content is radically different from that of traditional film and video. What are some of the challenges of editing such content?
“Technically, I’m still using editing software, and still working on a timeline with video and audio tracks,” says Sanchez. “But this is where the similarities with normal video editing end, because I can’t do quick cuts in VR, as they can disorient the user. You have to allow time for the viewer to look around in the medium and experience space around them, take it all in and get settled.
“In VR, I’m very conscious of taking care of the audience and guiding them through the experience,” he continues. “It’s my job to make sure that the viewer doesn’t ‘eject’ or rip off the headset because they become nauseous. I have to build trust with the audience and get them settled until I have room to be more aggressive, depending on what the story calls for.”
You might wonder what his current toolset is to shape and craft unique VR experiences. “I’m utilizing Adobe Premiere to edit VR right now,” reveals Sanchez. “I’m kind of an Avid guy, but in VR, I’ve now become a Premiere guy because, at this point, it handles 4K files better than Avid. I’ve completed VR work with [Vrse CEO and director Chris Milk], who have developed proprietary software that gives a live stitch from my editing system, which is how the separate camera angles are assembled into a live 360-degree view.
“I use an Oculus Developer’s Kit and Kolor Eyes 360-degree display software so that the director and I can check out the dailies and scenes in VR,” adds Sanchez. “I can’t edit the whole time wearing the VR headset, because it takes too much concentration and becomes way too intense, but I do check the dailies in VR, then I go back to editing in a 360-degree view on my flat screens while the director checks out everything in the VR headset.”
When asked if he thinks virtual reality is merely a fad or if it has the potential to become a new way of telling stories, Sanchez is reflective. “3D is an effect, while VR is a new medium,” he responds. “You can even use 3D within VR, so that gives it new life. I think VR is here to stay and this is just the beginning. It can be used in so many varied ways than mere entertainment. It can help doctors in surgery, or if you want to travel to the Dominican Republic, you can put on a VR headset and experience the magic of the island before you go. The possibilities are only limited by your own imagination.”
One thing is clear. The definition of editor is continually evolving as technology advances. Whether it’s cutting content in higher resolution, slicing images with greater dynamic range or working in an entirely new medium, the editor has no choice but to embrace the material he or she is given. Yet no matter the project, one thing remains—the artistic integrity of the editing position and the challenge of conveying storylines in the most engaging and compelling ways possible.