The Myth Of 4K, Part I

Okay, I’ll admit it. The title is a little hyperbolic, but stay with me here.

A few years ago, when 4K cameras became a little more accessible for purchase, I talked with a producer during an edit session. They told me how their company was going to purchase a new camera and said it would be 4K.

I listened to all of the details about which camera they had decided on and what lenses they’d purchase. I asked if they’d finish in 4K after they received their new camera. No, they wouldn’t finish in 4K; they’d still finish in HD.

Then, I asked if they’d shoot in 4K and would they upgrade their storage. I knew that they struggled with the amount of storage they had and how difficult archiving was for them. So, my question about 4K shooting was really about the infrastructure they currently had and how it would cope with the increased file sizes. Their answer was that they weren’t going to shoot 4K. Instead, they’d shoot in 2K so they could push in.

As an editor who has worked in a lot of supervised sessions, probably one of the most important skills I’ve learned is to keep my reactions to people’s comments, suggestions and opinions in check—no eye-rolling, no smirking and no outright laughing. I used every ounce of that skill at that moment. Okay, maybe not. In defense of the client, this was the early days of 4K.

So, I paused a bit and explained how 2K is really the same vertical resolution as HD, namely 1080 pixels. But that 2K was wider—2048—compared to 1920 for HD. So 2K would let you pan a bit side-to-side but not push in.

It also meant that if you’re monitoring on location in 2K but delivering in HD, you really aren’t seeing your true framing. So, I argued, shooting in 2K wouldn’t let them push in and it could end up causing problems due to inaccurate monitoring.

While all of that was true, my comments seemed to suggest that 4K was the answer to their goal of pushing in.

Next time, where the “shoot 4K and get two different framings” falls apart.