A disturbing trend has set into our business over the past few years. For lack of a better term, let’s call it “The Resolution War.” Just perhaps a decade ago, we finally made the long, painful and possibly necessary jump from standard-definition cameras to HD cameras, and along with them, the entire video infrastructure. Looking back at material I shot 10 years ago and longer in standard definition, basically 480 lines of resolution, the footage seems, in a way, almost quaint when viewed on a modern 27” editing screen. The images were so small and so low resolution. Of course, I was also shooting film for at least some of my projects and I always realized that film, particularly 35mm film, was capable of much higher than NTSC or PAL resolution, necessary when the films was theatrically projected. But the majority of what I shot ended up on video screens at that standard (for NTSC) 640 x 480 4:3 resolution.
The Journey To FHD
I recall seeing NHK’s earliest versions of high-definition cameras and huge, expensive HD videotape recorders at the National Association of Broadcasters show in Las Vegas as early as the late 1990s. Little did we know back then that the transition to everyone shooting HD and all audiences viewing in HD would be so long and circuitous. It really wasn’t until December 2005, when Panasonic introduced the AG-HVX200 P2 HD camcorder that I actually owned my first HD camera. Storage (P2 cards) was very expensive, so the price and relatively small capacity of the media was a strong incentive to shoot most of our material in 720p, not even in full 1080 HD. If you never shot 720p, it was somewhat like “HD Light” with more lines of resolution than SD, but quite a few less lines of resolution than FHD 1080 footage.
The Rest Of The Production Process
One of the challenges during the transition from SD to HD was the dawning on many of us that the increased resolution of the cameras meant that the images we captured would make apparent many items in the image that would have formerly been invisible in the days of standard-definition video. Hair and makeup departments had to become more fastidious to ensure that talent looked good in HD. Set and prop design had to evolve to cover up more seams, imperfections and details than they had ever done in the past. Visual effects took on a whole new level of sophistication to match the increased resolution and detail resolving ability of HD video.
From FHD To 4K And Beyond
Fast-forwarding 12 years later to 2017, and most videographers, camera operators and cinematographers are shooting cameras that are often still 1080-capable, but because of client, market and technical specification delivery requirements from clients like Netflix and others, many have had to switch over to at least 4K cameras. The most popular 4K format is referred to as UHD or Ultra High Definition and measures 3840 x 2160 pixels, quite an increase from FHD or Full High Definition, which is only a mere 1920 x 1080 pixels. Some 4K cameras also record to the DCI (Digital Cinema Initiative) raster size of 4096 x 2160, a 17:9 aspect ratio instead of the 16:9 aspect ratio of UHD.
Is 8K Reaching Critical Mass?
I recently had a chance to shoot with a RED WEAPON that featured a 6K DRAGON imager (look for the hands-on review, on the website soon), and while I had the camera, RED announced at the same time the introduction of a new 8K imager they call MONSTRO. The raster size the MONSTRO imager produces is 8192 x 4320! I saw this week that Sharp has just introduced a new camera that also features an 8K imager called the 8C-B60A, which has an imager that resolves a 7680 x 4320 image. It’s interesting that the RED MONSTRO is aimed squarely at feature film and possibly episodic TV production, while the Sharp 8C-B60A is definitely aimed more at live broadcast, although it’s capable of internal recording as well.
It’s interesting to speculate about what it means now that there’s now an established digital 8K cinema camera on the market, as well as an upcoming 8K alternative, geared more toward live broadcast as well. What does all of this mean? An obvious conclusion would be that 8K, as a production format, is coming into its own. But perhaps there are other considerations to ponder as well.
Beyond The Numbers
It’s strange that a specification that’s purely technical like 8K can cause us to ponder things philosophically, but this seems to be the case here. If it’s late 2017 and we’re at 8K being the state of the art in high-end cameras, where will we be this time next year? Is the relentless pursuit of increased resolution a good thing or even a smart thing? Talking to actors, directors and cinematographers, it appears the answer is no. It can already be incredibly challenging to light and photograph people flatteringly in 1080, much less 4K, much less 8K. If you speak with art directors, set designers, production designers and prop masters, you’ll hear many of the same answers, that each jump in resolution is making their job more difficult and expensive, not easier, and with the results more lifelike.
Is Resolution An Engineering And Design Copout?
It’s my opinion that the increases in resolution are one of the simplest, least challenging methods camera manufacturers use to “improve” their cameras. Many camera manufacturers buy their imagers wholesale from imager manufacturers, although some manufacture their own imagers in-house. It’s much easier, from an innovation and engineering standpoint, to increase resolution, than to truly innovate how the images are actually captured, processed and recorded. Most cinematographers and directors are much more interested in achieving more dynamic range, better, more lifelike color resolution, and more realistic and malleable images than in increasing spatial resolution. Do we need 12K and 16K imagers in our cameras? The distribution pipeline has barely become firmly established at 1080, while the pressure is on to increase the sales of 4K TVs, projectors, cameras and editing systems. Where do 8K and greater cameras and pipelines belong? Is it really all just about “future-proofing”?
Are We There Yet?
Granted, there’s a loss of resolution during the Debayering process when images are turned from raw image data to a viewable image. Many feel that for the ultimate in 4K resolution, you really should be shooting 6K resolution, minimum. But taking a step back from the resolution war, where does it end? Or if the pursuit of greater resolution never ends, where does it begin to level off? Is it counterintuitive for us to be cheering and lusting for 12K, 16K and beyond imagery? To what end? I have no simple or easy answers for these questions, but I do think they’re worth pondering. Sure, larger raster size allows for a lot of creative freedom in reframing and making moves on the images in post. At some point, though, greater resolution, to me, ends up looking less realistic and more electronic. If our goal is to present imagery that looks more like real life, we may already be there as far as pure resolution.
Writer, producer and cinematographer Dan Brockett’s two decades of work in documentary film and behind the scenes for television and feature films have informed his writing about production technology for HDVideoPro Magazine, Digital Photo Pro Magazine and KenStone.net. Visit danbrockett.com.