The Boring Detector Is Anything But Boring

The Boring Detector Is Anything But Boring

A recent beta release of Blackmagic Design’s DaVinci Resolve introduced a boring detector. By detecting long shots in your sequence, it can help highlight all the yawn-inducing scenes in your project.

I imagine it would light up like crazy if you edited “2001: A Space Odyssey” or “Rope” or “Birdman.” Maybe the next beta release will give us the exciting detector—lighting up whenever Jason Bourne is on the screen. Or the sad detector…

Like others, I’m having fun with this bold concept from Grant Petty at Blackmagic Design. For me, it’s not the tool but the marketing of it—mainly the name—that’s the problem. Defining a boring shot based on length is a myopic view of editing, as some of the examples above indicate.

You could simply call it a shot length detector. (“Boring,” I know.) Unfortunately, calling it a boring detector, while controversial, hides the tool’s usefulness.

Being able to determine longer shots might help, but there’s another part of this tool that gets ignored in all the fuss—detecting jump cuts. Jump cuts are user-defined, so you can set it to look for single frames, or two frames, or more. This is useful for longer timelines where at first glance you might not see where there are leftover frames, either from a mismarked edit point or from moving a clip without snapping to an edit point.

But I will take this analysis a step farther. What if the analysis tool could detect how many times a shot has been moved or trimmed or affected in some way? Call it the ignored detector. If a shot hasn’t been touched since it was first inserted into a sequence, maybe it has been forgotten or hasn’t received the amount of attention it may deserve. Was it ignored because you spent so much time finessing that drone shot?

Or maybe there are some quick global indicators that could quickly color all of the clips that are not playing at 100 percent speed. Or maybe everything less than 100 percent speed is blue, 100 percent is green and greater than 100 percent is red. The same thing could also indicate the positioning or scaling of shots.

Along the lines of checking for 100 percent scaling, how about a way to check which stock shots are “comps” and which aren’t? Yes, you can usually tell by the watermarks, but some stock accounts let you try out scenes without watermarks.

And since the software can find all the comp stock shots, how about having it create a simple text list of those shots? Then I can hand that off to whoever purchases the stock. Then they’ll be working off a list of the stock shots we actually used.

Analyzing for stock comps could be done by codec or file format evaluation. Mp4s could be a good indication of a stock comp. That could also lead to verifying full resolution shots versus proxies.

I could also imagine quick checks to make sure that various shots all have the same effects, like LUTs or color grades. As I consider this type of analysis, the ideas keep rolling.

I know I started writing this a bit tongue-in-cheek about the name of the boring detector. But the tool, not the name, is symbolic of the future—where edit tools are going.

Project deadlines are becoming shorter and shorter. As content needs to be posted more and more quickly, editors need all the help they can get to get the job done. A “shot length” detector might help an editor under pressure.