Color can be one of your greatest allies in the creative process. Even more than color itself, the assurance that your color balance looks as good as possible, is accurate from capture to export, and is consistent from screen to screen can be paramount to sharing your vision with others. To achieve this control over color, a color calibration process is a necessity.
The Two Pillars
When discussing calibration, you must consider not only the monitor on which you are viewing your footage, but the camera you used to capture that footage, as well. While first, we will take a look at the importance of monitor calibration, it is worth noting that although the meaning of calibrating cameras has changed over time, the importance of calibrating your camera has not.
Why Bother, You Ask?
Working with properly calibrated monitors can save you a huge amount of time and frustration. Color-correcting and grading your finished project on a poorly calibrated monitor can just make it look bad when displayed on a properly calibrated monitor. This can greatly affect audience response to it and can even hurt your reputation, so it is worth the investment of your time to work with calibrated monitors from the beginning. If you are thinking, “what's the point if I can't control the monitor in someone else's studio,” just remember—if your images start off looking their best, it is most likely they will hold up better than something that is graded too light or too dark. Trust me, as I write from personal experience, there are no rocks big enough to hide under when you are at a festival and your movie doesn't look its best.
It Used to be Simultaneously Simple and Frustrating
In the past, calibrating your monitor was simple. I'm talking about the old days of standard-def video. For the most part, you would calibrate your field monitor by eye, as viewing environments have a great effect on how you perceive the image on the monitor, and whether you were out in the field, or in a studio or color-correction suite, the process was the same. Input an electronically generated color bar signal (NTSC or PAL), flip your monitor to blue only—yes, that is what it is for—and adjust the color bars by eye so that certain brightness values are met.
Explanations about this process are available across the Internet, so there is no point in going over the process here and, in any case, most people developed their own version of how to calibrate a monitor in the field, since strictly following the guidelines never seemed to yield pleasing results. Suffice it to say, it was a simple enough process that required much trial and error, as well as going back and forth between steps. In the end, you ended up with a monitor that was set to display contrast, brightness, white, black, hue, and tint properly so that you could have an idea of what you were shooting. The concept is still applicable today, of course, with on-camera monitors, and HD field monitors—only with updated color bars.
You may even wonder about calibrating the viewfinder of your favorite camera. Well, though I can't say for sure for every camera, I'd hazard a guess that you aren't going to be able to finely tune that flip-out LCD screen, which means for judging exposure, color, and noise, you are going to want an external monitor, one that maintains good off-axis viewing for color and contrast, or a high-quality EVF to which you can press your eye tightly. Something to consider with the plethora of panels in use in on-camera monitors, some of which are repurposed computer panels and may not display an HD color space very well, is that reviewing your footage on a properly calibrated monitor instead of relying on how it looks on a random monitor has become crucial to ensuring you are getting the image quality you want.
Computer versus Television Monitors
Computer monitors, projectors, and television monitors are all different from each other, and the same image may look different on each. There are ways to compensate for this; however, once you have finished color-correcting your final project, you may need to create different versions with different color-correction settings, depending on the final display system. This is not uncommon, and something to be aware of as you ready your film or video for distribution.
Setting the Mood for Color Consistency
Perhaps you can't do your color correction in an exacting, precisely controlled environment, such as a set or location, but before you do your monitor calibration, here are a few general guidelines that you can follow.
-
Neutral environments tend to be best—a darker room is preferable to a brighter room, as it requires a less bright monitor to compete with the room's ambient light.
-
Avoid mixed color-temperature sources in your visual field when possible, as they can confuse your color perception. Also try to have the light sources match your monitor's chosen color temperature.
-
Avoid bright sources of light within your field of view. Adding blackout curtains to your windows will help to neutralize your perception.
-
Keep direct light from shining on your monitor.
Some DITs (Digital Imaging Technicians) will bring a blackout tent with them on location, and while this doesn't usually provide a neutral viewing environment (“neutral” refers to a gray of about 18% reflectance, while black is going to be down around 3 %, and will be in your field of vision as you work), it does ensure a consistent viewing environment during the day, and from job to job. Something to remember when entering your color-correction sanctuary: your eyes will need a few minutes to adjust to the environment, especially if you are coming in from outside. While doing your color grade, you will want to have something other than your screens to look at every once in a while, to keep your eyes fresh, and not have them adjust to the monitor.
Ready, Set, Calibrate
Before you begin, remember that properly calibrating your monitors does not mean that a 6-bit monitor masquerading as an 8-bit monitor will suddenly look as good as a 10-bit monitor. What it means is that different images will display consistently on a calibrated monitor, and that from one properly calibrated monitor to the next, the image will also be consistent, allowing you to make adjustments quickly and with confidence. Please note: if you are using a poorly designed or maintained monitor, the benefits of calibrating your monitor will be limited. So aim to use a good-quality monitor—after all, this is your work, your reputation, and future job prospects that you are protecting.
One very important tool for monitor calibration is the probe. It reads your monitor's true output, and not just what your monitor is set to produce. The X-Rite i1 Basic PRO 3 probe is a popular choice, though I tend to use a Datacolor SpyderX Pro, both allow you to calibrate multiple monitors, so you can rest assured you get a consistent viewing experience. Something to consider is that monitors can drift over time; even throughout the course of a day, a monitor can drift, depending on its operating temperature, so keeping your monitor calibrated is essential. When working with computer monitors, or projectors that are driven by computers, there are a variety of software options that can connect with a probe and automatically calibrate your monitor. Calibrating a television monitor tends to be a manual process; however, some high-end televisions are capable of interfacing with a probe and even capable of self-calibrating using that probe.
Once your monitor has been properly calibrated, you can feel confident when working on your image in a variety of color spaces that your monitor or color-correction software supports. If you want to see how your footage will look on a television monitor as opposed to a computer monitor, some of the available calibration software also supports NTSC, PAL, and REC. 709 (HD video) standards.
Camera Calibration
Back when Eli Whitney invented the cotton gin and video cameras used tubes—before CCDs—calibrating a camera was a painstaking process of making sure that the tubes were physically aligned, and if you were using more than one camera, you would aim them all at the same chip chart, and a video engineer would tweak and adjust the signal from each camera to minimize the differences between them. Cutting between cameras that were producing significantly different-looking images can be very disconcerting to the audience, and can take viewers out of the program. To make two cameras render similar qualities requires a skillful engineer, especially when trying to match cameras from two different manufacturers. The arrival of CCDs mounted directly on the prism alleviated the necessity of aligning the tubes, but cameras still required color balancing with each other using a chip chart, and the talents of a very experienced engineer.
"...using a color-chip chart when you shoot... allows you to accomplish quickly and automatically what used to require a trained colorist."
Today, we have cameras equipped by a wide variety of sensor manufacturers, with gamma curves and log shooting options, not to mention the capability of shooting raw. All this is in addition to increasing access to a maddening number of menu settings that affect the way our cameras record images. Now, if you are only using one camera, this may not be that much of an issue, but variations in exposure and switching between lenses can cause shifts in color and contrast that can be very time consuming and frustrating to correct, which ends up being counter-productive to the creative process.
How to avoid this? Well, you could bring your finished project to a colorist to have them tweak and adjust your footage. The problem with this is that you are going to be spending a lot of time and money just to get all your footage to match, especially if you are using footage from more than one camera, and none of that effort is being spent on creative choices. Something else to consider is that if some of the footage is not up to the image quality of the other footage, and you've waited to the end of the edit to balance it all, you may have to degrade the better-looking footage to make it all match, or go back and edit out the poor-quality footage. Neither option is desirable.
However, using a color-chip chart when you shoot is a time-honored technique; that, combined with software advances, allows you to accomplish quickly and automatically what used to require a trained colorist. This doesn't remove the need for a colorist, rather, it allows you to get your material to the stage for creative color choices quickly, and with minimal frustration. I used an X-Rite ColorChecker Classic Card, although there are other color card systems like this one from Calibrite that work just as well for this purpose. In the screen grabs below, you can see images taken with three different cameras, a standard-definition DVX 100, a Blackmagic Design Pocket Cinema Camera with a Panasonic Lumix 14-42mm lens, and a Panasonic Lumix DMC-G7 with an old Nikon 50mm AIS lens.
You can clearly see the differences between the color charts. However, using the built-in Color Match feature in DaVinci Resolve studio, I'm able to match the three-color bars quickly so that they are much more similar.
I should note that the footage shot with the DVX is in NTSC color and digitized using Final Cut Pro 6, while the Blackmagic Pocket Cinema Camera footage was shot using the film-recording mode in ProRes HQ, and the Panasonic G7 footage was recorded to .MP4. The key to this “magic” is a grid that you activate, and line up with the color chart you shot. Now this isn't limited to Resolve, it is or soon will be available in other programs, and there are a few different color chip charts you can use. I went with X-Rite here for demonstration purposes. You can see below an image with the grid on it, as well as the actual color adjustments that it makes in the color panel.
Essentially what this is doing is creating a color correction that you can apply to each individual camera's footage—shot under the same lighting and exposure settings—which will essentially calibrate each camera to each other, quickly and without having to spend a lot of timing correcting each shot as you add it to your edit. As a side note, I'd like to point out that there are adjustments you can make; for example, adjusting the color temperature will either make the auto-match grade warmer or cooler (more orange or more blue)—a useful trick, but I set it to 5600K to match the color of the lights I used to illuminate the color chart. You can see the subtle difference here with the Blackmagic Pocket Cinema Camera color chart, on the left, at 5600K (daylight), and on the right, at 3200K (tungsten).
Once you've established corrections that calibrate your different camera’s footage to each other, essentially creating a starting point, adding the correction to the appropriate footage should help intercutting between cameras and, overall, will be less jarring during the edit. Since the shots will be close to begin with, further color adjustments can be of the creative kind, and not just trying to match cameras. It isn't perfect, and it won't replace a good color-grading session. Although it isn’t technically calibrating your cameras to each other, just adjusting the footage, it is a very powerful yet easy-to-use tool, and one that used to require an experienced colorist who knew how to use the color chart to grade the image. .
I hope that this ramble through the color-calibrating brambles has shed some light on the value of calibrating your monitors, as well as your cameras. I also hope that you see the ease of use and precision that today's calibration methods hold over those of the past. The calibration tools available to virtually every cameraman, editor, and colorist today can help you to achieve more consistent footage, with less frustration and more time left for creative choices. I encourage you to become familiar and practice with this amazingly powerful, yet simple technical process that can lead to creative growth.
1 Comment
Awesome tutorials,Used that all walls:)