First up, this article finds its genesis based on a discussion over at the AVS forum. What are Calibration Nazis or shall we call them Calibration Militants? Is their perspective reasonable or does it represent something that cannot be attained and they just can’t admit it to themselves or is it just outright hypocrisy? This article will examine all the steps in the display calibration process and identify where there might be multiple ways of doing things and where there is only one way.
It should be noted that neither the THX Video Systems Calibration program nor the ISF Calibration program instruct their students to ignore both the end user and the equipment that they actually use. All calibrations must take into account the actual equipment that is used to view the programming. Do we calibrate strictly for the sake of calibration or should we actually consider the client and factor them into the calibration process?
For a typical calibration, here is a list of the following things that must be done:
8. Color Management Systems
10. Reference Material
Brightness. So let’s focus on the first item Brightness and see how exact this step can actually be. The actual setting of this control can vary slightly depending on the test disc that is used or the test pattern from a signal generator. Some patterns are on the coarse side in terms of visible elements to use. I often use the pluge pattern in the DVE disc for the initial setting and then I fine tune or double check with the Spears & Munsil disc which has finer elements. The biggest difference these two discs might account for is about one click on the brightness control of any TV. I will not get into the specifics of setting the brightness as the reader is welcome to watch the tutorial on how this is done. No need to repeat myself here.
First Variable … that can result is a different setting. Test pattern used. Different test patterns from different sources often vary the amount of white in the image from none to a little to a lot more. How much white is in the pattern can affect what the final setting for brightness is even in the same room, because that light output can affect how we see things in the dark.
Second Variable … viewing distance. The brightness can be set precisely with your nose to the screen where one can just barely see a hint of the checkerboard pattern in the S&M disc. That would be the correct spot except that the viewer at the normal viewing position cannot see this distinction and may not see even the 2% black bar. The display is technically correct, but the viewer is experiencing black crush since he is losing shadow detail based on where he sits. Suggesting that he sit 6 inches from the display is absurd. Do you give the person the correct setting based on where he sits or what is technically right? Who is writing the cheque here? In most cases, calibration clients are not production studios, but rather real people in real home theater settings.
I’m going to give you the best tasting cake in the world … but I am not going to let you taste it … but you still have to pay me. And by the way, the cake will spoil over time and you have to get me to come bake you another perfect cake that you still won’t be allowed to taste.
Contrast. This control has far more flexibility than brightness and can be set correctly from 6 inches or most viewing positions. Following the three rules of setting contrast should be fairly straight forward. Use the test patterns that help you to avoid clipping of the image and the discoloration. When it comes to the eye fatigue portion of setting contrast or light output there is a lot of variability here. The recommended specification for direct view displays is 35 fL of light output, but this number was determined at a time when digital displays did not exist and they were accounting for fundamental weaknesses of CRT technology as well. (Not to mention that all the displays were also 4:3 at the time) Today’s digital displays do not have these CRT weaknesses so is it still wise to strictly adhere to an outdated specification? These specifications also were developed for editing bays in production facilities which really look nothing like a typical person’s home theater. The specification also accounts for the presence of a back light behind the display which few people actually have. (As highly recommended as it still is.) There is a link on the site for those looking to get proper backlights for their displays. It leads to cinemaquestinc.com.
The specifications also account for viewing angles which may or may not apply to people at home.
In addition to some of the subjective elements here for setting contrast, I throw in an additional element which is the decision to clip image detail that is above 100% video white. The material can be mastered all the way out to 254, but not all films will use this additional range. Some use it and some do not. Some calibrators have even suggested that the material above 100% is not important so losing it does not mean much. Not sure where the proof of that kind of statement comes from though. It is okay to lose picture detail here, but sharpness must always be set precisely right. Seems okay to move the self proclaimed militancy around when it is convenient to do so and dismiss other things as stuff that doesn’t matter just as easily.
The opened ended nature of contrast means that some subjective decision is always made here; whether to clip an image to get more light output, or to move beyond specifications that are in need of updating. Any choice in setting contrast has some element of compromise.
Sharpness. This is actually pretty easy on most sets as the control is turned down until the ringing or edge enhancement disappears from the test patterns. Problem is, on some TVs figuring out this precise location is not an easy job. Pioneer Elite plasma sets were at optimal sharpness at somewhere from a setting of -8 to -11. Even with the nose at the screen, it was that hard. LG displays often are equally challenging as too low will soften the image, but the detente position at the middle is not right either. Five people in the same room might actually debate where that optimal position actually is. It can be that subtle at times. And then there are the displays that have controls that are too coarse and edge enhancement cannot be fully removed without softening the image. (Some BenQ projectors were like that.) Depending on the TV, there is room to debate where optimal sharpness is because sometimes it is just cut and dry.
Let’s add in viewing distance and visual acuity of the client. He sits here and he will ultimately be watching the TV well after I am done. I describe the parameters of setting sharpness to him and get a sense when he thinks it is set correctly from where he sits. Sometimes if an image looks soft from the viewing position because it is technically right at 2 inches, do you waver? I’d prefer to talk to the client about the trade offs that he is facing. Perfection the makes him perpetually unhappy or a slight compromise realizing that he may never see the down side. If one person cannot see the sharpness ringing on a test pattern that is literally down to the pixel level, he will never see this effect on real material since real material can never be smaller than the pixel level of the TV. Test patterns at the pixel level and real material somehow smaller than a single pixel? Doesn’t make sense.
Color and Tint. I don’t think anyone can be a militant here because of the inherent flaws in the blue filters that we use. Setting color and tint with blue filters is a best guess. Better than nothing. Different blue filters can give different answers. Add to that, color isolation modes in the TV are a better choice, but again have limitations because of how the TV maker decides to set up their color decoders in the display. This is why there are now two ways to set color and tint on TV sets. The first method is with the blue filter and the second method is with a meter measuring the brightness of the color red relative to white. I have seen enough television sets where neither of these methodologies are 100% fool proof. Sometimes the best answer visually lies between what the meter method says and what the blue filter method says. Again, subjectivity comes into play. There are no perfect displays. Instruments do lie when the controls in the TV do not work as they should. The challenge for the calibrator is to see this and understand when it happens.
Overscan/Keystone. Forgetting about this part leaves a person with a 720P image or worse. See the article about getting 1080p for more information here. The trade off here is that full resolution images come at a price. Knowledge must be passed on to the end user about full screen images that have no overscan. If for some reason the image is left with overscan, then all adherence that everything else must be done correctly to the letter pretty much goes out the window. Too many artifacts are added to the image if this is not addressed properly. This can only be done with proper education for the client or else the perfect image could result in unhappy clients.
Gamma. Here is one setting on a display which has a large aura of subjectivity to it. Up until late in 2011, there was no one right answer here. There was no industry consensus on what the gamma should be in a calibration. Experts couldn’t even agree on this. All we knew was that a range of 2.2 to 2.6 seems to cover all the opinions from all the industry experts. The industry ultimately settled on 2.4 going forward. (THX has adopted 2.4) Right in the middle of this range. A 2.2 looks different from a 2.4 … which is different than a 2.6. The answer comes purely down to calibrator preference and bias as most sets today can’t even hit that number. There is no one right answer on this one for the short term.
Grayscale. Since very few televisions behave in a linear fashion, doing grayscale always results in compromises and judgement calls somewhere and what is good enough and not enough has a large subjective component to it. Perfection is simply not possible here because no display is perfect and no measuring device is perfect either. The calibration classes give the calibrators guidelines on when something is good enough and when it is time to step back. Add to this, some measuring instruments are not up to the task here and yet the militant calibrator insists on doing everything to the letter, but uses an error prone device to finish up the process. There may not be perfection, but there are known devices that are more accurate than others and that is not a secret. He chooses the compromises … but who is to say that the errors of the instrument don’t undermine something else he did. Complain that the sharpness is set wrong, but then knowingly use an instrument that could give very erroneous results and proclaiming that the display is properly set up?
Color Management Systems. More fun with instruments here. Using the wrong instrument here will potentially give just a bunch of wrong results,which might look pretty on a graph. No CMS system is perfect so calibrators will once more have to use judgement calls and make compromises in the process. Often the TV simply does not offer the proper tools to do the work or the native colors of the display simply cannot be made to be accurate short of redesigning the TV. Fortunately by this part, compromises here simply do not account for much in the grand scheme of things. Our eyes are not sensitive to absolute color. There are errors that a meter can pick up, but they are simply beyond human perception. And what happens when the CMS controls do not work at all and it comes down to simply the judgement call of the calibrator.
If the goal is only to get the display set the best it can, we will not be using test discs from DVD players. Only HD signal generators outputting known reference patterns are good enough for this job. Set the display up with the generator and no colorimeters are allowed here since their inherent flaws and uncertainties are too great. Only spectrometers are permitted in this club, but profiled colorimeters are okay with accompanying spectros.
The end user does not watch patterns from your signal generator. As pointed out in both the THX and ISF calibration classes, this is not a proper calibration if you do not account for the devices plugged into the system. Nothing is achieved if people cannot actually enjoy the calibration because their equipment does not measure up to the HD signal generator. I suppose the calibrator could tell them to buy something else, but we could bet that this calibrator won’t be in business long after with this type of attitude. We acknowledge that the world is imperfect, so calibrations have to account for the weaknesses of the equipment being used in the system. Recommendations for new equipment can definitely be made, but leaving without optimizing the system to their equipment is precisely like giving people the best cake in the world and they are not allowed to taste it. In time the cake will spoil and they will bring you back to make another cake that they cannot taste. Uh … what world do you live in?
Display calibration is a science, but it has been described as being both a science and an art because as long as TVs are not perfect, many subjective and objective judgement calls have to be made in the process. No one person is above all others. One person’s judgement call might be considered to be very wrong my another, but reverse the tables and no one is without fault here. People in glass houses should not be casting stones since if we scrutinize even those calibration militants, I am certain we will find something they do that is actually against their own doctrine. And they will no doubt have a good explanation for why that does not count.
Is the militant calibration game only run by those that make up their own rules along the way? Is this prohibition again, where the holier than thou people are actually taking a drink behind the curtain? There is a time and place for sticking to doctrine, but that is when one does work for production studios. In a person’s home theater, the rules change because the environment for the calibration changes. And 95% of all calibration jobs will be for the enthusiast client rather than the studios.
They need a wake up call. Almost the entirety of the calibration process has judgement calls and subjectivity built into it simply because our display devices are not perfect and the people that view them are not perfect and that is only just the start of it.
We calibrate for people to watch these displays. Displays are not calibrated simply as an academic exercise without regard for the viewer.
airscapes(February 4, 2012 - 10:09 am)
I tip my hat to you sir! Thanks so much for putting things in rational perspective that make sense and are actually obtainable.
tbaudoin(January 21, 2014 - 6:32 am)
Great Article! I keep reading them over and over and find new pearls each time…
Ludo(November 12, 2014 - 11:00 pm)
I have a few questions related to the bias lighting you told about in the article. If I understand it right, those are meant to prevent the pupils from varying too much in size between dark and bright scenes, thus avoiding eye muscles fatigue?
But as you said, those were originally designed for operators, who need to spend many hours in front of screens, and even to see the controls.
Now, the trouble I have is with us needing to have bias lighting when watching TV, but not when watching front projections: is it because we spend more time in front of the former, while a projected movie generally won’t last more than two hours (though I guess it could just be because sufficient light output is already hard enough to get on bigger projection screens)?
If so, would it make sense to have a setting for longer TV watching sessions (with bias lighting and brighter calibration, say 30-35ft-L, so to compensate for the bias lighting ending up reflecting on the picture, which would lower the perceived contrast) and another setting for watching movies (without bias lighting, for more immersion, isolating oneself more from the compromised room, and with dimmer calibration)?
And seeing as the living rooms TVs generally dwelve in are much brighter than properly dedicated theaters, should that latter setting be calibrated quite dimmer than the 35ft-L for TVs (with bias lighting), but still brighter than the 16ft-L for front projection?
I get that you talk about flexibility, but understanding why and where there’s such a thing helps me in making a difference between flexible and broken 😉
Michael Chen(November 13, 2014 - 8:35 am)
Bias lights in home theater settings serve two/three main functions.
1. reduce eye fatigue
2. improve perceived contrast ratio of tv
3. reduce impact of environment colors to skew color perception.
Please read the article on setting contrast since it discusses the significance of the 30-35 f-L numbers or their lack of significance.
Flat panels tend to be much brighter than a FPTV set up. Projection often fills a greater field of view. The more the image fills the field of view, the less lightout is needed and people tend not to notice. FPTV also tends to be in rooms with better light control.
Properly placed bias lights don’t really reflect onto the screen for flat panels. They improve perceived black levels.
Ludo(November 13, 2014 - 11:08 am)
Thanks for the tips, Michael: I actually did read the article about contrast, but wondered what else was at play.
Indeed, in that article, you told about the 30-35ft-L for control rooms where horizontal field of view is 40°, which is pretty much what is generally recommended for front projection (ca 40-45°, if I remember it right) – though for this latter, 16ft-L is the target.
About the same perceived image size and dark room surfaces (at least as far as professional video control rooms are concerned ; minus the bias lighting), but significantly different luminance recommendations (even accounting for the +/-30% for the 35ft-L you introduced, which would only make it go as low as 25ft-L).
Now, I get that living rooms are generally more compromised, and that perception of comfort varies from person to person, but those recommendations also seem to hold valid in much less compromised environments.
Thing is, intuitively, I tend to watch TV programs with what I’d rather call ambient lights (no proper bias lighting yet : too yellow, and to the sides of the TV, up to now), so to avoid eyes fatigue.
But when watching a movie, I rather watch it with lights off, so to only focus on the picture; to get more immersed (though not quite neutrally colored, the wall behind my TV is quite dark, and in such conditions, I can’t even tell which color it is). It’s true that blacks could be darker, but I’m only yet starting to think about calibration (with a newer TV set in mind), and it’s true such conditions are much more tiring than with even ambient lighting (though I generally watch under such conditions for shorter durations).
And I was wondering wether I should aim for lower luminance in the second case. Now, maybe I’m lying to myself, and I would still benefit from a proper bias lighting, even when watching films, instead of trying to ape what it’s like in proper theaters, but with a still smaller picture (about 30° horizontal field of view), and a much more compromised setting. Guess I’ll have to try things like Cinemaquest myself, and see.
Michael Chen(November 13, 2014 - 11:59 am)
Even in THX’s own documentation about field of view … which is 36 degrees in the theater … they mention that some prefer a more immersive experience and can sit closer and some want to sit further away. There is not a One distance fits all solution nor is there a one ft-L fits all solution. This stuff is not written in stone tablets like the ten commandments. So many enthusiasts make this mistake and think there is only one number. Context matters …
Had the complaints from consumers kept coming into Panasonic, that THX number could easily be 40 or 45 …
This is why we don’t really care about FtL numbers when we calibrate professionally. It doesn’t matter all that much. YOu end up where you end up … and make sure that clients understand that eye fatigue is a moving target.
The theater spec of 16 is there only if you want to maintain some level of certification … since how often will you change out your bulb once it gets to 12 ft-L? Bulbs can get pricey if you swap out that often. My own projector does 8 ftL with a new bulb. That’s all there is. But in my light controlled HT … it is just fine until things get to about 4 ft-L. You will get about 700-800 hours before the drop gets to that level.
Ludo(November 13, 2014 - 12:34 pm)
It’s not so much the numbers themselves that made me wonder, than why they may vary in various situations.
I’m not trying to become a calibration nazi: I just try to understand when something is important, and when it is less so, and I think I now get it a little better, as far as luminance go, thanks to you 😉