Friday, December 10, 2010

5 Features to ignore when buying a High Definition TV


1.) Dynamic contrast ratio – A regular contrast ratio is simply the difference between the whitest white and the blackest black on your TV. A measurement of light output (lumens) is taken when a pixel is at its brightest (white) and darkest (black). The higher the ratio the better. For example 10,000:1 is better than 5,000:1. The problem is there is no standardized way in which manufacturers measure this ratio so you cannot accurately compare them to each other.
To make matters worse they have introduced what’s known as dynamic contrast ratios. Dynamic ratios are inflated because they compare the whites and blacks throughout the course of an entire movie versus a single frame. Native or Static contrast ratios compare the ratio of the luminance in a single frame, which is far more relevant than dynamic contrast ratios. Dynamic ratios are often artificially enhanced with a TV’s onboard processor. They usually focus on the “1” side of the ratio by attempting to manipulate blacks. In some cases, pixels may be shut off completely which can crush blacks on the screen making it impossible to see anything. This has even given manufacturers license to claim an infinite contrast ratio by dividing by zero, which is of course ridiculous.
The take away: Ignore dynamic contrast ratios completely. Use Static or Native contrast ratios only to compare models from the same manufacturer, year, and type (LCD, Plasma, DLP, etc.). They only real way to know if your TV has superior contrast is to see it in person or read a review from a credible third party. As a rule of thumb, if a contrast ratio is greater than 10,000:1, it is meaningless.  

2.) 1080p Resolution on TV's under 50": HDTV’s come in only 2 resolutions; 720p and 1080p. The numbers correspond to lines of resolution, so 720 and 1,080 horizontal scan lines respectively. The p stands for progressive, which means an entire picture frame is displayed at a time vs. interlaced, or every other line of resolution. Despite popular belief, when you get into the HD realm overall picture quality depends more upon contrast ratio, color saturation, and color accuracy than resolution.
Most people cannot discern the difference between 720p and 1080p unless the size of the screen is 50 inches or larger and viewing distance is within 6 feet. The smaller the screen, the less important the resolution. You also have to consider the source of the signal. The only 1080p sources to date are blu-ray, ps3, xbox360, and select on-demand video sources. Broadcast television comes in 1080i, 720p, or 480i (standard definition).  Unless you watch a lot of blu-ray movies or play a ton of video games on your ps3 or xbox360, you’re not even utilizing 1080p resolution.
All televisions on the market today are fixed-pixel displays. This means each pixel is discrete and the total number of pixels never changes.  When your watching ABC, the source signal is 720p. If your TV is 1080p, then it must scale the picture to display correctly. Scaling occurs when the source resolution doesn’t match your TV’s native resolution. Scaling is a bad thing because it can create ringing artifacts, posterization, and double scaling. These are glitches in your picture! Because of this, one could make the argument that when the source signal is 720p, a 720p native TV is superior to that of a 1080p. Remember, when the picture is scaled you cannot add sharpness, resolution, or any other quality enhancing information. Scaling is always a bad thing! Virtually every 40 + inch HDTV on the market today is 1080p anyway.  720p is still available, but far more common in screen sizes 37” or under.
The take away: There’s only 2 resolution choices, 720p & 1080p, and unless your buying a 50” screen or larger you won’t see any added benefit by purchasing a 1080p set. You will be hard-pressed to find a large LCD or Plasma TV that isn’t 1080p anyway (doing a search on BestBuy.com yielded me 0 results). The only sources that utilize 1080p are blu-ray, xbox360, ps3, and select video-on-demand services. Contrast ratio, color saturation, and color accuracy are more important to overall picture quality than the difference between 720p and 1080p resolution.
3.) Refresh Rates above 120Hz on LCD HDTV’s: The refresh rate is the number of times a second your TV draws a picture. Most displays operate at 60Hz or 60 times per second. This includes computer monitors, Tube TV’s, and even most flat screen High Definition Televisions. Until recently, 60Hz was pretty much the standard in the U.S. Video is shot at 30 frames per second, so assuming it was shot in progressive, each frame is displayed twice (interlaced video would display 30 “fields” twice, but don’t let this confuse you).
Time and time again, independent testing has shown that 60Hz is plenty fast in order to create the illusion of life like motion and eliminate flicker. Flicker occurs when the refresh rate is too low and each image fades slightly before the next image is drawn (it looks like someone is toggling the brightness up and down at super human speed).
So why fix what isn’t broken?
Advocates of higher refresh rates argue that on-screen motion due to live action or camera movement becomes smoother when the refresh rate is increased, thus reducing motion blur. This argument only applies to LCD displays. Plasma and DLP TV’s don’t have as big a problem with motion blur. This is due to the way in which LCD pixels behave. On an LCD display, liquid crystals must align themselves by rotating and letting light pass through. This process takes considerably longer than Plasma TV’s. Plasma’s contain a grid of cells that are charged by electrical voltage. Charged plasma cells react to the electricity by emitting light. This process takes place at virtually the speed of electricity.
When a pixel on an LCD display receives information to change, it stays lit until it is presented with new information (have you ever noticed a single ‘stuck’ defective pixel on your monitor?).  This pixel behavior is referred to as sample-and-hold and is relatively slow compared to other display technologies. If the movement on TV is fast enough (like it typically is in sports or action movies) motion lag will occur. By increasing the refresh rate of timely information, say of a football players arm throwing a pass, motion blur should be reduced. But motion blur is an incredibly complex phenomenon with many contributing factors including refresh rate, pixel response time, compression, scaling, 3:2 pulldown, type of display, and how the brains perceives movement. Because of this, motion blur can plague any display type, including Plasma.
LCD TV’s with refresh rates of 120Hz, 240Hz, or 480Hz do reduce some motion blur, but the difference after 120Hz isn’t noticeable and you’ll typically pay more for a higher refresh rate.  In addition, many LCD displays use interpolation, meaning they add images by ‘guessing’ where an object should be in between frames in order to make the transition appear more fluid. This comes at the cost of unwanted background image artifacts. Some people experience a loss of depth and a ‘plastic like appearance’. Luckily TV’s with high refresh rates can be turned back to 60Hz with a click of a button. I have yet to see a TV that doesn’t include this as an option.
So why not get just get a 60Hz instead of a 120Hz?
There is a lesser-known side benefit to 120Hz and it has to do with film. Unlike video, film is shot at 24 frames per second. If you try to divide 24 into 60Hz you get a mess of each frame being displayed two and a half times per second. To solve this problem they developed a way of converting film to video known as 3:2 pulldown. But 3:2 pulldown decreases image quality and introduces judder. Judder is a constant shaky or jerky effect most noticeable in still shots. This is where 120Hz comes in. 
If you’re a math wiz, you may have already figured out that 24 goes into 120 exactly 5 times; it’s a perfect fit! This means each frame is refreshed 5 times in a 120hz TV, thereby eliminating the need for 3:2 pulldown and its unwanted side effects. With the right equipment, you can now view movies at home exactly as you would see them in the theatre! The same holds true for 240hz and 480hz as they are divisible by 24 as well, 10 and 20 respectively.
Many blu ray players and DVD’s are now capable of 1080p/24 output. Just to confuse and frustrate you more, not all 120hz Televisions can accept the 1080p/24 signal but if your TV is 60Hz then it for sure won’t work. You need to do your homework first to make sure you’re purchasing all of the right equipment if you want to watch movies at home in 1080p/24 but the extra effort is worth it. You can now view movies from home just like you would in the theatre but without the price of a ticket and the annoying kid behind you kicking your chair.
 The take away: Anything above 120Hz on LCD televisions isn’t worth the extra price. The difference after 120Hz isn’t noticeable to most people. Higher refresh rate HDTV’s use interpolation, which will smooth motion in fast action scenes or sports but comes at a cost of decreased image quality and unwanted artifacts when viewing movies. Luckily pretty much every 120Hz or higher TV can be switched back to 60Hz when needed.  A 1080p/24 compatible TV that has a 120Hz refresh rate is better value than a 480Hz TV that isn’t 1080p/24 compatible. Look at the refresh rate if you watch a ton of sports and have to get an LCD display, otherwise ignore it completely. I will be talking about Plasma 600Hz sub-field drive in another blog.
4.) Wide Color gamut’s:  It is engrained in the mind of consumers that bigger, faster, and higher are always better. And marketers are privy to this. This is one instance in which more of something is not only unnecessary, but is actually counterproductive to overall picture quality. Today’s movie and television content is created and calibrated on 3 color displays. Most people are familiar with the RGB color model: red, green, and blue. High definition televisions conform to the Rec.709 or BT.709 standard, which utilizes the RGB color model and sets forth the color space parameters. These standards are incredibly strict and help ensure accurate color reproduction.
What does this all mean in the real world? This means that whatever colors the director saw on his screen are EXACTLY what you see on yours. If your TV advertises a wide color gamut you will get over-saturated and inaccurate color reproduction. Bloody reds and sun-bright yellows for example. Watch an episode of CSI Miami for reference. This show intentionally over saturates and alters colors as a stylistic device. The ocean is a blazing neon-aqua and each room in the glass crime-fighting lab is a distractive yellow, green, or blue.
It’s important to understand that virtually everything that comes out of Hollywood today is color graded in the studio. This means the colors have already been manipulated to create a particular mood, style, or simply to create the most aesthetically pleasing picture possible. Let’s say the director is shooting a scene that takes place in the forest and really wants those greens to stand out. He will have them enhanced to perfection in the studio. But oh wait, then your TV with its “superior” color gamut decides to reproduce those greens even greener! Do you get the picture yet? Or maybe you don’t because you’re reading this on a wide color gamut display.   
The exaggerated colors outside the gamut of the REC.709 color space are nearly impossible to find in nature anyway so why would you want a gamut that creates colors that don’t naturally exist?  
The Take away: If you want accurate color reproduction then stay away from TV’s that advertise wide color gamut’s or a fourth primary color sub-pixel such as the Sharp Quattron. If you like over exaggerated yellows then the Quattron is your best bet. Colors are already enhanced in the studio with specific intentions and to precision. Wide color gamut’s will simply over saturate and inaccurately display colors.  
 
5.) Energy Star qualified: Energy star is an international standard for energy efficient consumer products in the US. Products that are energy star certified should advertise the energy star logo. They claim up to 40% less energy usage than standard TV’s.
There are several problems with this concept. To begin with, the type, size, and settings on your TV have a far greater impact on power consumption then the energy star rating. So despite the fact that your new 50” flat screen LCD is energy star qualified, it is still consuming more electricity then the tube TV it replaced. CRT or tube TV’s are less efficient per inch, true, but they never made 50” tube TV’s! The bigger the TV, the more power it consumes.  The old rear projection TV’s are the most energy efficient per inch. If you’re comparing a plasma TV to a LCD TV and both have the energy star logo, you might think either one is going to consume roughly the same amount of electricity. You would be wrong. Plasma’s consume anywhere from 2-3 times the energy of LCD’s.
To make your life more complicated than it needs to be, all TV’s have different out-of-the-box settings, including brightness and contrast levels. Most people don’t change these settings. If you were to go home and plug in your new 46” LCD Sony KDL-46A750, in 1 year your TV would use $60.83 worth of electricity. Now lets say instead you purchased the 54” plasma Panasonic TC-P54Z1. Over the course of a year, it would cost you $59.40. A savings of $1.43! But I thought you said plasma’s used twice the energy of LCD’s?! They do, but only when they are calibrated equally. With all settings equal, the plasma would then cost $69.55/yr and the LCD only $31.36, less then half the cost and $38 in your pocket.
To frustrate you even further, the brightness controls on your Television don’t truly control the brightness. It controls the signal level of blacks. This does however have a small effect on brightness, but you’ll want to adjust your contrast levels to make a real difference in energy usage. This controls the amplitude of your video signal, also known as brightness. Confused yet? Good.
When your TV is off it is still using electricity. Some of the newer TV’s can run on less then 1 watt of power in stand by mode. This equates to less than $20 a year on your electric bill. Up until November of 2008, TV’s could be branded with the energy star logo and didn’t even have to be turned on during testing.  All that mattered was their stand by power consumption. Then came the improved energy star 3.0, which required testing for when the TV was on. The problem was these standards were remarkably easy to meet and virtually every TV qualified. Even worse, manufacturers could fudge the numbers by rating their TV at a low brightness setting and were allowed to perform their own testing. As of 2010, we are now on the 4th generation of energy star.
Even if energy star can get its act together an inherent problem remains. You can only improve efficiency so much until you must sacrifice something. Perhaps manufacturers will release TV’s with inferior brightness, contrast, and other features in order to meet the new standards. There’s simply no way a Plasma TV can compete with the low energy consumption of a DLP TV. There’s also no way a DLP can compete with a plasma TV in terms of picture quality, color accuracy, and viewing angle.
The take away: Energy star rating has nothing to do with the quality of your TV picture and is an unreliable way of comparing energy usage among different types of televisions. What makes the greatest difference is the size, type, and setting of your TV. If you are concerned about energy consumption, buy a small LED LCD TV, turn down your brightness, and steer clear of 50 + inch plasma’s. According to CNET.com, the Sharp Aquos is the most efficient TV on the market.

 


No comments:

Post a Comment