Last summer I wrote a multi-part series here that looked at how much color gamut displays really need. In those articles I used the gamut of colors found in the natural world, as defined by Pointer, as a possible design goal for an ideal color display. Kid Jansen at TFT Central has followed-up on my piece with a much more detailed look at how several current color gamut standards and devices perform compared to Pointer’s gamut. He’s done some great analysis and it’s well worth reading, check it out here.
In the previous post in this series, I made the case for displays with hybrid, custom color gamuts as a great way to deliver coverage of Pointer’s gamut as well as the most important broadcast standards. We can build the hardware today to support these large color gamuts so its seems like a great solution but there is a catch: nobody is broadcasting or distributing these large color gamuts today. So, are we going to have to wait for broadcasters and content creators to slowly catchup, much like we did with HDTV?
What content delivery looks like today
Today, content creators are actually shooting in a wide variety of color spaces ranging from RAW to rec.709 to Adobe 1998. They are then forced to cram all of these different sources into the lowest common denominator rec.709 standard for broadcast or distribution. That same content is then displayed on devices with a range of different gamut capabilities from tablets that only cover about 70% of rec.709 to HDTVs that do meet the spec to OLED devices that oversaturate the content.
There’s a lot of diversity on both the capture and display sides and a clear bottleneck in the middle in the form of broadcast and distribution channels.
Adhering to broadcast standards is no longer sufficient to guarantee a good experience for consumers because there’s already too much diversity on the display side alone to rely on one standard. You just can’t be sure that consumers are actually looking at your content on a rec.709-capable device. We’re also losing a lot of the value that creators are capturing and could, in many cases, be delivered to end viewers who have the devices to show it.
How do we get around broadcast standards?
What content delivery looks like tomorrow
The first thing to note is that the internet is democratizing broadcast and distribution channels. With the web we can deliver whatever we want, whenever we want. Some players in the industry, notably Sony, are already doing this with 4K content. If there’s no content available and you believe in 4K resolution, you just deliver your own content directly to your customers.
Still, this leaves us with some potential experience problems. If the right display gamut is not matched to the right content the results will be no different and that’s why color management is key. There are several companies working on color management solutions and certification programs for devices that will make it possible for wide color gamut displays to handle a variety of incoming gamuts. Using metadata, for example, a wide color gamut display can be alerted to the presence of Adobe RGB content and then remap that content on the fly to assure that it is displayed accurately on that specific panel.
With great color management, we can maximize the gamut on the display side and pull through the best possible gamut for the device we are looking at. In this way, we can deliver always accurate content that meets the designers intent, wether artistic or commercial.
Google announced an updated version of their Nexus 7 tablet this morning. Central to Google’s pitch was the improved display with both more pixels and more color. The device does feature an impressively high resolution, packing 2.3 million pixels into a 7″ form factor. But, I’m more interested in the color performance and, on this point, Google was vague offering only that the display, “has a 30% wider range of colors.”
What do they mean by that?
It depends on their frame of reference- what color space they are using and what color gamut standard they are comparing against. Since Google talked about the accuracy of HD video at their event, let’s assume that they are referring to the HDTV broadcast standard (rec.709) and using the common CIE 1976 (u’ v’) color space.
When I measured last year’s Nexus 7, I found it could only reproduce about 82%* of the colors found in the rec.709 standard. Color reproduction was not accurate and a little bit undersaturated on this device:
With just a simple calculation, increasing 82% by 30%, you’d get about 106% coverage of the HDTV broadcast standard. While that’s actually a slightly wider color gamut than the standard, it is not uncommon for device makers to use a wider color gamut in order to guarantee the color spec across all devices with some room for manufacturing tolerances. This means video and web content should be displayed accurately and it could make for a great looking display.
We’ll order and measure one as soon as they are available to verify so stay tuned…
* note: I always measure coverage of broadcast standards, not simply total area since that can be misleading. However, in this case, coverage and area are nearly the same since the Nexus 7’s gamut is smaller than rec.709.
Last week I set out to define the ultimate consumer display experience in terms of color performance. I laid out some potential color performance design goals for an ideal display, suggesting that such a display should be both accurate and capable of creating an exciting, immersive experience that jumps off the shelf at retail.
Can we achieve both goals? To find out, let’s start by looking at how we perceive color.
The color of objects that our eyes see in nature is determined by three things: physical, physiological and psychological:
The physical component of our color perception is a constant based on the laws of nature. It is a combination of the quality of the illumination or light source, in this case meaning spectrum it contains, and the reflectance of the object. In the image above, the ball appears red to the eye because it is reflecting red light, while absorbing most the other colors from the light source.
The physiological part of our vision is also a relative constant that is based on the electrochemical processes of the eye. The back of the retina contains photoreceptor nerve cells which transform incoming light into electrical impulses. These electrical impulses are sent to the optic nerve of the eye and onto the brain, which processes and creates the image we see. And that’s where the psychological component comes in.
Let’s look at how each of these components might affect display color performance, starting with the physical, which ought to be something we can measure.
Fortunately, a guy named Pointer has done this for us. For his 1980 publication, Pointer measured over 4,000 samples and was able to define a color gamut of real surface colors, of objects found in nature. The result is commonly called “Pointer’s Gamut:”
This already seems like a great place to start. It immediately looks like a great fit our first ultimate color experience criteria which was accuracy. If we could accurately capture and reproduce all of the colors found in the natural world it would make for a much improved, more accurate ecommerce experience, for example.
But how important are those extra colors? Looking at Pointer’s gamut mapped against the color gamut of the latest iPhone in the chart above, you have to wonder if we really come across these deep cyans and reds in everyday life. Are they just infrequent, rare colors or something worth pursuing for our display?
Turns out we do. As an example, Pantone’s color of the year for 2012 was a deep emerald green that falls outside of both the iPhone’s gamut and the HDTV broadcast standard. This is an important and popular color that appears a bit too yellowish on your computer monitor when you are shopping for the perfect tie on Amazon. So there are some really important colors outside of what the iPhone can display today.
But, what about our second criteria, the lifelike, exciting, immersive experience we want to give consumers? Is the gamut of the natural world enough?
If we look at the second component of the visual system, the physiological component, we’ll see that we can actually perceive a much wider range of colors. The cells in the back of retina can actually detect the entire range of the CIE diagram. That’s almost double the range of colors that Pointer found in nature:
This is starting to sound like a much more immersive experience. Maybe we ought to pursue the full color capability of the human eye just like the industry has done for high, “retina” resolutions.
It sounds great but it would be a tall order. It would take quite a lot of power, brightness and extra bit depth to even begin to think about covering a color space this large. There certainly would be a high price to pay in terms of design tradeoffs to get there. So are there any truly valuable colors contained in that extra space, similar to the Pantone color in Pointer’s gamut, that would make us want to go for it?
This is where the psychological component comes into play.
Seeing is not passive. Our brains add meaning to the light that our eyes detect based on context and experience and memory. We are continuously and actively re-visualizing the light that comes out of our retinas.
This may seem hard to believe but this fun demo created by neuroscientist Beau Lotto does a great job of showing just how much our brains actively interpret and change what we see.
The color of the chips has not changed in the video above, just our perception of the color. What’s happening here is our experience is telling us that the color chip in shadow must actually be a much brighter color than the chip under direct illumination, so our brain is just making the correction for us on the fly.
Artists absolutely play on this psychological element of our perception of color, sometimes using totally unrealistic or hyper real colors to make us feel or experience something new or help tell a story. In fact, one of the most influential art instructors of the 20th century, Josef Albers, once said that, “the purpose of art is not to represent nature but instead to re-present it.”
So, whether it’s Monet using saturated and contrasting colors with equal luminance to trick our brains into seeing poppy flowers sway in an imaginary breeze in a 19th century painting or modern films which sometimes rely on the wider gamut capabilities of color film and digital cinema projection to create uniquely cinematic experiences for audiences.
Movies like “The Ring,” for example, which used a deep cyan cast throughout much of the film to create tension and help tell a scary story. Or Michael Bay’s “Transformers” movies, which use deeply saturated oranges, reds and teal greens to create an exciting, eye-popping palette appropriate for a summer blockbuster sci-fi movie about giant robots:
There’s certainly a place for wild, unexpected colors in art. But, as we go through some of these examples, I think we’ll actually find that there is a huge range of expression possible within the gamut of surface colors that Pointer measured. The full range of gamut detectable by the human eye, while exciting to think about, is not really necessary to deliver both accurate and pleasing (engaging) color to our visual system.
So where does that leave us?
In my next post I’ll look at existing wide color gamut standards and content delivery mechanisms to see both what we can do today and what’s next for wide color gamut displays.
Finally getting around to posting a follow-up to a follow-up to John The Math Guy’s recent series on color gamut size, colorblindness and tablet displays. I thought I might be able to at least shed a little more light on his question about the differences in color accuracy between some of these devices.
In his testing, John found no statistically significant difference in scores among different people taking the EnChroma colorblindness test on different devices. I found this somewhat surprising since, in my experience, even tablets with similar color gamuts tend to show colors with very different levels of accuracy.
To show what I mean by that, I measured how two different tablets show the colors found in the Gretag Macbeth color checker chart.
As you can see, the iPad mini and Nexus 7 each produce very different colors, even for those colors that are actually inside their gamuts.
For example, even though the iPad mini has enough gamut coverage to accurately display the Gretag chart’s deepest blue, it cannot do so without distorting the image in another way. This is because of data in the underlying image standard- most content today is encoded in the sRGB standard. If the iPad were to show that Gretag blue correctly, it would not have enough color saturation headroom left over to show you a different color if a deeper blue, say right at the bottom of the sRGB triangle, were called for.
A good real world example of this can be found in the picture below of my bloodhound, Louisa, racing down the beach at Carmel, CA. The middle of the sky in this image is right on the edge of the iPad’s color gamut, very similar to the Gretag blue in the charts above, while the deepest blues found in the ocean fall outside the iPad’s gamut.
If the iPad were striving for accuracy at all costs, it might map both colors right on top of each other at the edge of the gamut. There’d be no visible difference between the two in this case and the quality of the image would suffer but at least the sky would be accurate. In order to avoid this scenario, the designers of these devices have decided to compromise on accuracy so they can show a full range of color differences to the user.
They do this by remapping colors inward, away from the edges of the gamut, effectively compressing the gamut even further so that otherwise out-of-gamut colors can be seen. This is a good solution given the gamut limitations of the device since it results in more pleasing, if less accurate images.
As newer devices trend towards wider color gamuts this kind of compromise should become a thing of the past. In fact, tablet designers may be working on the reverse issue- how to avoid oversaturating images that were encoded for smaller gamuts.
Great, how does this relate to colorblindness again?
Taking another look at the Gretag results from the two devices plotted on top of each other, there clearly are major differences. But, in the reds and greens, two colors associated with a common form of color blindness, the devices are relatively close. So, the simple answer may just be that colorblindness tests do not require pinpoint accuracy to be effective, at least as basic screening tools.
App developer FiftyThree recently updated one of my favorite creativity apps for iOS, Paper, with an impressive new color-related feature. If you are not familiar with Paper, it’s a sketchbook app capable of making the work of even non-artists like me look gallery worthy with an intuitive and responsive interface.
The new feature, which FiftyThree calls “the biggest leap forward in color controls in the past 40 years,” is a color mixer that allows you to create a wide array of colors within the app just as you would in real life. They say they put a lot of time and effort into making the new mixer feel natural. The Paper color mixer works just like finger painting as a kid, mixing yellow and blue in the Paper app mixer produces green.
This is a great feature that expands the content creation capabilities of an already exceptional app. But, as great as this app is, it’s still limited by the color capability of the device it’s installed on. Even the latest iPad, which can produce 100% of the sRGB color gamut, still only shows about 1/3 of the visible color spectrum.
The experience you will have mixing and creating colors on today’s tablets just will not be nearly as dynamic or visceral as making a physical painting. Not until better, wide color gamut technology is adopted in displays will the digital color experience match the stunning world of color we live in.
This is a great, exhaustive tutorial on managing color gamut for photographers by color expert Andrew Rodney. He does a great job making the case for working in wide gamut color spaces like Pro Photo, especially when capturing in RAW. Using smaller gamuts like sRGB throws away useful color data that printers and more and more displays can recreate.
Commenter William thankfully double checked our math and we’ve corrected a small error in our % NTSC calculation.
We finally got our hands on an iPhone 5 yesterday. I tried asking Siri if she really has 44% more color saturation but she wouldn’t give up the goods, so I went with plan B and aimed our PR-655 spectroradiometer at the phone to find out just how impressive the screen really is. A lot has already been written about this display, but not much empirical evidence has been published about the color performance. How does the screen actually stack up to the marketing claims?
In short, Apple did an exceptional job improving color saturation and display quality in general, but the unit we measured just missed the 44% more color saturation claim.
The 44% more color claim for the iPhone 5 is the same claim Apple made for the new iPad. As with the iPad, increasing the color performance of the iPhone 4S by 44% of NTSC 1953 gamut, measured using the CIE 1931 color space, would result in color saturation matching the sRGB color standard. Using these standards as the goal posts, we measured the iPhone 5 at 70% of NTSC 1953 in CIE 1931, a 39% increase from the iPhone 4S, which measured at 50%. That’s 5% less of an improvement than Apple’s 44% claim and just 99% of sRGB (measured against the sRGB primaries).
While 5% less might seem like a big deal, getting to 99% of sRGB is a major feat and will result in tremendously noticeable color improvement in the phone. Additionally, color filters are notoriously difficult to manufacture. Slight variances in performance like this are common and most likely outside the range of a just noticeable difference for the average person.
If you want to know more about NTSC, CIE and sRGB, and why we are using standards from the 1930s, I have written extensively about this issue in the past.
How did they do it?
Much like they did with the new iPad, Apple significantly improved the color filter performance† of the iPhone 5. Based on our experience, this type of improvement typically means that the display requires 20-30% more power to operate at the same brightness. Considering that the display is already a major source battery drain on the phone, this further underscores the engineering effort Apple made to keep battery life about the same as the 4S.
Let’s take a quick look at the changes in each of the red, green and blue color filters, starting with white, which is all three filters turned on:
Looking at the white spectrum of the iPhone 5, we see that the new color filters are very similar to those of the new iPad. Compared to the 4S, the peaks are slightly narrower, which improves color purity. In order to meet sRGB, they also moved to deeper reds and blues.
As with the new iPad, the biggest difference between the 4S and the 5 is in blue. Apple moved the peak to a deeper blue but, more importantly, they narrowed the filter so less green light leaks through. The green leakage causes blue to look a bit “aqua” on the 4S.
Retinal neuroscientist Bryan Jones looked at both displays under his stereo microscope earlier this week. His close-up shots really show off the difference in blue filters.
Apple again chose a slightly deeper wavelength of green which is less yellow and eliminated some of the blue leakage that had been muddying the green on the 4S.
The change here is subtle but as with the other filters, the peak is narrower, deeper in the red and leakage is reduced. One difference worth noting is that, while we are seeing less peak leakage in the red filter, there had been relatively broadband leakage across yellow, green and into blue that has been largely eliminated.
In all, it’s an exceptionally well-calibrated and accurate display for any kind of device, especially a smartphone. Apple has gone to great lengths to design a screen that brings the vibrancy of sRGB to the palm of your hand.
† If you are not familiar with color filters or the inner-workings of LCDs in general this great live teardown by Bill Hammack is well worth watching: http://youtu.be/jiejNAUwcQ8
Display improvements were once again featured at yesterday’s Apple keynote event. The most obvious improvements may have been the larger display and thinner form factor but most interesting to dot-color are the color claims.
Just like the new iPad, Apple claims that the iPhone 5 can display “44% more color saturation.”
Let’s do some simple math to see how the iPhone 5 stacks up against older iPhones and last week’s color performance claim from Motorola.
- iPhone 4S IPS LCD: 50% NTSC color gamut (CIE 1931†)
- iPhone 5 IPS LCD: 50% * 144% = 72% NTSC color gamut (CIE 1931)
- Motorola Droid Razr Maxx HD AMOLED: iPhone 4S (50%) * 185% = 92.5% NTSC (CIE 1931‡)
So Motorola is still king of the fall 2012 smartphone color saturation, based solely on marketing claims. That said, I wouldn’t be surprised if they updated their marketing to say that the Droid Razr Maxx HD offers 28% more color saturation than the iPhone 5 once it hits store shelves in a couple weeks. I plan to measure all of the announced devices to verify these marketing claims, but for now, this is all we have to go with.
Apple also claimed to be able to match the sRGB standard used in TV and movies. With the addition of the iPhone 5, nearly all of Apple’s flagship products (with the exception of the MacBook Air) now meet this standard. This means content should look very consistent across all Apple devices and may open up the possibility for serious content creation apps in iOS.
It also means we’re only just now catching up to an average CRT display from circa 1990, as the sRGB standard is based on the capabilities of phosphor materials used in CRTs. And even still, the new displays are only covering about 35% of the range of colors a human eye can see. There’s still plenty of room for improvement in display color performance (as well as updated content delivery standards, but that is a whole different post). Hopefully if we keep on this kind of pace with display enhancements, next year we’ll start to see a push beyond the limits of last century’s color standards.
† We’re using the long outdated CIE 1931 color space and NTSC 1953 gamut standards here since this is clearly Apple’s reference when they claim 44% more saturation and sRGB coverage. 50% * 1.44 = 72% and 72% of NTSC 1953 gamut in the CIE 1931 color space is also called the sRGB color gamut.
‡ It is not clear which color space Motorola is referencing; we are assuming CIE 1931/NTSC 1953 for ease of comparison.
Over the weekend I saw this interesting tweet about color gamut and the NFL and I had to find out if it was true:
Could it be that something as simple as an NFL jersey is not within the color gamut of modern HDTVs? I mapped the Broncos team colors onto the CIE 1976 color space along with the HDTV color gamut standard, called rec.709. As you can see, the orange is right on the edge and the blue is indeed outside the gamut.
When we think of high color content, we think of action movies and video games, but this exemplifies how color performance affects everything we see on our TVs, even down to the jersey being worn by our favorite sports team. Luckily high color displays are on their way to fix this problem. As you can see, the Bronco’s colors fall nicely within the much wider DCI-P3 color gamut.