Talk me out of Eizo ColorEdge

...contrast on Eizo matrix is just 1:1000, a less than the modern standard to make you see the midtones etc, so most of it is happening on the hardware level.

1:1000 contrast ratio don't really mean much here. The metric represents the maximum luminance target possible divided by the best black point achievable. I can't speak for Eizo monitors but I do use NEC Spectraview monitors, and the best I can ever achieve (using NEC software) is a black point of around 0.42 cd/m^2. To achieve a 1:1000 contrast ratio would then require a luminance target of ~420 cd/m^2. If you are calibrating the monitor for anything reasonably acceptable for accurate prints, then the target luminance is probably going to be in the range of 90~120 cd/m^2 or a contrast ratio of 1:214 ~ 1:285.
 
Most modern cards give you sufficent data at full res without much degradations if you use DVI/HDMI (avoid VGA), only older cards couldn't cope with high 2560x1440 resolutions (creating a "lagging" effect). Only if you do gaming or 3D-rendering or video-rendering or just few functions in PhotoShop (blur and some others) use internal-graphiscard-processing, then you need a "capable" (CPU- & tech-wise) graphics card. But unless you use some chinese $50 graphics card then picture quality-wise it's not an issue since you calbrate the computer output anyway.

The problem with cheaper monitors is the "misfitting" (relatively speaking) hardware components and the internal-processing of the signal to suite it for the speciefic LCD matrix hardware that draws the picture for you. One weak link in the "chain" will make the whole chain weak. High-end manufacturers such as Eizo or NEC have sorted this out by carefully choosing the best hardware palette to mach and making the "chain" to work in harmony in the strongest way possible (according to the price you pay for it) to draw you the most color- and tonal-precise picture for the price, this means also high-end dedicated software support from the computer-side (both Eizo & NEC have their own software). I.e. 10bit-intput-to-16-bit LUT processing happens inside the monitor not on your graphic card, contrast on Eizo matrix is just 1:1000, a less than the modern standard to make you see the midtones etc, so most of it is happening on the hardware level.

Actually, no. For AMD and NVidia, 10 bit color, or 30 bit color to be more precise, is only enabled on their professional graphic cards. Their standard graphic cards only enable 8 bit color. All this LUT only works only when the source is giving enough data to even make sense of it. Anything else is just interpolation.
 
1:1000 contrast ratio don't really mean much here. The metric represents the maximum luminance target possible divided by the best black point achievable. I can't speak for Eizo monitors but I do use NEC Spectraview monitors, and the best I can ever achieve (using NEC software) is a black point of around 0.42 cd/m^2. To achieve a 1:1000 contrast ratio would then require a luminance target of ~420 cd/m^2. If you are calibrating the monitor for anything reasonably acceptable for accurate prints, then the target luminance is probably going to be in the range of 90~120 cd/m^2 or a contrast ratio of 1:214 ~ 1:285.

Actually was just stating the spec since this can be often misleading. I manage around 0,19 cd/m2 of black on mine at 100 cd/m2 working brightness (equals some 526:1, only half of the spec) while I see advertisements on monitors and TVs having 1:20 000 or more! Hence the relativity of spec and marketing.


Actually, no. For AMD and NVidia, 10 bit color, or 30 bit color to be more precise, is only enabled on their professional graphic cards. Their standard graphic cards only enable 8 bit color. All this LUT only works only when the source is giving enough data to even make sense of it. Anything else is just interpolation.

Just did a quick Google on this and you're right. Somehow always assumed all the Mac Pros (not iMacs or other more consumer oriented models) come with a proper pro-grade graphics cards and necessary signal support for pro monitors since they're mostly used by graphic designers - well turns out MacOS doesn't support 10bit color per channel output at all!

Cards would support this looking at how good they are in Mac Pro models, but the OS seems to be the current limit, or say bottleneck.

I will not switch to Windows for this (probably not even a gun pointed at me would!), but I guess we really have to start pressuring Apple on this obvious lack of support!?

Maybe Apple has it's reasoning but like the contrast example above I know "bit doesn't equal a bit" (i.e. listen a pro-sound card audio @ 16 bit or a cheap consumer sound card @ 16 bit on equal terms and the difference is astounding, pretty sure it's the same scenareo with 8bit vs 8bit or 8bit vs 10bit displays or vice versa) but Apple should provide the necessary OS support for the best of pro hardware market has to offer none the less IMO.
 
There is no doubt the Eizo is good monitor. Just that it might be better to get a lower end model that offers just as good viewing quality and supports just 8bit color since you won't be using the higher end features.
 
Back
Top Bottom