Jump to content

best monitor for color grading ?


Recommended Posts

For HDR grading, you’ll need to spend about $30000 now.

For SDR, I use an Eizo computer display, but I’ve needed to learn how to calibrate it correctly with a proper 3D LUT. This takes time and some money.

For “out of the box” calibrated display, get an FSI monitor. There may be a 24” version within your budget.

Link to comment
Share on other sites

Depends on how much money you have, if you need HDR and what colorspace (sRGB, rec709, P3) you are working in.

You can drop $30,000 on a Sony which does everything.

The EIZO 319 is good, but $6000.

Maybe a top of the line DreamColor for rec709?

The new Apple Pro XDR looks interesting, but I'm not sure if it's useable with anything but a Mac.

I'd look at Flanders Scientific, they have a wide range of models. You're better off with a small, but accurate monitor, than a large unit that can't be trusted.

One of the nicest features of the Flanders is that you can send it back to the factory for calibration. This used to be free, but I'm not sure if that still is the case.  Accurate calibration is actually quite important. You need access to the right equipment and someone who knows what they are doing. Flanders uses a $20-40k meter that is operated by a real video engineer. 

You also get things like SDI I/O, built in waveform / vector scope, LUT support and all sorts of good stuff you'd expect on a dedicated broadcast monitor. This also makes it a great monitor to take on set and drive directly from the camera.

I think the smaller 17 inch Flanders come in around $2500

PS: I have one of the small Flanders and like it very much. 

Edited by Feli di Giorgio
Link to comment
Share on other sites

  • Premium Member
18 hours ago, Phil Rhodes said:

I have an Asus PA32UCX here right now. 1200 nits, 1152 zones of LED backlighting, £2500 or so in the UK.

Interesting display. My only gripe is that it's 16:9 UHD resolution and not 17:9 4k. ? 

I'm still using an antique LG 31MU97 still, it's only 300 nit, but it's 10 bit, 17:9 and true 4k. I just turn off the lights when doing a final grade. ?

Link to comment
Share on other sites

  • Premium Member
14 hours ago, Tyler Purcell said:

Interesting display. My only gripe is that it's 16:9 UHD resolution and not 17:9 4k. ? 

I'm still using an antique LG 31MU97 still, it's only 300 nit, but it's 10 bit, 17:9 and true 4k. I just turn off the lights when doing a final grade. ?

Careful, Tyler, that's advancing the dangerous view that all this obsessing over monitoring numbers is more than offset by the unreliability of the human visual system, and we can't have that!

P

Link to comment
Share on other sites

  • Premium Member
9 hours ago, Phil Rhodes said:

Careful, Tyler, that's advancing the dangerous view that all this obsessing over monitoring numbers is more than offset by the unreliability of the human visual system, and we can't have that!

P

HAHAHAH! But in all truth, we do a lot of 1.85:1 grading, so we need a 17:9 aspect ratio monitor. ?

Link to comment
Share on other sites

You don't need a super bright display unless you're doing HDR or you're fighting ambient light contamination.

80-160 nits is the range for most sRGB / rec709 / P3 work. 

10 bit, coverage of desired colorspace, uniformity across the display and in monitor LUT are far more important.

Edited by Feli di Giorgio
Link to comment
Share on other sites

Wild pitch, but I also believe you should have a "common denominator" monitor: a simple TV from an electronics store set to factory settings.

It's obviously the right idea to color grade with actual monitors for color grading (like the ones everyone has mentioned), but it's also good practice to see what your grade looks like on the mediums/formats your audience will be viewing your film on. This avoids the problem Game of Thrones had with their major night battle on their last scene (news article).

Even though we trust our audience will view our project in the best conditions possible, we'd be lying to ourselves. Most people have their TV's set to factory settings, are watching your movie with sun light blasting through windows, or are consuming the film on their iPhone while riding the subway. Having a "common denominator" monitor gives you perspective into how your project will look on the crappiest of settings which you can then account for.

Link to comment
Share on other sites

9 hours ago, AJ Young said:

Wild pitch, but I also believe you should have a "common denominator" monitor: a simple TV from an electronics store set to factory settings.

It's obviously the right idea to color grade with actual monitors for color grading (like the ones everyone has mentioned), but it's also good practice to see what your grade looks like on the mediums/formats your audience will be viewing your film on. This avoids the problem Game of Thrones had with their major night battle on their last scene (news article).

Even though we trust our audience will view our project in the best conditions possible, we'd be lying to ourselves. Most people have their TV's set to factory settings, are watching your movie with sun light blasting through windows, or are consuming the film on their iPhone while riding the subway. Having a "common denominator" monitor gives you perspective into how your project will look on the crappiest of settings which you can then account for.

100% You can and maybe should review your content on a consumer monitor before mastering but the wild variety of looks and settings between brands and even generations of the same brand means there is no middle ground consumer monitor to use as reference so you MUST have something that is reliable and reproducible as your reference monitor.  Grading as close to accurate as possible is the best bet to put you somewhere in the middle of all the wild variation in the consumer market. 

Link to comment
Share on other sites

A consumer TV should feature the filmmaker mode promoted by Martin Scorcese and other cinematographers. This will disable all the image overprocessing constructors tend to apply in order to have their TVs look better than their competitors on demo stands, like over-saturation of colors and over-sharpening that make the most cinematic films look as good as a a soap.

Link to comment
Share on other sites

11 hours ago, Shawn Sagady said:

100% You can and maybe should review your content on a consumer monitor before mastering but the wild variety of looks and settings between brands and even generations of the same brand means there is no middle ground consumer monitor to use as reference so you MUST have something that is reliable and reproducible as your reference monitor.  Grading as close to accurate as possible is the best bet to put you somewhere in the middle of all the wild variation in the consumer market. 

Of course, have both! ? I wouldn't degrade the image too bad for the common denominator monitor, just to make sure that my look is good enough for a terrible TV.

Link to comment
Share on other sites

Same.  Way back in the day I'd actually color correct for a Rec709 consumer box TV set with factory settings and always had better results from that than any flat screen or computer monitor.   Now I have an HP Dreamcolor Display calibrated with an X-rite puck and although it usually performs, there are always issues with getting true black.  I use a waveform but it's those highlights and shadows where you want to know how dark is too dark.  A war of attrition given how little people seem to care about true black at all these days.

When calibrating or when finishing a grade I would often put a clip up online on an unpublished page and then visit a local bestbuys or Microcenter.  Then I'd go through various brands of laptop PC's tablets, Macs etc.  Whatever devices they had online and check out how the clip looked on a variety of screens and conditions. Just to see an overall benchmark.

Calibrating only to industry specs leads to issues like that Game of Thrones finale battle scene that nobody could see because it was graded for a perfectly calibrated screen watched in a pitch black room with perfect internet speed.    Best to plan for people watching stuff on phones and computers with wildly different results.

Link to comment
Share on other sites

10 hours ago, Michael LaVoie said:

Calibrating only to industry specs leads to issues like that Game of Thrones finale battle scene that nobody could see because it was graded for a perfectly calibrated screen watched in a pitch black room with perfect internet speed.    Best to plan for people watching stuff on phones and computers with wildly different results.

OLED complicates this even further. The contrast ratio and blacks looks nothing like even the best LED. At my last job we had one of the high end broadcast SONY OLED monitors and it was on a planet of its own. Comparing it to any big screen LED was a futile exercise.

That said pretty much every professional grading suite I have ever been in has at least two monitors. A calibrated broadcast monitor for the colorist  and a good quality 55' inch or bigger high end consumer panel for the clients. Lot's of Panasonic units, but I've also see a few LG. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...