Jump to content

Larry Baum

Basic Member
  • Posts

    21
  • Joined

  • Last visited

Profile Information

  • Occupation
    Other
  • Location
    Northeast

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Unfortunately and shockingly, it's starting to seem like I was correct. One scanner doing experiments and someone from another forum (apparently someone contacted some expert at film-tech and he seemed to come to the conclusion that it surprisingly outputted only REC709 color primary referenced data and simply clipped away all the extra rich, beautiful colors that 35mm film can record that go beyond that) seem to be reaching the same conclusion, that the Lasergraphics Scanstation 6.5K is absurdly appearing as if they remap the wide gamut output the hardware produces to the limited sRGB/REC709 color gamut in their software before outputting files. If they force the data to use REC709 primaries they seem to agree that the colors then look most natural and if they force the primaries to be interpreted as various wide gamuts it seems to come out over-saturated and/or odd. Nobody can seem to get a straight answer from Lasergraphics and they seem to be very evasive and stonewalling even to owner's of the machine who pay $$$$$$$ each year for service contracts. Honestly, they could simply add an option to allow for full native scanner gamut output and do it in less than a single afternoon but it seems they have no intention and maybe are afraid to do that since then they have to go tell everyone that all they scans they ever did, all the films archived by film archives and museums and so on had the color gamut clipped? It would seem a horrible shame to think of many films that got their one and only scan done without anyone realizing that the color gamut was clipped and that maybe for all time the original colors will be lost as the original film rots and fades and the digital archive failed to capture all the colors. I dunno how how a tens of thousands of dollar machine could do that when a $500 stills scanner over a decade ago could easily do wide gamut. And it's nothing to do with Bayer, almost every single DSLR has a Bayer sensor and even the 1st Canon 10D DSLR did wide gamut color. So does the camera inside of the Scanstation 6.5K. I dunno, I wish I was wrong, but it is starting to seem like I was likely correct. I mean maybe I and some others now are wrong and I'm saying very unfair things about Lasergraphics, don't take this as 100.000000% proven gospel, but it's sadly sure looking like I was correct. I'll be able to judge more myself when I get some full scans back soon and also get a color chart scanned. Hopefully I'll be proven wrong, doesn't seem like it. We'll see. If so, it's kinda frustating. I think even Cintel likely allows full wide gamut color capture. But the Scanstation seems to be clearly better in every other regard so kind of a shame, why have it crippled for now reason (if it really is)?
  2. Thanks! (The thing is you need to know what the scan data is stored in reference to to be able to properly translate it into whatever working space you chose. You could chose P3 and set a display to P3 but then if the data is NOT in P3 the software needs to know what it is in so it can translate it to P3. What if one (or all) of the primaries are in very different locations than the ones in P3. If you just treat it like P3 data the color balance will be all twisted in weird ways and the saturation for each channel will be way off. I just don't yet see how a motion picture scan can somehow magically not need this initial translation step that all other visual data seems to. From DSLRs it does (which is sort of what is inside of the modern motion picture scanners), from digital motion picture cameras it does, from stills film scanners it seems to. So far it almost seems like Lasergraphics already pre-does this step and just puts it in reference to REC709 color primaries (although not REC709 curve) at least for the Scanstation, perhaps not for the Director, or maybe it is actually closer to P3, but what exactly is it then and why do they not put it in any metadata and why does Resolve seem to default to loading DPX16 and DNG from it looking like it assumes it is REC709 and not some custom scene (scanner) referenced color gamut?)
  3. That has nothing to do with what color primaries things are in reference to though. I mean some of the Fuji DSLR were true RGB and they still had to have their RAW files opened with something that knew what color primaries to reference. And my Nikon 9000 ED stills scanner is true RGB per pixel and it's output all appears to need to be in reference to some primaries. I mean think in 1D and think of money. Say you give me $100,000 USD and then I say 'scan'/'convert' it to $100,000 Digital Dollars. And then Digital Dollars are not USD, not Pounds, not Lira, not Euros, etc. But you can't say oh you can take the $100,000 Digital Dollars and put them in whatever 'format' you want directly. I mean you can, but if I then called it Lira and gave you $100,000 Lira I think you might be a bit upset. You need to know what the Digital Dollars were in reference too. They were actually in reference to USD in this case, but if you don't account for that and then apply a conversion rate, it doesn't work out so well for one party. But supposing the scanner camera+filters had primaries really close in to the center of the CIE XY chart and you then viewed the data on a P3 or REC2020 monitor directly. It would like insanely over-saturated and the color balance would likely have a very strange twist to it. Or suppose the scanner camera+filters had primaries pretty far out, beyond P3, around REC2020 or something and you then just viewed it on a monitor set to REC709 gamut, it would look pale, washed out and likely with a strange twist to the color balance. That is what happens if you just take RAW data from a DSLR and don't apply any knowledge about what the camera's color primaries were. So that is not done. The software companies find out what the primaries are and then treat the RAW data in reference to those primaries and then convert it to a standard intermediate color management CIE XY D50 space and then from there they translate it to whatever working space you want to work in and then that is translated once again to whatever your display is set to.
  4. The data has to be stored relative to something though doesn't it? I'm still not understanding how now. Either simply whatever the filters on the scanner's camera are or something else if they decide to remap it. At the end of the day, it's basically just a DSLR inside the scanner in a sense, taking a digital picture of each frame. Any DSLR RAW photo stores it's data relatively to the raw performance of the sensor + color filters. Any program like say ACR in Photoshop that loads it properly goes to a custom table that the software maker creates to look up what they found the proper primary locations to be (would be nice if the camera makers just gave it in the metadata but they seem to wanna be all secretive). It would seem strange and annoying if the motion picture scan machines were to scan it relative to the scanner's primary locations and then not put that data in metadata and then you have to hope some software happens to have the proper internal table to handle it (which would even be fine, although it kinda seems like this is not something the softwares have in these cases, perhaps they do for Resolve and Cintel since that is their own scanner, it seems like maybe they do understand that scanner and load custom wide gamut primaries for that, maybe, haven't tested it yet, hard to find info, after tons of searching a little bit of info makes it seem like maybe that is the case) one just randomly apply color gamuts and see which one sort of looks semi-reasonable as the starting point. And most grading tools are not really meant to correct for wrong assumptions of where color primaries are located I don't believe and I think might be tricky to get to match exactly if you have to pick something that isn't exactly right at all. Maybe they figured since motion picture scanning is so obscure compared to DSLR work or digital motion picture work or stills scanning that a lot of software won't have the proper look up info so they just internally scale it all to REC709 gamut so anything can just load it as default with no mistakes, but that would seem quite a shame to lose those colors that print film can handle that are beyond REC709. Or maybe they expect one to have some test sample of colors for every stock they use (not easy if it's old stock in particular) or to use a spectro to measure various colors on a projected print and find out what they are and then compare to values in the scan and then custom build your own color table and apply your own custom input LUT to the scan data. But that's a helluva an undertaking and not even so easy to do super well, when it seems it should be able to be handled so much more trivially easy. I thnk I've heard of studios doing stuff like that back in the early days, but it would seem like we should be way past that. Maybe for a negative if you didn't ahve the primaries chosen eactly, it's not quite so bad as no final look has been chosen (but it would still help! and could still introduce a lot of tricky issues) but for a finished print with all the colors already locked in it's really strange seeming. But also strange is that if you use Resolve and set Custom under part of the color management panel and then you can select the input color space, which I think is you manually telling Resolve in what color gamut+curve the data is stored as (you select both separately if you wish) and it allows you do this when loading DPX16 format and you select something really wide like Arri Wide it just looks insane, wayyyy insanely over-satured and twisted in color. If you select a more mild wide gamut like P3, which should be somewhat close to a print gamut, I have to say it just kinda looks oversaturated to me. I don't have the print on hand to compare with, but it seems hard to believe it looks quite THAT intense. OTOH, if I tell it to treat the data in the DPX16bit linear Print file as REC709 it kinda looks about like what I believe the saturation on the print actually is. If it's stored in wide gamut I'd have thought that would make it look quite bland and maybe noticeably twisted in color and that P3 or something larger might make it look more ballpark. So the data just keeps seeming like it's awfully close to REC709 gamut. I'll check into more in case I made a mistake and somehow it turns out to be closer to P3, which would be good, but it seems like otherwise.
  5. But whatever loads them needs to know what the R, G, B values stand for. Where on the CIE plot is the red primary that the data is in reference to, to make it easy assume an 8bit per channel (and yes I know it's more), so what would 255,0,0 represent? It totally depends upon on what reference primary red it's based upon. To properly work in a display referred or standard type colorspace the program needs to know how to translate it into that. With a RAW file from a DSLR it will be stored with reference to whatever the sensor+color filter array end up making it be. If you try to load it into a program that doesn't recognize the camera it was shot on, it won't load properly. It needs to know the exact camera model and then go to a look up table (since for strange and unfortunate reasons the DSLR makers don't just list the info publicly and don't store it in the metadata) that they had to come up with themselves by studying the properties of the sensor and more it's color filter array and then treat the data with reference to the primary locations for that particular camera model. They transform it from that into CIE standard color management base D50 XYZ space. And then in the programs you can chose whatever working space you want, for stills, it's more stuff like sRGB or ProPhotoRGB (for video more often REC709 or P3 or REC2020 or use an ACES workflow) and then it transforms that data into that space and everything is done in that space (it may need a further transform to the display's exact space if not the same as the working space, in video people often try to set the display to the same as the working space, in stills that is less often the case and the working space is often larger than the display's max space). But the thing is a program needs to know how to interpret the data. If not it will almost certaintly end up showing the image with weird tints and either under or oversaturated. I believe so. That was the impression I got, but I was not there. Finished, balanced, print positive. Yeah negative scans have to be inversed. There did seem to be a PrintLOG option as well, it definitely used a different sort of tone curve. But the actual goal in my case was just the regular Print mode. The AnalogBalance 3 entry matrix in the metadata I'm guessing instructs how to handle this and get the data treated with channels in the original balance, taking advantage of the extra DR storing it this way allows, I believe. I would have thought so, but surprisingly, it appears to not be the case. The cNDG doesn't seem to preclude 2flash HDR. It even sets the metadata to claim that the DNG is stored in display referenced form and not scene referenced form (which is what DSLRs and digital video cameras do). It could be that some software recognizes the IMX342 and does have the native primary locations and uses that. But the thing again is, if I simply force a more still oriented program to treat the color gamut as if it was sRGB/REC709 then it loads in looking with the same color tint/saturation as it does in Resolve or PremierePro for the DPX16 samples. If Resolve had a special table for how to handle Scanstation and IMX342 and the data was in wide gamut reference of the native capability of that camera then telling Photoshop or whatnot to act as if the data is stored in reference to sRGB/REC709 primaries should make it load looking pretty undersaturated or at least more undersaturated, and very likely with a noticeably different tint as well, but instead it loads with almost identical saturation and the tint shift is very minor. I mean it ends up looking extremely close, almost identical to how the DPX16 automatically loads into Resolve when doing that. If I do the same thing with a wide gamut image from a DSLR it loads way differently when one program assigns it the wrong color space reference.
  6. I thought it was for taking the raw scanner/camera data and converting it to the CIE XYZ D50 intermediate space where most color engines seem to process everything. If it was just taking the raw scanner/camera data and converting it straight to sRGB/REC709 then that would be another matter and the RAW data could then still be wide gamut and that certainly would be nice! I hope I am just being dumb and interpreting that matrix at the wrong end. One weird thing is that the regular color matrix values in the metadata are just the CIE XYZ identity which doesn't seem correct at all. And the programs that default to using those matrices first and reverse engineering a forward matrix from them display the image in beyond wild crazy colors insane saturation and tint so I don't think those can be correct. I thought I was looking at it properly though to read the RAW primary locations of out of, which were quite close, but just a trace off from sRGB/REC709. Unless it was meant to be crunching it down to that. But when I look at the ForwardMatrix someone showed for a DSLR it was very different than the one in this film and when I did the same thing to it, it popped out very wide gamut primary locations. Also, the fact that if simply told Photoshop to assign sRGB to the DPX16bit linear scan and it looked what seemed to be reasonably as expected made me also think the data was stored in reference to sRGB/REC709 primaries. If I tell Photoshop to assign sRGB to a DSLR jpg that I saved in say AdobeRGB or even more ProPhotoRGB then it looks very off when it loads. Same if I say load a DNG into RawTherapee from this, well first it uses the COlorMatrix and gets crazy results, but if I tell it to assume the data is stored in reference to sRGB/REC709 primaries it looks kinda normal (unless the film is more saturated than I think and this is jsut some undersaturated somewhat off look). Another problem is if say the files really did have wide gamut primary referened data in them, then if the DPX16 files has zero metadata about the color gamut at all then how in the world would any program know what to do with it? I mean for DSLR that actually is the case but then each raw converter software has to have a custom database, look up the camera name in the metadata and then apply whatever they found by testing the proper primary locations to be. But the DPX are not even really raw files and Photoshop has no idea what to do with them. And in ACR for the DNG, if I load in images and use working space ProPhotoRGB, wide enough to handle anything it could reaosnbly have in the data, and then raw proof it down to sRGB/REC709 primaries it doesn't show any clipping warnings, which does happen if I do that with a known wide gamut image (granted many frames won't ahve wide gamut info, but a few of these were pegged all straight along the edge of sRGB/REC709 which means the film should have gone beyond and so should the file if it really was in there I'd think). yeah it is slightly off from a pure sRGB/REC709 D65 matrix which I was guessing perhaps because it is D50 based although I didn't do that math to see if that accounts for the little bits of difference or not CFA Pattern 2 : 0 1 1 2 Linearization Table : (Binary data 23879 bytes, use -b option to extract) Black Level : 0 White Level : 65535 Color Matrix 1 : 1 0 0 0 1 0 0 0 1 Analog Balance : 1.03499997 0.9990000131 1.210325574 As Shot White XY : 0.3456999958 0.3585000039 Calibration Illuminant 1 : D50 Colorimetric Reference : 1 Forward Matrix 1 : 0.4360747041 0.3850649001 0.1430803985 0.2225044967 0.7168785933 0.06061689931 0.01393220016 0.0971044973 0.7141733173 CFA Pattern : [Red,Green][Green,Blue] it looks pretty similar to what I think is a matrix taking data stored in reference to maybe D50 sRGB/REC709 primaries and putting it in XYZ D50 space it looks very different from the few DSLR matrices I've seen
  7. It wouldn't be. At that stage it would surely be in some very wide gamut referenced form. Yeah, it would be very surprising they would use such limited filters though and yeah I doubt they'd use some weird low spectrum LEDs in it. The only thing I could think of is that, for some reason, they decided to take the native scan data referenced to the primary locations of the scanner (which are surely very wide gamut) and then convert it down to sRGB/ REC709 primary locations maybe solely thinking of targeting the old broadcast standard or making it automatic that any program will load them normally without havinbg to have color management? But it would seem crazy restrictive to not have a toggle option somewhere to just leave it full wide gamut. Of course, maybe there is and it is just well hidden or strangely labelled and the scan operator just didn't have it set to allow wide gamut even though it can deliver that. Either that or somehow there is something very different about these files that I am somehow missing being used to handling digitally shot RAW or stills film scanner RAW or that the print is way more saturated than I recall and it's actually NOT loading in correctly into programs and they are just defaulting to sRGB/REC709 since the DPX16 have no colorspace metadata and the DNG, for whatever reason, where given an sRGB colorspace data and the sRGB/REC709 that looks fairly normally is actually both somewhat undersaturated and with the colors somewhat twisted and tinted.
  8. P3 would be plenty decent enough (although it does still clip some real world stuff, although it's not too bad, although some flowers still handly clip it noticeably, but few displays other than a few really high level pro ones can even display more to much more than P3 yet anyway, so only really matter in a long term preservation sense for more mainstream future tech and not practically relevant at this moment) and certainly for typical print film that is good enough, since as you say, yeah P3 was based somewhat on typical motion picture print film gamut so if you have P3 or something very close, you more or less have all to almost all your print film could contain no matter what was shot on it and how. If I thought it was giving something along the lines of P3 that would be perfectly fine, since that would be able to deliver everything I expect from the film scans. The problem is that it seems to be REC709 and NOT P3 or something else wide. And yeah certainly for plenty of stuff even that won't matter too much, but if you get into fall foliage, tropical waters, emeralds, flowers, 80s clothes or any clothes with rich saturation, fireballs, some birds, brightly colored sports cars, sunrises, sunsets, etc. then it can easily get noticeably clipped off by REC709. Heck even some animated stuff if they wanna go wild (it depends a lot doesn't some does) some colors look easily beyond REC709, wild reds, oranges, purples, cyans, yellows, greens, etc. in some of that stuff. Just on an orange/red bit on some of the frames I see the gamut pegged all along the edge of the gamut, hard clipped along the edge, so those frames obviously had more than REC709 on them. On other frames it is all easily within REC709. It all depends on what is in the scene. Not really, I know that film can deliver a wider gamut than REC709 and I know that even my own personal film stills scanner can deliver a wider gamut than REC709 and I know that some even far cheaper totally avg Joe consumer level stills film scanners can deliver a wider gamut than REC709 so I'd naturally be surprised if it were to turn out that there is no way to set something like a Scanstation 6.5K to get greater than a REC709 gamut (hopefully there is or somehow there is something about the files I am somehow missing or not understanding).
  9. Yeah, it's not that sort of stuff that I am talking about though. Color gamut IS the sort of thing you could easily notice at 24fps. I mean in the extreme form, if you slide each color primary over to the white point you get B&W and that surely looks different than something shot in color with REC709 primaries or whatever. Lots of other stuff has been mentioned but that stuff is all NOT what I was getting at.
  10. I suppose for some reason Lasergraphics might have limited native scanner primary reference wide gamut scans to the Director, but that would seem surprising. Even cheap stills film scanners, even some with Bayer method, allow for wide gamut scans and even full color scanners like Nikon 9000 Coolscan cost like 20x-100x less and came out years earlier and allow wide gamut film scans. So I wonder if maybe there is some weirdly name option that the scanner isn't seeing and has it set to REC709 color gamut when maybe some little setting change could allow for wide gamut. I mean maybe there is something about the scan files I am simply not interpreting correctly, but so far I can't see anything I am doing wrong in that regard, but open to suggestions. Yeah Bayer sensors are not perfect, but almost all DSLR use them and virtually all DSLR have put out wide gamut RAW files or JPGs from day 1. You lose a bit of color resolution and precision and get more metamerism and so on and so forth, but in terms of primary locations, most of them have them set quite wide indeed, way beyond sRGB/REC709. I did just ask Lasergraphics the other day whether the 6.5K machine can output data referenced to color primaries wide gamut than those of sRGB. I will see what they have to say, if anything. I've heard rumors that unless you own a machine or appear ready to buy one that they might not talk much, if at all, so I'm not 100% sure I will even get a response. Someone else using one of the machines asked them a few questions somewhat along these lines and said they got a kinda vague answer that didn't really answer anything and it was sort of an "it's all proprietary secrets" kinda response that didn't make anything clear. I'll see if they get a more clarifying answer to a couple more exactingly specific questions. In the meantime I was sort of hoping that in one of these forums someone would just be like oh yeah you just have to make sure that so and so setting is set/not set and then you'll get wide gamut reference output. Or, although it would be a shame, a nope, that one limits it with reference to REC709 color primary locations there is nothing the scanner could have done using this machine but so and so brand and model can do that. It's almost impossible to find info on exactly what any of the models from any maker can do regarding color gamut. I found a few vague hints that maybe the Cintels can actually give wide gamut color, but it is not entirely clear (they also seem inferior otherwise to stuff like Scanstation, etc.). It's surprising since in the film stills world it's trivial to find this info for any stills film scanner and people are talking all over about standard and wide gamut scans. But the motion picture world is a much smaller world. I do see that sort of talk a lot in purely digital work flows, but haven't had luck finding a lot when it comes to motion picture film scanned sources, only for digitally shot productions where, in recent times, you can find a lot of talk about wide gamuts and such (or back in the day, when the first digital motion picture cameras appeared when you'd hear some bash them for having small REC709 palettes and poor DR and lacking in the DR of film and it's more rich color possibilities).
  11. It wasn't a TAF. The little sample bit I got back didn't include any of the color or gamma tracking frames at the very start of the print, just a few seconds of the film itself. A TAF doesn't seem to have very saturated colors so not sure you could see much from that. Perhaps if it had one of the fuller color checker type charts, as some prints do, can't recall exactly what this one has on it at the start, some of the colors would be beyond sRGB/REC709 primary locations. It would be good to get those frames included in the full final scan to see what tweaks need to be made to scans from the machine to get them closer to perfectly matching the print, but that seems like a separate issue. I'm not sure why a TAF scan would tell you any more about the color gamut than just scanning a few seconds of a final print and seeing what the metadata in the DNG output says or seeing how the DPX react assigned different gamut profiles. Maybe one of the more comprehensive patterns would have a few patches out of REC709 gamut and one could try to see if they looked a touch muted relative to the others or see if any program or any profile assignment could bring them back, but that is a bit subtle. You can easily tell roughly what sort of gamut any jpg image was stored in reference to so long as it has any reasonable amount of color in the scene, apply the wrong assignment of gamut and it will look normal or way oversaturated and twisted or way undersaturated and twisted (and for that file type the metadata would tell you the reference color gamut as well even if the image which 100% white or pitch black).
  12. Also I should clarify, I don't think it is that all the output samples are in REC709 color SPACE, they just all seem to be in REC709 color GAMUT (solely with regard to the xy locations of the color primaries on an CIE xyY gamut plot).
  13. Thanks. Yeah, the Scanstation 6.5K also has a Bayer sensor like most DSLR so it also has to debayer the file (or in the case of DNG output let the software the user uses do the de-Bayer). From what I read, the Director and Scanity are true RGB per pixel, etc. The real issue is whether the Scanstation is taking the native color data and transforming it from the native color gamut of the scanner, which is surely pretty wide, and converting it to a small sRGB/REC709 primary locations (long time broadcast standard, but far less than film's color palette and far less than the new broadcast and home video standards) before writing out the data to files. It would seem shocking for them to do that, baffling, but it seems like the samples I have received have had that done to them. I'm not sure if maybe the person doing the scan had some setting somewhere set that is forcing that or if the Scanstation 6.5K simply is really being restricted to that the smaller color gamut or if I'm somehow misinterpreting the files (but then how come the DNG have an sRGB forwarmatrix and so on?). How are Hollywood studios preserving the full color gamut of 35mm/70mm films they can in and release on UHD if the scanners are clipping stuff to a small gamut on output? Yeah they can still get true HDR out of them, but what about the colors? I know for Wizard of Oz they simply did like a B&W for each of the three Technicolor strips and they probably measured the spectral response for what each strip recorded and then custom ended up capturing the full, rich, huge color palette of three-strip Technicolor. But what about regular films like on single strip full color Eastman or Fuji or whatnot? If you look at people doing scanning of stills from stills film cameras, you see all sorts of talk about scanning into wide gamut formats and how one should not use sRGB/REC709 if you care about preserving all the possible colors in your slide or negative.
  14. In an image file based upon the normally used tristimulus model, the data has to be stored with in reference to some primary color locations otherwise how in the world do you even know how to interpret the data? You need to know where on the CIE XY chart the R, G and B color primaries are located. The DNG output file has a ForwardMatrix in the metadata that tells programs how to interpret the data. Color management modules are standardized work in CIE D50 XY and that ForwardMatrix is multiplied by the R,G,B color value to convert it from whatever reference it is in to D50 CIE XY. The matrix in the sample files I received is extremely close to the matrix one would use to convert from data stored in reference to the sRGB/REC709 color gamut (the slight difference might be because the scanner light source was called D50 and sRGB/REC709 use a D65 white point). And then again, if I load in a DPX16 and tell a program to assume that the data should be interpreted as if was stored in reference to sRGB/REC709 primary locations, it ends up looking pretty much the same as the file loads into PremierePro or Resolve or what the REC709 ProRes output looks like. It's done all the time. If you have an image in say AdobeRGB and are viewing on a screen set to sRGB any color managed viewer will conver the gamut from AdobeRGB to sRGB so it looks normal on that screen. You can't add back colors that were clipped to begin with. If the data was clipped away, it is clipped away. You can boost some saturation and saturation at bright levels and sort bring some back to an extent, but it won't be the same and it's hard to not get other stuff artifically boosted looking. Also the tints might go all wrong since the color primary location it was interpreted as might not be on a straight line out in saturation from th white point but might be shifted one way or another. I'm not talking about dynamic range though. Dynamic range and color gamut are not the same thing (although more dynamic range can make the gamut larger in a way, but that is not the part I am talking about). I'm not talking about like SDR vs. HDR. I'm talking more along the lines of color gamut portion of say sRGB vs AdobeRGB vs P3 vs REC2020 etc. I'm talking about stuff like: https://www.eizo.com/library/basics/lcd_monitor_color_gamut/ https://webkit.org/blog-files/color-gamut/ https://nick-shaw.github.io/cinematiccolor/common-rgb-color-spaces.html https://www.benq.com/en-us/business/resource/trends/understanding-color-gamut.html https://www.displaymate.com/Display_Color_Gamuts_1.htm and I'm not talking about stuff like being able to see details inside of a room as well as stuff out the window in bright sun at the same time and I'm not talking about whether you have large steps between each tone or super fine gradations, etc. Camera RAW files tend to not include the color primary locations since the manufacturers get all secretive, but that also is why you keep needing RAW processing updates when a new camera comes out. The software company has to go measure the color filters and sensor and see what the camera captures in terms of where the color primaries should be located and then it knows what the values in the RAW file mean. If we are say 8bit just make it simple, what does say an R,G,B of 255,200,10 even mean? if you don't know where the R, G and B color primaries are you have no clue what 255/255 red and 200/200 green and 10/255 blue means. If the green is located where it is in sRGB/REC709 it means one thing, if it is way, way up there on the CIE XYZ/xxY chartthen it means something else. If the data was in reference to the former but you treated it like it was in reference to the latter then you'd get absolutely nuclear intense looking greens and they wouldn't just have the green component way oversaturated it would also be shifted in tint most likely as well and have too much of a blue or red tint to it as well. That is the sort of thing I am trying to get at.
  15. Does anyone here think they have any DPX16 or DNG files from a Scanstation 6.5K stored in reference to some wide gamut format (like P3 or REC2020 or more likely native scanner gamut primary locations)? And again, I am NOT referring to the tonal curve of REC2020 or bit depth or LOG or flat or whatnot, simply about the location of the color primaries and whether the color gamut is wide or standard broadcast. Like how the new UHD format blu-rays not just have more bits per channel and have HDR but also store colors in reference to the very wide color gamut of REC2020 even if mostly restricted to the P3 color gamut within that which is still a good deal wider than the old REC709 color gamut of blu-ray and HD broadcast TV. Like how you can look at intensely colored fall foliage and flowers and sunsets and tropical waters and see much more intense colors if you view pics taken with a DSLR and the images processed in a wide gamut color format and then view them on a monitor set to wide gamut format (even if it is still in SDR mode). Oh, also the metadata in the DNG samples I received has the ColorimetricReference tag set to 1 which means display referred and not scene referred. This triggers Adobe ACR to load the DNGs RAWs as pseudo-RAWs and changes the color temp slider from Kelvin scale to a points scale instead like if you force jpgs to be loaded into Adobe ACR. Which also seems strange. I downloaded some video shot on someone's drone and output as DNG and it had that set to 0 and the same for shooting MLR on a DSLR and then AdobeACR loads them as true RAW and gives you the normal Kelvin slider for color temp.
×
×
  • Create New...