Jump to content

Graeme Nattress

Basic Member
  • Posts

    145
  • Joined

  • Last visited

Everything posted by Graeme Nattress

  1. Because talking about this wastes lots of time with endless back and forth. Backlit wedge tests are currently my best method. I don't like the stopping down the lens / multiple shot method due to the abundance of frames / data to analyse. It's also quite open to human error and relies on the calibration of the lens, among other things. Also, by DR, we want to know what the dynamic range can be at one time, not over time, so doing it all in one frame makes sense. I think we agree on this! Imatest is a clever program indeed, but for me, it's not a definitive way to test DR. That is because it is not consistent, in that using different curves gives different results (it should not, in my opinion), and that when I put known linear data through a known gamma curve, the software did not accurately tell me which gamma value I'd used. So to put it plainly, not I don't accept Imatest as a tool for measuring dynamic range. I've said why, and I've said so multiple times. If I get different results by changing things which should not change the answer, I cannot accept it. As Jim says, we're not interesting in doing comparative tests ourselves - but we do test constantly to help us improve what we do do, and we use our own methods, as I've outlined before for that. If Imatest worked properly with known linear light data (no futzing of curves allowed) I'd have been happy to use it, but without that, I'll probably have to write my own internal tool.
  2. Evangelos, my only real comment is that I find the measuring DR hard. It's not an easy test to perform. There are many parameters that can ruin the results, especially comparing across cameras, and across test done in different ways by different people. Not least, once your shooting is done, measuring the results in a meaningful way is also tricky. Then there's the fact that measuring noise is not the same as viewing noise. The eye can see that two different noises, with the same PSNR can look very different. Stephen, I'd love to have the time to develop better ways of measuring DR, based on linear light values from calibrated charts, and have it automated, but I have much more pressing tasks. Such a scheme would be very useful for internal R&D and I'm sure I'll get to it at some point. Graeme
  3. As I pointed out, you can go on about Imatest until you're blue in the face, but because it's noise figures are based on the curve of the data, which it does not linearize correctly, the numbers are essentially meaningless.
  4. 1) Adam's test, which was interpreted above, from the image Adam showed of him conducting the test, allowed stray light to hit the chart. That means it never got dark enough. It was a poorly conducted "quick" test and not representative of the proper tests that Adam can perform, and not representative of the camera either. 2) Imatest can give wildly different results depending on the curve applied to the data, even if that curve DOES NOT effect dynamic range. 3) Lattitude is the amount you can over expose or under expose an image and still pull it back and have it look ok. It's related to dynamic range, but IS NOT Dynamic Range. If you mean Lattitude, say it, or if you mean Dynamic Range, say that, but don't mix the two - it's confusing. 4) We have always quoted Dynamic Range, as measured on a Stouffer test chart, with the measurement being the counting of the wedges from clipping through to the point where you can no longer see the difference between one wedge and the next. This is a useful measurement as it's repeatable and easy to measure. It doesn't requite complex software, and ignores issues like what curve you use on the data. 5) Every digital camera has it's dynamic range noise limited. It is up to you, the user, to explore the available dynamic range and see what your tolerance is. Graeme
  5. Because of the MTF rolloff as you get higher and higher detail, a small amount of sharpening can compensate for this, to an extent, without going over the edge into producing those horrible, thick, halos you see around the edges on some systems. It's therefore ok to add a very very small amount early on. However, as you point out, it's best to add what you need at the end, if needed. If you're downsampling, you'd probably just pick an appropriate downsampling filter an not need sharpening at all.
  6. Lines are like square waves - have infinite frequency, and therefore we've just violated sampling theorem. If instead we'd used a sinusoidal pattern we'd be ok. But video projectors lack a reconstruction filter - they just show the samples as is, rather than do reconstruction as you'd get in a CD player on audio waveforms, for instance. What you have pointed out is the in-ability of any sampled system to properly sample any image that has frequencies that are too high in it. Measurement of lines is completely interchangeable - you've just changed the rules mid way. In your thought experiment, you've omitted the optical low pass filter which would send your fine pattern of lines near the maximum limit to a uniform grey. That's what we see (or should see) on any resolution chart approaching maximum resolution. Also, the MTF of the system will be reducing contrast significantly at that point anyway. You can think of Film as either having it's own built in OLPF due to the random grain structure, or that the random grain structure breaks up any aliasing into a random pattern that you cannot detect. Sharp edge resolution charts are used because sinusoids are harder to print nicely, but if you want truer measures they're more useful. Also, zone plates are more useful than trumpets are they make the aliasing pop out as extra circles in the image that were not there in the target.
  7. I don't think interchangables would not be good as you'd risk getting dust on the sensor. And you'd have to have a way of recording in the metadata the filter used. It would add a large degree of complexity. Obviously, it can be done, but it could be very tricky to do right. Also, different OLPFs would be different thicknesses and that would throw your focus off unless you could compensate somehow.... It's not trivial. As for a demo - how about I walk around NAB wearing a zone plate T-Shirt and people can point cameras at me :-)
  8. If you have 2048 lines (samples) that means you can store any frequency up to 2048/2, ie 1024 line pairs. 10204 line pair is of course 2048 lines. Of course, any frequency about 1024 line pairs must be removed to avoid aliasing, and indeed, optical filters don't roll of high frequencies very fast, and hence must be brought in early. As you say, if you could have optical filters with negative taps we'd get faster rolloff, but we may also get ringing and halos too....
  9. If you have such a card, divided into 2048 areas, alternate areas filled with black and white, you have a card that contains, if filling the FOV, a resolution of 2k lines or 1k line pairs. And yes if you have a > 2k camera, you should be able to accurately sample that image. For 2k Projection, you'd be at the mercy of whatever downsampling filter you use to create a 2k image from your >2k camera image.
  10. 1500 lines is 1500 lines per picture height, which is 750 line pairs per picture height. A line pair is a white and black line together. The best reference I've found for this is here, http://www.cst.fr/IMG/pdf/35mm_resolution_english.pdf where film was tested through the stages of production through to typical real world cinema projection, where best performance was 875 lines per picture height. For a projector to display 1500 lines it needs 1500 pixels. You're confusing things by thinking of aliasing at this point, and confusing frequency, which would equate to line pairs, with pixels and lines which are different by the factor 2, lines being twice as many as the line pairs. Graeme
  11. But sampling theory says all sampled systems give less measured resolution than the number of pixels on the sensor, unless you allow for nasty aliasing, which I think we can both agree is not something we want on a motion picture image. That fact doesn't change if you're a bayer pattern single sensor or a 3 chip system with prism or any other kind of sampling.
  12. I dunno. Ever since people began asking about Bayer stuff I've been saying you get get about > 70% of the sensor resolution as measured luma resolution. Try to get any much than that (we're getting just over 75%, which I'm happy about) and you're using too weak an OLPF and you're in for a nasty alias surprise. However, you don't just want resolution - I've seen some demosaics from some stills cameras that are as nasty as hell in attempts to "win" on resolution. We've put a lot of effort into the raw compression / demosaic combo, doing some things differently from how you'd expect, and of course, that leads to the visual image which people are enjoying looking at, and enjoying working with.
  13. Microlenses are used to increase the fill factor. Although a larger fill factor can reduce the nasties of aliasing, they cannot act as anti-aliasing filters or reduce aliasing below what sampling theory dictates. Remember, you test chart for the "lines" should be sinusoidal in nature as a square wave has, effectively, infinite frequency because of the sharp edge. Graeme
  14. That's expensive! Last time I was at the Cinema here in Canada, it was about $10CAD, and with lowering attendences I think they tried to drop it - or was it $12 that they dropped to $10 - either way, way too much when you can buy a DVD and all three of us can watch the movie at home for less than a trip to the Cinema, and as you may know with little ones, if they watch a movie once, they watch it ten times! Graeme
  15. Pay more to go to the Cinema? No way - it's way too expensive a "treat" as it is! Pay more because the film was shot on 65mm? I severely doubt the average audience cares what the movie was shot on, but they sure as hell care about how much it costs to get in.... Graeme
  16. 4000 photosites can sample 4000 lines, or 2000 line pairs, not the numbers you quote above. This is, of course, if you allow massive aliasing. In practical terms, a goal of 3000 (given that optical low pass filters are basically a 2 tap filter, and hence don't roll off the frequency very fast) is right for us. To resolve 4000 lines of luma resolution, on a bayer pattern sensor, I'd need somewhere between 5000 to 5500 depending on strength of the OLPF. To get at least 4000 on chroma all the time, I'd need a 8k sensor. That would then give 6k luma, and at least 4k chroma. Well, NHK have an 8k science experiment...... You've got to think of the Genesis chip as a lof of RGB macropixels. It's not designed for a higher resolution to be extracted from it (or else they'd have used a bayer pattern). At the moment, they should be optically low pass filtering for the 1920x1080 resolution, and therefore there shouldn't be more resolution than that anyway getting to the sensor. Chromakey is important, but again, aliases could screw that up as much as lower chroma resolution. In the end, it's the balance of all things. Graeme
  17. In any sampling system, there's no way you can max out it's resolution so that 1000 samples gives 1000 lines of resolution unless you allow for massive aliasing artifacts. Output measured resolution from your sampled system is more of a factor of the level of optical low pass fitlering you use than if it be single chip or three chip. They impact things a bit, but sampling theory and aliasing are the main concerns. For years we've had interlaced video where the measured vertical resolution has always been significantly lower than the pixel count due to the necessary anti-twitter interlace filtering. Basically measured resolution has never been equal to the amount of samples, and I don't see that changing. Graeme
  18. Just want to add, that please don't judge image / demosaic quality from the grabs posted earlier. Please look at the new stuff that people are posting instead. We've made really significant improvements throughout the image chain since April. Also, I saw an image posted from elsewhere showing a bayer pattern artifact, and again, we're working via different methods than that demosaic used (you can sort of tell what method people are using via the specific artifacts you see). Graeme
  19. When I get the chance, I will. BTW, "Graeme" please - this "Mr. Nattress stuff" reminds me of when I taught math :-) Graeme
  20. The chart was specially shot for an engineer to perform some analysis for me. I'd prefer that it we are to publish the results that it's performed again, under more controlled conditions, and on production cameras rather than the hand-built prototypes with the older sensor boards. I'd also like to shoot other cameras alongside to add perspective to the proceedings. Graeme
  21. You do need a strong OLPF on a bayer sensor so as to avoid nasty chroma fringing artifacts. There are some bayer cameras around that posted test images that were absolutely riddled with such nasties, so it's vitally important to get that right. The OLPF, I think it set to the pixel pitch of the sensor, and would be the same that we'd use if it were a monochrome sensor. I didn't perform the analysis on the 4k zone plate that I shot, but the 2k patterns are pretty darn near perfect for luma and green, with the slightest hint of aliasing on the red and blue. The 4k patterns show resolution out to just over 75% of the way, or just over 3000 lines for the luma, with slight aliasing beyond that showing that the lens we used, I think a Cooke 65mm if I remember rightly had enough resolution for this chart. After using the zone plate chart, I'd love to point some other cameras at it and see how they perform. I think that the alises you get on film are not so much of a problem because of the way that the the sampling pattern is changing every frame, and is essentially random. On any system of regular samples, it's going to be easier to spot the aliases. I wonder if grain also has a masking effect? Graeme
  22. So, why do we not use a 3 chip and prism system? Cost, and we'd use the ability to use standard cinema lenses, because we've chosen to use a large sensor, so we get high resolution and resonable sized photosites. Why bayer pattern? Well, it's very well understood. Why do we use a well specified optical low pass filter? We don't like nasty aliasing. Graeme
  23. Putting it simply, 4096 samples is enough to sample and obey Nyquist 4096 / 2 = 2048 cycles. 2048 cycles are made up of 2048 line pairs (need a pair of lines to represent a cycle) and hence 2048 line pairs are made up of 4096 lines. So, you need 4096 samples to get 4096 lines and you're still obeying sample theory. However, there is no such thing as a brick wall optical low pass filter. So we use a practical filter that does low pass filtering. This stops aliasing, but will limit your measured resolution. We can measure around just over 3000 lines, with minimum of nasty aliasing artifacts. That's luma. Chroma you get anywhere from 2000 to 3000 depending on image content. Now, with a 3 chip system, you still need that low pass filter or you'll get some nasty aliasing. You also get prism artifacts too. You'll get a bit higher luma resolution, perhaps, because you don't need to demosaic, you may get slightly higher chroma resolution, but that might be affected by the prism and chip alignment. Pixels and sensors and lenses and prisms don't tell the whole story. What counts is the image. We've stated the size of our chip, it's bayer pattern, the number of photosites, the size of the photosites and the bit depth of the a-to-d. They're hard specs you can unambiguously count. Measured resolution is a soft spec as you've to measure it - and each way of measuring will give different answers. It depends on a vast number of variables.
  24. Thomas, listen to Tripurari as he's trying to educate you on your mis-understanding of sampling theory. I've tried to post two explanations here, but my crappy internet connection in the hotel screwed me up both times. Graeme
  25. All I remember from last year's IBC is that Phil told me it was a 1k camera, at which point I laughed and asked him if after viewing the image on the 60ft screen it looked like the output from a 1k camera. Now, Phil may have mis-spoke and meant to say 2k, or may not - that's for him to remember not me, but I'm very clear in my recollection of the event. The pros and cons of a Bayer pattern sensor are well known and understood. As are other alternative imaging approaches, and non are in any way perfect. Every single one is a series of compromises, and I'm happy that the compromises we have taken result in a very nice image, with a very usable file size from a compact camera that takes standard cinema lenses. When Phil gets to design and build his own camera, he can choose his own personal set of compromises and I wish him luck in achieving his goals. Graeme
×
×
  • Create New...