Search the Community
Showing results for tags 'comrpession'.
Found 1 result
Hi there, I'm asking myself some questions about compression in the sony a7s II's various recording formats and maybe some of you know the answer. I tried to calculate the mean (file) size of a pixel in the different formats to compare the amount of compression: 4k 30p 100Mbps : 3.33 Mb per image = 0.40 bit per pixel (on average, obviously) 24p 100Mbps : 4.17 Mb per image = 0.50 bit per pixel HD 24p 50Mbps : 2.08 Mb per image = 1.00 bit per pixel 30p 50Mbps : 1.67 Mb per image = 0.80 bit per pixel 60p 50Mbps : 0.83 Mb per image = 0.40 bit per pixel 120p 100Mbps : 0.83 Mb per image = 0.40 bit per pixel Is 4K 24p actually 80Mbps to have the same image quality as 4K 30p, or is it true 100Mbps an thus produce a less compressed image ? Same question for 24 and 30p 50Mbps HD. If 24p 50Mbps HD is actually 50Mbps, it means it's the format with the least compression with over 1bit per pixel on average. If the limiting factor of the camera is a data stream of 100Mbps max, why are there no HD 24/30/60p 100Mbps modes for better IQ ? Is it to keep consistency over compression levels across the various framerates even if it means sub optimal IQ ? But then the 4K 24p should be 80Mbps to fit the 30p compression ? 100Mps for all the different res/framerate combinations would allow the 60p to be better quality than the 120p, as it should, and also allow an even lower compression 24p mode (with over 2 bits per pixel) This would come in handy for specific shots where you'd want low compression over high resolution. Or am I just talking nonsense ? Cheers.