Search the Community
Showing results for tags 'formats'.
Found 3 results
As camera's keep coming out with higher and higher resolution models (far beyond what our eye can resolve) and with company/studios like Netflix trying to force content creators to film their content in 4k (claiming to want to "future-proof" their investment).... Do any of you have any strong opinions on this and/or have you heard of anyone or you yourself not been able to sell your film BECASE it wasn't UHD or 4k (obviously SD has some hurdles)... I am considering filming some doc stuff while abroad for a year in Sri Lanka (a feature included) using a Super-16 lens (Canon 8-64) in a windowed 2.5k cropped sensor (from a 4k BMP4k) as I think the focal range, the character, sharpness and speed will be the "perfect" lens while abroad... However I am nervous about the workflow (that is unless BMD decides to offer a S16 Windowed mode which they really should) and or having a hard time selling it once it is completed realistically 3+ years down the road. Any thoughts?
I am trying to find if any major tv series were broadcast in scope aspect ratio? I know some shows such as The Walking Dead and True Detective s02 were shot anamorphic with a 16:9 extraction, but looking on line I was not able to find any series broadcast in scope except a british show called Broadchurch which apparently was shot anamorphic for 2:1 screening. Also the great doco Wild Wild Country was shot scope (seems anamorphic). Any other thoughts?
Hi there, I'm asking myself some questions about compression in the sony a7s II's various recording formats and maybe some of you know the answer. I tried to calculate the mean (file) size of a pixel in the different formats to compare the amount of compression: 4k 30p 100Mbps : 3.33 Mb per image = 0.40 bit per pixel (on average, obviously) 24p 100Mbps : 4.17 Mb per image = 0.50 bit per pixel HD 24p 50Mbps : 2.08 Mb per image = 1.00 bit per pixel 30p 50Mbps : 1.67 Mb per image = 0.80 bit per pixel 60p 50Mbps : 0.83 Mb per image = 0.40 bit per pixel 120p 100Mbps : 0.83 Mb per image = 0.40 bit per pixel Is 4K 24p actually 80Mbps to have the same image quality as 4K 30p, or is it true 100Mbps an thus produce a less compressed image ? Same question for 24 and 30p 50Mbps HD. If 24p 50Mbps HD is actually 50Mbps, it means it's the format with the least compression with over 1bit per pixel on average. If the limiting factor of the camera is a data stream of 100Mbps max, why are there no HD 24/30/60p 100Mbps modes for better IQ ? Is it to keep consistency over compression levels across the various framerates even if it means sub optimal IQ ? But then the 4K 24p should be 80Mbps to fit the 30p compression ? 100Mps for all the different res/framerate combinations would allow the 60p to be better quality than the 120p, as it should, and also allow an even lower compression 24p mode (with over 2 bits per pixel) This would come in handy for specific shots where you'd want low compression over high resolution. Or am I just talking nonsense ? Cheers.