I’ve got something I’ve been wondering for over and over this past weekend!
I read that downscailing from 4K to HD can produce the equivalent of 10-bit luma.
Like to say: 8bits 4K = 10bits HD.
Is that in fact true? how is it possible?
the GH4 records externally 10bits 4:2:2, the Sony A7 II with an external recorder only 8bit 4:2:0, so I wonder I shoot with the Atomos Shogun (not real 4K but UHD I think) and then I downscale it to to I get a true 10bits 4:2:2, with S-Log3, nicer to grade with, that would work cos of 8bits 4K=10bits HD? or it doesn't work this way?
thank you :)