A couple of colleagues of mine and myself are so close to producing realtime HDR video. We have done samples by combining 3 different 4-second long videos at 3 different exposures off an XHA1, and the results are unbelievable.
So, here's my dilemma...
We are looking for two possible pieces of the puzzle. Either, a small mirror/prism rig where 3 A1's can connect to it and film 3 versions of the same EXACT image, OR, find a program that has the ability to take 3 slightly offset images/videos and warp them to one of the videos so they all look identical. I imagine the computer taking the features of one image, and then compares it to another image that's slightly different (taken from another camera 2 inches to the side of the other), and warp the image to the details and features of another. I've looked into stereoscopic workflows and such, but it only gets me so far, since stereoscopic video is trying to produce two slightly offset images instead of 2 identical images. Again, I'm dealing with 3 images that I'm trying to make identical, even though the cameras will be inches apart.
Any ideas? We have a workflow figured out and everything, but we are just missing that last piece of the puzzle. Once I figure out this roadblock, I've done it. Real-time HDR Video. Finally.
This is what I've produced so far.
http://img269.imageshack.us/i/20191350.jpg/
Those 2 cars you see that look ghosted are actually the same car. Each exposure that was combined was a different moment in time, since this test was done with a single camera. I have yet to produce a great quality image from 3 cameras, obviously, that's what this post is for.