Jump to content

Apertus - open source cinema


Sebastian Pichelhofer

Recommended Posts

Hi, this is my first post on cinematography.com but instead of just talking about myself I want to share the project I have been working on the last couple of years which I think has the potential to be "a big thing" in digital cinematography in the future:

 

Introducing Apertus

www.apertus.org

We created a video introducing the Apertus Project and explaining our intentions. The video has subtitles in English, Spanish, Portuguese, French and Catalan (just press the "CC" button on YouTube's player to choose the one you prefer)

 

http://www.youtube.com/watch?v=mvI7rJ_AZys

 

Mission Statement

We intend on creating an affordable community driven free software (FLOSS) and open hardware cinematic HD camera for a professional production environment.

 

 

Apertus Stereo Rig

Community members Nathan Clark and Winne Yang have developed a new Apertus rig for stereo 3D cinematography. Needless to say, we've all been blown away by their ingenuity. Congratulations goes out to them and their stunning machinists!!

 

post-54757-0-13038200-1328347048.jpeg

 

More here: http://apertus.org/en/node/140

 

I will keep updating this thread as new developments and project related news emerge in the future.

Link to comment
Share on other sites

Interesting. Are there plans to support 1.85 in the future?

 

Also the high res modes aren't capable of 25fps. Is this a problem with the camera or the the codec keeping up with that frame rate?

 

Lastly is it possible to overlay framing guides of any kind?

 

love

 

Freya

Link to comment
Share on other sites

Interesting. Are there plans to support 1.85 in the future?

 

Also the high res modes aren't capable of 25fps. Is this a problem with the camera or the the codec keeping up with that frame rate?

Resolution can be freely set to anything between 16x16 and 2592x1936 (in 16 pixel steps) also varying aspect ratio to anything you like. As the number of pixels goes up the max FPS goes down. Apertus AMAX and CIMAX have been designed to acquire maximal resolution at 24 FPS so if you are happy with a few less pixels you can get 25fps easily.

 

 

Lastly is it possible to overlay framing guides of any kind?

Yes, currently we have: "thirds", "safe area", "inner crosshair", "outer crosshair" and its pretty easy to add new ones in the software.

 

The following image shows just inner & outer crosshairs:

EV_screenshot02_0.jpg

Link to comment
Share on other sites

Resolution can be freely set to anything between 16x16 and 2592x1936 (in 16 pixel steps) also varying aspect ratio to anything you like. As the number of pixels goes up the max FPS goes down. Apertus AMAX and CIMAX have been designed to acquire maximal resolution at 24 FPS so if you are happy with a few less pixels you can get 25fps easily.

 

 

So it should be possible to capture in 1.85 at higher res than for the AMAX mode, but will that require extra programming? I notice there doesn't seem to a WIMAX mode!

In a similar way what about high speed in scope aspect ratios (or even 1.85 for that matter), seems these would be preferable as presumably you could then get higher resolutions at high speed?

 

Is the ethernet version of the camera more limited in what in can achieve over the SATA version?

Obviously it's way cheaper so if it is just a matter of connectivity...

Link to comment
Share on other sites

So it should be possible to capture in 1.85 at higher res than for the AMAX mode, but will that require extra programming? I notice there doesn't seem to a WIMAX mode!

In a similar way what about high speed in scope aspect ratios (or even 1.85 for that matter), seems these would be preferable as presumably you could then get higher resolutions at high speed?

 

Yes these resolutions are possible, no programming experience required:

 

EV_Settings1.jpg

 

There is a "custom" resolution button where you can type in any resolution you like (in 16 pixel steps). If the 1.85 setting is something many people will want to use I can also add a new button to the resolution settings easily.

 

Is the ethernet version of the camera more limited in what in can achieve over the SATA version?

Obviously it's way cheaper so if it is just a matter of connectivity...

 

I would strongly advise to get the IO board as well.

With only Ethernet you are limited to 100Mbit for video recording and need a more powerful PC to capture the stream to not drop any frames. Its possible of course to go that route.

Though with the additional IO board you can write the video stream directly to camera connected media (at slightly more than 100mbit) and you don't need to worry about the network stream dropping frames. I use a lower powered netbook to view the video stream for framing, etc., dropping frames on the newbook will only affect the preview video I see but not the video that goes on the HDD.

 

 

So if we were talking 1080p res, cropped to scope aspect ratio, what would the maximum frame rate be?

 

I just tested that specifically for you ;)

 

2.35:1 aspect ratio: 1920x832 resolution results in max: 33.4 FPS in RGB mode and 40.2 FPS in JP4-RAW mode.

If you can live with 1 pixel missing in height 816 instead of 817 you can gain ~1 additional FPS.

Link to comment
Share on other sites

Some clarification and bean spilling:

 

-) Yes, we are currently preparing material to start a crowdfunding (kickstarter) campaign to develop our own Apertus Sensor Front End.

-) The sensor we currently favor is the CMV12000 from Cmosis (http://wiki.elphel.com/index.php?title=Sensors_table#Hi_speed_sensors)

-) It will be amazing and lift Apertus into a whole new level of possibilities putting us in direct competition with the current big players

-) It will cost a lot of money

-) It will be open source/free software/open hardware.

-) We will go public with it once everything is ready. We decided to rather spend more time in preparations to make it a really good and complete campaign, then if it still fails we can at least go down with dignity ;)

 

 

If you fear you could miss the epic start of it all be sure to subscribe to our newsletter: http://apertus.org/newsletter

Link to comment
Share on other sites

  • Premium Member

Are you planning to offer any form of very low contrast (log, etc) output for workflows that require it?

 

What's the cost of the camera module?

 

I have to say I'm not sure how useful this really is. As much as you say "open hardware", nobody's going to be fabricating sensors at home so really all you can do in software is the user interface. Also, and this is something that open source people constantly overlook, the fact that you can get the source code is actually only useful to you if you are a software engineer. Most camera operators and directors of photography are not software engineers.

 

P

Link to comment
Share on other sites

Hi guys, this is my first post here.

 

I am a user and contributor to the Apertus project. I thought I might be able to shed some light on why I think Open Hardware is so important.

Even if you are not a developer / engineer (of which I am neither)- real world cinematography experience can be of great influence to the project!

The power of a user-driven, open project means that anybody can contribute. At its simplest, this can be in the form of pure ideas / wishes!

For example, I am working on a stereoscopic rig and many of the (now robust) S3D features within our viewfinder software have been implemented as a result of my "stereo wishlist".

What's important to understand is that by using open hardware/software, features can be added / enhanced / evolved without having to "hack" the firmware...

Even though people are almost certainly not going to be building camera chips in their garages; by using open hardware, it means that they can access/interface with this hardware, without limitation!

 

So even if you are not a developer, you can still influence/assist with implementing features that you want to see in a cinema camera

and if you are a developer / have a good friend developer, she can program the hardware/software, right down to the lowest levels- in a supported community.

All for the greater good really!

Link to comment
Share on other sites

  • Premium Member

Even if you are not a developer / engineer (of which I am neither)- real world cinematography experience can be of great influence to the project!

 

 

I agree completely. Unfortunately, the record of opensource software is not particularly encouraging when it comes to accepting the advice of people who are not software engineers. It is of course possible that the Apertus project will be an exception, but my experience in this area is not encouraging. I have offered free consultancy to several open source projects, which has been universally turned down, often quite abruptly. The reason for this has generally been the tired old plaint that "you expect us to work to your specification for free."

 

Well, yes and no. Yes, because it is the job of a software engineer to work towards the needs of the user, and no, because we already have a perfectly good model for software engineers being paid to work to the user's specification. It's called commercial software, and in general I find it works much better than open source at creating programs that are feature complete and easy to use.

 

I will embarrass only one project I spoke to by naming it here - Cinelerra, the open source video editor, which at the time I approached them had no support whatsoever for timecode, an extremely basic feature without which the software is useless in many use cases. Nobody on the programming team had even the haziest idea what timecode was, or what it was used for. Instead they preferred to keep writing "fun" stuff, cheesy image filters and the like. They didn't see why "they should do what I say for nothing." As far as I understand it, there is still no feature-complete open source video edit software, simply because of this sort of horrible failure of project management.

 

And that is the problem with open source. It's not that there aren't enough hours of work being done by sufficiently able people, it's just horribly, terribly misdirected.

 

It is of course possible that Apertus will be the exception that proves the rule, but to be honest I've long since given up hoping that any piece of open source software will ever fulfil its potential.

 

P

Link to comment
Share on other sites

I will embarrass only one project I spoke to by naming it here - Cinelerra, the open source video editor, which at the time I approached them had no support whatsoever for timecode, an extremely basic feature without which the software is useless in many use cases. Nobody on the programming team had even the haziest idea what timecode was, or what it was used for. Instead they preferred to keep writing "fun" stuff, cheesy image filters and the like. They didn't see why "they should do what I say for nothing." As far as I understand it, there is still no feature-complete open source video edit software, simply because of this sort of horrible failure of project management.

 

A bit off-topic, but since realizing that Cinelerra can't be fixed, there has been going on for some years an effort to write a good professional NLE from scratch. No cheesy filters, no 'export directly to youtube' kind of stuff. Only problem is of course that it is progressing rather slow because the project requires long term commitment and at this point lots of under the hood work which doesn't seem to attract too many developers. Might be related to the mentioned problem of people wanting to do fun stuff... Please see this page for more information about Lumiera: http://lumiera.org/ (as for their plans and how it is progressing, please see their rather promising and realistic roadmap: http://issues.lumiera.org/roadmap)

 

I myself believe that opensource projects can be useful. I run Gentoo Linux myself on most of my personal computers. Unfortunately as it comes to editing there really hasn't been OSS that is stable enough and supports professional workflows yet. I know my way around console and commands, but when I'm editing a movie, I hope the tool does what it should and that I don't have to spend my time trying to find why the software is giving me a segmentation fault...

 

For the Apertus project I wish all the best -- I have been following your progress from time to time and I'm certainly interested in seeing where your project is heading.

Link to comment
Share on other sites

Hi.

It Appears the raw format jp4 is not supported for any software, is it true?, and it must to transcode to DNG, then is no better to use cinemaDNG how native raw format? According to Adobe, cinemaDNG is unencrypted and free from intellectual property encumbrances or license requirements. It appears the logic option.

Link to comment
Share on other sites

Are you planning to offer any form of very low contrast (log, etc) output for workflows that require it?

 

We can already load custom gamma curves, so you can design your own s-log curve, etc. Though support in the GUI of the viewfinder software is still limited, we are working on improving it.

 

What's the cost of the camera module?

If you buy the hardware from Elphel Inc. (www.elphel.com) that Apertus is based on you can get the kit for ~1300$. Apertus itself does not sell any modified or custom hardware yet.

 

I myself believe that opensource projects can be useful. I run Gentoo Linux myself on most of my personal computers. Unfortunately as it comes to editing there really hasn't been OSS that is stable enough and supports professional workflows yet. I know my way around console and commands, but when I'm editing a movie, I hope the tool does what it should and that I don't have to spend my time trying to find why the software is giving me a segmentation fault...

 

For the Apertus project I wish all the best -- I have been following your progress from time to time and I'm certainly interested in seeing where your project is heading.

 

Thanks! I fully agree. Historically Linux and Open Source had a nerd image attached and I also know a lot of software developers who have no idea about usability, graphics design, etc.

 

But:

1) This is changing now as artists and creative people are becoming more and more the driving force behind the "open source" movement and

2) you need to distinguish the religion and its followers.

 

Hi.

It Appears the raw format jp4 is not supported for any software, is it true?, and it must to transcode to DNG, then is no better to use cinemaDNG how native raw format? According to Adobe, cinemaDNG is unencrypted and free from intellectual property encumbrances or license requirements. It appears the logic option.

 

Correct. JP4 RAW is a raw codec that Elphel Inc. originally developed for Google.

 

We provide a software that can transcode from JP4 to DNG (and soon other formats as well).

 

CinemaDNG is high on our agenda but since it doesn't have many significant advantages over a DNG sequence implementation is behind schedule a bit.

Edited by Sebastian Pichelhofer
Link to comment
Share on other sites

The way I was introduced to linux was by producing a video for an open source developer. They were staging an event and wanted to raise awareness of open source software. They were very keen on me using open source video editing software to cut the project(they pretty much insisted). I knew nothing about it at the time, so jumped in and tried out Kino, Kdenlive and a few others, Cinerella was the worst of the bunch. None of the applications really EVER worked, none of them supported an EDL of any kind. Crash upon crash, technical set backs even the developers didn't understand. The apps were like video editors from 1990. These folks have no love for Windows but neither do they like Mac. I had to get it done to get paid, so of course used what I already had, final cut pro. They were rather angry with me, but I think they understood that there really isn't an open source editor. Lumiere's page and roadmap looks like it was made by and for software developers, not filmmakers and I don't give it much hope. I did however, become an Ubuntu user and wish Apertus the best of luck. If it is being written by filmmakers and artists alike and they listen and learn from the complaints about previous attempts(timecode support, EDLs), then perhaps there is hope. Digital cinema with all the supposed freedom it gives filmmakers, is perhaps the most proprietary art form going. I'm hoping these folks can change that.

Edited by Chris Burke
Link to comment
Share on other sites

  • Premium Member
None of the applications really EVER worked, none of them supported an EDL of any kind. Crash upon crash, technical set backs even the developers didn't understand.

This is generally my experience of open source software per se. It doesn't work very well, even if it does it's terribly incomplete and they won't accept any sort of consultation on the basis they're being "told what to do".

It's a tragic waste of time and resources.

Link to comment
Share on other sites

Lumiere's page and roadmap looks like it was made by and for software developers, not filmmakers and I don't give it much hope. If it is being written by filmmakers and artists alike and they listen and learn from the complaints about previous attempts(timecode support, EDLs), then perhaps there is hope. Digital cinema with all the supposed freedom it gives filmmakers, is perhaps the most proprietary art form going. I'm hoping these folks can change that.

 

I can assure you there are such people involved in Lumiera process. The reason for the website to look like it was made for engineers instead of artists is to make sure no useless hype is generated. There really isn't point in having large quantities of people rushing in asking "is this ready yet" and have them disappointed when the project is proceeding the pace it can with people doing it on their own time.

 

But I certainly like it that way. There are way too many websites made by artists who have no idea about coding. They give grandiose promises and never deliver them. In comparison to that engineer-oriented developer page is exactly what is needed until it is finished.

Edited by Heikki Repo
Link to comment
Share on other sites

  • 4 weeks later...

Latest News:

JP4ToolsGUI_05.jpg

 

As we have announced in February, we are preparing the material for a major crowdfunding campaign, aimed at developing a new Sensor Front End for Apertus. The update to that statement is that we indeed have a solid technical project in our hands but that there are still one or two factors we want to consolidate before opening everything up to the public. Meanwhile, other things have progressed.

 

The balsamiq service has granted us free unlimited lifetime usage, which we’ll use for collaborative web and software GUI designing (thanks balsamiq!). Also, after almost 9,700 views and 216 replies in our logo creation thread, we've created a forum area for the graphic designers in our community. This section is read-only for everyone else, but the “old” logo creation thread is still open.If you're versed in what our project is aiming for and want to contribute, please continue joining in!

 

Concerning our plans for a Wireless Open Follow Focus, there is currently a prototype in construction phase, with community member Micheal Green using Arduinos Uno, Pro Mini and Xbee hardware components. The plan is to produce two wireless versions of a Follow Focus, one controlled via a smartphone and the other via a hardware controller (you can check the original discussion here).

 

Sebastian has begun working on an Android version of the ElphelVision software. It has been simplified from its tablet/pc counterpart and currently functions as a “remote control” for the Elphel camera, with live stream viewing disabled. You could think of this as a software version of what the Dictator hardware will represent.

 

The headlining image at the top of the page is a design mock-up for what we'd like to call “Raw Factory”, or maybe “Open Cine”. To be honest, we're still trying to decide on the best name for it. What we do know is that it's an application designed for working with DNG sequences. At present, this software does not exist, but it is envisioned to allow for a live preview of DNG clips over a timeline. It will offer a great solution for all cinema professionals, not just those working in the Apertus community. Anyone and everyone will be able to use this to convert RAW DNG footage into proxy/digital intermediate files or apply color corrections/manipulations. In March, our efforts will be directed towards getting this project accepted into the next Google Summer of Code. The discussion has just started here.

 

We're in the running for an award at Prix Ars Electronica 2012: International Competition for CyberArts. Prizes of $10,000 and $5,000 will be given to the first three winners. We'll be entering as a “Digital Community” and all the necessary information for doing so is being gathered in this thread.

Link to comment
Share on other sites

  • 1 month later...

DP-Banner-01---Apertus2.jpg

 

This is huge! This is the moment we have all been waiting for! We’re announcing our first giant steps towards a completely new Open Source Camera.You heard us right! We are doing so much more than just developing a Sensor Front End.

 

Apertus will create an entirely new, complete digital cinema camera!

A High speed Super35 global shutter 4K CMOS sensor camera, powered by free software and open hardware, with a target retail price well below 10K$. How does that sound?

 

We are currently ironing out details, creating 3D models and animations to illustrate our concept. Wait until you see them, they are just awesome! Once our preparations are complete we will make the entire plan available for review and feedback. We will engage in discussion with the community, as well as the broader public, for 2 weeks of intense consultation. During this time, we will be able to make appropriate, informed revisions to our final plan.The next step is officially launching the crowd funding campaign and turning this project into reality.There is another thing we are very excited about and want to share with everyone.

 

Apertus is not going to do this alone!

We have established a partnership with Dynamic Perception. Chris ‘Shutterdrone’ Church and Jay ‘Milapse’ Burlage are the pioneers of HDR Motion Control Time-lapse. They are the developers of the Open Source Motion Control System - OpenMoCoand popular open source sliders; The Stage Zero system released in late 2010 and the soon to be released Stage OneSystem.There are many benefits that will come from this cooperation: in addition to contributing with personnel resources,Dynamic Perception has committed to funding part of the development of our new camera project.The Apertus camera will work seamlessly with Dynamic Perception motion control systems and offer freaking brilliant new possibilities. Finally, they will also help us establish a recognizable Apertus legal body in the United States.This legal, non-profitable structure, is something we are discussing in our forums right now: the creation of an Apertus Foundation, based in the United States. The U.S. is the best strategic location for crowd funding our project on the scale we require.We are currently researching the best avenue for creating this foundation, and you can join in the discussion at the specific forum thread.

 

Here’s the official announcement from the guys over at Dynamic Perception:

Official announcement:

---------------------------------------------------------------------------

The Apertus Project and Dynamic Perception team up

Dynamic Perception and the Apertus Project have teamed up to produce the first fully integrated camera and motion control system. Together we will expand the capabilities and tools available to filmmakers everywhere, while pushing the envelope in creativity and openness through open-source electronics and hardware.

 

 

About the Apertus Project

Born by the community in 2006, the Apertus Project seeks to design and build all aspects of a digital cinema camera using open-source technology, to create a truly unencumbered platform for professional production as well as experimentation, education and artistic expression.

 

The project has not only focused on software and hardware for recording video and interacting with the camera, but also to provide open-source alternatives for several key areas of post-production workflow. In pursuit of this goal, the Apertus Project plans to create a RAW video conversion software suite. Work is also currently under way to develop a Blender based post-production workflow.

 

MoCoBus

Dynamic Perception is bringing its new MoCoBus technology to market, which will provide an open framework for numerous types of devices to be connected and controlled both in the studio and in the field. Unlike other existing motion control and studio automation technology, all aspects of MoCoBus will be truly open and available for integration by open-source and closed-source products.

 

As a long-term project, one of the goals of Dynamic Perception and the Apertus Project are to provide product alternatives for all "intelligent" devices in the studio that are both open-source and competitive on pricing, features, and capabilities. We believe that not only does open-source technology work well in the hands of creative individuals, but the rapid pace at which the technology can adapt to new situations helps prove out new capabilities which will impact traditional products as well.

 

The nanoMoCo controller board will be the first product released by Dynamic Perception using MoCoBus technology -- allowing DIY and OEMs to create stepper-based devices with high performance and capabilities easily and at an extremely low cost.

 

A New World, a New Camera

The result of this project will be a full-featured cinema camera with complete MoCoBus integration built directly into the camera's user interface, allowing control over all recording, motion, and other studio automation tasks directly from the camera without the need for several other controllers and devices that must be independently configured and started. By reducing the number of devices that need to be interacted with, not only can we reduce the time between concept and execution, but also reduce the overall cost and labor required to create the most expressive shots you can imagine. Ultimately, we believe that a solo filmmaker should be able to produce expressive shots that would normally require an entire team to execute today, and to be able to do it on an independent's budget.Of course, all of the control capabilities in the world wouldn't matter if the camera can't record breath-taking video, and to this end the Apertus team is investing heavily in creating a camera which will be on-par with any production cinema camera at a fraction of the price. Details on the specifications of this camera will be released soon.

 

 

The Open-Source Studio

Our goal at Dynamic Perception is to support the creation of open-source alternatives for all tools used in film-making. While our roots are in motion control - our heart is in the entire creative process. Over the next year, we will be creating or supporting projects covering everything from automation to lighting, and on to post-production.

 

These tools are only as good as the connections between them, and knowing this, we are working to make intelligent, network-able tools that allow you to control each of these devices from the most appropriate interface for your needs -- whether it be in-camera, in a specialized hand-held controller, or on your desktop with your other applications.

Link to comment
Share on other sites

  • 2 months later...

http://apertus.org/en/node/152

 

Now there is no turning back anymore. Today we announced the product name of our new camera in the first of the Apertus talks in the Media, Radio, Television and Professional Graphics category at LSM. The future of open cinema cameras from now on has a name: APERTUS AXIOM

 

axiom-website-banner.jpg

 

We have actually been rather careful and secretive not to release too much information about the new camera on the Internet as we do not want to follow the unfortunately quite commmon practice of how other companies announce their products: they just promise you whatever you want to hear and a couple of years later they admit that this product is not ever going to see the light of the day. So we made sure that what we announce is solid and well thought through. But still a word of precaution: We will need your help to create Axiom, only if we manage to succeed with our kickstarter campaign we will be able to afford the development. Currently we are still working on preparing the material for the campaign, unfortunately its taking longer than we hoped it would - stay tuned.

Link to comment
Share on other sites

  • 5 months later...
  • 1 year later...
  • 5 months later...
  • 4 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...