Jump to content

Sebastian Pichelhofer

Basic Member
  • Posts

    10
  • Joined

  • Last visited

Everything posted by Sebastian Pichelhofer

  1. We just released an article containing the First Moving Images from Axiom Alpha Prototype: Read more about it and watch the videos on: https://www.apertus.org/2014-first-moving-images-axiom-alpha
  2. Great News! We just released more details about Axiom on our just released own Axiom website. AND we also have a section there about the prototype we are building: Axiom Alpha. http://axiom.apertus.org/
  3. http://apertus.org/en/node/152 Now there is no turning back anymore. Today we announced the product name of our new camera in the first of the Apertus talks in the Media, Radio, Television and Professional Graphics category at LSM. The future of open cinema cameras from now on has a name: APERTUS AXIOM We have actually been rather careful and secretive not to release too much information about the new camera on the Internet as we do not want to follow the unfortunately quite commmon practice of how other companies announce their products: they just promise you whatever you want to hear and a couple of years later they admit that this product is not ever going to see the light of the day. So we made sure that what we announce is solid and well thought through. But still a word of precaution: We will need your help to create Axiom, only if we manage to succeed with our kickstarter campaign we will be able to afford the development. Currently we are still working on preparing the material for the campaign, unfortunately its taking longer than we hoped it would - stay tuned.
  4. This is huge! This is the moment we have all been waiting for! We’re announcing our first giant steps towards a completely new Open Source Camera.You heard us right! We are doing so much more than just developing a Sensor Front End. Apertus will create an entirely new, complete digital cinema camera! A High speed Super35 global shutter 4K CMOS sensor camera, powered by free software and open hardware, with a target retail price well below 10K$. How does that sound? We are currently ironing out details, creating 3D models and animations to illustrate our concept. Wait until you see them, they are just awesome! Once our preparations are complete we will make the entire plan available for review and feedback. We will engage in discussion with the community, as well as the broader public, for 2 weeks of intense consultation. During this time, we will be able to make appropriate, informed revisions to our final plan.The next step is officially launching the crowd funding campaign and turning this project into reality.There is another thing we are very excited about and want to share with everyone. Apertus is not going to do this alone! We have established a partnership with Dynamic Perception. Chris ‘Shutterdrone’ Church and Jay ‘Milapse’ Burlage are the pioneers of HDR Motion Control Time-lapse. They are the developers of the Open Source Motion Control System - OpenMoCoand popular open source sliders; The Stage Zero system released in late 2010 and the soon to be released Stage OneSystem.There are many benefits that will come from this cooperation: in addition to contributing with personnel resources,Dynamic Perception has committed to funding part of the development of our new camera project.The Apertus camera will work seamlessly with Dynamic Perception motion control systems and offer freaking brilliant new possibilities. Finally, they will also help us establish a recognizable Apertus legal body in the United States.This legal, non-profitable structure, is something we are discussing in our forums right now: the creation of an Apertus Foundation, based in the United States. The U.S. is the best strategic location for crowd funding our project on the scale we require.We are currently researching the best avenue for creating this foundation, and you can join in the discussion at the specific forum thread. Here’s the official announcement from the guys over at Dynamic Perception: Official announcement: --------------------------------------------------------------------------- The Apertus Project and Dynamic Perception team up Dynamic Perception and the Apertus Project have teamed up to produce the first fully integrated camera and motion control system. Together we will expand the capabilities and tools available to filmmakers everywhere, while pushing the envelope in creativity and openness through open-source electronics and hardware. About the Apertus Project Born by the community in 2006, the Apertus Project seeks to design and build all aspects of a digital cinema camera using open-source technology, to create a truly unencumbered platform for professional production as well as experimentation, education and artistic expression. The project has not only focused on software and hardware for recording video and interacting with the camera, but also to provide open-source alternatives for several key areas of post-production workflow. In pursuit of this goal, the Apertus Project plans to create a RAW video conversion software suite. Work is also currently under way to develop a Blender based post-production workflow. MoCoBus Dynamic Perception is bringing its new MoCoBus technology to market, which will provide an open framework for numerous types of devices to be connected and controlled both in the studio and in the field. Unlike other existing motion control and studio automation technology, all aspects of MoCoBus will be truly open and available for integration by open-source and closed-source products. As a long-term project, one of the goals of Dynamic Perception and the Apertus Project are to provide product alternatives for all "intelligent" devices in the studio that are both open-source and competitive on pricing, features, and capabilities. We believe that not only does open-source technology work well in the hands of creative individuals, but the rapid pace at which the technology can adapt to new situations helps prove out new capabilities which will impact traditional products as well. The nanoMoCo controller board will be the first product released by Dynamic Perception using MoCoBus technology -- allowing DIY and OEMs to create stepper-based devices with high performance and capabilities easily and at an extremely low cost. A New World, a New Camera The result of this project will be a full-featured cinema camera with complete MoCoBus integration built directly into the camera's user interface, allowing control over all recording, motion, and other studio automation tasks directly from the camera without the need for several other controllers and devices that must be independently configured and started. By reducing the number of devices that need to be interacted with, not only can we reduce the time between concept and execution, but also reduce the overall cost and labor required to create the most expressive shots you can imagine. Ultimately, we believe that a solo filmmaker should be able to produce expressive shots that would normally require an entire team to execute today, and to be able to do it on an independent's budget.Of course, all of the control capabilities in the world wouldn't matter if the camera can't record breath-taking video, and to this end the Apertus team is investing heavily in creating a camera which will be on-par with any production cinema camera at a fraction of the price. Details on the specifications of this camera will be released soon. The Open-Source Studio Our goal at Dynamic Perception is to support the creation of open-source alternatives for all tools used in film-making. While our roots are in motion control - our heart is in the entire creative process. Over the next year, we will be creating or supporting projects covering everything from automation to lighting, and on to post-production. These tools are only as good as the connections between them, and knowing this, we are working to make intelligent, network-able tools that allow you to control each of these devices from the most appropriate interface for your needs -- whether it be in-camera, in a specialized hand-held controller, or on your desktop with your other applications.
  5. Latest News: As we have announced in February, we are preparing the material for a major crowdfunding campaign, aimed at developing a new Sensor Front End for Apertus. The update to that statement is that we indeed have a solid technical project in our hands but that there are still one or two factors we want to consolidate before opening everything up to the public. Meanwhile, other things have progressed. The balsamiq service has granted us free unlimited lifetime usage, which we’ll use for collaborative web and software GUI designing (thanks balsamiq!). Also, after almost 9,700 views and 216 replies in our logo creation thread, we've created a forum area for the graphic designers in our community. This section is read-only for everyone else, but the “old” logo creation thread is still open.If you're versed in what our project is aiming for and want to contribute, please continue joining in! Concerning our plans for a Wireless Open Follow Focus, there is currently a prototype in construction phase, with community member Micheal Green using Arduinos Uno, Pro Mini and Xbee hardware components. The plan is to produce two wireless versions of a Follow Focus, one controlled via a smartphone and the other via a hardware controller (you can check the original discussion here). Sebastian has begun working on an Android version of the ElphelVision software. It has been simplified from its tablet/pc counterpart and currently functions as a “remote control” for the Elphel camera, with live stream viewing disabled. You could think of this as a software version of what the Dictator hardware will represent. The headlining image at the top of the page is a design mock-up for what we'd like to call “Raw Factory”, or maybe “Open Cine”. To be honest, we're still trying to decide on the best name for it. What we do know is that it's an application designed for working with DNG sequences. At present, this software does not exist, but it is envisioned to allow for a live preview of DNG clips over a timeline. It will offer a great solution for all cinema professionals, not just those working in the Apertus community. Anyone and everyone will be able to use this to convert RAW DNG footage into proxy/digital intermediate files or apply color corrections/manipulations. In March, our efforts will be directed towards getting this project accepted into the next Google Summer of Code. The discussion has just started here. We're in the running for an award at Prix Ars Electronica 2012: International Competition for CyberArts. Prizes of $10,000 and $5,000 will be given to the first three winners. We'll be entering as a “Digital Community” and all the necessary information for doing so is being gathered in this thread.
  6. We can already load custom gamma curves, so you can design your own s-log curve, etc. Though support in the GUI of the viewfinder software is still limited, we are working on improving it. If you buy the hardware from Elphel Inc. (www.elphel.com) that Apertus is based on you can get the kit for ~1300$. Apertus itself does not sell any modified or custom hardware yet. Thanks! I fully agree. Historically Linux and Open Source had a nerd image attached and I also know a lot of software developers who have no idea about usability, graphics design, etc. But: 1) This is changing now as artists and creative people are becoming more and more the driving force behind the "open source" movement and 2) you need to distinguish the religion and its followers. Correct. JP4 RAW is a raw codec that Elphel Inc. originally developed for Google. We provide a software that can transcode from JP4 to DNG (and soon other formats as well). CinemaDNG is high on our agenda but since it doesn't have many significant advantages over a DNG sequence implementation is behind schedule a bit.
  7. Some clarification and bean spilling: -) Yes, we are currently preparing material to start a crowdfunding (kickstarter) campaign to develop our own Apertus Sensor Front End. -) The sensor we currently favor is the CMV12000 from Cmosis (http://wiki.elphel.com/index.php?title=Sensors_table#Hi_speed_sensors) -) It will be amazing and lift Apertus into a whole new level of possibilities putting us in direct competition with the current big players -) It will cost a lot of money -) It will be open source/free software/open hardware. -) We will go public with it once everything is ready. We decided to rather spend more time in preparations to make it a really good and complete campaign, then if it still fails we can at least go down with dignity ;) If you fear you could miss the epic start of it all be sure to subscribe to our newsletter: http://apertus.org/newsletter
  8. Yes these resolutions are possible, no programming experience required: There is a "custom" resolution button where you can type in any resolution you like (in 16 pixel steps). If the 1.85 setting is something many people will want to use I can also add a new button to the resolution settings easily. I would strongly advise to get the IO board as well. With only Ethernet you are limited to 100Mbit for video recording and need a more powerful PC to capture the stream to not drop any frames. Its possible of course to go that route. Though with the additional IO board you can write the video stream directly to camera connected media (at slightly more than 100mbit) and you don't need to worry about the network stream dropping frames. I use a lower powered netbook to view the video stream for framing, etc., dropping frames on the newbook will only affect the preview video I see but not the video that goes on the HDD. I just tested that specifically for you ;) 2.35:1 aspect ratio: 1920x832 resolution results in max: 33.4 FPS in RGB mode and 40.2 FPS in JP4-RAW mode. If you can live with 1 pixel missing in height 816 instead of 817 you can gain ~1 additional FPS.
  9. Resolution can be freely set to anything between 16x16 and 2592x1936 (in 16 pixel steps) also varying aspect ratio to anything you like. As the number of pixels goes up the max FPS goes down. Apertus AMAX and CIMAX have been designed to acquire maximal resolution at 24 FPS so if you are happy with a few less pixels you can get 25fps easily. Yes, currently we have: "thirds", "safe area", "inner crosshair", "outer crosshair" and its pretty easy to add new ones in the software. The following image shows just inner & outer crosshairs:
  10. Hi, this is my first post on cinematography.com but instead of just talking about myself I want to share the project I have been working on the last couple of years which I think has the potential to be "a big thing" in digital cinematography in the future: Introducing Apertus www.apertus.org We created a video introducing the Apertus Project and explaining our intentions. The video has subtitles in English, Spanish, Portuguese, French and Catalan (just press the "CC" button on YouTube's player to choose the one you prefer) http://www.youtube.com/watch?v=mvI7rJ_AZys Mission Statement We intend on creating an affordable community driven free software (FLOSS) and open hardware cinematic HD camera for a professional production environment. Apertus Stereo Rig Community members Nathan Clark and Winne Yang have developed a new Apertus rig for stereo 3D cinematography. Needless to say, we've all been blown away by their ingenuity. Congratulations goes out to them and their stunning machinists!! More here: http://apertus.org/en/node/140 I will keep updating this thread as new developments and project related news emerge in the future.
×
×
  • Create New...