Jump to content

Perry Paolantonio

Basic Member
  • Posts

    789
  • Joined

  • Last visited

Everything posted by Perry Paolantonio

  1. Stock. I think. My ACL 1.5 has this handgrip. It's more comfortable than you'd think, since you you can adjust the angle. I always wanted a wooden handgrip like the Aaton though. Each of those slides holds a wratten filter that goes into the camera body behind the lens.
  2. This doesn't sound like a problem with the film, it sounds like a scanning issue. Most scanners are designed for color film, and use color cameras. If you scan B/W film in color mode, it's going to have a subtle color tint to it (just like if you printed b/w to color stock). The easy fix is to just make your scan monochrome in your grading tool. I'd tend to do that as a first step anyway, when grading b/w material.
  3. No it won't. At least not definitively. It might tell you if the lens is broken, but that doesn't seem to be the issue here. As I mentioned above, the correct lens choice is dependent upon a number of factors, including the size of the sensor's photosites, the area of the sensor as well as the magnification factor that's required. While you might be able to tell if something is grossly off with the lens by trying it on another camera, that doesn't sound like the problem here. Tyler has said that it's not as crisp as scanners like the scanstation, which has a more expensive lens as well as a more complex focusing/rack system for that lens. I don't think this is a matter of the lens not working properly, I think it's probably just that it's the lens that fit FilmFabriek's budget and mechanical constraints, and it is what it is.
  4. The full scanstation is different. Also a Schneider, but a Makro-Symmar. And the lens in our 5k was different than the one in our 6.5k, even though they're both the same basic model. We're using a very similar Makro-Symmar in the 70mm scanner, but there are real differences between revisions of the same model lens. According to Schneider, there is only one variant in their lineup that is suitable for the sensor, pixel size, and distance between film and sensor, for the 14k camera we're using -- and that all the others would be soft if used in that setup. A different sensor or different pixel size would require a different lens. (which is why the 5k and 6.5k ScanStations both had different variations of the same lens). The point of my post above is that the manufacturer's choice of a lens is partly about what's available, but every one of those companies picked their specific lenses based on the mechanical and magnification requirements, the sensors being used, the film gauges they're optimized for, etc. That comes down to math, not guesswork and assumptions, of which there's been a ton in this thread. Just because a lens is good in one scanner doesn't mean it's the right choice for another... (I also have a 95mm Printing Nikkor 2.8 like the one in the Xena and Arriscan , but it wouldn't work in our scanner due to mechanical and space constraints)
  5. Then contact other vendors. I don't really see what the problem is here. You write one email and send the same one to 4-5 manufacturers and see what you get.
  6. Rather than guessing, why not contact a company like Schneider and ask for a recommendation? You will need to know the exact size of the sensor, the pixel size of the sensor (all available from the spec sheet), the distance from the film to the sensor, and the type of mount you're currently using. The calculation for a lens is just math. You pick the lens based on the enlargement you need, the space you have to work with, and the resolving power of the lens, factoring in the size of the photosites on the sensor. We chose our Schneider lens for our 70mm scanner after discussing with them exactly which model we'd need to cover everything from 35mm to 15p IMAX (it's a kind of macro/bellows setup with separate stages for lens and camera so we can move both camera and lens as needed). You can easily buy a lens with the same focal length, magnification factor and model name on ebay, but if you get the wrong version it's not optimized for the sensor's photosite. There are a lot of factors to consider and this problem could be resolved with a couple quick emails with a lens manufacturer. We also looked at lenses from Linos, Myutron and Nikon (Rayfact), but ultimately went with Schneider. All were helpful in choosing the correct lens (though Schneider was the most helpful), but you need to give them the right information.
  7. That does not look like a file direct off the scanner. As you noted, it appears to have been exported from Resolve. You should ask for a DPX file direct off the scanner, and look at that. Perhaps they're doing a one-light grade in resolve and exporting DPX from that, which is why the file looks this way. In any case, ensure you have a DPX file made by the scanner software directly, and then look at the metadata.
  8. DPX can be linear as well. It's pretty flexible. You can even make YUV DPX files if you want, which are definitely not log. Log scans exist as a way to match the characteristics of negative film to a digital format. Log simply isn't applicable to positive film. When you scan positive film to a DPX file, it's linear. If they're treating it as log (which is certainly possible), you would need to also interpret it as such upon playback. But they would be doing it wrong if they are. My bet is that the specs are just incorrect or incomplete.
  9. Scans from Reversal aren't log. Log is only applicable to color negative scans. What you want to ask for is a flat scan, as suggested above. When we scan positive film on our ScanStation 6.5k, we typically set the black level to about 10% (roughly the same point as dmin on a log scan), and we reduce the white levels so that the perforations (where there is no film, thus maximum brightness from the light source) is at 100%. The whitest whites in the film fall where they may but will never blow out this way.
  10. That's the spec for the raw sensor. The camera that sensor is put into may impose further limits on speed/resolution/bit depth, depending on how it's configured. So the same sensor in two different cameras may produce slightly different image sizes, at different speeds, especially if one of the cameras is doing any kind of processing internally.
  11. Right. My point was more that you don't need Xeon or Threadripper to get the PCIe lane bandwidth to add a 25gbE NIC. The PC the ScanStation uses is nothing especially fancy, just a gaming rig really, with a mediocre CPU. A similar motherboard with an i9 should provide plenty of PCIe lanes. Of course, you also need the software to support whatever camera you put it. I seriously doubt it's a simple drop-in replacement so this whole tangent is kind of moot.
  12. But the camera can't. Sorry - you're right. I was using my calculator set to monochrome. For color it would be 3x larger, so a 12bit file would be about 55MB/frame. For 10bit it would be about 46MB/frame. You could never do 10bit DPX faster than about 13fps because of the speed of the camera interface, and even that's a huge stretch. Probably more like half that speed in real-world usage. It's in the spec sheet for the camera. This is almost certainly why you can't get faster speeds. This is probably unnecessary. Our ScanStation's host PC has dual GTX1070 cards, a 25gb nic (for the camera), a 10gb nic (network), Cameralink card (optical track reader), RAID card (internal RAID) and it's only an i7 9800 with 32GB RAM on a decent gaming motherboard. We can capture single-flash 6.5k scans at 15fps on this and 2k scans up to 60fps. A good i9 should be sufficient.
  13. There are a lot of things that could cause slowdowns, but my bet is the interface. That camera has a USB3.1 Gen1 interface, which is only 5Gb/s. (gigaBITs - note the small 'b'). That's 625MB/second, assuming you're getting full saturation, which you almost certainly are not, because it's USB. A 4k 12bit file is a bit over 18MB/frame. So the USB 3.1 connection can move 33 frames per second, theoretically, assuming it's an RGB image off the camera onto the disk. I would bet you're getting maybe 50-60% of your available 5gbps bandwidth if everything is perfectly configured, which might get you near 20fps max in bursts. But if the PC is doing other stuff (even background stuff you're not aware of), that's going to affect performance too. That could include any image processing they're doing on the computer, whether that's GPU or CPU based. This is why professional scanners don't use USB cameras. To get 4k files off the camera to the PC at speed, you need a proper interface on a PCIe card: CameraLink (though nobody is really using this anymore), or Coaxpress, or 25GbE (5x faster than USB3.1 gen1). And you need a motherboard/CPU combo with enough bandwidth to allow all that data to move back and forth. Also, that camera has a tiny image buffer onboard - it can only hold about 6-7 frames of 12bit 4k, so if the FPGA on the camera is doing any processing that could be a choke point as well. The problem isn't your disk speed.
  14. This is nowhere near the 2500MB/S you claim it requires. 10bit DPX needs about 1200MB/S, plus you want a bit of overhead. If you have RAIDs that can do 1500MB/S consistently (and I don't mean testing with BMD or AJA disk speed tools, because those barely generate enough data to get past the caches on most hard drives), then you should have no problem with this. I think you're conflating things here. You started off talking about the disk speed needing to be 2500MB/s and that's not true. 1500 should be sufficient, though a bit tight. Now you're talking about the speed of the camera. But the camera being USB-C. The camera almost certainly isn't outputting a DPX file, it's pumping out the RAW data and then the DPX is being made by the frame grabber (in this case, it seems that might be a software element rather than hardware. DPX isn't a format that machine vision cameras really understand. They can do things like pass the raw sensor data (which is pretty lightweight if it's not debayered yet), or they can convert to an RGB bitmap file and pass that back to the software. But not DPX. And if you're looking at the specs for the camera on the camera manufacturer's site, the top speed they rate is always based on an 8 bit image. That's the industry standard for machine vision cameras - to use the 8bit speed as the advertised speed of the camera. So you need to dig deeper to find out what the camera's speed really is, for a higher bit depth image. That should be in the datasheet for that camera. What you are describing here is not a bug in how the OS handles the data, it's that this is a very inefficient and inherently slow way to do the work. If it has to write the data to disk and then verify that data, it's Writing and reading a big file. That's going to slow things down by adding a bunch of unnecessary I/O. But that's not the fault of the OS, that's just a matter of the way the scanner software is handling the files. There's no reason they can't do the verification of the data in memory and then write the file. GPU has nothing to do with it. Since the camera in this system is a bayer sensor, you would probably be better off writing the raw data to the disk and then converting it from Resolve. What camera is used in the scanner you have? Exact manufacturer and model? It should be easy enough to tell in the System Information window in Windows, if you can't see it on the physical camera.
  15. What format are you scanning to? 4k 16bit DPX (a ridiculous format to use, especially if you're scanning color neg) is about 1700MB/s at 24fps. We easily get those speeds on our SAN, which is home-built using off the shelf WD Red drives. But even with the network removed, one could get enough speed to do 4k DPX on a local array with just 4-5 drives in RAID 0 (and RAID0 is fine if the files are only going to be there temporarily). There is no issue with Windows slowing down DPX sequences. You've brought this up before, but it's simply not the case. Back in Windows XP it was a problem yes, but hasn't been for many, many years. Working with a DPX sequence on an NTFS RAID 0 is pretty straightforward stuff and is done all the time, even at high bandwidth. We have local raids for caching in all of our Windows machines - some are NVME or SSD caches, but some are just simple 4x 7200RPM spinning disks. All are NTFS, and all can handle image sequences.
  16. Pardon? please explain. All you need for consistent speed, if you're aiming for real time, is a closed loop stepper motor, Hybrid stepper/servo motor, etc. This is all cheap and readily available, as are sophisticated motor control systems to drive those motors. The flywheel in machines like the old Magnatech dubbers was there because the stepper motors it used were fairly coarse steps and it helps to smooth out the motion. What made the motor go the exact speed was the stepper (and the encoder to ensure it wasn't drifting, if it was, speed was adjusted). The flywheel just smooths it out. It is not. Freely available open source code exists that makes this possible: OpenCV. There is no need for an AI, it's a simple matter of pattern matching. You have a predefined image of the perfs, for example, then with each frame you find that pattern in the overscanned image, and figure out how far away from the target location that perf is. You move the image on the X and Y axes by that amount, and you have a well registered frame. If you wanted a standalone application to do this, sure, it would need to be written. But you can do the basics in a short python script. Resolve 18's tracker looks very promising. I haven't tried it yet, but sophisticated tracking has existed for a long time in tools like After Effects, Nucoda/Phoenix, Resolve, Diamant, Baselight, etc. It's not an especially complex problem to solve, but you have to define the inputs (good references on which to stabilize). And all of those should work with a ProRes file, without requiring a DPX sequence. We use Phoenix for restoration on a 28 core Xeon PC. 3fps is par for the course in our world with 4k+ files. We're doing just fine, having earned plenty of dimes from this system.
  17. Before they made film scanners, they made recorders to make still images from a computer on film (slides I think). This went on one of those devices.
  18. Rob's right. This has gone off the rails, and I'm sorry about that. After asking him not to contact me privately, Dan keeps doing it so I've blocked him where possible. too many distractions when there's actual work to be done.
  19. The hole is yours, my friend. I have the thread with you from that conversation which contradicts all that you're saying above. I also have the emails with the client from after I figured out who you were talking about. Amicable is definitely the tone of the communications with the client, which began with a request from him for a new quote, not a complaint. You began that PM by telling me "I’ve just had a complaint against your company Perry." (you have no relation to my company so I'm not sure why you're the one they're talking to). He has never once contacted us about a problem. Had he, my first course of action would have been to ask him to send the film back so we could look into it. We did advise the client, who brought up the idea of scanning one reel to start, then the other as his budget allowed. That was not the first time we've done work like that. We advised him on ways he could reduce the cost without having a detrimental effect on the image quality. Your words, verbatim, in that thread: "Please if you ever have a collector who can’t afford a full scan with your company don’t do that again. That’s ridiculous. If you don’t want to do a full 4K scan of a 2hr movie for $1500 then send a customer to one of your competitors that will so that situation doesn’t happen again" (The "don't do that again" line is in reference to scanning one reel of a multi-reel film) The reality is that the only time you really need someone to be physically present is when the system is being installed, and travel is included in that cost. We had two people out here for 2 days when we got ours in 2013, one day to uncrate, assemble and calibrate the machine, one day for training. the cost included flights and hotels, and FWIW, Steve bought me lunch both days. In the years we've had the scanner we've never had a need for an on-site visit for support and we've had a number of issues that had to be dealt with. The scanner is designed such that most parts are user-replaceable and are modular in nature. For example, if you had to replace the camera for some reason, they will send you detailed step by step instructions on how to do so. It's a couple cables and a couple screws to remove the camera module from the machine. You send it to Lasergraphics and they repair or replace it. In some cases they will ship you the new part first (like a new camera upgrade), and you send the old one back to them in the box the new one came in. The scanner came with a set of tools required for assembly, and for future removal/reassembly of parts. The design principle of their machines is that you don't need a field tech because the parts are plug-and-play modules. All other support is done via email, phone and remote login on the machine. If there's a problem they may ask you to do a remote login session. You need to be at the machine to load film and do some things in meatspace that can't be done over the phone, but again, we've never had a need to have someone come out to fix something. This is different than how some companies do it - Arri, for instance, will send a tech (at your expense) to deal with things. I'm sure Lasergraphics could do that if necessary, but I don't know anyone who has had to do that.
  20. Nice. Yeah it can be really hard to pull the color out of something that faded. There's a little bit left in there, but what remains is highly variable from film to film. So while two reels may look equally pink to the eye, there's a very high probability of never getting any color at all from one of them. I've found that Resolve's Auto Color feature is often a good starting point for this kind of thing, and I suspect BMD is doing something similar to what Lasergraphics is doing in that feature. Yeah, ignoring is a possibility and that's good advice. Normally that's exactly what I'd do. But there's a reason I respond to his posts: This site and some others come up as the first hit in a lot of google searches about scanning, and on more than one occasion I've had to spend an inordinate amount of time dealing with potential (or existing) clients who have questions about incorrect information they read here and elsewhere. Some of these are Dan's posts. Additionally, and this was in a private email thread on another forum, Dan accused us of being irresponsible to our customers because we scanned one reel of a film for a collector, instead of the whole film. This was the customer's choice, because of budget. His assertion was that we shouldn't accept work like that if we're not scanning the whole film, which is patently ridiculous. The customer only sent us one reel. Are we supposed to refuse to scan that on principal because it's an incomplete film? many of our customers engaged in long term restoration projects will send us a reel at a time, as they can afford to, and will work with what they have in the mean time, sometimes over many years. A lot of these projects are unpaid labors of love and take a long time to complete because of the costs and effort involved. This is an *extremely* common scenario. He had been conversing with this client privately, and relaying information to me without telling me what the film was or who the client was, telling me the client felt "ripped off." (the customer never contacted us with any complaints). His justification for this? The ProRes files seemed too small, indicating the customer didn't get the resolution they requested. They did. It was a 4k Scan to ProRes 422HQ with HDR. In fact, the files were smaller because it was ProRes 422HQ, vs 4444, and they are, well, smaller. But it's exactly the specifications the customer asked for. He also accused us of not scanning the film in focus, telling me and the customer that we scanned it "too fast" causing focus issues (a completely cockamamie idea. It just doesn't work that way), and then that we should have focused on the edges of the perforations, instead of the image (this is not how you do it and itself will result in an out of focus image), but he also told our customer as much, sowing confusion and doubt. The print in question was of low quality, and was a reduction from 35mm to 16mm, several generations removed from the original. The optical reduction was slightly out of focus - you could clearly see the film grain in our scan, indicating that we were in focus on the film. That, as you well know, is the only way to ensure it's in focus. Meanwhile, I heard from that client, and we explained why everything was the way it was, and that seemed to end amicably enough that he asked for a quote on a new job. But it cost me half a day of digging through old files, emails and order forms, and a fair amount of stress, to ensure that we didn't screw something up. Then I had to explain to the customer the litany of things Dan was incorrect about. Unfortunately, it's hard to ignore the pathologically uninformed, who post stuff about you and about things they have no direct knowledge of on the internet, as if they're the expert. Because once bad information is up there, it's up there for good.
  21. And you are wrong. You are inferring that the tool is designed to eliminate post production, which is not what they're saying. Nobody is forcing you to use this feature, but it's there if you want it, and they will tell you straight up that you shouldn't use it for master scans. Again, as you say above, people know this already because it's an archival scanner, and grading in-scanner isn't how you're supposed to use it for master scans. So you're contradicting yourself here. If anyone's being contrived in their argument, it's you. I'd love to know what you think it explains, having never seen this particular reel of film before. In any case, you seem to live in a world based on assumptions, feelings and opinions, not facts. I wonder how that's working out for you? 1) 2k scan, SDR, done in about 5 minutes from an incredibly crappy reel of film that's badly faded. It's never going to look better than that, even if it's graded from the flat scan because there is virtually no color left on the film. The intention is to show that the feature works as advertised, which it does. 2) @Daniel D. Teoli Jr.: This is in part why we don't post example scans, because every last "expert" on the internet will pick it apart without knowing anything or having seen the source material. It's not worth the time or effort to have to argue with every last yahoo on the internet. (and yet, here I am). Dan, honestly, what is your problem, man? Why do you insist on going on and on about stuff you so obviously know nothing about, even when multiple people prove you wrong time and time again? I simply don't understand what your deal is.
  22. Motors do not run at perfectly consistent speeds and can speed up or slow down in small or large amounts while running due to a variety of factors. A rudimentary way to adjust for minor fluctuations is to use a flywheel. Fluctuations in speed result in wow and flutter, which affect the sound quality by causing variations in pitch. The Cintel scanner gets around this by using an encoder on the captstan, which is used to measure the speed fluctuations and to compensate for them digitally. The Lasergraphics ScanStation uses a digital camera to take a picture of the soundtrack, and it also correlates the audio samples captured by the camera to the encoder's readings, to eliminate wow and flutter. FilmFabriek could use a stepper motor running at real time speeds to capture audio, in a second pass. With a simple microcontroller, a closed loop stepper can effectively maintain a consistent speed. This is how the Magnatech dubbers do it, and those are early stepper motors. With a newer motor that can microstep you can get it moving very smoothly. Better. If you are not using the noise reduction feature in the Lasergraphics hardware reader, you are capturing all of the audio signal plus the randomness of the film grain and this randomness is the hiss that Dan is referring to. That hiss is there, part of the film. If you look at those spectrographs I posted, you'll see that most (but not all) of the signal is below 6kHz. So if you apply a low-pass filter at 6kHz, you are lopping off everything above 6kHz. Because we perceive white noise as louder when it's in the higher frequencies, removig it results in a more pleasing sound. However, it's also cutting off any valid signal above 6kHz that might be on that soundtrack. So you have two options: one is to capture with the noise reduction feature on, because that does grain removal on the picture of the soundtrack *before* that picture is converted to an audio signal. This removes the hiss and keeps the sound, even above 6kHz in the case of 16mm. Or, you can remove the hiss in a second pass in an audio workstation where you can fine-tune the noise reduction using appropriate tools. We've been doing a bunch of tests internally, and I have to say I'm pretty impressed with the Lasergraphics noise reduction. It only works with variable area tracks, but we will likely be using this going forward as the end result retains the underlying signal, which is harder to get to in post (though not that hard). This is incorrect. The job of a film scanner is to capture the film as faithfully as possible so you can post-process the image and sound later. What you're describing is a telecine, which is old-school and not used much anymore. In a Telecine, you are capturing the film to an analog or digital format (traditionally tape, but can also be files with the right setup), while applying color correction and audio sweetening during the transfer. this is done in real time and the end result is the final product. But, any decisions made are permanent, most telecines only work up to HD resolution, and the amount of equipment required to make it all work is why it was an expensive process, costing typically hundreds of dollars per hour because you're paying for the use of millions of dollars in equipment (telecine, color correction system, specially designed room, all the other computer and video hardware involved, the maintenance of that hardware, and so on), as well as the operator who is running it all and doing the color correction work.
  23. I just need to address this specifically because it's been bothering me all weekend. Let's break down Lasergraphics' text, shall we? "Built in color grading tools for easy dye fade correction": This exists, and works to the extent that many people require. "applied during scan, eliminates secondary post processing step": This is also true. If your goal is to make an access copy of a print that has corrected for dye fade, you can do that and it saves a step. If, however, your goal is to make an archival scan of the film, then this is the wrong approach. But this gets back to something we've discussed before: The person operating the machine has to know what they're doing and if you're using the scanner to make archival scans with grading done in the scanner, I would submit that you don't know what you're doing and should do a little bit of reading about how this is all done first before proceeding. The dye fade correction tools are there primarily as a convenience and anyone who is spending $50,000 - $200,000 on a film scanner ought to know that. Even Lasergraphics will tell you this and they will tell you that you shouldn't be relying on their grading tools for mastering work. They are rudimentary controls and there's no way to properly monitor the image because it's all being displayed on a computer screen that is subject to Windows' color management. Basically, don't do that. If you know how to grade footage, you know this is not how it's done. This feature is there for people who want to make a quick access copy. We don't use it, but could, and it works fairly well if the film is consistent (such as a print made from one element with uniform fading). It does not work well if the film is spliced together from multiple sources. We often make a secondary MP4 file alongside a master flat scan, with a one-light grade done in the scanner for the access copy, though we do that one-light manually, arriving at an average grade that doesn't clip or crush anything, even if there are multiple sources. That wouldn't work with this tool, but one should know that before using it. This morning we captured 30 seconds from a short dye-faded 16mm print and I just slapped this video together. The HD version should be done processing soon, but you get the idea even if you're limited to SD playback from YouTube. I am no longer waiting. BMD confirmed on their forum that the soundtrack reader in the cintel is essentially what you get in a projector: a red light source with a photocell. There is no camera there. They are compensating for wow and flutter by monitoring the capstan with an encoder.
  24. Daniel, we last scanned film for you 4 years ago and haven't spoken other than via these forums, since. So I don't recall what you're referring to. That being said, we don't put up example scans for the following reasons: To do it properly you need to post, at minimum, ProRes files, or something uncompressed. These need to be viewed on the local machine and *not* streamed. If you put stuff on YouTube or Vimeo, the compression they apply to make it streamable is out of your hands and there's no guarantee it looks correct. Also, the compression applied will in many cases destroy the film grain, so it's not an accurate representation. Every reel of film is different. Posting a test scan from one reel is not representative of how another will look, because there are too many factors involved (type of film, number of generations from the original, damage to the film, fading, quality of the print or intermediate, sharpness of the original footage, etc). Back in the day, when I worked for a company that made one of the early computer-based nonlinear edit systems, we participated in several "shootouts" of these systems. Ours against all the others, using the same footage. Mostly it was about the quality of the picture, which was often coming from an analog tape source like BetacamSP. So you could objectively test, because you were looking for things like the level of compression for the same size image, etc. The people doing the tests were objective third parties. I would be happy to have someone who knows what they're talking about (not Dan Baxter) send us a reel to scan, with specific parameters, if it's going to be compared with other scanners and presented objectively. This means each scan should have the same cropping, should be scanned to the same format, and should be scanned in similar ways (eg: all flat scans). Then someone who knows what they're doing should take those files into a color grading system to look at them a bit deeper and see what they have to work with: push the grade as hard as you can to see where the image falls apart, stuff like that. I'm only willing to do this if it's someone credible doing the work though, because it's too easy (as you can see from the Fleugicker paper you posted in another thread) for someone's assumptions about what's happening to color their conclusions. That being said, I think you'll find that it's impossible to truly, objectively compare these scanners. They all have their strengths and weaknesses. Some are optimized for negative, some for print. They're all going to have different features, so you may not be able to do some tests on certain machines that you can on another, and so on.
  25. I have two working eyes and a brain. And I also have email and text message confirmation from Stefan, the CEO of Lasergraphics and Steve, the owner of Galileo Digital, the reseller that it is either/or. You cannot have both, it is one or the other. Ok, whatever you say. The record on this is pretty clear to anyone who wants to search this forum that this isn't true. I don't have time to dig up every claim but you have in fact said that the Archivist has a hardware optical reader, more than once. No, your guessing wrong. We don't have it because we have the hardware reader. The software reader is software. It's part of the application and does not require a license, as it's part of the base module application. If you have a hardware reader, the software reader is disabled You get one or the other. Again, confirmed with Stefan and Steve separately. Why is this so hard to understand? It has nothing to do with model year of the scanner as it's purely a software option, and it is Lasergraphics' decision to give you one or the other. No, this is not how Lasergraphics does it's software updates. If it was there but not licensed, it would say as much in the license window where that stuff is listed. It does not. If you pay for a support contract you get the same software that runs on machines shipping from the factory today. The only features that are not necessarily enabled, are hardware based. For example, the original machines didn't have variable tension adjustment for the dancer arms. Without a hardware update to add that feature, you cannot use those settings in the software. But they're there. You are the one making unsubstantiated claims in a public forum. I'm not going to message you privately about this. You are spreading misinformation about something you do not have direct experience with, and in many cases you're making assumptions based on incorrect guesses as to how things work. Your claims are not credible, IMO, and I wish you would simply stop talking about things you don't know about.
×
×
  • Create New...