Jump to content

Landon D. Parks

Basic Member
  • Posts

    1,924
  • Joined

  • Last visited

Everything posted by Landon D. Parks

  1. While I don't work on Premier Pro much any longer - having switched almost exclusively to Resolve and Fusion studio, I don't recall having any stability or export issues with Premiere on any of their builds - inlcuding the latest, which I used to edit, color and export a commercial just a few weeks ago. I run an AMD Threadripper 32-thread and dual 1080ti system with 64GB ram.
  2. Super 8? That is pretty much a downgrade to any HD digital footage. I suppose if you can deal with the extreme grain, flicker - and well, lack of good low-light - then by all means. Super 8 has pretty much always been the 'home video' format for a reason - while it does achieve the film look, the overall appearance of super 8 is extremely amateur, and your paying for that film look with a lack of resolution and audio-sync nightmares. Personally, I'd say no to super-8, but to each there own. If I was gonna go through the trouble of investing in a film-based workflow, I'd probably just save up and adopt super-16 at the minimum.
  3. If you already own a lot of m43 support equipment, look into the GH5s. It's got a 10mp sensor, which means it should have some of the best low-light performance in a non-full frame camera.
  4. Personally, I don't know. I think they are assuming that you are storing your media on the same drive, in which case even 512GB is too small. The installer for Resolve is less than 2GB in size, as is the programs total install size. The one issue you might run unto is media cache / render preview cache though, but I have that going to a separate drive anyway - you just need to set it up in the preferences menu. By default, Resolve will attempt to render temporary cache files to your program drive - which could also be why they say 512 minimum. In your case, just make sure you tell Resolve to render the cache and previews to your other media drive - otherwise it'll eat up your OS/Program drive really quick. I current operate with a total of 5 internal drives, and a NAS (networked attached storage) system. Those are: Drive 1: 256GB SSD | Operating System and Program Installation (C:/) Drive 2: 2TB 7200rpm HDD | Non-media related files, like pictures and documents. Drive 3: 2TB 7200rpm HDD | Raid 1 with Drive 2 for backup of computer files. Drive 4: 1TB SSD | Game installation files that need quick boot-up (most often played Steam games) Drive 5: 2TB 7200rpm HDD | Other game installation files that are not boot-time sensitive, or that I play less often. I use to have an additional internal 12TB running Raid 5, but last week I upgraded to NAS. That setup is the Synology 8-Bay NAS system, which runs a total of 32TB of storage space, which is running Raid5, which gives me 28TB of NAS attached media storage. That is where I point all my imports and exports for media.
  5. I ran Resolve on a 240GB SSD drive along with my OS and a host of other programs. Keep in mind I never actually stored any project files or on that drive though. In general, it is a good idea to run your media off another drive, but as long as you have enough storage space for the programs and OS, and then extra space for the media - go for it.
  6. I can't imagine trying to shoot something on my phone. Seems like you'd have to rig the thing way out to even make it usable, and then you're stuck with the inferior sensor. I understand the whole 'use what you have' thing, but I find it odd that a filmmaker this established would see the need to use an iPhone - unless it was just for publicity.
  7. That is a nice idea, but 5/70 is not common, nor has it ever been. The 65 and to a greater extent, IMAX film format, is huge, requires bulky cameras, and is expensive. The sheer cost of these formats has meant that only rarely was a film ever shot on anything other than 35, and to my knowledge photo-chemical IMAX has not even been done for a feature length film. So as nice as 500 5/70 screens would be, there would be little financial incentive to make that happen, as there just isn't enough content to make it feasible. It's like suggesting theaters should upgrade to 8k projectors - why? there is no content. In some alternate universe the idea is nice, but 5/70 has never been, nor will it ever be, a standard format in this universe. The reason 65/70 formats are only concentrated in metro areas is because the vast majority of people don't care about 65/70 vs 35. If you don't place your 65/70 screen in a large enough metro area, the audience willing to pay the extra amount will just not show up. Larger formats are expensive, and require higher ticket prices -- which people don't want to pay unless the whole experience is worth it. A simple 65/70 upgrade alone brings little to the table. I saw the Hateful 8 in 70mm in Cleveland, and honestly didn't find it that impressive. The mystic around the format and the 'road tour' was what attracted most of the people I saw it with, not the format itself. If 70 became standard, it would quickly loose its cool factor. And 3D TV sales and 3D movie tickets were also all the rage, for a couple of years until people realized it was a fad. Niche markets almost always experience a 'resurgence' for a short period of time, that does not bring it out of the niche market. Yes, sales figures are nice and all, but the overall number of people who own record players is still exorbitantly small, almost to the point of being unimportant, and the sales of digital audio capable devices obviously far outsell record players. A digital music file can be played on many things - phones, MP3 players, computers, tablets, laptops, Car stereo players... Records can only be played on record players, and usually only at home. So while records were making a comeback (I haven't seen any recent data that suggests this trend is still happening, and only Sony has started making LP available again - and still not in any large numbers), that does not mean they are not a niche market. I also can find no evidence that in 2015, record players were the number one selling home electronic item. Am I led to believe that record players outsold TVs? DVD players? Hell, I'd be really shocked is record players outsold Bluray players. I'd love to see the evidence of that.
  8. Funny thing is, the wide-angle lens isn't even an excuse... They make lens adapters, even for the iPhone. I don't think it was a technology issue, I think he wanted to shot it that way. In other news though, my Galaxy S8 has a lot better image quality than this.
  9. Rather film 'looks' better or not, the format is no longer common place. It could be argued that vinyl sounds better than CD's, but that doesn't mean the world is going to suddenly give up their iPods and CD players for an old record player. Film is nostalgia, and there will always be those who decry new technology as inferior - just look at vinyl purists - but the reality is, it doesn't matter --- the vast majority of the world switched to CD's and downloads, making vinyl an extremely small niche market. This is the same thing that is happening to photo-chemical film. In this world, Humans will always try to find more convenient ways to do things, and digital has done that. No longer are you shooting blind, waiting for dailies to process, etc. Many still photographers gave up film long before the cinema world did. Perhaps if the audiences threw a bigger fit over the switch, things might have stalled or slowed down; instead, the audience were mostly indifferent. Why is this? Because while film and digital purist can usually tell the difference , modern professional digital cameras produce images almost identical to their film counterparts, and 99.5% of the world are not purist enough to see the difference. As for an iPhone, I don't know - I'm sure the image is okay. Honestly, I have never seen anyone attempt anything serious on one.
  10. I do a lot of green-screen type work, and can say that I have never seen a need to shoot in anything other than 24fps and at a 180 degree shutter. Then again, I'm no fan of wild camera pans and such, favoring slower movements - so motion blur has never been a problem. Ultimately if you are planning to do fast camera pans or other sudden camera movement, the first place I would look is to bring the shutter down rather than mess with the frame rate - that is just my opinion though. Trying to key blurred footage is bad enough, trying to track it is a nightmare.
  11. There is no need to shoot in frame rates above 24p for visual effects. 'Bluescreen' movies are still shot at 24p, unless your Peter Jackson and you are working on The Hobbit. Motion blur is rarely a problem, but you do need to take extra care when filming that you try to avoid REALLY bad motion blue.
  12. I use filmemporium.com for equipment and E&O. Never had to make a claim with them yet, but they appear nice people and a lot of people work with them. For production insurance, I usually take out a short-term police from theeventhelper.com, which has a $1/$2 million general liability and property damage policy for filmmakers. I had to make a claim with eventhelper once, and it was smooth.
  13. No, you can't compare a $65,000 camera with a $6,000 camera. I wasn't. I was saying that Arri is still the go-to camera for a reason. It produces an amazing cinematic looking image, work well with proper film accessories, and works well in post. I thought you like more soft, filmic images? That is exactly what the Arri does. The 3K upscale is what makes the image look so filmic, when combined with Arri's color science. 6k down-sampled to 4k is extremely sharp, which sounds to me like the opposite of what you have said you liked in previous posts. You make multiple claims that even 4K is too sharp, so I fail to see how 6k down-sampled to 4k is going to look less sharp. I simply find it hard to believe that someone who values the film look so much would prefer to work with Red over Arri. Red images, while looking nice, look like digital camera footage. They are sharp, and don't share Arri's very film-like color science. Red images really don't look filmic at all, and look on par with my GH4 with more DR if anything. As for the 'please name a sub' questions - I'm not going to, because they don't exist. I'm not comparing the URSA to an Arri, or a Red, or anything else. I'm simply saying that Arri is still the go-to camera system for award-winning and stunning looking digital cinematography - and Blackmagic is not. It has nothing to do with price. There is no way a $6,000 camera is going to be on the build-quality of a $65,000 camera, which is another reason Blackmagic has not taken over the industry. Arri cameras work reliably. Not saying that BMD has a lot of issues, but they not known to make cameras that never experience a hiccup or have missing features. As someone else on here said, they are primarily aimed at small-time owner-operators who work in corporate, music video, documentary and such. BMD did not design the cameras to be used on a well-funded feature film set. If they did, they would have priced it accordingly.
  14. The best of the best in terms of digital, right now, is still Arri. There is a reason its the most used camera system at this point for real productions. Blackmagic, even the URSA Mini Pro, is still a low-cost camera, which means many corners are being cut. The image is good from the URSA, but the build quality of Blackmagic in general leaves a lot to be desired, and they are still not as reliable as Arri. As Robin said, if you're shooting your own stuff, a Blackmagic is fine --- I would not use one on a set where I could afford a real camera rental.
  15. Come on Samuel, you didn't get the memo? I always win. Like death and taxes, it's just a fact of life. :lol:
  16. I'm not sure how many people 'live in their own world'... I spend roughly 12-14 hours a day - every day - stuck behind my computer. It's a side effect of having a post-production business, a book publishing business, and have a deep interest in designing and playing games, and writing books. I literally spend most of my waking time on the computer. If Windows had half the problems you say it does, I'd certainly have ran into them already. My computer is internet connected, shares files from authors and publishing houses around the world, files are sent from authors to me and back again. I download probably 5GB worth of data every day, and send nearly as much out. Chrome scans any files you 'download', including EXE files --- but the thing here is, don't download files or programs that you don't know about, and they won't infect your computer. Period. User error 100% here. If you go downloading the .doc file Princess Umba from Nigeria sends you, you deserve a virus. Period. I'm guessing your statement was probably a 'slight' at me, assuming I live in my 'own little world' over here and as such, can't possibly know what I'm talking about. Maybe it wasn't, and I don't really care - but let me tell you, I'm FAR from someown who lives in his own little world. I share more files, download and upload more crap, than 95% of other users - PC or MAC. But you never have owned but a couple of Windows computers, and most of them were XP and before? The problem with only 'supporting' Windows is that you only ever see the bad in it, or what you perceive as bad. If I was a Mac technician, yet never used a Mac in daily use - I'd despise Mac for being unreliable... I mean, heaven forbid the must be bad - all these people bringing them in to get repaired.... I also question how you support Windows PC's your entire life... Are you saying you have worked in IT? Have you worked for PC repair shops? Have you built PC's? If you can't answer yes to each one of these, then I don't consider the person to be well-versed in Windows OS. Personally, I hold an A+Certification and a CCNP certificate. I worked at a PC repair shop for 5 years, and have been repairing and building PC's for myself and others since I was 12 - going on 19 years now. I worked in IT at a call center that ran around 80 Windows PC in Windows server configuration. I'm certified in PC repair and also in networking from Cisco. Believe it or not, I know what I'm talking about here. I have literally seen more computers in my time than I care to admit, and while they are not perfect, the doom-and-gloom you paint with Windows is just not there. It never has been there for the most part, and most of it is fear-mongering from Apple and their hardcore fan base, which borders on being a cult. Again, 5 years experience in a PC repair shop. 19 years building and repairing Windows PC's. 19 years of using Windows computers, for multiple different tasks, many hours a day... Cisco CCNP certificate. A+ certificate. I am a PC professional - and I don't agree at all with these 'experts' you mention. Chalking it up to 'just Windows' can mean any numbers of things. Even I have been known to say that before, but that is not because Windows is bad - its because we have came to know the quarks of Windows over the years, and when we see anything - good or bad - we chalk it up to 'just a Windows thing'. I don't really want to get into an argument with you about Android vs iOS - but since you brought it up, here goes. So, you first say that Android is designed as a toy for 5 year olds, then go on to explain how it has smaller buttons and requires a lot more reading.... THEN, you go on to explain the simpleness of iOS, including the big flashy buttons, and somehow the Android is the one designed for 5 year old? You kinda defeating your own argument here. However, I'll indulge you: Have you ever actually used android? I'm guessing either no, or your just trying to get dramatic about how bad Android is compared to iOS. First, The size of the android grid can be adjusted, which adjusts the icon sizes. If I really wanted 'kid-sized iOS' icons, I can do that on my Android. Second, I'm not sure what 'reading' you need to do to use Android... If you want to change a setting, you literally 'swipe down from the top' to open the settings bar. From there, you can adjust display properties, sounds, etc - JUST LIKE iOS. Third, what sounds the phone plays are fully adjustable - and it doesn't require an reading to make it happen - you simply swipe down from the top, click the gear, click sound, and change the sound to whatever you want - or turn it off. I could literally go on and on all day about how easy Android is to use. The reality is though, I have the last generation iPod touch as well - so I know exactly how iOS works as well, and neither one is simpler than the other. Both do the same thing, and both have similar settings menus. I never said it was simple. In fact, I said building a Hackintosh is exceedingly hard. I said this just a few posts ago, when I mentioned that OSX just flat out refuses to work with most hardware that is not already in Apple computers. I simply said you can, which you can.
  17. You CAN install OSX on PC hardware --- it's called a Hackintosh. You just have to make sure the hardware you use is right. For that matter, you can install Windows on Mac-based hardware as well... I still don't see why anyone needs ProRes... The DNX codec from Avid is the same size and has the same quality, and is just as portable between Mac and PC. And your PC can play and decode ProRes, and can even write it (if you need it for a deliverable) as well with a special program (which would be less work in my mind than trying to run a dual-boot OS). And yes, Windows 10 is the way to go. Windows 7 is no longer supported.
  18. I actually do a lot more than just film industry stuff --- I also run a company that publishes books for self-published authors as well, and I actually have a Macbook Pro from around 2012 that I bought off a friend about a year ago. So I do have some experience with Mac - but not much. It's mostly used to transcode book files and audiobook files for iTunes store, and for use in the rare cases when I need to make or work on an iOs app, since Apple won't let you do anything on Windows in terms of design, coding, or trancoding. I'm not impressed with the performance of it, and I'd actually say lets about on par power-wise as my 2013-era Acer Laptop, which cost like $200 at the time. But it works, and I don't need it for much, so I'm not complaining. I don't mean to come off as 'pushing the Windows OS' on people. I understand that all operating systems have their issues, and each is appropriate for different people. However, I will defend Windows to death when it comes to people making misleading statements about reliability and usability. If you are choosing between a Mac and Windows PC, you should not be basing your decision on which one is more reliable - that is not important, and ignores the other factors that need to go into your decision. A Mac and Windows PC are two different beasts, and are really designed for two different mindsets. If you purchase the wrong one, you're not gonna be happy. Period. Apple hardware and OSX clearly work for Tyler, and he likes it - which is fine. I know many people who love and swear-by Apple products. At the same time, I know people (myself included) that would be lost with the hand-holding OSX does with its users. My biggest gripe though comes from the way in which many pro-Mac people make it sound as if a Mac computer is the only thing you could ever use that will work reliability. It's not just Tyler, I have a couple of friends who argue with me all the time about how their Macs are superior to my PC (even when they clearly are not in their cases). The way in which many pro-Mac people argue makes it appear as if they own a large amount of stock in Apple, and have a financial stake on people buying Apple hardware and and software... I don't have the same loyalty to Windows, nor do many other Windows users I know. We use it because it works, allows us to customize as well need, allows an infinite array of hardware choices, and doesn't try to hold our hand. It just works, and it's what we are use to. That will be my last post on the Apple vs Microsoft debate. Bottom line: do your research before buying one over the other, and never take my word (or Tyler's word) alone on which one to choose.
  19. 1. 64-bit ProRes is nice, but that is only if you require working in ProRes. DNx works on 64bit on both Windows and Mac, which makes it superior - and a more widely support format - in my mind. Yes it might not play back as smoothly on a Mac as ProRes, but ProRes on a PC is 32bit, and my 64bit DNx is going to play back faster. I can also get Quicktime pro for my PC as well, but I don't know of anything it does that my other programs don't. 2. FCPX is not a good program, and the programs like compressor and motion don't come standard on the Mac - you you to buy them. Media Encoder and After Effects do the same thing, and they are actually updated programs unlike the half-abandoned Mac programs. 3. Google Chrome can give me real time flight info, track currency conversions, etc. with a simple written command. Yes, it's one extra step, but then again I don't regularly track flights or convert currency, so if it needs to happen once every 3 months, I can be bothered to open Chrome. It won't kill me. 4. Windows has the ability to run on speech commands built in. It also has text to speech. Not sure about speech to text, since I don't use that - but consider that is a fairly basic thing, I imagine it has that as well. 5. Microsoft Edge works perfectly fine, and is built-in. The days of the bad 'Internet Explorer' are over. But my question is, why use a built-in browser? Chrome is much better and more feature-filled than any built-in browser, Safari included. 6. I believe Windows has this as well. With your Microsoft account, it syncs all this stuff with OneDrive. Don't use it, so can't say how it works. But it's there. 7. I'm not sure what 'pooping' means in terms of programs, but I take it you mean the way Windows installs programs. I think some people imagine this evil Windows machine throwing files all over the place. It doesn't. Most programs install everything into a single folder in the C:/Programs folder. They when then usually write a registry file. Some programs that use external things will create folders in your My Documents section as well. HOWEVER, to get rid of all of that, all you have to do is uninstall the program, and it takes everything with it. Mac also has a registry file that programs write to, and the programs are still stored on a separate folder away from your desktop, it just invisible to you --- and some programs will still create folders for your project files and such - even on a Mac. Lets also be honest, how many times do we uninstall a program? I mean yes - Mac makes its easy by simply dragging the icon into the trash can, but then again Windows doesn't require that many steps --- and no more than most people fully uninstall a program from their computer, the time savings on the Mac is pointless. Like a lot of OSX and iOs features, its 'fancy ware' that has little practical use besides being cool and snazzy.
  20. I'd say more like fortunately... I'd hate to see the low light performance on that sensor with 33.1 Million pixels on a 2/3". :blink:
  21. Come on, Tyler, really? Massive amounts of tinkering? I use to install Windows OS as part of my job, and I can say that this is totally untrue. The Windows 10 setup process does everything for you --- you only need to create a username and password and it sets it all up. Yes, you might need to install some drivers, but that depends on what hardware you are using. Windows automatically installs almost all the drivers you need, and the windows update process keeps those drivers up to date for you. The only drivers I have had to manually install on my build are the nVidia drivers and the Blackmagic Video software, which has the driver for my 4K mini-monitor card. I doubt even Mac knows to install that Blackmagic driver by itself, and the graphics drivers --- while I did have to install it once, was painless to do, and keeps itself updated automatically in the background. I have not encountered any driver compatibility issues, and like I said -- the two drivers I had to install were 'point and click' installs. I'm not trying to paint Windows as the best OS - it has its issues - but it does not require massive amounts of tinkering. In fact, it requires no tinkering whatsoever - this has been true at least back to Windows 7 / Vista, and even XP wasn't that hard to setup. If you want to tinker, you can --- you can overclock your CPU, RAM, and GPU to get more power from them like I have --- but you don't have to. The question I have is how you know so much about how hard Windows is to work with, when you claim you won't have any Microsoft product in your house? Are you basing this on old work you have done in the past?
  22. That is nice, but it's nothing my Windows PC can't do - The motherboard in my PC has 4x PCIe 3.0x16 slots, 1x PCIe 2.0x4, and 1x PCIe 2.0x1 slots. I can run 6 PCIe cards at the same time, plus an m.2 drive. AMD Threadripper also has 64 lanes of PCIe support, meaning it supports up to 7 full-power PCIe devices at the same time. If I wanted to, I could run 4 graphics cards in my case as well, and still run my Blackmagic card and my audio card. Of course, I have no need for 4 graphics cards - but the option to run 4 cards at full power is there if I ever needed it. The thing about Mac hardware that gets me is the cost - I am running 32 threads at 4.1GHz over 64-PCIe lanes with 64GB of RAM (could update that to 128GB if I wanted), and that processor cost me less than $800. The only Intel chip that comes close to that is the $2,000 Core i9. I don't even think Mac offers that chip in their decide options, so the best you're going to get on a new built is a 10-core - and you can't put an i9 in an old Mac either -- the socket is entirely different. Yes, you can customize Mac hardware to a degree --- but the problem is, once you start venturing outside of Apple-approved hardware, it might or might not even work. This is why Hackintosh's are so hard to get working. Apple supports only specific hardware.
  23. Digiprime's are not generally a good idea for such use... Aren't they designed for 2/3" prism sensors? You will see the black circle of death when trying to use a digiprime on a m4/3 sensor - period. I'm also not sure how the optics are different when designed for cameras with a prism... Also from the above video posted, it looks as if the Digiprimes are not producing a very good image quality on the GH4. I don't know if that is because of the lack of a prism, or if the lenses are just not sharp enough for 4K...
  24. This depends on the user, and what the user is doing and how they like to work. If you don't ever want to touch the 'workings' of the OS, then OSX is definitely a more user-friendly software. Although, nothing is saying you have to delve into the Windows registry - that is your choosing - and again hearkens back to user error. If you don't need access to the internal code of Windows, just don't mess around in the system settings. However, what if you want access to more computer-level features? If ALL you are doing is video editing and such, you probably don't need it - but if you are using custom-code for programs, designing games, writing computer code for software and such - having access to the registry and the internal OS settings is vital. It is also possible to tweak performance of your system when you have access to the internals - like overclocking, RAM management, etc. Personally, I wouldn't like a computer telling me what I can and can't have access to - it's to restricting for me; but then again I know what I'm doing within the system. Again, just because you don't want access to the Os internal stuff, doesn't mean I don't --- and I think this is where Windows and OSX users differ the most. That is why Apple computers are not a go-to for all professional work. There is a reason why coders and game designers and such use Windows - because Apple software does not allow you to make any adjustments to the OS. There is a reason that many server-based computers farms use Windows over Apple - while Apple has networking that is fairly user friendly, it doesn't have the same server-level support that Windows and to a greater extent Linux have. Visual effects is also heavily skewed toward Windows based PC's, which could also overlap with the small-time video editor if he is doing his own VFX work. There is a reason VFX is mostly a windows-based endeavor - as Mac hardware is not up to the snuff for the networking and graphics requirements of such use. I have noticed that in the professional design world, Macs have been popular with graphics designers and video editors and such, and not much beyond that. I liken that to the fact that neither use requires access to any part of the Os other than opening a program. As I have said before and I'll say it again: OSX is perfectly fine, as is Apple hardware. There is nothing 'bad' about it per-see. If you like Apple products, or you need a Mac, get one and use it. It'll serve you just fine. However, a Windows based PC will also serve you just as well if you don't need a Mac specifically. There is no software of any importance that will work on a Mac and not on a PC (FCPX exempted - but then that program is basically iMovie). Windows no longer has the problems it use to have with security issues and software bugs / systems failures. Windows 10 is a much more user-friendly version of Windows. I work on Windows-based PC all the time, and don't experience any of the problems people decry Windows for - like BSOD, hardware failures, and corrupt systems. I have also never had a virus on my PC. The bottom line is that what you should choose to buy for an computer system is based on your preference, the job you are doing, the software you need access to, and your budget. Don't use 'X system is unreliable so go with this one' mind-set, as its outdated and not giving you all the facts. OSX, Windows 10, and pretty much all recently Linux builds are stable, fast, and have no issues whatsoever that you won't run into with other OS's.
  25. I've had issues with Windows-based PC's, don't get me wrong. No piece of technology is perfect. However, I will say that the way in which computers 'stop working' or 'break' is usually the result of user error. I think this is why their PC's and Mac's are more reliable to some people, and not to others. Computers in general are subject to a lot of user error - Windows more so, since Mac 'hides' much of what you can do to tear up the computer. I'm very careful when using my computers. I always run antivirus programs, never install unknown or untrusted applications, and I take care of the computer by staying proactive. You need to run HDD checks to check drive health, you need to ensure your computer maintains good airflow and is free of dust, you need to install a temperature monitor to make sure temps are putting hardware at risk. If you don't monitor these things, then you won't know of potential problems until they become REAL problems. If I know in advance that my drive is failing a checkdisk, I can replace the drive. Otherwise, I end up loosing data to a sudden hard drive failure, and then blaming Windows for allowing it to happen. If I know that my graphics card is overheating, I can fix the problem (replace the heatsink or such) before it tears up the card - or I can wait til the GPU chip becomes so hot that it malfunctions. As someone who has worked on PC's, pretty much 95% of all issues I have seen have been a direct result of either user error, or user carelessness for their PC. Just like a car, it needs a checkup and tuneup every once in a while. If you don't change the oil, you're gonna blow the engine. I'm sure Mac's are subject to the same issue.
×
×
  • Create New...