Jump to content

8K TVs?!


Keith Walters

Recommended Posts

  • Premium Member

http://www.dailymail.co.uk/sciencetech/article-3237451/Sharp-sell-8K-TV-October-Screen-produce-images-clear-appear-3D-ll-need-133-000-one.html

 

I mean, do we really need 8K TVs?

I think the manufacturers have gone into pixel overload.

 

The reality is, most digital TV currently being transmitted in the world today is either 576i/480i with an effective pixel resolution of about 650 x 350, and often less, depending on how much compression the broadcasters think they can get away with. This is basically the resolution standard the NTSC recommended in 1941!

 

HD when it is transmitted is either 720p (1280 x 720) or 1080i (1920 x 1080 interlaced, which has about the same effective resolution as 720p). As far as I'm aware, nobody transmits 1080p, at least not ion a regular basis.

 

From a standing start in about 2000, the TV manufacturing industry went from a rag-tag assortment of panel resolutions (often just 1024 x 768), before finaly settling down to a mixture of 1366 x 768 ("HD") and 1920 x 1080 ("Full HD") by about 2010.

 

4K screens made their debut a couple of years ago, but rapidly migrated to the "clearance" areas, replacing the 3-D "Next Big Thing" models that preceded them.

 

My Samsung Galaxy S5 phone shoots 4K video, and it really is 4K. You can zoom in on details that are not readable in the normal 1920 x 1080 screen resolution, (like the "Sci Fi" versions you see on CSI and other overly-imaginitive and reality-decoupled cop operas). That is actually quite useful, since you can pull quite respectable stills (about 12 megapixel) from the video stream. Naturally a major limiting factor is the tiny lens, but it's amazing for what it is. The two main killers are the 1GB/minute memory requirements, and the fact that it really guzzles the battery power. The video is also a bit jerky on fast moving scenes, and generally everything works better if you throttle back to 1920 x 1080

 

But why the S5 particularly needs a full HD screen is another matter, since I can only just discern the individual pixels.even with a powerful magnifying glass. (You need a microscope to see them clearly, and it's interesting that the OLEDs are tiny dots with a large amount of clear space around them, so it would be possible to actually make the "see through" monitor screens currently beloved of the aforesaid reality-disconnected cop shows :rolleyes:)

 

Meanwhile, not to be outdone, the current Galaxy S6 screen is 2560 ×1440 pixels; not 4K (yet!), but watch this space!

 

So, after the complete crash and burn of 4K TVs, why do they think anybody is going to be interested in 8K?

 

As it is, I know a few people with 4K TVs, and the only source of 4K video they have had (apart from the nauseating sample footage the manufacturers supply) was what I'd given them from my phone. The only software I have that can edit 4K is the freeware AVIDemux app, which can only cut and splice footage, rather like cutting film.

Link to comment
Share on other sites

  • Premium Member

Eventually I'll get a 4K TV to replace my 47" 1080P Sony Bravia but I'm in no hurry.

My Philips Blu-Ray player died a couple of months back, but since virtually the only thing anybody ever used it for was playing DVDs, I replaced it a $25 job from Target. Not only does it work just as well, it fires up instantly instead of spending two minutes trying to understand why it can't find a network or WiFi connection (I mean this has gotta be a mistake right? How can anybody NOT want to take advantage of all that BD+ has to offer??!)

Link to comment
Share on other sites

  • Premium Member

I absolutely LOVE watching movies on blu-ray.

So do I. But just about all the movies we watch are rentals, and if you get the Blu-Ray version, you could only play it on that one TV, whereas with DVDs you have a larger choice of viewing areas. Sometimes I get the Blu-Ray as well, but nobody in my household seems to be able to either tell the difference, or care.

 

Anyway I haven't thrown the Blu-Ray player out yet. I might be able to fix it, but they're so cheap now it's probably not worth the bother.

 

Remember that there were once multi-Billion dollar industries based on 78RPM records and VHS videotape. Both formats took an extraordinarliy long time to die even when vastly better alternatives became available....

 

Just sayin....

Link to comment
Share on other sites

  • Premium Member

I remember when DV came out, people HAD to shoot on DV, if you didn't own a DV camcorder you were screwed.

I remember when HDV came out, those DV camcorder guys all had to upgrade because everything had to be shot on HDV.

I remember when RED hit the market, OMG if you tried to shoot anything else but RED, you were an idiot

I remember when 2k hit the market, if you weren't shooting and finishing in 2k you were worthless.

I remember the first 4k finish, it was a huge deal in 2004 and today (10 years later) most films aren't finishing or distributing in 4k.

 

So in my eyes, isn't the first step to send 2k progressive scan material to the home? Currently there isn't a single service that does this and there is a good reason why. The standards we've set in the US are fixed, they aren't dynamic like the Japanese system. We stuck to a rudimentary and archaic MPEG 2 system that was conceived almost 10 years prior to it's official rollout. At the time, it was the easiest/best method to upgrade broadcasters and of course, consumers to a system that was proven. Broadcasters spent billions upgrading, that's billions with a big B. This was great for equipment manufacturers who were on the verge of going out of business at the time. Force everyone who watches content to buy a new television and force all broadcasters to build new studio's, the switch from Analog SD to Digital HD would put many broadcasters out of business.

 

So here we are, next year will mark two decades of digital HD broadcast in the US, yet NOTHING has changed. We still broadcast a substandard 1080i/60 or 720p/59.94 signals, mixed with 525i for people with square televisions and converters. To make the leap from 1920X1080i to 2048X1080 progressive, would require a complete re-working of the current system. It's absolutely insane to think the jump to 4k will ever happen on full-time terrestrial broadcast television full-time for one key reason; cost to upgrade. For big sporting events, I can see this happening and special decoder boxes sold to those individuals who want that extra quality for a premium, but for normal every day broadcast, no way. The current workflow is pretty much set in the 1080i world and won't change because there is no reason to change, there is no mandate for updating.

 

Now, internet is another way to get 2k or higher signals into the home for high resolution monitors. However, one thing that people don't realize about the internet is that, the more people join, the slower it gets. We need trillions of dollars to be spent on infrastructure before we can stream even 2k to the home with any fluidity. Sure, we're seeing fiber come to the home, but switches are getting overloaded, the OC lines that run across the country are too packed and unfortunately consumer data doesn't get the bandwidth allocation commercial data gets. So until that infrastructure is improved, until there are more servers capable of streaming high bandwidth content, we won't see any changes. The only solution is to download the entire movie first and then watch it, but even that is very time consuming on high-speed internet and doesn't deliver anything but 8 bit 4:2:0 ultra compressed signals.

 

We will probably all agree that content providers want high-resolution content in the home, simply because they can charge more. However, they're currently struggling to deliver high-resolution content to theaters, with high-end/expensive servers. Heck even IMAX the name in "quality" has resorted to 4k distribution as their mainstay because there is no 6k or 8k distribution protocol developed. Their double projection system is fully capable of doing 8k, but they don't because there is no way of getting it from the content makers to the theaters. There are also only 4 theaters in the world using that double projection system and it will be a few more decades before everyone has made the switch.

 

With all that said… the point of an 4k costumer grade monitor seems pretty pointless. The monitor has to take the 1080i signal and turn it into a progressive scan one, which immediately softens the image. Then it needs to scale it up and that process also softens the image. So a 4k monitor with 1080i signal, actually looks worse then a 1080i native monitor from 10 years ago. Sure, monitor technology has gotten better with LED backlight and now OLED, however we're not actually seeing higher quality at home, we're just allowing the MPEG noise to be more visible.

 

We're expecting Sony to announce a 4k BluRay disk (.h265) at CES in 2016. That could be the first big push towards 4k at home, but it will be at least a decade or so before acceptable content will be available, just like the move from DVD to BluRay, most original HD disks looked like crap because they used old transfers. Studio's aren't going to go back through their collection and make 4k disks, it's not going to happen anytime soon. In that case, what's the point of even contemplating upgrading when we're still so far away from a working solution.

 

By the way… .h265 is the first MPEG4 based format which can deliver 10 bit 444 signals to the home. So if 4k BluRay does happen and it is actually 10 bit 444, that in my opinion will be the biggest upgrade, rather then just resolution.

Link to comment
Share on other sites

I've been using a 4K TV at home (also my studio) for about nine months now. I just got it for use as a computer screen - in particular for working on 4K scans of film material. But I otherwise use it for whatever: watching TV, you tube, etc.

 

A job I worked on last year involved writing the software for a one off custom screen that was 12K wide !! Required 3 computers running in parallel to drive it. It was for a real time interactive computer animation, so it's not as if it needed a 12K camera. The great thing about code driven computer animation is that the code can generate any size image it wants. The only limit is the display hardware. The down side, of course, is that it's not photographic, ie. it's synthesised, but it was awesome to see the outcome anyway.

 

C

Edited by Carl Looper
Link to comment
Share on other sites

  • Premium Member

I find it interesting that the majority of reworking that would need to be done is the hardware on the broadcast side. Current coaxial copper lines running throughout the US can sustain data (and energy for that matter) transfer rates far exceeding the current standards. If the consumer devices can support it, maybe its time for the broadcasters to upgrade their equipment.

Link to comment
Share on other sites

When one discusses 'broadcast', one needs to remember that since the invention of TV signal specs, there has been a great amount of reluctance to just 'cut the cord' on the previous standard.

 

So, here in NTSC land we have the idiocy of 29.97 frame rate, because of the color burst interference with the audio. When ATSC came along, the max resolution was 1080p, but Interlace was required to be drug along.

 

PAL land may have avoided a few such oddities, but still current DVB-T is limited to 1080p.

 

Both ATSC and DVB have 4K 'in the works', but most broadcasters, that is entity that send over the air radio signals, probably won't be upgrading with any great rapidity... except....

 

One of the reasons, at least in the US, for upgrading to ATSC-3, which would give 4K potential, is not so broadcasters can send in 4K, but so that broadcasters can 'pack' more subchannels at lower resolutions, in to the same 'TV Band Channel'. In the US NTSC channels are 6 MHz, while DVB can go up to 8 MHz.

 

For the last 2-3 years there has been a major shift in 'broadcast'... the growth of personal Internet based 'viewing devices' has reached a point were 'broadcasters' have to serious rethink their systems from the ground up.

 

Over-the-air broadcasts have dominated the thinking, and cable only allowed the delivery of 'broadcasts' to remote regions. There are a number of archaic regulations about 'local' broadcasting vs. the Cable Companies, etc. But with the 'anytime, anywhere', demand growing because of the availability of these mobile/portable/fix-in-home devices, the model of a antenna setting on top of a mountain, with broadcasters 'in charge', will become far less important for many people.

 

Over-the-air broadcasts may not disappear, there is a growing trend to return to antennas for local TV programing, and 'cutting the cable cord', for other media. Netflix and Hulu were 'leaders' in the trend of mobile streaming. The content producers are now experimenting with their own 'direct to consumer' streaming services. While it could be if someone signed up for all these streaming services the monthly fees would be far greater than the consumers current cable bill... if they only sign up for one or two such services, the bill would be far less than the current cable packages.

 

Unlike things in the the Film film to Digital film transition... I don't hear anyone pining for their 'old' NTSC or PAL TV sets, RF interference or low signal noise included for free...

Edited by John E Clark
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...