Devices > Video Cards, Monitors, Televisions, and Projectors
4K TV's
glynor:
--- Quote from: 6233638 on November 14, 2013, 04:44:54 pm ---What do you think needs changed from what we have today? A good Blu-ray encode seems about as good as 1080p is likely to get on the consumer side of things.
--- End quote ---
I disagree. Only just over the past two years have hardware H264 decode blocks gotten truly competent, and even now, these are often limited to higher end GPUs and CPUs (the FPGA chips in TVs and set-top boxes are almost universally generations old). Display technology has massive color accuracy improvements they could incorporate.
Plus, and perhaps more importantly, there has been a revolution in capture and post-production processes. A few things have happened over the past few years:
1. The full frame DSLR used for "real production". Suddenly, high quality, fast glass is cost effective even for television productions and lower-budget movies, and we have a way to capture it as well. DSLRs (and DSLR-inspired tech in "pro" cameras) are used everywhere now in both big-budget and small screen productions.
2. Full high quality 2K production workflows, from end-to-end, only filtered across the "studio system" in the past 2-5 years. A big problem has long been the data rates for capture. We could build sensors (for lots of money) that would catch the light, but they'd dump out gigabits of data per second, and it is very difficult to get rugged, reliable, high-capacity, and removable storage that could capture at those kinds of data rates. Another huge problem has been aliasing (and aliasing-removing algorithms, of course).
B-Roll sticks around for a long time. Footage is constantly mish-mashed. And, you're constantly looking at footage from a variety of sources, shot under different conditions and with different constraints. It is only now that most editors are really starting to live in a world free of "bad capture" (well, bad capture due to technology anyway, says the cutter).
3. There have been dramatic improvements in rendering power and in software editing suite capabilities. Now we have 4K edit suites in our workstations, and disks that can actually feed the data fast enough to matter.
It isn't that 4K doesn't matter. It is that the 4K 4:4:4 (or, probably more commonly, 4:2:2) workflows are part of the system now, and they make better 1080p video. The problem is that this stuff is expensive and has a long shelf life. While the edit bay might have a nice system (or usually the visual effects artists), does the colorist? Does the editor on his or her laptop? Can we render it at sufficiently high quality? Will the studio (after all of the artists who made the thing are long-gone) screw up the BluRay transfer anyway?
As this stuff filters out, and as the DSPs, scalers, and decoders in your computer and BluRay player get faster and smarter, yes, we will absolutely see an increase in 2K quality over time. This will plateau, but I don't think it has yet. Whereas, 4K is immature. We're still at the point where $100k ARRI digital 4K cinema cameras can't record the content in RAW with media you actually might want to use on, say, a helicopter, and doesn't even have a sensor that can resolve full-4K in widescreen once you take into account the pixels needed for slop and the viewfinder and so on and so forth anyway.
Lower end cameras pushing the limits of what the sensor and glass can resolve and well... It makes really nice 2K footage, is what it does.
--- Quote from: 6233638 on November 14, 2013, 04:44:54 pm ---I'm really hoping that this returns, because I desperately want an Ultrawide 4K OLED TV.
--- End quote ---
OLED has a lot of promise, if they can get saturation and the grid under control. It is expensive, and difficult, to make good OLED displays. You can make high resolution ones, and you can make bright ones, but making color accurate ones is tough.
Still, the black levels can't be beat, and the display tech has the capability to easily reproduce a wide gamut, so it is promising. But doing it in production, with big displays, is hard.
Ultrawidescreen?
I don't know. The whole idea of widescreen is already periphery. But it doesn't matter... I really, really doubt we're going to see movement away from 16:9 anytime soon, en masse. There's way too much momentum behind it, and ultrawidescreen isn't "better enough" to be worth replacing everything, yet again.
--- Quote from: 6233638 on November 14, 2013, 04:44:54 pm ---I'm not sure that's actually possible now. I think most people are satisfied enough with how thin their TVs are, and things like contrast or color accuracy are of no interest to them. Maybe wide gamuts or high framerates will work, but a lot of people seem to be against high framerate content, and that's only been 48fps so far.
--- End quote ---
Sure it is, but maybe not in the way you think.
Same reason BluRay never took off quite like DVD did (and why people still now buy and rent a ton of DVDs). BluRay is better than DVD. In a variety of ways. But it isn't like the difference between VHS and DVD. Because the difference between VHS and DVD was also about the fact you didn't have to rewind. It was about chapters, and jumping directly into the part of the film you wanted to see. The differences between tape and digital, weren't just quality, they were convenience. And you don't get that again with the jump to BluRay.
But you do with the jump to Netflix/iTunes, in a way (but at much lower quality). Clearly, though, the market has largely spoken. Mark my words: there won't be another major optical disc based format. There may very well be improvements to the current format over time, and some companies will certainly try to make some kind of 4K optical format, but it'll do even worse than BluRay in the end.
But hardware makers won't just sit still. I do think we will see good quality 4K gear, and reasons to use it. I'm just not convinced that it is here, and if I was picking, I'd vastly prefer improved color accuracy. Once you get used to seeing the footage as it was shot, then the conversion to 4:2:0 just looks so flat, and then most of the TVs to an awful job displaying it, and gunk it up with all kinds of post-processing to hide their deficiencies.
But it is hard to sell, so who knows.
But I do know that getting content from an IP network is the future, and we are still so far behind having the data rates available to us that we need for even nice quality 2k content, much less 4K, in that realm.
glynor:
--- Quote from: 6233638 on November 14, 2013, 02:29:09 am ---Well, a lot of people sit too far from their TVs, or choose displays which are too small. But the distance at which 4K provides a benefit is further than many people expect. On many AV sites there's a bogus chart I see posted all the time, which says you need to sit much closer than you actually do.
--- End quote ---
--- Quote from: Sparks67 on November 14, 2013, 07:42:33 pm ---The chart is based off several years of studying the human eye. So, it is not bogus. Are you referring to this chart? http://carltonbale.com/does-4k-resolution-matter/ Well, if you plug in 70" then you get 4 feet, but I think his calculations is a bit wrong. Sharp recommends 6 feet on their new 70 inch 4k TV.
--- End quote ---
The problem with those charts, and there is a big problem, is that they're measuring the wrong thing.
They're, effectively, measuring the eye's capability to resolve individual details (of text, actually) in isolation. The goal of a large video display, for theater use anyway, is to trick the eye into thinking it is continuous tone. You don't want to be able to resolve the individual pixels, you want them to be much smaller than your eye's ability to resolve the fine detail. In other words, just because you can't read text at a certain size and distance, doesn't mean you don't see any difference between the letters at all. You can perceive detail, even if you couldn't read text there.
There is also not a linear relationship between the details you are able to resolve at 1 foot than those you are able to resolve at 20 feet, or 60, or 300.
That said, I do think for most uses, you'd get much more bang for the buck in first going:
1. Bigger
2. More color accurate
The problems with 4K is that it slows both advances, in the name of resolution, that we don't yet have production capabilities to fully harness. And, of course, the problem with #2 is all distribution and economies of scale.
glynor:
--- Quote from: 6233638 on November 13, 2013, 11:44:01 pm ---My point though, is that it is not expensive to make a 4K television - they're only expensive just now because they're new, and that means they can get away with charging a premium price for them.
--- End quote ---
A tiled 4K display, true. You can basically just glue 4 smaller 1080p screens together, after all. But getting a bunch of separate pieces of LCD in production that all "match" color performance is actually pretty tough, and expensive.
So, making a good 4K display is expensive.
Wake me up when they have good 70-90" single-tile displays, thanks.
Matt:
--- Quote from: glynor on November 14, 2013, 11:12:45 pm ---The problem with those charts, and there is a big problem, is that they're measuring the wrong thing.
--- End quote ---
I think there's a wow-factor in being immersed in the image and having your peripheral vision filled.
I sit 8 feet from a 9 foot screen. I did this even when my projector was 720p. I'd sit a little closer yet or make the screen bigger if my room had a way to handle it (but it doesn't).
A lot of charts say that's too close to sit because the resolution can't really support it. I think that misses the cool effect on your body in being immersed in the picture.
With that said, I'm in line to buy a 4k projector when they're priced reasonably. This is more interesting than 3d to me, after seeing how little I've used my 3d.
glynor:
--- Quote from: Matt on November 14, 2013, 11:32:07 pm ---I think there's a wow-factor in being immersed in the image and having your peripheral vision filled.
I sit 8 feet from a 9 foot screen. I did this even when my projector was 720p. I'd sit a little closer yet or make the screen bigger if my room had a way to handle it (but it doesn't).
A lot of charts say that's too close to sit because the resolution can't really support it. I think that misses the cool effect on your body in being immersed in the picture.
--- End quote ---
Completely agree. The optimal viewing angle is pretty close to 50 deg. Maybe even a bit less. That's what I'm saying. Buy the biggest display you can reasonably fit into the space, well before you go for resolution.
And, a nice 4K projector is a completely different story. If you have the space for it... Mmmm. Tasty. You just need a dark room, and that's impractical for most people.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version