What do you think needs changed from what we have today? A good Blu-ray encode seems about as good as 1080p is likely to get on the consumer side of things.
I disagree. Only just over the past two years have hardware H264 decode blocks gotten truly competent, and even now, these are often limited to higher end GPUs and CPUs (the FPGA chips in TVs and set-top boxes are almost universally generations old). Display technology has
massive color accuracy improvements they could incorporate.
Plus, and perhaps more importantly, there has been a revolution in capture and post-production processes. A few things have happened over the past few years:
1. The full frame DSLR used for "real production". Suddenly, high quality, fast glass is cost effective even for television productions and lower-budget movies, and we have a way to capture it as well. DSLRs (and DSLR-inspired tech in "pro" cameras) are used everywhere now in both big-budget and small screen productions.
2. Full
high quality 2K production workflows, from end-to-end, only filtered across the "studio system" in the past 2-5 years. A big problem has long been the data rates for capture. We could build sensors (for lots of money) that would catch the light, but they'd dump out gigabits of data per second, and it is very difficult to get rugged, reliable, high-capacity, and removable storage that could capture at those kinds of data rates. Another huge problem has been aliasing (and aliasing-removing algorithms, of course).
B-Roll sticks around for a
long time. Footage is constantly mish-mashed. And, you're constantly looking at footage from a variety of sources, shot under different conditions and with different constraints. It is only now that most editors are really starting to live in a world free of "bad capture" (well, bad capture due to technology anyway, says the cutter).
3. There have been dramatic improvements in rendering power and in software editing suite capabilities. Now we have 4K edit suites in our workstations, and disks that can actually feed the data fast enough to matter.
It isn't that 4K doesn't matter. It is that the 4K 4:4:4 (or, probably more commonly, 4:2:2) workflows are part of the system now, and
they make better 1080p video. The problem is that this stuff is expensive and has a long shelf life. While the edit bay might have a nice system (or usually the visual effects artists), does the colorist? Does the editor on his or her laptop? Can we render it at sufficiently high quality? Will the studio (after all of the artists who made the thing are long-gone) screw up the BluRay transfer anyway?
As this stuff filters out, and as the DSPs, scalers, and decoders in your computer and BluRay player get faster and smarter, yes, we will absolutely see an increase in 2K quality over time. This will plateau, but I don't think it has yet. Whereas, 4K is immature. We're still at the point where $100k ARRI digital 4K cinema cameras can't record the content in RAW with media you actually might want to use on, say, a helicopter, and doesn't even have a sensor that can resolve full-4K in widescreen once you take into account the pixels needed for slop and the viewfinder and so on and so forth anyway.
Lower end cameras pushing the limits of what the sensor and glass can resolve and well... It makes really nice 2K footage, is what it does.
I'm really hoping that this returns, because I desperately want an Ultrawide 4K OLED TV.
OLED has a lot of promise, if they can get saturation and the grid under control. It is expensive, and difficult, to make good OLED displays. You can make high resolution ones, and you can make bright ones, but making color accurate ones is tough.
Still, the black levels can't be beat, and the display tech has the capability to easily reproduce a wide gamut, so it is promising. But doing it in production, with big displays, is hard.
Ultrawidescreen?
I don't know. The whole idea of widescreen is already periphery. But it doesn't matter... I really, really doubt we're going to see movement away from 16:9 anytime soon, en masse. There's way too much momentum behind it, and ultrawidescreen isn't "better enough" to be worth replacing everything, yet again.
I'm not sure that's actually possible now. I think most people are satisfied enough with how thin their TVs are, and things like contrast or color accuracy are of no interest to them. Maybe wide gamuts or high framerates will work, but a lot of people seem to be against high framerate content, and that's only been 48fps so far.
Sure it is, but maybe not in the way you think.
Same reason BluRay never took off
quite like DVD did (and why people still now buy and rent a ton of DVDs). BluRay is better than DVD. In a variety of ways. But it isn't like the difference between VHS and DVD. Because the difference between VHS and DVD was also about the fact you didn't have to rewind. It was about chapters, and jumping directly into the part of the film you wanted to see. The differences between tape and digital, weren't just quality, they were convenience. And you don't get that again with the jump to BluRay.
But you do with the jump to Netflix/iTunes, in a way (but at much lower quality). Clearly, though, the market has largely spoken. Mark my words: there won't be another major optical disc based format. There may very well be improvements to the current format over time, and some companies will certainly try to make some kind of 4K optical format, but it'll do even worse than BluRay in the end.
But hardware makers won't just sit still. I
do think we will see good quality 4K gear, and reasons to use it. I'm just not convinced that it is here, and if I was picking, I'd vastly prefer improved color accuracy. Once you get used to seeing the footage as it was shot, then the conversion to 4:2:0 just looks
so flat, and then most of the TVs to an awful job displaying it, and gunk it up with all kinds of post-processing to hide their deficiencies.
But it is hard to sell, so who knows.
But I do know that getting content from an IP network is the future, and we are still so far behind having the data rates available to us that we need for even nice quality 2k content, much less 4K, in that realm.