I agree QuickSync could be cool. But so far... I'm not sure.
My view is certainly "tainted" by my experience doing "real" transcoding. To me, it would make much more sense to get a dual Xeon CPU 16-core behemoth for those kinds of purposes and do software transcode. But, of course, money. But, like I said... From my point of view, you can now do on a single workstation what used to (only a few years ago) take a cluster of high-end machines running in parallel.
Well for me, the only time I would personally want transcoding is to view files on my iPad, which are not feature-length films, and on that display it's more about convenience than quality.
But JRemote doesn't currently support transcoding for video anyway.
QuickSync is probably better quality than anything my CPU can handle in real-time, and without the CPU load and power consumption that is associated it.
Streaming to the other TVs is not something I would make use of, because I only watch films on the main TV, which is hooked up to the PC via HDMI.
But it would be nice to have
good enough quality, that doesn't have a performance impact for anyone else here that might want to use it on one of the other TVs.
Now, it'd have been WAY BETTER if they'd included the new sleep states in the desktop variants, which is mostly what the new on-die VRMs were for. The idea of the new sleep states is that the CPU can do "micro-sleep" all the time (shutting down even the VRMs themselves). So, the idea isn't that it does some task, but then sleeps when you stop using it. The idea is that it sleeps for 2 seconds, or 50ms, here and there while you are using it. In between keypresses and the like.
That's actually one of the things that had me excited about Haswell
after reading this article some time ago - I didn't realise they only planned on it being available on the mobile chips.
Apple has actually been advertising similar things for years now - such as sleeping in-between every keystroke:
http://www.apple.com/macbook-pro/environment/I have to say though, while it may add up to save power (which is often why you get longer battery life in OS X rather than Windows) I can't stand to hear the CPU switching power states all the time in their notebooks. Those high-pitched whines drive me nuts.
That, plus the DRAM-backed display tech Intel is pushing could make a huge difference on laptops.
Yep.
Also, the Airs I've played with got 4-6 hours real-world easily for normal usage. Not gaming, of course, and running Garage Band kills them (which is probably all GPU and memory), but for "regular stuff" that you'd do on an Air (web browsing, Office, etc) they were pretty good.
Well I suppose it depends what your normal usage is. Most people I know with Airs are complaining that they only get 2-3 hours before the battery dies.
For my money, though, I'd much rather have a 13" Retina Macbook Pro. But not with that crappy Ivy GPU.
When most of my work is done on a desktop machine, and I'm used to the portability of an iPad, I'm not sure that I want something as big as that now. I don't know that I'd buy another MacBook Pro again anyway, because they're so expensive for the performance that you get from them. I'm always wanting to upgrade long before I've had my money's worth from them.
But a 13" Retina Macbook Pro with a Haswell GT3+Crystalwell? That looks like it could be pretty darn interesting.
Perhaps. It's a shame that while we went "Retina" on the iPhones, iPods, and iPads without a price penalty, going "retina" on the MacBooks is a big price increase.
I'm not great at predicting trends due to my personal weirdo-quotient (ie. I often like weird things). But isn't this sort of a self-fulfilling prophecy?
I don't think so. I don't know anyone that actually wants to own a desktop computer these days - at most they will consider an iMac if they need "a lot of power" and that's essentially a laptop with a big screen attached.
These days it's mostly gamers that are left buying desktop PCs - and they're moving towards smaller form factor systems that are no bigger than a full length video card.
I wouldn't mind one of them if it weren't for the noise and lack of storage options. With everything shifting towards smaller form factors and lower power consumption, I'm starting to regret buying
a large tower though.
Most people - if they want a computer at all now, and aren't satisfied with an iPad or even just an iPhone - want a laptop that they can use at a desk/table if necessary, but are mostly just using on their lap when sitting on the sofa, lying in bed, taking it with them to a café etc.
In fact, people that are only a few years younger than I am, are starting to use their laptops as their
sole entertainment devices. I know a worrying number of people that are happy to carry their laptop around the house with them, and use it as their music system, streaming via Spotify or similar services. All video content is just streamed to it via Netflix. High fidelity is a completely foreign concept to most people under say 25. The most you are likely to find is people that are into headphones for fashion, sound isolation, or bass, more than they actually care about fidelity. (hence the popularity of Beats)
Now that's potentially good news for you, as it means more people are shifting towards computer-based audio playback, but most people opt to pay for a streaming service that costs roughly the price of purchasing a single album a month, than actually caring about owning music and having a "library" to manage.
And in some ways, I don't blame them. I personally
hate having a huge library of physical discs that I have to store somewhere, and are likely to be surpassed in quality in a few years.
I feel sorry for people that had big VHS libraries, then had collections of hundreds if not thousands of DVDs, and now we have Blu-rays which are a significant improvement. And in a few years time we will likely have 4K and eventually 8K too.
I'm happy enough to purchase Blu-rays though, because the quality is generally very good, and while it may not stand up to native 4K/8K video, I think it should remain watchable for a long time.
Even a good DVD never really impressed me when they were the current thing - they're all full of MPEG2 artefacts, sharpening, noise reduction etc.
If you're paying for a streaming service, you got a free upgrade from SD to 720p, to 1080p, and beyond.
Now the baseline quality is not good enough for me yet, but I'm sure it will be eventually, and why not pay the equivalent of buying a single disc to access any film you want, instantly, even if that means you don't own it?