aren't there actually tools to convert your 320x240-kitten-playing-with-a-string video into 1920x1080?
Well... Yes and no.
You can convert and upscale anything if you want. Going from 320x240 to 1080p is going to look like doodie, but sure you can scale it.
In Final Cut, you just make a new 1080p sequence and drop the source footage in and it automatically scales it up. It has to render it, and it doesn't use the
smartest scaling system in the world, but it does a decent job. I'm sure most other video editing applications have something similar/identical. A smarter way is to run the source video through Compressor (or Telestream Episode if you have it) and upscale there first, as they have higher quality scaling algorithms built in (though they're slower). Likewise, you could do the same in After Effects or Apple Motion if you choose (Motion has a really nice automatic stabilization system too). If you care, there are tutorials for this on the Creative Cow forums.
Here's one such discussion about going from 4x3 480p footage up to 16x9 1080p.
There are also hardware scalers built into many pro I/O systems, like
the AJA Kona or the
Blackmagic Design DeckLink. Lastly, there are a TON of
third-party software plugins that specialize on cleaning and upscaling.
But going from a 320x240, probably web (aka junk) compressed, source video to 1080p is laughable. Is there some quick and easy consumer tool to do it? I don't know... You can do it using
MPEG Streamclip (which is free and actually has an awesome quality scaler).
Staying in two dimensions (no pbd), 1440X1080 seems to be HDV, which I guesss was used a lot a few years back for sports shoots, for example the Salt Lake Olympics whose clips I was viewing.
Like I said, that resolution is very common, certainly including Canon's implementation of HDV. I would be very surprised if they used HDV for the Olympics (at least not for high-profile events) because it was a decidedly "pro-sumer" codec, but whatever.
To be clear, HDV is a
codec. Just like H.264 or XviD or Apple ProRes or AAC. It doesn't particularly care about the resolution, though HDV was typically stored in 1440x1080 because that resolution would "fit" on a standard MiniDV cassette. That's what it was designed for. Shooting 1080p on MiniDV cassettes originally designed to store crappy quality standard def DV content. HDV is a 4:2:0 codec, and so has even less color information than broadcast. I'm sure it was probably used heavily in some spaces (the cameras were "cheap" and lightweight), but DVCProHD was certainly much, much more popular professionally a few years back in the same "era" (DVCProHD is a 4:2:2 codec and records at a MUCH higher bitrate with much less noticeable compression artifacting). DVCProHD also uses the 1440x1080 resolution for "1080p mode".
Is it possible this "quote HD" wasn't necessarily--as Lady G says--born this way?
If
they're (the content creators) really choosing to distribute it at 1440x1080 and they're professionals (one would hope), I'd guess it is being shot on DVCProHD gear at that resolution natively. They might also be using AVCIntra (or one of the other H264 based variants) which sometimes uses that frame size. In any case,
it was almost certainly shot at 1440x1080 if they're distributing it via that format. The same cannot be said for something that is "real" 720p (1280x720) or "real" 1080p (1920x1080), because people do up/downscale to those resolutions, as they're standard broadcast sizes. But there'd be no reason to distribute at 1440x1080 unless you shot it that way.
But, much more likely, the frame size was "picked" by YouTube and they use 1440x1080 for ALL "1080p" content. I don't know enough about YouTube's compression magic they do to comment for sure. I know their quality is absolutely HORRID, and that their data rates for "1080" are so low that it is a joke that they distribute things in that format at all, but it wouldn't surprise me to learn that all 1080p content there uses that 1.333 PAR. When you upload a video to YouTube, they compress it to their own delivery format, regardless of what source you give them. YouTube does
not upscale content. So, if you only give them a 480p source, then your video's max resolution for playback will be 480p. But they do downscale, so you'll also get a 360p version (or whatever smallest size they use). If you upload a 1080p source (or higher), they create a 1080p "version" of their own (H.264 compressed and conforming to their standards).
All video uploaded to YouTube is recompressed on their servers. There are no "user-facing" controls for this as a content creator. You get what you get. Now, if you are NBC, they might give you a little more control (but from what I've heard at NAB, not much at all). They NEVER use your source footage directly, even if it is "web-friendly" compressed already.
So, the takeaway is this:
If YouTube is offering a "1080p" version, then the content creator gave them a 1080p source (or better). If it was upscaled, the content creator did it, not YouTube.If I had to guess? It is probably being shot either at 4-5k on a digital cinema camera of some kind (maybe a
RED Epic?) or it is being shot on DVCProHD, Sony's format (whatever it is called now), or AVCIntra gear at 1080p of some variety. Probably a mix of both depending on the event, actually.
It doesn't much matter though because then YouTube takes the nice quality source they give them, and they beat the living daylights out of it and serve you crap. If you drop down to the lower quality one on YouTube? It might actually be better (if the bits-per-pixel available in the compression algorithm are more favorable) or it might be worse (if they use even crappier settings for lower-res stuff, figuring it is web video and who cares).
YouTube's compression system is a black box though, and what they did last week might not be the same as what they do now.
That's why Vimeo is so much better.