When entering the world of HD or picking a format for your productions there is a lot to consider and a lot to understand. There is a lot that can dictate which format you choose. But in a perfect world you want to have the same native resolution from camera to projector, to web, to archiving. Most devices now days can switch between formats, however, all of these devices such as cameras, projectors, and monitors have a native resolution, a native format and then converts for everything else.
So whatever you currently have in the chain, figure out the native resolution and start there and once you pick a format keep everything in line that same format. I would say projection and cameras are your most important pieces to keep in a native format.
Interlaced vs. Progressive
Back when we first got introduced to television we operated in a 525i format. “I” standing for interlaced. How interlaced scanning works is in a frame of video it would scan half (every other line) of the frames first in a field of video, then it would scan the second half of the lines on the second field, therefore taking two fields to make 1 frame. Because we operate on a 60hz power structure, 1hz would equal 1 field giving us 60 fields per second or 30 frames.
Progressive scanning scans all the lines within a field of video so rather than 2 fields = 1 frame, 1 field = 1 frame. This gives us 60 frames per second and overall doubles the scanning making a sharper, cleaner image. The reason you don’t get as sharp of an image with interlaced is when you have constant movement you can’t afford to wait another 60th of a second to complete the image, causing smear.
For anyone that does still photography you know that when shooting motion, you have to have a fast shutter speed in order to get a sharp, clean image. Almost think of scanning the same way.
We have been on the interlaced backbone since we started with TV, there is nothing wrong with interlaced scanning, but I do think it is safe to say that progressive in most cases is a better way to go.
720p vs. 1080i
When your choosing a camera, most of them can switch between 720p and 1080i, but before you choose one over the other, understand your camera’s chipsets. There is an intended format the camera is meant to be used in.
Most chip sets in cameras are fixed, they don’t scan differently when you switch your format, and instead the signal is converted. Some cameras can natively switch formats at the chip, but most cannot. So that means if you have a native 1920x1080i chipset in your camera and change the camera to 720p out, you are still getting an interlaced scanning image and that is being converted to a 720p signal, you are not actually getting a progressive image, but instead a cross converted output based upon a 1080i chip. So in reality you are getting a 720i signal and somewhere the signal is trying to create a progressive scan out. But if it was never there from the start, it isn’t there.
So when going from 1080i to 720p you are down converting the pixels but up converting the scanning, when going from 720p to 1080i you are up converting the pixels and down converting the scanning. Thus completing the idea of cross converting, the conversion between 720p and 1080i.
720p and 1080i are very different but I don’ t think you can call one better than the other. When it comes to SDI, they both take the same amount of bandwidth and go the same distance.
I want to stress that these two formats are very different and yet very equal at the same time. They both serve different purposes and are both widely used in the industry.
Some cameras have 1 million pixel chips. Be careful with these cameras, just because it is a 1 million pixel design doesn’t mean it is a 720p chip. Most of these cameras such as the Panasonic HVX 500 and Hitachi z-HD5000 are meant to be up converted to 1080i, sticking with the native interlaced scanning. Upconverting resolution in my opinion is better than upconverting scanning and is what I believe most manufactures intend, if you want 720p but don’t need the benefits of progressive, even though you have to convert, would make this a great option. Sometimes 720p can be easier to work with when dealing with computers and projectors with native 720p resolution.
Most editors I talk to love editing in 720p, working with codecs, shooting with DSLRS, and file storage tends to work really well in 720p, at least what I have herd from editors. Not to mention pretty much every sport on TV is broadcasted in 720p. Sports broadcasters love 720p because the progressive scan chip offers a crisper image that handles fast motion really well. Lots of smaller format projectors have a native 720p output, which makes 720p a better option.
If you are doing any type of streaming or using encoders, some encoders don’t handle interlaced scanning as well as they do progressive. Most encoders prefer 720p on their input and stream.
So why if progressive scan is usually better would people go with 1080i, well I believe that is because of the higher resolution. The benefits of progressive scan over interlaced scanning looses out to overall pixel count. If you aren’t dealing with lots of motion and just want overall resolution 1080i is the way to go. I also think that the bigger the screen size the more resolution can be important, but make sure your projector can handle it natively.
Also, there is something mental about 1080i over 720p. More is better, right? And most people feel that if they go 720p they aren’t as up to date as they could be and therefore aren’t making a good long-term investment. I hope by reading this so far you realize that that isn’t true, but sometimes mentally you just can’t justify less as more or in this case less as equal.
1080P or 3G
So as I have said Progressive scan is better than interlaced, interlaced just really allows us to get higher resolution cheaper, and by using less bandwidth. So naturally we want to go progressive in every video format we do which is where 1080p comes in. However, it ends up being twice the bandwidth. Where 720p and 1080i take 1.5gigabit, 1080p takes twice that at 3 gigabit or 3G as we call it.
1080p doesn’t feel like a new idea, consumer TV’s and Blue Ray players have been doing 1080P for awhile but in the broadcast world we have had to figure out how to push and deal with twice the data. Some devices were easy to manufacture such as DA’s, Frame syncs, converters and routers. A few years back 3G cameras came on the market and within the last year Switcher manufactures just started to manufacture 3G production switchers.
But there is still more to the game that has to be changed, you are now pushing twice the data to everyone’s TV’s at home. Although Dish and Cable have a few 3G channels, the remaining infrastructure will take a big upgrade to get it up and running. I think we are still a few years out before the transition really starts to unfold to a 3G standard.
If you really want to future proof yourself, go for a 3G system, but in most cases the cost vs reward is too far of a gap to really justify it, especially cost of cameras and production switchers. Not to mention in our world cable distance can be a problem and twice the bandwidth means half the distance (150-175ft on 1694a Coax).
Ultra HD and 4K
The next evolution of video is another jump in resolution. This has gotten a lot of hype from the consumer which has the Broadcasters treating it seriously as well. Film companies have already been using 4K and extremely high resolutions. In the TV/ Live industry though, there are some skeptics and a few problems that need to be sorted out.
First let’s understand 4K. Up until 4K, all video has been measured in horizontal lines 525, 720, 1080; but 4K is actually referring to vertical lines and has a total resolution of 4096×2160 which is double the horizontal lines of 1080p and over 4x’s the pixel resolution of 1080p. But 4096×2160 isn’t a 16:9 format, instead it is 17:9 making it a little bit wider than the current standard. This new format isn’t surprising as it is currently used by the film industry and they have consistently played with different aspect ratios. But close to 4K is UltraHD, and despite the confusion is different than 4K. Ultra HD is still 16:9 and operates at 3840×2160 and as of now is considered to be the next consumer standard.
As we have gone through different standards since TV began, Broadcasters have had pushed to keep everything tied into 1 cable. Such as Composite, SDI, and HD-SDI. As there have been a few detours with S-Video, Component, and B-Level 3G, there has been push to get back to a single cable system. In the current structure there are two options for 4K, a 4-cable system, or the newly developed 6G infrastructure. 6G is twice the bandwidth of 3G (1080p), but UltraHD is 4 times the pixel resolution and 6G is only double the data. 6G’s current limitation lies in Frames per Second. And rather than doing interlaced, 6G does 30 progressive frames at UltraHD, half the FPS of 1080p. Early research is indicating that UltraHD and 4K are pushing for higher frame rates upwards of 120fps. In order to deliver that much data would require 4-6G cables.
As progress has been made in transport and delivery by introducing 6G infrastructure, the industry doesn’t seem to be stopping there. And 4K is just the beginning, higher resolutions will eventually come. But before it can realistically become mainstream some new standards need to be figured out. 6G and the New codec H.265 have been great steps forward, but they are just the beginning.
Early studies have compared 1080p vs 4K and the consensus has been that for consumer use 4K isn’t noticeably better for TV until the TV size is above 85 inches. Which means 4K is a bit of a hard sell for the consmers if they can’t see the difference. But there have become other benefits to 4K that will continue to spur development. Sports has begun to use 4K in instant replay for the ability to do high quality zoom. And the digital signage industry will use 4K where resolution is needed. The broadcast standard may stay at 720p/ 1080i for a while yet, but 4K has been found useful. And 4K will eventually happen, the question is more of when. Will it be immediate or will it take time to get new standards in place. Then the entire broadcast industry needs to take a turn, and they are still a ways out in making it main-stream. Realistically 3G has been around for 5+ years and it has only been within the last year that 3G has been realistic for the Main-stream.