Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: -YYZ-
They could have just as easily settled on 1080p30 (1080 lines, 30 full frames per second) which would have no higher broadcast bandwidth requirements, and would be easier to deal with on the increasingly common fixed-pixel displays (Plasma, LCD, LCD and DLP rear projection, etc).

Given the amount of logic required to uncompress digital video, how hard would have been to allow any resolution up to 2048, and any frame rate up to 60 (or even 120) and let television sets deal with it as they see fit? Would it cost significantly more to produce a receiver that could convert a 2400x1800@72 image down to NTSC than to make that was restricted to 1080i resolution?

It seems to me that for viewing movies, 24fps is better than 30fps (and raster-scan sets could display it as either 72 frames/sec progressive, 24 frames/sec with 3:1 interlacing, or 48 frames/sec with 2:1 interlacing). Was/is there any reason not to allow broadcast in whatever format best describes the source data?

22 posted on 12/02/2005 5:16:05 PM PST by supercat (Sony delinda est.)
[ Post Reply | Private Reply | To 19 | View Replies ]


To: supercat

The ATSC standard does include a 1080p24 setting, I believe. Certainly it is widely used in digital movie cameras and post-production work. Actually, the same 2-3 pulldown trick that turns interlaced NTSC video from movies on DVD back into progressive scan frames works for 1080i, also. However, as you say, it also requires turning 24 fps into 30 fps, which is also problematic. CRT televisions would not work well at 24 fps, and the ATSC standards that have settled out as being the standards for broadcast, cable, etc, were all oriented towards CRT displays. Of course, in the sizes that make the best use of HDTV signals, these are going to disappear.

I also have wondered why they didn't just make the standard fairly open-ended in terms of supported resolutions and refresh rates. I guess they felt it would be asking too much of the kind of equipment they expected to be receiving these signals to be able to deal with it. Really it's a mindset that was locked into the concepts that worked for analog sets. As you say it would be virtually trivial for the decoding and scaling chips in modern TVs and set-top boxes to deal with a wide variety of inputs and then output something the video electronics could handle.


25 posted on 12/02/2005 5:58:40 PM PST by -YYZ-
[ Post Reply | Private Reply | To 22 | View Replies ]

To: supercat

"Was/is there any reason not to allow broadcast in whatever format best describes the source data?"

I should add, a lot of the work that went up to setting these standards was done in the 80s and early 90s before the concepts of digital video used on computers everyday now, that we're all familiar with, were pretty exotic stuff back then. I don't think they had really come to terms with concepts like having the transmission signal properties, like frame rate, aspect ratio and resolution, essentially independent of the display's abilities. They just hadn't realized how cheap and ubiquitous the processing power needed to do these jobs would become.

As it is, digital cable and sattelite outfits use various schemes, the effective resolutions and aspect ratios on their transmissions are often wildly different than their nominal values for 1080i or 720p. And they're already moving on to MPEG4 or similar higher quality/lower bandwidth compression schemes. I've downloaded high def DivX videos that were smaller than their standard def MPEG2 versions, with less artifacting.


26 posted on 12/02/2005 6:29:05 PM PST by -YYZ-
[ Post Reply | Private Reply | To 22 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson