In what way do you think the requirement that signals be viewable on black and white televisions impaired the technology? While there are certainly other possible encoding methods, the only ones I can think of that wouldn't be compatible with black and white sets would have required too much circuitry to be practical in a consumer-level product in the vacuum-tube era (Betacam splits a color signal into YUV components, then on each scan line it outputs Y at double-speed, followed by U at quad-speed and V at quad-speed; excellent method, but I don't know how to handle the conversion without using an analog shift register or other such buffering device).
They did. The regulators, and the market, appears to have settled on 1080i60 (1080 lines of resolution, interlaced, 60 hz - 60 half-frames per second) as the main standard for broadcast. The main reason for using an interlaced broadcast format was due to the limited internal bandwidth of analog tube-based televisions. They could have just as easily settled on 1080p30 (1080 lines, 30 full frames per second) which would have no higher broadcast bandwidth requirements, and would be easier to deal with on the increasingly common fixed-pixel displays (Plasma, LCD, LCD and DLP rear projection, etc). Tubes are basically a dead technology, which although good, will never be seen in any bigger sizes than currently available (34" diagonal for a 16:9 widescreen set). Anyway, it's not the end of the world, but it's not ideal, either.
Too late. Digital TV signals are screwed up to maintain source compatibility with the 40 year-old, landfill TVs. Why they didn't go with a non-interlaced, progressive resolution digital signal is beyond me. Instead of converting the old video/film/whatever to the new format with a few pieces of expensive equipment at the broadcasters, we're stuck with crappy low end converters in millions/billions of HDTV sets. A total pooch screw, imho.