"Is there a special input for 1080p? Thought that DVI and HDMI have the non-interlaced resolution covered."
As I understand it, HDMI (and I think DVI, too) have sufficient bandwidth for 1080p, but many TVs with HDMI inputs can't handle a 1080p signal
"Question: what resolution do the cable companies broadcast? 1080i, 1080p or something less? And what bit rate?"
The cable companies rebroadcast whatever they receive. As far as I know, all broadcast HD is either 720p or 1080i (or, rarely, 480p), never 1080p. The bit rate the cable company uses is typically variable and depends on the configuration for the system that they have chosen. They can split up the available bandwidth in the system in a variety of ways.
"The higher resolutions mean more data down the cable. I am always suspicious that the cable company may cut corners and feed a bit rate that is less than the HD standard. Some "HD" shows look like they have macro distortion. Others are crisp and clear. Some shows are labeled 1080p, too.""
The distortion may come upstream from the cable companies, either in transmission, or in poor quality compression techniques in post-production. As you know, the cable itself has a limited bandwidth, and the MPEG2 compression scheme that is used in cable (Dish is switching to MPEG4 for HD) is not the most efficient, so in many cases there may not be sufficient bandwidth for all the detail, so you get macroblocking.