The requirements for scanning every packet on the Internet is never going to come into play. Since packets are routed on a best speed mapping, well - let’s put it this way. The page on FreeRepublic you’re reading now likely passed through three separate major network hubs, the top part of the page went down one path, the graphic for the freepathon went down another, and likely this reply went down a third.
Assembling all these pieces together to figure out if it’s a piece of copyrighted material, or even if it was encrypted, would be virtually impossible without putting in lengthy delays and picking out the other packets.
it is unlikely that network administrators would try to sniff packets as it would be much easier to simply check the files posted on a public share
in some cases such as a personal web page the public share is on a network server while in other cases such as in a P2P network the shared directory is on a client computer
on a network server it is easy for the network operator to scan the files on the public directory for copyright material.
on a P2P net all the files would have to be downloaded, — or — better — a crawler would be used
anyway you cut the cake they will put a stop to all this copyright violation activity the question in the air is how much collateral damage will occur
The requirements for scanning every packet on the Internet is never going to come into play.Suggested Reading: AT&T and Other I.S.P.s May Be Getting Ready to Filter
As I mentioned the other day this might be a pretext for requiring an ADK, and this rather more likely than being an effective means of hunting copy-pirates