To: ScuzzyTerminator; Lonesome in Massachussets; cryptical
I agree it is not a hard problem. My point is speed. How many machine cycles will be needed to test each decryption for plaintext? If you're trying to test a billion keys per second, this definitely becomes a consideration.
There would be many ways around brute-force recognition algorithms for skilled operatives. For example, you could write you text in a graphics application, and save it as a .jpg. Then uuencode it or yenc it to disguise the filetype, and apply your encryption algorithm. The brute-force cracker cannot be programmed to recognize to many variants without slowing it down so much that it would be worthless.
To: proxy_user
I think the point is that even things like that have headers in them, which would make them even easier to decrypt and recognize. Besides, AQ don't want to be slowed up, they just wanna fire off emails without all the rigamarole.
To: proxy_user
How many machine cycles will be needed to test each decryption for plaintext?
I doubt that the NSA would use CPUs to crack well known algorithms. I assume they use hardware designed for the task, like the
EFF DES Cracker project. The "randomness meter" would likely be a module that works in parallel without slowing anything down.
There would be many ways around brute-force recognition algorithms for skilled operatives. For example, you could write you text in a graphics application, and save it as a .jpg. Then uuencode it or yenc it to disguise the filetype, and apply your encryption algorithm.
Steganography
before encryption doesn't help because you know you found the key when you've found, for example, a .jpg or a uuencode. Generally, you need to encrypt before you hide.
The brute-force cracker cannot be programmed to recognize to many variants...
Variants don't matter. You're not looking for recognizable data patterns but just measuring randomness. Any message with meaningful information, no matter what the format, will be manifest as non-random if you have a sample at least as big as the unicity distance.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson