Author: Ondřej Vejpustek 2018-01-17 11:39:42
Published on: 2018-01-17T11:39:42+00:00
The entropy argument is a precautionary principle based on Shannon's information theory that suggests it is safer to encrypt random or compressed data than natural language because low redundancy makes plaintext more secure. Although there is no known generic attack, cryptographic attacks may occur due to the breach of this rule. Two examples of such attacks are the Related Message Attack and Stereotyped Messages. These attacks exploit the redundancy in messages to compute x from RSA ciphertext. The relative redundancy of both messages is at least one half and (1-1/e), respectively. More complicated variants of these attacks exist with weaker premises nowadays. RSA and SSS have a considerable similarity as both schemes are algebraically-based. CRCs and error-correcting codes generally introduce redundancy into the message. However, the redundancy introduced by CRCs is induced by a linear relationship among the message, which is different from the premise of the Related Message Attack. Using a hash function like SHA instead of CRCs is suggested when introducing redundancy unless it is introduced in a complicated way. There is no randomization in the scheme used, such as padding or initialization vector, and advantages of error-correcting codes over hash functions, including minimal codewords distance property and performance, were considered.
Updated on: 2023-06-12T23:34:39.100446+00:00