0

I've read three different exceperts from three different information security books, to summarize, this is what they say:

"Block size impacts security, complexity, and performance. A larger block size is more desirable. It is more costly to implement though."

I have yet to find a reason WHY it is more desirable. WHY is a larger block size more desirable than a smaller block size? How does a larger block size impact security versus a smaller block size? What is considered a large block size? What is considered a small block size? Is 3DES compatible with only a 64 bit block size?

If someone can help explain this to me, that would be awesome. I haven't been able to find a source that can explain this concept to me without being extremely vague.

ihoaxed
  • 3
  • 2
  • 1
    In general for a block cipher, the larger the block, the more data (length & number of messages) you can encrypt without duplicating a block or leaking other key related info. By analogy, consider the Birthday Attack (see Wikipedia) concept of mm/dd match of class of 30 students. A match is all but certain if you only use dd (30 students, 31 days and '31' itself is only 7/12 as common, etc.) and certain with mm (if extreme of 1-12 all different, 13th child must match). Now consider matching 30 people on mm/dd/yy(birthdate) on mall escalator on 1st Sat in Dec (hence somewhat diverse ages). – BillR Oct 10 '16 at 06:10
  • Note that if "cost" means time or power consumption then the conclusion is not true. We care more about efficiency than the time it takes to compute one block. We use measures of megabytes per second or cycles per byte to describe algorithm speed. It may be the case that doubling the block size for a family of algorithms causes the time to encrypt on block to increase, but if it only increase by say 25% then it's less costly per byte to use the larger block size. – Future Security Nov 01 '18 at 04:25
  • It makes sense to look for both the cost per block and cost per byte. We associate the two respectively with short messages and long messages. If a message length is not divided evenly by the block size then we need to round up to a whole number of blocks. It still may be more secure to use sufficiently large block sizes, however it doesn't make sense from a performance perspective to switch from 128-bit blocks to 256-bit blocks if your message is only 4 bytes. – Future Security Nov 01 '18 at 04:32
  • If cost means how many transistors are needed in a hardware implementation, then bigger is costlier. For software implementations executable size may increase by a few bytes or kilobytes. – Future Security Nov 01 '18 at 04:36

2 Answers2

2

Block size increases the combinatorics - there are many more different blocks.

3DES and Blowfish always work on 64 bit blocks, regardless of key size. AES always works on 128 bit blocks, regardless of key size. We now know 64 bit blocks are insecure if you send a lot of data.

For more, read the site and the paper for sweet32.

Z.T.
  • 7,768
  • 1
  • 20
  • 35
  • 1
    64-bit blocks are basically bad and should be abandoned, but calling them insecure is much too strong a conclusion. – Luis Casillas Oct 31 '18 at 21:08
  • 1
    They are insecure in _more situations_ than ciphers with a larger block size, but @LuisCasillas is correct that calling them insecure is not necessarily right. If you use, say, Blowfish and rekey every 4 GiB, you should be completely fine. – forest Nov 01 '18 at 07:17
1

Cryptography describes the science and methods we can use to mathematically obscure (and in some cases de-obscure) information. The way this plays out in practice of course has a lot of nuance to it, but as we're all aware the main process for doing this is to take some data and perform calculations on it that are essentially impossible to undo. (Of course, in theory, they can be performed in reverse, but thanks to minor constraints like thermodynamics that's not actually feasible or, for any relevant purpose, possible.) These calculations can be performed on a character-by-character basis using a design known as a stream cipher. The alternative is to take the input data, break it into chunks (blocks), and perform calculations on the blocks. This design is known as a block cipher.

Given the nature of your question, I'm sure this is all information you already have, but I think it's worth outlining for future readers.

To answer your question, larger blocks are safer for the same reason that cryptographic algorithms use things like modulus functions and elliptic curve tangents: intricacy. All of these elements are incredibly complex on their own, and put together form a system that essentially represents a hardened web of computations. In the case of block size in specific, each added bit presents a new multiplier of potential values, increasing the difficulty in guessing the contents of that bit. No matter how hard you try, a 2 bit block, spun round and round and calculated on in circles, is going to show patterns over time, no matter how good the underlying algorithm is. These patterns show seams and the encryption can be cracked.

A 128 bit block, on the other hand, is much harder to guess. For one, it's difficult to guess the contents of that block on its own, even in cleartext. But given a cryptographically secure algorithm, it is feasibly impossible to crack a 128 bit block. Good luck spotting patterns, and you're definitely not going to get any collisions there.

Another way to think of it is simply as layered security. Larger blocks present more layers of data for your encryption function to use, and that produces more layers of encryption at the end.

securityOrange
  • 913
  • 4
  • 12