I believe you're talking about the LZ adaptive algorithm. It's not referred to as redundancy because of anything that's getting duplicated in the process of building the zip file. The term comes from how this method of compression works.
To illustrate, here's an example. Let's say I had a document containing the phrase:
It is what it is because that's what it is
If I wanted to make this phrase shorter through redundancy, I would first make a dictionary containing all the words that were repeated, like so
1it
2is
3what
And then I would rewrite the sentence as
12312becausethats312
If I then want to compress it farther I can add the following to my dictionary:
312x
12y
So that it becomes
yxbecausethatsx
As you can see, the more redundancy checks you go through the greater the compression. But you're also increasing the likely hood of corruption. This is because as the dictionary grows it becomes more prone to damage and if any portion of the dictionary gets damaged the rest can't be read.
CDs use their own sort of redundancy to compensate for scratches etc., it's called eight-to-fourteen modulation. I think the better use case would be downloads that get corrupted for whatever reasons. – slhck – 2011-05-13T11:05:10.930