Why do secure HDD wipe algorithms overwrite the same sectors multiple times?

3

I was reading somewhere that secure HDD erasing programs erase all disk sectors multiple times with a random sequence of bytes.

My question is: logic tells me that surely doing it once to every single sector (including those marked bad by the HDD) is enough?

If it weren't enough then we should be concerned about the ability of the HDD to reliably write/retrieve data in the first place.

To me it sounds like some bureaucrat without any real knowledge deciding that it would be a good idea.

Am I missing something?

EDIT: I'd like to add the following link that seems to support my argument. http://hostjury.com/blog/view/195/the-great-zero-challenge-remains-unaccepted

Matt H

Posted 2011-09-14T08:39:31.397

Reputation: 3 823

Answers

2

You're thinking like a programmer, not like a hardware engineer. If data stored on disk is just 1s and 0s, then when a 1 gets changed to a 0 then it's irrecoverable. But actual physical disks are more complicated than that, and when a 1 gets changed to a 0 there may be some traces left behind. In practice, it's not worth worrying about unless you need to protect your erased data against someone with the resources of the NSA -- but some people do have this requirement, or at least they think they do.

Mike Scott

Posted 2011-09-14T08:39:31.397

Reputation: 4 220

1So what you're saying is that it's very unlikely you'd recover anything useful. I think your last statement sums it up. "some people do have this requirement, or at least think they do". – Matt H – 2011-09-14T08:59:20.817

The problem is the imperfection of disk tracking. If an overwrite pass is slightly to the left of slightly to the right of the write pass it's overwriting, there's a fear that it may be possible to recover the original data by using a narrower reading head and aiming it off the center of the track area. However, there's no evidence this is actually possible on modern hard drives. – David Schwartz – 2012-03-10T07:25:38.770

2

It's really an urban legend that a drive has to be overwritten by more than one pass. This stems from a paper written by Peter Gutmann who made up numbers about a drive having to be written a certain number of times before the data was obscured (he said 35 times!)

Later researchers have declared that writing over data a single time suffices. Here's an article linking to actual research, not urban legends:

http://www.infosecisland.com/blogview/16130-The-Urban-Legend-of-Multipass-Hard-Disk-Overwrite.html

Fletch

Posted 2011-09-14T08:39:31.397

Reputation: 21

0

When you delete a file it is actually not deleted, only the file allocation table is modified to show that the space occupied by the deleted file is now free and can be used to store other files.

Using a recovery program you can still recover the deleted file from a sector unless it is overwritten by some other file. i.e. some other file is written in that sector where the deleted file was stored previously.

As secure HDD erasing programs have to delete the file securely so that no one can use a recovery program to recover those files thus making those file securely deleted, they HDD erasing programs write random sequence of bytes to that sector where the deleted file was stored and again deletes those bytes.

It is done multiple times to ensure that the data was entirely lost and no part of the data can be recovered.

Also you may want to read about Data remanence

Alpine

Posted 2011-09-14T08:39:31.397

Reputation: 1 076

2Actually I was referring to disk erasing programs that bypass the disk partitions and erase the drive at a low level like doing "dd if=/dev/zero of=..." – Matt H – 2011-09-14T09:04:15.567

0

The idea is that, since the data is stored on the disks electromagnetically, even after blanking out, there may remain a residual electrical charge where the bits used to reside. This residue is too faint for the HDD drive head to read, but there is the possibility that very advanced and sensitive forensics hardware can detect these faint charges and re-create the data.

Ishmaeel

Posted 2011-09-14T08:39:31.397

Reputation: 435

0

Most drive erasing programmes tend to use the guttman method - this was designed for older drives with lower data density and older encoding systems. It assumed you did not know what your drive used.

With modern drives, a simple wipe will do the trick, but many organisations still insist on a guttman wipe as part of their standards

Journeyman Geek

Posted 2011-09-14T08:39:31.397

Reputation: 119 122