There are well-defined procedures for permanently erasing data from a traditional hard drive. But for solid-state drives (SSDs), which use Flash memory instead of magnetic platters, things are quite different. The problem stems from two peculiarities of SSDs: “they can only erase data in larger chunks than they can write it, and their storage cells can only be written a certain number of times (10,000 is standard) before they start to fail.” Because of these, SSD firmware does a lot of behind-the-scenes manipulations when writing data to the drive.
Researchers at UCSD have determined the following:
- Built-in erase commands are effective, but are sometimes implemented incorrectly.
- Overwriting the entire visible address space of an SSD twice is usually, but not always, sufficient to sanitize the drive.
- None of the existing techniques for individual file sanitization are effective on SSDs.
That being said, law enforcement agencies are finding that it’s hard to do forensics on SSDs because the drive automatically wipes a significant percentage of deleted data without any intervention by the user. This may seem like a direct contradiction to what the UCSD team determined, but the difficulty there was with the purposeful sanitization of data as well as with the erasure of individual files. So while it’s difficult to wipe everything, it’s also hard to prevent some amount of deleted data from being wiped automatically.
The Ars Technica article (link #3 below) briefly discusses the article in link #1, and then goes on to mention other erasure techniques that are coming down the pipeline. For right now, however, they suggest encrypting the drive as a good way to keep private data secure.
Link #1: http://www.usenix.org/…
(via Slashdot)
Link #2: http://news.techworld.com/…
(via Slashdot)
Link #3: http://arstechnica.com/…