Archive for the ‘Encryption’ Category

Using “dd” and “/dev/urandom” to write random data to a disk

Sunday, April 6th, 2008

dd is an essential Linux tool, kind of a Swiss Army knife. That said, it took me a couple of years on Linux before I even used it once. Whilst there are plenty of uses for dd, today we are using it to fill a hard disk with random data.

There are two possible reasons for doing this: wiping confidential data from a hard disk prior to disposal or increasing the level of protection given by using encrypted disks.


Filling a hard disk with random data will prevent all but the most determined people (with specialized tools) from recovering the data. That said, it is probably better done with tools built specifically for the job like shred.

Raising the effectiveness of encryption

Filling a hard disk with random data prior to storing encrypted data on it raises the level of security significantly, making it much harder to crack the pass phrase or key. I’m not a cryptographer and don’t fully understand the process, but I believe it’s something like this.

If a brand new hard disk, or a non-encrypted disk is used in an encrypted setup, there will be parts of the disk that have recognizable data (or completely blank spots) on them. By analyzing the disk, it would be possible to find an edge between encrypted data and non-encrypted data. This information can then be used to aid in the attack against the encryption.

By filling the disk with random data first, it becomes near impossible to spot the edge between the encrypted data (which looks random) and the unused parts of the disk.

Generating Random Data

On a Linux machine we have two main sources of random data: /dev/random and /dev/urandom. /dev/random provides a very high quality random output that is generated using environment noise from device drivers such as sound cards or mouse movement. You can see this by executing:

cat /dev/random

and then moving your mouse around. When you first execute the above command, there may be a fair amount of random data stored up, but once it is exhausted it only comes through very slowly. This means that to fill up a hard disk with data from this source could take years. Enter /dev/urandom.

/dev/urandom (or unlimited random) is, as the name suggests an unlimited source of random data. The quality of the randomness is not as high as that of /dev/random as it uses both input from /dev/random as well as pseudo random generation algorithms. That said it is more than random enough for our purpose.

Putting it together

So to fill up our disk (/dev/sda is this example) with random data involves just one command (to be run as root). The time command at the start is optional but will return the time taken when the command completes.

time dd if=/dev/urandom of=/dev/sda

Now, depending on your disk size and processor speed, this could take from a couple of hours up to a couple of days. As we are using the pseudo random generation algorithm, a side effect is that your CPU will be running up near 100%. I recently used this command to fill up a 400GB disk on a PC with an Athlon-64 3000 processor. The output is shown below.

time dd if=/dev/urandom of=/dev/sdb
dd: writing to `/dev/sdb’: No space left on device
781422769+0 records in
781422768+0 records out
400088457216 bytes (400 GB) copied, 141572 seconds, 2.8 MB/s
real 2359m32.492s
user 4m8.912s
sys 1781m26.368s

A quick bit of maths tells us that it took a bit under 40 hours to complete.

Checking on progress

The dd command doesn’t give us any useful feedback whilst it’s running, but we can force it to cough up some useful data by prodding it with a USR1 signal. Assuming you only have one instance of dd running we can use:

kill -USR1 `pidof dd`

(Note, they are backticks around the pidof dd command.)

So sit back, make yourself a cup of coffee (or 40) and let your system struggle under the responsibilities of generating a continuous stream of pseudo-random data.

(In the near future I will be placing a full guide about using encrypted hard disks in the guides section of