Junk Food for the Brain

Open Source and Awesomesauce :)

Generate Files With Random Content and Size in Bash

| Comments

Occasionally you need to generate a bunch of random files with random content, usually for testing compression, user quotas or miscellaneous stuff.

Here’s one way, using the bash shell and a few handy linux utilities.

  1. The bash $RANDOM function. It generates a random number between 0 - 32767.
  2. Linux DD utility, to output files.
  3. /dev/(h|s)da, your hard drive in linux.
  4. All wrapped in a bash while loop.

So lets start. First define a bash variable with the number of files we wish to create, lets say 10.


Then we’ll assign the a bash variable for the counter.


As for the dd command, this creates a file with random content from your hard disk. (Mine’s /dev/sda) which is 1KB in size. The count switch tells dd to repeat 1024 bytes 1 time, thus the 1Kb file size. skip makes dd skip an x amount of bytes before reading further. Since this requires raw access to your hard drive, you’ll have to run this as root unfortunately. :(

dd bs=1024 count=1 skip=0 if=/dev/sda of=random-file

So now imagine, if we let bash assign the count and skip random numbers, we get random file contents. (Light bulbs flashing eh?)

Of course, all of it will be written to a single file called random-file in my case. To add just slight amount of variety, we can add the counter variable that we will use in our while loop as an extension. The dd command will now be:-

 dd bs=1024 count=$($RANDOM) skip=$($RANDOM) if=/dev/sda of=random-file.$counter

Finally, we will wrap it up in a bash while loop like this:-

while [[ $counter -le $no_of_files ]]; 
 do echo Creating file no $counter;
  dd bs=1024 count=$RANDOM skip=$RANDOM if=/dev/sda of=random-file.$counter;
  let "counter += 1";

When you run it, you will get output like this:-

Creating file no 1
16614+0 records in
16614+0 records out
17012736 bytes (17 MB) copied, 0.29308 s, 58.0 MB/s
Creating file no 2
14456+0 records in
14456+0 records out
14802944 bytes (15 MB) copied, 0.100101 s, 148 MB/s
Creating file no 10
25224+0 records in
25224+0 records out
25829376 bytes (26 MB) copied, 0.492113 s, 52.5 MB/s

when you do a directory listing, you’ll see this:-

[root@atreides rd-test]# ls -lh
total 226M
-rw-r--r-- 1 root root 17M 2009-07-29 00:25 random-file.1
-rw-r--r-- 1 root root 25M 2009-07-29 00:25 random-file.10
-rw-r--r-- 1 root root 15M 2009-07-29 00:25 random-file.2
-rw-r--r-- 1 root root 20M 2009-07-29 00:25 random-file.3
-rw-r--r-- 1 root root 21M 2009-07-29 00:25 random-file.4
-rw-r--r-- 1 root root 30M 2009-07-29 00:25 random-file.5
-rw-r--r-- 1 root root 22M 2009-07-29 00:25 random-file.6
-rw-r--r-- 1 root root 27M 2009-07-29 00:25 random-file.7
-rw-r--r-- 1 root root 25M 2009-07-29 00:25 random-file.8
-rw-r--r-- 1 root root 29M 2009-07-29 00:25 random-file.9

For more info, refer to the $RANDOM function from the Advanced Bash Scripting Guide.