dd is a very handy tool and there are some more practical things we can do. For example if we want to dump a 3TB drive and want to preserve it and only 200GB are being used on the 3TB we can save a lot of space with gzip.
Backing Stuff up with dd
How to Use dd to backup a raw hard drive and tar gzip at once
Change /dev/sda to the drive you want to backup
Change /mnt/extras........
The folder I was trying to archive is about 72GB, but much like rsync at about 17GB it chokes because of the filesize. What's with so many common and essential Linux tools having such limitations? I guess it is likely that the authors never wrote their code with the idea that files would be so large but it's still very annoying. It's important to stay on top of these limitations on production servers because I didn't realize what happened until I checked the file with "........
I decided on using yum to help me decide even though I normaly use proftpd I decided to see what else I could find.
yum search ftp
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* rpmforge: ftp-stud.fht-esslingen.de
* base: mirrors.netdna.com
* updates: updates.interworx.info
* addons: yum.singlehop.com
* extras: mirrors.netdna.com
rpmforge........