Copying / syncing files over a local network with rsync
Just a short article to document for myself how to copy a large directory (e.g. a user-folder) over a local network. While (s)cp might work for smaller operations, rsync is my preferred tool as you can restart it when it breaks + in case you found an optimization, you can just abort and restart. Some things to take into account before I share the command:
- Do not mount a drive, just use ssh
- if you’re sharing from macOS, make sure file sharing has access to the entire harddrive, otherwise some important folders will sync empty (e.g. Documents(!))
- Make sure you exclude files you don’t need (a home folder typically contains many cache-files that you don’t want to sync to a new machine
- Do not enable some form of compression (it waists cpu cycles when your network is fast enough)
So here is the command:
rsync -aWP --inplace --exclude-from=exclude-file.txt murb@someaddress:/Users/username/ .
Breakdown:
-
-a
is the archival option, and it is typically what you want, preserving the most important bits -
-W
does a dumber file comparison when restarting the syncing again, which is again faster (although it might result in a few duplicate bytes being synced) (found at SuperUser.com -
-P
gives you a progress indicator, which also helps you identify time consuming waisted syncs that should be in your exclude-file.txt (I just abort, and continue) -
--inplace
a minor optimization, not using tmp files.
My exclude file looked like this:
*/node_modules/*
*/.rbenv*/*
.rbenv/*
Library/Caches/*
Library/Developer/*
Library/Containers/*
Library/Containers/com.docker.docker/Data/vms/0/data/Docker.raw\
*/.git/objects/*
Note that some of the excluded files might be useful if you’re offline for a longer period of time on your new machine, but typically you can rebuild the contents directly from the sources.