Just a short article to document for myself how to copy a large directory (e.g. a user-folder) over a local network. While (s)cp might work for smaller operations, rsync is my preferred tool as you can restart it when it breaks + in case you found an optimization, you can just abort and restart. Some things to take into account before I share the command:
So here is the command:
rsync -aWP --inplace --exclude-from=exclude-file.txt murb@someaddress:/Users/username/ .
Breakdown:
-a
is the archival option, and it is typically what you want, preserving the most important bits-W
does a dumber file comparison when restarting the syncing again, which is again faster (although it might result in a few duplicate bytes being synced) (found at SuperUser.com
-P
gives you a progress indicator, which also helps you identify time consuming waisted syncs that should be in your exclude-file.txt (I just abort, and continue)--inplace
a minor optimization, not using tmp files.My exclude file looked like this:
*/node_modules/*
*/.rbenv*/*
.rbenv/*
Library/Caches/*
Library/Developer/*
Library/Containers/*
Library/Containers/com.docker.docker/Data/vms/0/data/Docker.raw\
*/.git/objects/*
Note that some of the excluded files might be useful if you’re offline for a longer period of time on your new machine, but typically you can rebuild the contents directly from the sources.
Enjoyed this? Follow me on Mastodon or add the RSS, euh ATOM feed to your feed reader.
Dit artikel van murblog van Maarten Brouwers (murb) is in licentie gegeven volgens een Creative Commons Naamsvermelding 3.0 Nederland licentie .