[Discuss] cloud storage
Bill Bogstad
bogstad at pobox.com
Sun Jul 15 09:37:42 EDT 2012
On Sun, Jul 15, 2012 at 8:18 AM, Eric Chadbourne
<eric.chadbourne at gmail.com> wrote:
>....
> If you put everything into a directory, tarred and then say gpg
> --symmetric, and then rsync'ed that would be a mistake I think because
> it would have to copy the whole huge file.
You are probably right, but I can't tell if you think this is because
the file is huge or something else. Rsync does chunk at a time
checksumming on both ends of a transmission before sending anything
for a modified file. I routinely rsync huge virtual machine images
(essentially large disk images) and it works pretty well. OTOH, I
suspect the steps you suggest (brand new tar and then gpg) will
produce many changes in the resulting large file. In particular, if
you compress two almost identical files you can get compressed files
that are very different. Encryption programs often compress files
before encrypting and gpg (by default) appears to do so. So rsync
doesn't have any trouble working on large tar files (or any other
binary file), but using GPG is likely to magnify small changes in the
tar file so the final encrypted versions are very different.
Wikipedia has a short writeup on the rsync algorithm:
http://en.wikipedia.org/wiki/Rsync#Algorithm
Now the original poster appears to want to encrypt his files
individually rather then en mass. If he is modifying these
individual encrypted files, then there is a good chance most (if not
all) of the encrypted file will change for minor source file changes.
The result will be rsync will end up sending the whole thing each
time. This would still be better then the gpged tar file though.
It would also probably not be a good idea to use the "-z" option with
rsync as the files will have already been compressed by gpg.
Bill Bogstad
More information about the Discuss
mailing list