I back all my media up over the internet. This has worked quite well overall. One issue I run into, though, is when MC rewrites tags for all a media category. Such as when I filled the [Album] tag for all photos to match the parent folder name. Suddenly, I have to upload 85GB of photos. Or how MC must have rewrote all my tags back FLAC tags on April 6th. Again, I suddenly have a ton of data to upload. This amounts to 528GB.
I have full access to the remote server (both Windows). Is there an application that will intelligently synchronize file changes only? Clearly, less than 1% of the data in these files has actually changed. If a server/client software product could run checksums on half, 1/4, 1/8, 1/16th, etc., of the file, it could break down the amount of the file to actually copy significantly. It probably needs to be on the server itself to get full-speed reading of the files. To perform this on one system wouldn't really work because it would have to download the entire file and scan bits, defeating the purpose.
I like not being inhibited to major library changes. If I need to make massive adjustments to tags, I like to be able to without running into this hassle. This update will take approximately two months of uploading, not to mention stealing my upload bandwidth during that time!
Does SVN handle this on its own? I've thought about making the directories TortoiseSVN. Then, I'd see when files changed and could submit changes, followed by updating on the other system. Although I think it still pulls the whole file each time. Deltas are merely used on the server, right?
It's a tough issue but backups are certainly essential, even with good server redundancy.