Hello! i’m trying to prepopulate a backup with 3tb of information, for a internet client, via local network. The thing is, hashed files doesn’t seem to be working properly. If i have a folder with 1 tb of information, divided in several different folders, and i add those several folder one by one, when i add the root folder the server tries copies every file again, instead of just copying the differences.
Same thing with different clients, it’s like it’s not really honoring the share_hashes tag, i’m scared that when i finally get the full copy (3tb) and try to backup via internet it will try to copy 3tb over the internet, taking months to complete, if it doesn’t fail.
It’s baffling me, since i’ve been trying to get this copy done for 2 weeks now, and it keeps failing.
Any tips?