My main machine (18TB worth, Windows) is currently doing its 2nd full file backup, and 6TB in I’m running into a bunch of similar files that all have an 8-byte patch applied to them. Would this actually be caused by metadata e.g. file access times and not actual file content differences?
Yes. If the patch is empty it might be 8 bytes in size.
Unfortunately I had to give up because those empty patches still caused UrBackup to copy the entire file before “patching” it, and the server (set to retain only one full file backup) was going to run out of space before finishing and deleting the previous full file backup… not to mention the whole process taking many times longer than needed.
I have no idea why the first 6TB worth of files went through normally but pretty much every single file after that was considered modified. Could be because the initial backup was interrupted and resumed, but I don’t remember if it was around that point.