Incremental image too big with CBT

I had such feeling in the past, and was told to purchase CBT, which I did, and it confirmed my suspicion.

A few hours after completing an image backup, running another image backup results in a few gigabytes of changed data. I assume these are system files (swap, etc), because nothing else really changed on that disk.

Client is running over a slow internet connection. Even a few gigabytes take hours to complete and slow down the client’s overall internet connection to the point of frustration.

Is there a way to see exactly which files have changed?
Or do you have another idea how to solve this?

It would have to iterate over all the files in the volume… I’ll have a look at that when I have some time.

@uroni Still waiting for you to get that time :wink:

Create a file named log_image_changed_files in C:\Program files\UrBackup and it’ll log the changes (as info)

1 Like

Where should it log them, in debug.log?
Because I don’t see it there…

Using version 2.3.4-cbt beta.

CBT promises much and delivers almost nothing compared to non-CBT… in fact the ONLY point to buying it is to support development of the free functionality

Did you enable info level logging?

It does what is says on the web page. It does not change the amount of data being backed up:

Change block tracking for UrBackup Client on Windows speeds up image and file backups performed by UrBackup by tracking which blocks change between backups. Without change block tracking all data has to be read and inspected in order to find and transfer the differences during an incremental image or file backup. This can take hours compared to the same taking minutes with change block information. This enables backup strategies with hourly (or less) incremental image backups and speeds up incremental file backups with large files (e.g. virtual disks).

That is indeed what it says, however I have 2 clients at my sister’s house, one using CBT and the other not if there was any significant difference in backup speed I’d have bought a second CBT licence, they’re cheap enough…

Yes I have some with the same issues (DSL clients with slow upload speeds).

they have their own group with speed and hour restrictions on backups

For me the difference was noticeable. Not only the backups finish faster, also the system is not slowed down during backups due to the disk being fully read.

Also I’m happy to support this amazing project. I was looking for such solution for years.

As suspected, I see many files marked as “changed” even though they most certainly weren’t. Just to be sure, I’ll store their md5 hash and compare the next time they’re marked as changed, and post back the results.

Could be. The problem is that the image backup is build for local networks (or high bandwidth and it is also simpler) and always transfers/compares 512KB blocks. And Windows can change a lot of 512KB blocks in a short amount of time, especially if it does idle defragmentation or whatever it does (and that you cannot turn off anymore in Windows 10). If it’s small files or Windows puts blocks of different files into one 512KB block you get changes in one file changing others as well. The CBT also marks blocks as changed that have the same content over-written (it does not check it differs). But that doesn’t actually get transfered in the end as hashes are compared still.

Nevertheless, if it still should only transfer changed block, obviously.

Thanks for the explanation, it makes it clearer now. I also confirmed that those files marked as “changed” in the log were not actually changed (some of them are jpeg images, some are installation files). This was confirmed by comparing their md5 signatures.

For the record, I’m running on Windows 8, and have disabled background defragmentation. But reading your explanation made me think that maybe running a manual defrag will actually help, because it will potentially store unchanged files close to each other, and reduce the number of changed blocks.

Does it even work like that anymore with SSD drives?

Just wanted to chime in that the CBT driver speeds things up for me, and regardless is a good way to support the project. Being a one-time-purchase, it’s a total bargain as well.

We do image backups over the internet… any way to make the block sizes different or tune for internet backups?