Dear Colleagues,
maybe this has been asked already, but I am not finding a good description of the action.
My Situation is as follows: I have to backup a considerable amount of data (300G) over the internet with an unreliable Internet connection and server downtimes. Most important is the question whether the entire 300G have to be downloaded again or not, in this case the backup will never be complete…
- Is it possible to interrupt and resume a backup by the administrator?
- What happens if the client goes down ?
- What happens if the server goes down ?
- Is it possible to download the data via a reliable connection to a Harddisk, transfer the harddisk to a computer with a fast connection, backup them from there and link this backup to the original client ?
THanks in advance,
Thomas
Hi Thomas,
You need to copy all the data to a portable hard drive.
Take the drive to the location of the backup server and connect it to a computer that is hard wired into the same network.
Now add the portable drive to the backup paths and perform a backup of the ‘foster’ computer.
Once completed. Perform a full backup of the remote system with the original data. As the data is already on the server it won’t need to transfer it.
I do this all the time for the initial full back up of remote clients with lousy internet connections. It works well.
To answer your first three questions. 1, Yes from the server admin activities page, 2 & 3, it will appear to start again from scratch but due to the deduplication will only transfer what’s left.
It’s a truly brilliant project. High five to Martin and the gang!
Glenn
Very cool, thanks a lot. This saves a LOT of time. I have evaluated several backup solutions (bacula, bareos, etc) but I think in my environment with the automatic detection of clients in the same network, btrfs support, etc.) urbackup seems the system of choice. It is really good work.