First I need to credit @Eran because he is the one who got me started on this and I am not at all an expert but I do have it working at least for the last couple of hours so here goes. I will give you a little background of the idea.
I have a local backup server working and have been for a few years. Currently running on Ubuntu 16.04 LTS and Urbackup 2.1.19 on a Dell R530 with 8x 6TB drives in a Raid 5 and around 40TB usable storage
I have an internet server running Ubuntu 16.04 (EC2) and an S3 Bucket mounted at /mnt/s3bucket (Uroni just tells me that S3 is not a good use for urbackup but thats irrelevant to my goals of this tutorial)
the first thing we do after getting the 2 servers running is copy the server_ident keys in /var/urbackup between the 2 servers so they are both identical. It doesnt matter if you go internet>local or local>internet. as long as theyre the same files.
make sure the clients can backup to the local server.
turn on the internet pieces on the internet server
add a new client to the internet server using the same, identical, case sensitive name as it is on the local server. when you see the next screen copy the internet authentication key and put it in the settings for this same client on the LOCAL server so when the client pulls its settings from the local server it gets this key automatically. Here is a shot of what one of my ādualā clients looks like on the local server.
the last thing I did was restart both urbackup servers and then restart the service on the client so it immediately downloads its internet auth key.
I think Iām forgetting something and Iām kinda busy at the moment so just ask questions if something isnāt clear. I like to help with this project as it has helped me a lot in the last few years.
Here is what your clients status window should look like when its done.
Really appreciate you taking the time to write this up. I have a few internet clients that would like to backup locally as well so this is just the ticket!
So I had been running for awhile but now I am unable to run backups to the internet server anymore. It looks like local backups are working but internet ones are failing. If I manually run a full image backup via the internet webgui I get this.
03/07/17 11:45 INFO Starting unscheduled full image backup of volume āC:āā¦
03/07/17 11:45 DEBUG Backing up SYSVOLā¦
03/07/17 11:45 DEBUG Backing up SYSVOL done.
03/07/17 11:45 INFO Basing image backup on last incremental or full image backup
03/07/17 11:45 ERROR Error opening hashfile (/mnt/s3bucket/UrBackups/servername/170303-0124_Image_C/Image_C_170303-0124.vhdz.hash). No such file or directory (code: 2)
03/07/17 11:45 INFO Time taken for backing up client DC-DCDNS0: 5s
03/07/17 11:45 ERROR Backup failed
I can see the file that is apparently non-existent so whatās the deal?
I added another volume to a client and did a full image backup. The C drive still failed but the new volume didnāt have any problems and was backed up. Then if I do a full or incremental backup on the new volume it fails.
Your problem can be solved with either a cleanup or remove_unknown script, I think.
Itās trying to pull files and compare them to the database, and when a mismatch happens between the database and the files saved there becomes an issue.
Not a database_cleanup, just a regular cleanup or remove_unknown should fix it.
I have two UrBackup services running in two different locations (my house and my Mumās house!) both on Raspberry Pi 2 systems running stock Raspbian. The backup destinations in both cases are standard NAS units (a Synology RackServer at my house and a Netgear ReadyNas v2 at Mumās house). There are clients at both locations.
I want my systems to back up primarily to my local UrBackup server, and use the one at Mumās house as an Internet server.
I want my Mumās systems to back up primarily to her local UrBackup server, and use the one at my house as an Internet server.
Before I muck about with my parentsā laptops, Iām trying to set it up and get it working from my side.
Unfortunately, my clients change their āInternet server name/IPā to the hostname of the local server - like theyāre connecting to Mumās server and receiving the āInternet server name/IPā I configured on the UrBackup server at her house. Naturally, it then stops connecting to Mumās server.
Iāve even tried write-protecting the settings.cfg file, but the client seems to override this and change it anyway.
I thought that a āper-client settingā for āInternet server name/IPā would be good, but when I tick the box for āSeparate settings for this clientā, thereās no option for āInternet server name/IPā. If there was one, I would tell both my server and hers that each of the clients at my house would use her serverās Internet name, and vice versa.
So whatās the trick? Why do my clients keep pulling the config from the Internet server? How do I get them to not do that?
Hi all⦠is anyone still using this technique for getting a 2nd repository for their backup data.
I really wish there was a āproperā way to do this but perhaps there just isnāt sufficient call for it.
If anyone is using this method, can they initially tell me⦠Surely, the way UrBackup is designed it assumes that once itās got a successful backup it doesnāt need another one⦠hence each of the two servers ends up with an incomplete backup set?