Local and Internet backups Simultaneously - Tutorial (kinda)

First I need to credit @Eran because he is the one who got me started on this and I am not at all an expert but I do have it working at least for the last couple of hours so here goes. I will give you a little background of the idea.

  • I have a local backup server working and have been for a few years. Currently running on Ubuntu 16.04 LTS and Urbackup 2.1.19 on a Dell R530 with 8x 6TB drives in a Raid 5 and around 40TB usable storage

  • I have an internet server running Ubuntu 16.04 (EC2) and an S3 Bucket mounted at /mnt/s3bucket (Uroni just tells me that S3 is not a good use for urbackup but thats irrelevant to my goals of this tutorial)


  1. the first thing we do after getting the 2 servers running is copy the server_ident keys in /var/urbackup between the 2 servers so they are both identical. It doesnt matter if you go internet>local or local>internet. as long as theyre the same files.

  2. make sure the clients can backup to the local server.

  3. turn on the internet pieces on the internet server

  4. add a new client to the internet server using the same, identical, case sensitive name as it is on the local server.
    when you see the next screen copy the internet authentication key and put it in the settings for this same client on the LOCAL server so when the client pulls its settings from the local server it gets this key automatically. Here is a shot of what one of my “dual” clients looks like on the local server.

  5. the last thing I did was restart both urbackup servers and then restart the service on the client so it immediately downloads its internet auth key.

I think I’m forgetting something and I’m kinda busy at the moment so just ask questions if something isn’t clear. I like to help with this project as it has helped me a lot in the last few years.

Here is what your clients status window should look like when its done.

Really appreciate you taking the time to write this up. I have a few internet clients that would like to backup locally as well so this is just the ticket!

Cool! Hope this is enough documentation to get you going.

So I had been running for awhile but now I am unable to run backups to the internet server anymore. It looks like local backups are working but internet ones are failing. If I manually run a full image backup via the internet webgui I get this.
03/07/17 11:45 INFO Starting unscheduled full image backup of volume “C:”…
03/07/17 11:45 DEBUG Backing up SYSVOL…
03/07/17 11:45 DEBUG Backing up SYSVOL done.
03/07/17 11:45 INFO Basing image backup on last incremental or full image backup
03/07/17 11:45 ERROR Error opening hashfile (/mnt/s3bucket/UrBackups/servername/170303-0124_Image_C/Image_C_170303-0124.vhdz.hash). No such file or directory (code: 2)
03/07/17 11:45 INFO Time taken for backing up client DC-DCDNS0: 5s
03/07/17 11:45 ERROR Backup failed

I can see the file that is apparently non-existent so what’s the deal?

I added another volume to a client and did a full image backup. The C drive still failed but the new volume didn’t have any problems and was backed up. Then if I do a full or incremental backup on the new volume it fails.

Hey,

Have a “No Internet server configured” in my connection status. Did you see this error when connecting your server?

I installed locally first, will try to install preconfigured UrBackup just to check.

EDIT: Yes, installing from preconfigured setup fixed that one although I’m still having trouble.

Your problem can be solved with either a cleanup or remove_unknown script, I think.

It’s trying to pull files and compare them to the database, and when a mismatch happens between the database and the files saved there becomes an issue.

Not a database_cleanup, just a regular cleanup or remove_unknown should fix it.

Having a weird problem with this setup.

I have two UrBackup services running in two different locations (my house and my Mum’s house!) both on Raspberry Pi 2 systems running stock Raspbian. The backup destinations in both cases are standard NAS units (a Synology RackServer at my house and a Netgear ReadyNas v2 at Mum’s house). There are clients at both locations.

I want my systems to back up primarily to my local UrBackup server, and use the one at Mum’s house as an Internet server.

I want my Mum’s systems to back up primarily to her local UrBackup server, and use the one at my house as an Internet server.

Before I muck about with my parents’ laptops, I’m trying to set it up and get it working from my side.

Unfortunately, my clients change their “Internet server name/IP” to the hostname of the local server - like they’re connecting to Mum’s server and receiving the “Internet server name/IP” I configured on the UrBackup server at her house. Naturally, it then stops connecting to Mum’s server.

I’ve even tried write-protecting the settings.cfg file, but the client seems to override this and change it anyway.

I thought that a “per-client setting” for “Internet server name/IP” would be good, but when I tick the box for “Separate settings for this client”, there’s no option for “Internet server name/IP”. If there was one, I would tell both my server and hers that each of the clients at my house would use her server’s Internet name, and vice versa.

So what’s the trick? Why do my clients keep pulling the config from the Internet server? How do I get them to not do that?

i know this is a old thread but have a question . you state to copy the server_ident key(s) that is the .key files and the .priv/pub files as well ?

another question would be what if i have 2 local servers and 1 internet server .
local office 1 server
local office 2 server
corp office Internet .

going to play with this but would assume yes the keys need tobe identical so the any of the systems syn with the internet server .

Randy