5TB data on client (linux) shows >150 TB to backup

why is this:

and what happen at the backup destination which has 8 TB max?

am i the only one having this?

What happened? Did you get an answer? Are you doing a File Backup? What type of files are you moving?

I also have many TB’s of data to backup and also trying to find a solution. My initial attempt at using Image backups failed due to a 2TB limit.

This forum is a bit quiet, which probably means things generally work too well for everyone else. :grinning_face_with_smiling_eyes:

i have a different opinion. urbackup is buggy as hell and i switched to bareos. That i get no answer to such a trivial question didn’t help either.

Apparently yes.

Now you have your answer to your trivial question.

My guess, which is all it is, is that your backup either includes some sort of loop due to symbolic links, or the default excludes have been removed.

Sorry I’m just a user, I can’t take a stab at fixing it, as even if your config, backup paths & excludes were posted I only read English, & don’t have a Linux client to compare to.

@BatterPudding use a Linux based server with BTRFS for your storage, you’ll then be able to image bigger drives in a non-vhd format.

i’m not the only one with this, after digging deeper into the forum. I’m also not he only one getting no answers. Found user, which mentioned, that he even pays for urbackup and gets no answer.
Honestly, a backup software which needs that much fiddeling to backup a linux server with the usual known excludes, isn’t worth it. Especially when the same in and excludes are working with borg, bacula,bareos, backuppc, duplicity, duplicaty.
Intersting part for me is, that the replies are coming, after i’m stated that urbackup is crap.
So, good luck using it.

Personally I find a dead quiet forum is usually a good sign. If there were loads of problems with the product this place would be full of ranting users. One of the reasons I bailed from using Duplicati was the number of people with problems on the forum and devs replying in a confused manner. Nothing worse than backing up a Windows PC and finding very limited Microsoft knowledge on their dev team. Having a Windows Update delete the backup database is not exactly ideal. (I also had a few too many machines with Duplicati on “coincidentally” loose hard drives…) And yet many people are very happy with that product.

I am using urbackup in a number of offices with small sets of PCs backing up to a spare hard drive. Has been working very well and saved some people a few times already. Clearly the fundamental product “just works” which sends a support forum into silence.

That means when we come along with our hard to repeat issues there ain’t many people around to answer. :slight_smile:

2 Likes

yes, i read it elsewhere that urbackup is windows os focused. Thats okay, but then it shouldn’t advertise linux capabilities which are just not working

I looked at bareos last year when I was searching for a backup solution. Due to the subscription model, I didn’t test them out, but I found UrBackup to be stellar for my needs (both personal and for customers).

Sorry that your experience has not been stellar with it.

As for fiddling, I found Duplicati much more finicky for my networks, which led me to UrBackup. Different strokes for different folks.

1 Like

From the experiences I am having so far I would have expected that those numbers you are seeing of many TBs would actually vanish due to the deduping. If you only really have less than 8TB on your source, then it won’t find more than 8TB to move. I guess you have various virtual links pointing back to the same data. The backup should backup the link, not the data. Even if it did go to backup those files multiple times it would not actually store them due to the dedup rules.

I bet the error here has just been how it has done the maths of calculating how much data needs to be backed up. The maths to make that image has counted all your links as extra data.

I don’t know about urbackup being “windows focused”. I am crossing the streams with various windows and linux systems as both source and targets and been pretty happy so far. I have a number of offices where I stumbled through the urbackup setup and just left it running. Returning months later to find a happy system still operating fine.

Whereas Duplicati is far too linux leaning to the extent that I would return to offices where it was setup and found it totally missing! A Windows update would have removed it as the devs were installing vital files databases into the Window system folder! They seemed surprised that breaking the OS rules was deleting their database.

As @BrainWaveCC says - we all have different needs. Different software fits our different ways of doing things.

1 Like

Have only got this far through the thread so far, but I’m running the host on Ubuntu and it has been pretty much flawless…even though I am in no way a *Nix guru and have only just got back into using it…

Most of my [client] computers are running Linux (again, Ubuntu) and have had absolutely zero issues with getting them backing up - and the three Windows machines are able to backup over the internet without issues, all but one of those [that is still working, others have been decommissioned] have never even seen my DMZ subnet, as they are at remote sites…

As mentioned by @BatterPudding above, a quiet “support” forum means a lack of general issues - which means a good and functional solution… :sunglasses:

2 Likes