I am thinking about using the software with Amazon S3 product do you see any reason this would not work and could you give me any pointers on setting this up?
You’d need a file system which stores its files on S3. There are several. I think S3QL (https://bitbucket.org/nikratio/s3ql/overview ) is the best one.
It does block level deduplication and stores the blocks in a SQLite database. Since UrBackup already has performance problems with file level deduplication with a SQLite database I would take a close look, if it can handle it from the performance perspective if you plan to store a large amount of data on it.
OpenDedup (http://opendedup.org/ ) would be another candidate to evaluate, as it can store to S3 as well.
Can you keep us updated on what’s best?
Just to share my implementation:
##Ver 1.4 RC1 - Image backup Only
Urbackup Server AWS Instance - runing TNTDrive mapped to S3 Storage.
Urbackup clients - send full images to the Urbackup Server
Urbackup configured to save the compressed images on the mapped drive as presented by TNTDrive which is actually S3.
Throughput is pretty impressive now on 1.4. Previous version was too slow
Clients are Win 2003 and 2008 flavours. Urbackup Server is 2003. Needed to use a C3.Large instance size through.
###Would like####
As I have 1 policy, I have simply put in the settings to backup drives c,d,e,f,g etc
The clients report an error when it hits a drive that doesn’t exist, but prefer that to each client having a setting. Would be nice if message was a warning rather than error.
With 1.4 you can use “ALL” instead of “c,d,e,f” and it will backup all volumes.
Didn’t think of this option as it is on Windows. Don’t know how performant it would be for file backups.
Could you tell me the amout of storage you are backing up are you having any preformance issues and how is pricing with the AWS and S3 product is it profitable.
@Uroni- thanks for the “All” - will try that
@Troye - Amount of storage is going to be about 200Gb - the reason for using Urbackup with S3 is to keep the costs down, and is a very specific use case. As such I will only keep a very limited amount of image backups no more than 1 per system. Full costs can be worked out by the AWS calc.
##Performance
On the topic of performance- I see a variety of speeds at the network level but all acceptable. The Clients get no hit on the CPU/Memory side. The Ur Backup server does get maxed out on CPU and Network with 6-7 image backups running sequentially. The plan is to do a once a week image backup of each client and only have the Urbackup server switched on for that period, again to keep costs down.
I haven’t quite got it nailed but will keep you posted
Have you attempted this with Google drive I cannot seem to get the permission correct for it to install and for the user to read the folder all at the same time.
This makes me think…hmmmm…
Just saw this post, yes I know the original is 1 yr old.
CloudBerry drive will work but it will be painfully slow unless you have massive bandwidth. They do have backup solution.I have not tried it.
Another option is AWS Storage Gateway. There is a cost of $125.00 /month + S3 storage. We have been using it and it does work well.
See post here: Urbackup to the Cloud
If you have very large amounts of data to backup full backups will take a very long time to sync to S3 unless you have a lot of bandwidth. Currently we are only backing up C: drives for one of our field locations and their file server data drive is a cache drive presented by the gateway so all files are backed up to S3 in real time. C: drive full image backups take about 72 hours over a 10Mb/sec line. We only do these every 30 days and try to stagger the servers. The differentials only take about an hour or so.
I am working on something in this area. I anyone wants to help testing please contact me via PM/mail.
This may sound overly simplistic… but with DropBox’s background syncing (auto bandwidth limiting), would it make a good candidate for a repository?
I am trying to do something like this as well. I have an in house ubuntu server running UrBackup with some test PCs working fine. But I want to sync my backups with S3. for now I have an S3 bucket mounted and I rsync the urbackup folder with the S3 mount. This seems to be the most cost effective way of doing this and it was fairly simple to setup.