How to get a clean server database

I am trying to setup a backup for some of our server. I tested with a server we where not using. But the server contained a lot of file and now my urbackup did not work correctly anymore. (I was seeing that it was active while it was not). So i decided to move the files aside and start with a fresh database (the current was about 6gb and another files was 13GB). I stoped the server moved the files aside and restarted the server. The files are created again but not correctly. Whats the correct why of creating a new database?

Anyone a suggestion or should u just purge the package and reinstall it?

I guess by now you have it worked out somehow.

For others: I haven’t checked but I guess:

Thanks for the replay Uroni, what i did was purge the whole package and reinstalling it.

I think it’s mentioned somewhere else but what are the recommented settings for backupping a lot of file. (or is it better not to backup a lot of files)?

Is there any way to run the database for the server on mysql because after 6GB it seams to me sqlite is unable to handle it.

Or Postgres? I believe it is more compatible with sqlite, IIRC?

For me that does not make much different although i am used to running MySQL and i never ran postgres but i don’t mind learning it :). But i think a real database server is needed for handling more files.

This will be improved in 1.5. See here for what you can do for the current version:

Postgres/MySQL would only improve concurrency. Postgres does use the same basic data structures as SQLite, for example. In your case it probably would not improve the situation significantly (or how many backups are you running concurrently?).

I was only running about 2 backup but i think the filecount killed it. After the SQLites files got over 6 & 13GB it was not doing anything anymore. (i was trying to backup about 200 websites :smile:). I know thats a lot but still it think it should be possible.
I am not a database expert but would a database service not save a lot of loading the sqlite file, so saving IO at the cost of some memory?

SQLite (like Postgres) uses the operating system file cache. So more RAM should automatically help.

The server i tried this on has 4GB where now with running server and client there is still about 2GB memory free.

What would u recommend if i want to backup a lot of small files using UrBackup? Or would u recommend tar.gz ever set?

I have running Urbackup with 144 clients, database size is 63GB, occupied storage space is 8 TB on iscsi NAS 4xSATA. Server HW is HP DL 360e Gen8 but it also hosts Surveillance Software. It runs like a charm :smile:
Now I want to connect bigger iscsi storage with 23 TB of space. I am wondering if starting with clean database is a good idea…?