So we currently have about 1600 clients over the internet we are backing up and adding daily.
Are there specific changes we should consider with that many internet clients?
What we are seeing is even when idle (no backups or cleanup) the server’s CPU is using 35-45%. I think do to all the internet clients checking in every min it seems when looking at packet captures. Can this interval not be increased to reduce the load on the server. I would think if they checked in every 5 to 15 minutes that should be fine?
Also, is there anything we can do to load balance these backups as we eventually outgrow what a single server can do in a night? I can’t see how we can point them all to a single destination and be able to load balance it. almost like we will manually have to just point new clients to the next server?
Another thing that i cannot find is what is the impact of the “Database cache size during batch processing” setting? Meaning is there a recommended amount based on how much ram you have, how large the databases are, etc? Have 32 GB in the server and only do file backups. Data is relatively small per client, but it still becomes a lot of files.
Any pointers would be helpful as we keep scaling.