Last night I had lots of incremental file backups deleted and I believe it is due to a bug in the filesystem quota calculations. I have a test server setup with ~4TB of storage capacity, but it is on a Server 2012R2 system that runs deduplication on the volume that holds the backup data. The volume is ~50% full, but is also deduping at ~50%. I believe that it deleted those incrementals because it is not seeing the actual free space on the volume but only comparing the amount of data on the volume to the size of the volume?
@uroni, if there are any logs that I can provide to you, please let me know. Also, if there are any settings that I can check please let me know. Also, is there anyway to disable the global filesystem quota? If I blank that input and then save, would that take care of it?
I was going to post the same question. I have the same exact setup except 7.5TB of storage. Thanks to 75% deduplication rate, I am only roughly 30% full, but it appears incrementals are still being deleted.
You are correct. It is probably difficult and perhaps impossible to correctly estimate the space a client uses, if you have deduplication. The correct thing to do would be to count the number of unique blocks a client has and add the number of shared blocks divided by the number of clients they are shared with. I know of no filesystem/deduplication technique, where this is currently possible. Usually you only get the total undeduplicated size, or sometimes the uniquely referenced size.
So the soft quotas should be definitely disabled if “100%” and not limited to the storage capacity (That is what it does. It sees 100%=4TB ==> client uses more than 4 TB ==> backups are deleted ).
I’ll fix that. As a work-around for the current version you can set the soft client quota to an empty value. Then it will not try to enforce the quota during the nightly cleanup.
Just to clarify, the “Global soft filesystem quota” should still work fine, and it’s the “Soft client quota” that is causing the problem, correct?
So basically the only way would be through a batch script or PowerShell script that would walk basically walk through all of the symlinks and return the size of the directory? Man, that would take quite a while. I am trying a “du” script at the moment and will see what I can come up with.