Best config for deduplicated incremental file backup

Hi,

I am using urbackup for many years now on my private server and I have read the documentation several times.

I tried to configure a similar setup for a friend but I never managed to get similar results. Especially regarding deduplication features (symbolic linking) and strict incremental file backup behavior.

This is why I want to clarify once for all, what are the recommended and mandatory configurations of server and client to get such a setup running.

Main criteria:
Windows Server
Windows Client
deduplicated data (each file shall be saved one time only on the disk, no redundant data for backups)
incremental data (delta is detected fast and only changed data is transfered)
many small files (over 1M files)

Here is a list of things that may are related:
*) Server disk shall have NTFS formatted (e.g. ReFS does not support all features)
*) Does the cluster block size of NTFS influence the UrBackup features?
*) Settings → General → File Backups
Interval for incremental file backups: 1h
Interval for full file backups: disabled
*) Settings → General → Advanced
Temporary files as file backup buffer: disabled?!
Local/passive incremental file backup transfer mode: Raw?! (since it is fastest)
Database cache size during batch processing: 600MB?! was raised
Use symlinks during incremental file backups: enabled (this ist the most important for deduplication?!)
Debugging: Verify file backups using client side hashes: disabled?!
Periodically readd file entries of internet clients to database: disabled (incremental backups shall be forced)
Create symbolically linked views for each user on the clients after file backups: disabled?! (not sure but not needed?)

Other things:
How to force MS Shadow Copy services to backup up locked files?

Maybe there are some other settings that may also influencing the behavior.
Hope you can clarify how to force deduplication and incremental backups.
Currently the setup uses 4 times the storage of the client allthough deduplication should be enabled and no big data changes were made over the time.
And from time to time it makes full backups and copies all the data again.
And the processing of the large number of smaller files takes very long.

Thanks.