Answers to that will vary widely by system specs, storage type, server specs and if the client is local (LAN) or remote & backing up over the internet. I’m not sure what use random numbers that work in setups probably quite different from yours will be to you.
I haven’t particularly tried to average, though the internet backup of a remote client that happens to be running right now appears to (currently, it varies a lot) be running at ~34Mbps which would equate to a bit over 4MBps, other times it slows right down to virtually zero, I couldn’t say what file types it’s currently doing though.
Looking at my logs a local (LAN) backup of a recorded 500MB (mixed files) is shown as having taken 44 minutes.
I have observed that I get better performance for clients with fewer larger files than those with tens or hundreds of thousands of tiny files, especially for creating user views after the backup reaches an indicated 100%, for those clients I actually see faster performance & backup times if I do disk images rather than file backups. I have a less than ideal setup though, with the server database on a spinning drive & USB storage.