Client-side deduplication instead of server-side deduplication (Dedup Server)
We do not give up on deduplication but changing our strategy to offer you a more effective backup. There are several reasons we are shifting from one approach to another:
We've done a research and it turned out that most of our customers manage customers having multiple sites. And this is what we've come across:
- When using the Dedup server, a separate server has to be provisioned, be up and running 24/7.
- Computers have to send FULLs to Dedup server for better deduplication.
- Most of the computers start a backup routine at the same time and Dedup server has to have a sufficient local storage capacity storing all FULLs from all machines.
- The Dedup server requires an SSL certificate to be manually configured on it to secure data transfers.
- The Dedup server is a potential single point of failure.
- Having in mind costs of having a powerful publicly-available server up and running 24/7 (#1) with large local storage capacity (#2) leads to extra spendings that sometimes are higher than storage savings from using the Dedup server.
The Dedup Server is deprecated long live Client-side deduplication
Instead, we have polished the engine and now preparing Client-side deduplication built into backup agents that will be released this summer as a part of the new data backup format that will be a part of GFS (coming out later this year).
Tech questions: firstname.lastname@example.org