New Backup Format - Size comparison on the bucket vs legacy
Thanks for sharing your schedule Steve, gave some insights for the process.
David, our requirement in this scenario is mostly for file backups.
We would like to keep 1 year retention of files and at least 3 versions.
Today, we run legacy format with 1 year delete local and keep at least 3 version.
We have one customer with 6 million files, which would take forever to restore the entire dataset.
So, we though on using the new format.
But as per I'm seeing, a always incremental approach wouldn't work, and run monthly fulls, would consume something like what... 10x more GB on the storage rather than the legacy format, due to the monthly full backup.
I wanted to use the new format because of the deduplication and much reduced recovery time, fetching the packages rather then millions of small files is much faster.
I know there is no rule of thumb and it depend on requirements, but are you having traction for file-based backups with this new format?
Thanks and regards,