On a Windows 10 PC suppose I have a folder "A" containing general data. I decide to reorganize this data, splitting it between two existing folders "B" and "C", and then deleting folder "A". Assume I have been and will continue backing up this PC to a local NAS. For this use case, how would the legacy and new file backup handle retention of the moved files? How would the new format handle deduplication? Would I need to manually synchronize the repository for either?
No need to do repository sync. In either format, files would be backed up all over again, and the original location backups would be deleted from backup storage based on your retention setting for deleted files.
Synthetic fulls do not work for local storage, so move the files, run the backups, make sure you have a setting for purging of deleted files set and you will be fine.
The problem with the new format is that you have to do periodic true full backups (where all files get backed up regardless of whether they have been modified. And you have to keep at least two generations. That is not the case for the legacy format, so we use legacy for local and cloud file backups. We use the new format for Cloud image and VHDx backups as BackBlaze supports synthetic fulls, but we still need to keep two generations, which takes up twice the storage.
But it is worth it to get the backups completed in 75% less time thanks to the synthetic ful capability.
Sorry if this is confusing,
Is the reason that you use legacy for local and cloud file backups due to the amount of storage space taken up by the local backups? In other words, do you keep only one copy of the backup data reducing your backup storage required by about 50%?
If you do monthly full backups and set retention to 30 days would the prior full backup be deleted upon the creation of the new full backup?
The reason that we use Legacy format for File backups, both cloud and local, is that they are incremental forever, meaning once a file gets backed up, it never gets backed up again unless it is modified.
The new backup format requires periodic "True full" backups, meaning that even the unchanged files have to be backed up again. Now it is true that the synthetic backup process will shorten the time to complete a "true full" backup, but why keep two copies of files that have not changed?
As I have said prior, unless you feel some compelling need to imitate Tape Backups with the GFS paradigm, the legacy format incremental forever is the only way to go for file level backups.