I have a backup going to a Wasabi bucket which as 89 day immutability set at the bucket level and my backup job is set to delete files after a 90 day retention period.
This has been running fine for some time, but recently something happened (not sure what) and I've started to see a strange sequence of events which recurs on a daily basis.
Basically what seems to have happened is that the backup client tried to delete a file which was immutable at the time. This was obviously not possible, but the real issue is that it then created a new 'cbbdeleted' file in the storage location, which was then ITSELF subject to 89 day immutability.
Now I have a recurring sequence of events where every day the client tries to delete the file, fails, and creates another 'cbbdeleted' file. I now have almost 90 days worth of these files.
How can I break this vicious cycle of file creation?
FYI, after quite a bit of back and forth with the support team, the outcome of this is that the issue is that I have a combination of bucket-level immutability and the 'Mark objects as deleted in a backup source' option set in my legacy backup job.
This backup option creates 'cbbdeleted' files as markers for deleted backup files and the backup software expects to be able to delete these files at any time. Being subject to the bucket-level immutability breaks this functionality.