What would you want Duplicacy to do if deleting the oldest revision (of which backup?) isn’t enough? Should it just keep pruning revisions permanently until there’s room, likely in violation of any pruning policies? It seems like it would be easy for someone to permanently destroy an entire storage with a single typo in a backup command.
What if even a single snapshot revision exceeds the specified threshold due to the backup directories growing too large? I would expect it to throw an error, which would still cause backups to stop and require manual intervention,
In either case, deleting anything seems like terribly destructive behavior for a backup command.
The current behavior of erring out once a storage fills and requiring manual intervention seems ideal to me. Only the user would know what is worth deleting to make space. Maybe it’s worth removing an intermediate revision, or all of the revisions for an old separate backup that’s no longer needed, etc.