Avoid disk full condition

It would be nice to set a flag how big the backup should get.

for example --max-size 100G

so when I say backup every day the last week. every week the last month and every month the last year and so on, I set the max backup size and when the backup gets to big the last backup get deleted.
could be added to the web ui too.
I think a full disk is one of the bad thinks that can happen to a backup.

What would you want Duplicacy to do if deleting the oldest revision (of which backup?) isn’t enough? Should it just keep pruning revisions permanently until there’s room, likely in violation of any pruning policies? It seems like it would be easy for someone to permanently destroy an entire storage with a single typo in a backup command.

What if even a single snapshot revision exceeds the specified threshold due to the backup directories growing too large? I would expect it to throw an error, which would still cause backups to stop and require manual intervention,

In either case, deleting anything seems like terribly destructive behavior for a backup command.

The current behavior of erring out once a storage fills and requiring manual intervention seems ideal to me. Only the user would know what is worth deleting to make space. Maybe it’s worth removing an intermediate revision, or all of the revisions for an old separate backup that’s no longer needed, etc.