Is it normal to have a very large "fossils" folder?

I noticed my “fossils” folder in the Duplicacy directory has a size of 2.2 TB. Simply judging by the name of the folder, I’m wondering if it’s normal that it’s so large? Is that all data that could be deleted?

I am running the prune command with the flags -keep 2:60 -keep 1:7 -a -threads 100 once a week. So I would have assumed that anything that’s not required would be always be deleted by that.

If prune is interrupted, there may be some orphans left in the datastore.

Run prune -exhaustive to clean up orphans.

Don’t delete anything manually.

Also, 100 threads?! What is the target?

Ok, thanks! And the target for this is Google Drive, when I configured the prune back then, 100 threads was what worked the fastest without leading to timeouts.

When I was using in the past google drive via shared access, there were some issues that prune was failing to delete data, unless I gave the “manager” access to the account. Prune Fails: GCD_RETRY: The user does not have sufficient permissions for this file - #21 by saspus. Probably not what happens in your case – but see if maybe there are some silent failures, 100 threads seems too much :slight_smile: