Storage is filled up and Prune does not seem to work

I use this command:
/usr/local/bin/duplicacy prune -keep 30:30 -exclusive -exhaustive -threads 6

I have 26 revisions older than 30 days, plus one each day for the last 30 days.
60 revisions at around 5 GB each on the source (it started out at around 3GB and has now grown to 5GB , so use 5GB as some sort of worst scenario) should be ~300GB. But thats without dedup!
Most files never change, only files added.

My backup folder at the remote site now is 3300 GB! (!!!).

What am I doing wrong? Why does the “CHUNKS” folder keep growing?

Remove the -exclusive flag. This flag disables all protections and can lead to data loss if anything else is touching the datastore.

Also there is no need to run exhaustive every time. Exhaustive flag forces Duplicacy to enumerate all chunks, it is useful to find orphans after you manually deleted the snapshots but it’s very slow. There is no need to run this every time.

That said, what types of data do you backup? You can run check -tabular or check -stats to see difference in size between revisions to see if this matches your expectations of 5GB of new data/day.

What is the destination? How do you check size of the chunks folder on the destination?