Remote storage is full, prune fails instantly

So I’m running into an issue like I did with ARQ backup with duplicacy, my remote storage completely filled up, and I’m unable to prune my backups to free up space.

Unfortunately despite my weekly prune my storage got packed completely full and now in the web ui I can’t prune.

With command prompt set to duplicacy’s install folder I tried “duplicacy_web_win_x64_1.0.0.exe -d -log prune -exclusive -exhaustive” but that did nothing.

How can I force a prune operation to free up space? It’s pretty annoying that there’s no obvious way to use command line options when you have the web edition.

What errors did you get from the CLI run (in debug mode)? Would you mind pasting the output here?

How are you trying to prune within the web UI? Can you post a screenshot of the schedule that contains the prune job?

Based on the name, that doesn’t look like the CLI executable. That looks like the web UI executable. I don’t know where the CLI gets downloaded to on windows, but you can download the CLI executable from here if you want to run the prune command directly. It would probably be easier to use the web UI, though.

It depends on the error you’re getting. In the web UI, click on “error” or “warning” or “completed” (or whatever the status is) next to the prune job, then see what the log file says. You can also add the -exclusive and -exhaustive options in the web UI, but you also need to specify your retention rules using -keep.

1 Like

@XOIIO The CLI .exe’s are already downloaded into .duplicacy-web\bin but you have to run it from .duplicacy-web\repositories\localhost\all for storage-wide operations like check and purge…

cd %HOMEPATH%\.duplicacy-web\repositories\localhost\all
..\..\..\bin\duplicacy_win_x64_2.3.0 -v prune -all -exclusive -exhaustive

Just make sure no backup jobs are running (to disable them, uncheck all the days of the week; Mon-Fri).

I wonder if it’d be feasible to place a link in the Web UI to open command prompt directly in that location.

At the very least, Duplicacy Web could perhaps add a symbolic link duplicacy.exe in \bin to the latest version - every time it downloads an updated CLI .exe - and then add that to the PATH environment variable, so all you have to do is go to \repositories\localhost\all or 1 or 2 etc…

Running that cli command gives me this output.

Storage set to sftp://ARQBackup@64.110.218.138:45/SFTP Backup
Using 16384 iterations for key derivation
Listing all snapshot ids
Listing revisions for snapshot DL380_G6_Alpha
Failed to decrypt the file snapshots/DL380_G6_Alpha/744: No enough encrypted dat
a (0 bytes) provided

My prune operaiton is set to run once every sunday and my backups were scheduled for every two hours.

I don’t get a warning, I just get a failure immediately, turns out this same error from the cli is what pops up. My SFTP server passsword has not been changed, and it is authenticating the account and I see it connect to it. This is all it does and then it disconnects.

I’m running a check from the cli to see if that helps at all.

edit: Nope, still doesn’t work. Uhg.

Does a check in the Web UI return the same error? It looks like your snapshot file snapshots/DL380_G6_Alpha/744 (revision 744) on the sftp storage is 0 bytes. Are you able to verify that manually by listing the directory? If it is 0 bytes, you may have to delete it manually and do a fresh check.

It gives the same exact error. I went and deleted a bunch of 0kb snapshots and that actually freed up about 15gb of space and now a prune operation is running successfully in the CLI.

If I may ask a possibly totally unqualified question: couldn’t duplicacy avoid such situations by never filling up the storage to 100 percent? @gchen

How did you delete those? What did you delete exactly? Reading that phrase makes me to think that something really really weird is going on with your sftp storage.

1 Like

I just logged into the machine hosting the sftp server and manually found those chunks in the file system. As I mentioned my storage ended up filling completely, so I’m guessing duplicacy needs some space available even to perform deletions.

Through the cli I did a really aggressive operation to limit the age and number of versions, and I’m now running a prune again and it’s deleting files and freeing up space, so I think that the issue is solved. I’ll update/close the thread tomorrow when I know for sure,

Looks like the issues are resolved, at least it’s possible to fix when your storage gets full with duplicacy, vs ARQ, though I do miss some features of ARQ still.