Duplicacy hangs/stalls on "list"

Please describe what you are doing to trigger the bug:

I’m running the unlicensed CLI duplicacy on my home server.

My setup is as follows:
duplicacy runs on a Fedora VM, 2 cores, 8 GB memory.
The target is a Pi 3 Model B+, accessed via SSH over a tinc VPN. Attached is a 8TB USB drive with ~3 TB of duplicacy backup data in ~100 revisions. (with little difference between the revisions, ~90% of the data has been unchanged from revision 1.
For testing, the connection happens via local network.

I had to re-built the Fedora box, downloaded current duplicacy and did a “duplicacy init sftp://…”, which worked fine.

However, the next command “duplicacy list” stalls after a few lines… it quickly lists ~10-20 revisions, then becomes slower and just stops. Checking both boxes indicates no cpu load, no full memory, no IO wait/load, no network congestion. It simply hangs.
When I then stop via CTRL+C and re-issue the list command, it quickly shows all previous revisions plus 5-10 new ones.
I continued that about 15-20 times until it had listed all revisions.
Then I tried to do a “duplicacy -d list -r 292 -files” (revision 292 is most recent).

Output is attached (from run ~3):

[root@fedora~]# duplicacy -d list -r 292 -files
Storage set to sftp://duplicacy@raspberry//mnt/something/backups
Reading the environment variable DUPLICACY_SSH_KEY_FILE
Attempting public key authentication
Reading the environment variable DUPLICACY_SSH_KEY_FILE
Reading the environment variable DUPLICACY_SSH_KEY_FILE
Chunk read levels: [1], write level: 1
Compression level: 100
Average chunk size: 4194304
Maximum chunk size: 16777216
Minimum chunk size: 1048576
Chunk seed: 6475706c6963616379
Hash key: 6475706c6963616379
ID key: 6475706c6963616379
File chunks are encrypted
Metadata chunks are encrypted
id: fedora-myraid, revisions: [292], tag: , showFiles: true, showChunks: false
Loaded file snapshots/fedora-myraid/292 from the snapshot cache
Snapshot bosco-redraid revision 292 created at 2019-11-26 00:05 cron
Chunk f13d5e8a46b00dd5529a1eed283ef831769f49fa4efbd3587d1b5e8f39470d27 has been loaded from the snapshot cache
Fetching chunk c15dea9465ffff8563ba784c19d866cf2a1ec72afee3e78886d4d74aa4b2b592
Chunk c15dea9465ffff8563ba784c19d866cf2a1ec72afee3e78886d4d74aa4b2b592 has been loaded from the snapshot cache
Fetching chunk acf5dac1017ebfbd5d981bbf1bca139907d92a0e15800e0e1154087f01075568
Chunk acf5dac1017ebfbd5d981bbf1bca139907d92a0e15800e0e1154087f01075568 has been loaded from the snapshot cache
Fetching chunk 034451f2de377c291c7e5f928d85f2cdf5d4867b3e5a35f3f5f3661c15062ac3
Chunk 034451f2de377c291c7e5f928d85f2cdf5d4867b3e5a35f3f5f3661c15062ac3 has been loaded from the snapshot cache
Fetching chunk 42e781caa71fca1f9cd96ead5b466620e49e6a2f1ea6afd902fc9c5b3507eca5
Chunk 42e781caa71fca1f9cd96ead5b466620e49e6a2f1ea6afd902fc9c5b3507eca5 has been loaded from the snapshot cache
Fetching chunk 6c30398b02368050786ed5c52018f65bddb7164513b99ba8c1e398da87a1bb0d
Chunk 6c30398b02368050786ed5c52018f65bddb7164513b99ba8c1e398da87a1bb0d has been loaded from the snapshot cache
Fetching chunk 8556532b1bb0657f312d3386998395c347a270f83ced8ad5bebd8ec79a3796fa
Chunk 8556532b1bb0657f312d3386998395c347a270f83ced8ad5bebd8ec79a3796fa has been loaded from the snapshot cache
Fetching chunk 0047d9c888f8f446b564511c0815c6fd9ac2be4cfc44302299890c08cd3b2a71
Chunk 0047d9c888f8f446b564511c0815c6fd9ac2be4cfc44302299890c08cd3b2a71 has been loaded from the snapshot cache
Fetching chunk 1288acc033e5ab787965b10879e458854397987fd2ff4afc32f12d8cac7d0259
Chunk 1288acc033e5ab787965b10879e458854397987fd2ff4afc32f12d8cac7d0259 has been loaded from the snapshot cache
Fetching chunk 94c4e93aa32f7264be644d5db37fb7277f8b2bd3bef972a85bf9f58b3c5a5003
Chunk 94c4e93aa32f7264be644d5db37fb7277f8b2bd3bef972a85bf9f58b3c5a5003 has been loaded from the snapshot cache
Fetching chunk 756f4912514fc5f61d4b767c49cafe541035c9c35dcaa01fcf7fdd4495bee6e0
Chunk 756f4912514fc5f61d4b767c49cafe541035c9c35dcaa01fcf7fdd4495bee6e0 has been loaded from the snapshot cache
Fetching chunk de00b1d74930023e63a31e19be683335bd082bb6a7f8dc70d85e200359bc6ecf
Chunk de00b1d74930023e63a31e19be683335bd082bb6a7f8dc70d85e200359bc6ecf has been loaded from the snapshot cache
Fetching chunk 32d2de2bb7aa2a124d0b085321365a8606b638067177dc7fca49be419bdfe031
Chunk 32d2de2bb7aa2a124d0b085321365a8606b638067177dc7fca49be419bdfe031 has been loaded from the snapshot cache
Fetching chunk 34f07c4f704b605e20f306b752d8c24109dae3c56e57df04b63868a99fb92325
Chunk 34f07c4f704b605e20f306b752d8c24109dae3c56e57df04b63868a99fb92325 has been loaded from the snapshot cache
Fetching chunk 2b95318ae5d82d070d5d756335cd161ee657953e2413aab2ac0ffb3e49470ba6
Chunk 2b95318ae5d82d070d5d756335cd161ee657953e2413aab2ac0ffb3e49470ba6 has been downloaded
Fetching chunk 357789613f17886c9d3379a5b97fe78c566a146b42eb3440f2c8b8cd98037142
Chunk 357789613f17886c9d3379a5b97fe78c566a146b42eb3440f2c8b8cd98037142 has been downloaded
Fetching chunk 1cfd62f9a44b4afc466fa3baa1be62f1d250ef89c24ef36039711ca2d0cb6419
Chunk 1cfd62f9a44b4afc466fa3baa1be62f1d250ef89c24ef36039711ca2d0cb6419 has been downloaded
Fetching chunk 038f7bd383d1e34b8ddca6c7a1680288f43702958cd1432a76693446a464b5c4
Chunk 038f7bd383d1e34b8ddca6c7a1680288f43702958cd1432a76693446a464b5c4 has been downloaded
Fetching chunk 408b3e60ef7ee2b0826c34f81c2e8168585e36a11fee9ce0c79f8dbe5b4c0906
Chunk 408b3e60ef7ee2b0826c34f81c2e8168585e36a11fee9ce0c79f8dbe5b4c0906 has been downloaded
Fetching chunk 8c05fc7c36442afb28488bd600fb1e4c995b0dad6dd6f8a54861e005ada50d4b

Again - it hangs at this point. Pressing CTRL+C and running the command again, it loads the chunks from snapshot cache, downloads a few new ones and hangs again.
Obviously I didn’t want to operate by pressing CTRL+C hundred times again and again.
After waiting almost 5-10 minutes, it did now print:

Failed to decrypt the chunk 8c05fc7c36442afb28488bd600fb1e4c995b0dad6dd6f8a54861e005ada50d4b: corrupt input; retrying
Encountered an error (failed to send packet: EOF); retry after 1 second(s)
Attempting public key authentication
Reading the environment variable DUPLICACY_SSH_KEY_FILE
Chunk 8c05fc7c36442afb28488bd600fb1e4c995b0dad6dd6f8a54861e005ada50d4b has been downloaded
Fetching chunk abc2709a2ef5e0ae8591390e4335c478bbd96ebb53ab3f20da5346f8c449dc99

However, now it hangs again.

These are the current stats from the Raspberry Pi box:

top - 02:29:51 up  1:07,  1 user,  load average: 0.05, 0.01, 0.00
Tasks: 168 total,   1 running, 117 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.2 us,  0.2 sy,  0.0 ni, 99.7 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem :   949448 total,   497856 free,    76912 used,   374680 buff/cache

Please describe what you expect to happen (but doesn’t):

Run without hanging, or at least giving a clear message what the issue is.

Please describe what actually happens (the wrong behaviour):

see above

I’d be thankful for any help!

Kind regards
xpac

What version of the CLI are you using? This sounds like a deadlock with sftp that was fixed in v2.3.0.

I literally downloaded the latest release less than 24 hours ago, so it’s 2.3.0.

offtopic but <3 your avatar @xpac! image

1 Like

I also believe it has something to do with the sftp connection as i see

Encountered an error (failed to send packet: EOF); retry after 1 second(s)

Could it be that the server on the storage (RBP from what i understand) is rate-limiting :d: as it thinks it’s ddosing it?

Not yet exactly sure what’s causing is (there’s no firewall in place), but it’s somehow related to the tinc VPN. Bypassing the VPN (both boxes are in the same LAN now, usually they’re not) solves the issue for now.

What is really confusing me is that so far all but the initial backup were carried out over said VPN, and it never caused issues :-/