Unexpected Network Error Occured


#1

Hello,

I’m using duplicacy to upload my first backup to Backblaze b2 and I’m running into an issue where during the upload process, I randomly get the following error and the backup stops. I’m using Duplicacy to backup my QNAP Nas using a UNC path.

When I configured the backup, I specify the path as \nasserver\directory and it picks it up just fine, when I kick off the backup, it indexes correctly and starts to upload chunks but randomly Ill get the following error:

Failed to read 0 bytes: read \blah\blah\blah\blah.file An unexpected network error occurred.

Ive tried the backup from various different computers and it didnt help. I also verified all my network settings are correct. I cant seem to get a full backup to occur and I’ve been trying for a few days now.


#2

Try running with duplicacy -d -log backup <other settings> and paste that detailed log here, maybe that helps.


#3

Thank you, I’ll try that


#4
2019-05-05 14:13:26.600 INFO UPLOAD_PROGRESS Uploaded chunk 62950 size 1732425, 17.59MB/s 16:29:58 22.7%
2019-05-05 14:13:26.658 DEBUG BACKBLAZE_CALL URL request 'HEAD https://f002.backblazeb2.com/blahblahblah' returned status code 404
2019-05-05 14:13:26.712 DEBUG BACKBLAZE_LIST b2_download_file_by_name did not return headers
2019-05-05 14:13:29.004 ERROR CHUNK_MAKER Failed to read 0 bytes: read \\blahblah\blah\blahblah An unexpected network error occurred.
2019-05-05 14:13:29.228 INFO INCOMPLETE_SAVE Incomplete snapshot saved to C:\duplicacy/.duplicacy/incomplete

this is the error im getting…


#5

That looks a bit weird.
What do you think about waiting a few days until @gchen releases the new CLI/GUI versions of :d: and retry? (Update on the Web GUI)

I saw that he reworked the backblaze backend, so that could simply solve your problem without the need to further debug the issue.


#6

sure. I dont mind waiting.


#7

This has something to do with your network – Duplicacy couldn’t read the file due to some temporary network issues. The new version won’t fix this issue.

My suggestion is to run Duplicacy directly on your QNAP NAS to backup the directory.


#8

This is a great suggestion, so just to confirm, running duplicacy from a qnap device is supported?


#9

Yep. download from Release Duplicacy 2.2.0 Command Line Version · gilbertchen/duplicacy · GitHub the linkfor linux arm build.


#10

Is there any way I can increase the upload throughput? When I was using duplicacy from a server accessing the nas as a file share, I was getting 15mbps up. Right now I’m getting 1.3 to 1.5 mbps running it directly on the qnap.

I used the -threads 16 argument in my backup command on the qnap…


#11

Well after a week of uploading, I was nearly 70% done and got this error…

Failed to upload the chunk 5e52986b551165a7384cb2f753ed976251c0d52e5ba3af70fe21d0acf1ae862c: Maximum backoff reached

I restarted the process and its starting from the first chunk. 9 more days to upload :man_facepalming:

For some reason duplicacy isnt storing my app key and other info and asks for it every time I kick it off. Uploads are painfully slow. Goign around 1.5megs/sec

help.


#12

You should try Version 2.2.0 which has a reworked B2 backend (including a better retry mechanism that is more resilient to temporary B2 errors as you have experienced).

As for the -threads 16 option, I guess it may be too many, especially if your qnap doesn’t have a powerful CPU. Run the benchmark command to determine the optimal value.