I used gdfs as described in this post. It worked smoothly, and the speed was great, until gdfs reported some abstract errors and some files appeared in “lost_and_found” folder under C:\…\Local\DriveFS
I checked and in the folder were 52 chunks. I noticed that 21 of them were uploaded correctly, but 31 were missing on remote. I decided to leave them as is.
I started new initial backup with changed ID directly to google drive (“gcd://” as storage). It seemed to upload exactly 31 chunks according to the log file.
Then I found some zero size files with .tmp extensions on GDrive remote. I assumed it was partially uploaded chunks. The names of files (before extensions) didn’t have any duplicates anywhere.
I started check -chunks
and it ran for half a day and returned this error:
"ERROR DOWNLOAD_CHUNK Failed to download the chunk chunkname: stream error: stream ID 90737; INTERNAL_ERROR; received from peer*
There is no such chunk uploaded on GDrive and or under the lost and found folder.
I restarted the check again and it reported that all chunks were successfully verified.
My questions:
- Does
check -chunks
compare uploaded chunks with the actual corresponding files in repository? Does it use checksum? - Does
check
take into account filters file under .duplicacy folder? - Can I do anything else to be completely sure that the backup is solid?
- Is it necessary?
Thank you gentlemen.