My check fails with "Missing Chunks"

I have read through this post: Fix missing chunks

I have cleared the cache, but that did not fix the issue.

I have never run a “prune” task on any of my backups yet, so it can’t be caused by “prune”.

I don’t know how I would be able to manually check if a missing chunk exists on the storage, as I do not know the path of a chunk. The log of the failed check command does not tell me the path, only the name of the chunk. But I would need to know the path to manually check if it exists, right?

Just to give some info about how I am using Duplicacy, maybe that helps to figure out why Check fails:

I have started using Duplicacy around 1 month ago. The first time I run a “check” task was on December 7. “Check” worked fine until December 15. Starting on December 16, “check” failed because of Missing Chunks. Since then, “check” has never completed successfully.

Duplicacy is working every day on my initial backup still, that will take quite a while before it all finished uploading.

I am using Google Cloud Storage (Archive).

The folder of the chunk are the first two characters, for example, chunk

02c25aea4621acdd4c8751d5ab7ff438fb47308ce8738f030b7db0741c37ecb5

is in the folder

02

and it’s the file

c25aea4621acdd4c8751d5ab7ff438fb47308ce8738f030b7db0741c37ecb5

So the first checks did not return errors, even with the first backup still incomplete?

Interesting, thanks! I very much suggest adding that info to the article here: Fix missing chunks

The folders in the “chunks” folder go from “00” to “31”, in nice ascending hexadecimal order. So I have a total of 50 folders.

I have checked now manually if some of the files the “check” complains about exist, and they do not exist. The first missing chunk “check” complains about is this one:

0e5665992600082539fab2db6a4806a3d56fd49f239b71edd16cf947b07aa237

And in the folder “0e”, the chunk with the “biggest number” as the first two characters is this:

034e0a46280a5988d1aee80bfdcacc0635b5466c6349dcd4b7a07d14ac97aa

So “03” is way less than “56”.

Yes, that’s correct. I started my backup around December 1st. At first I only had “backup” jobs, as I didn’t know I manually had to add a “check” job too. At December 6 I started also doing “check” in my schedule, and it always worked fine until December 15 when the “Missing Chunks” error appeared. At first I thought “check” might not be compatible with the “Parallel” option, which I had ticked for it, but removing the “Parallel” checkbox from the check did not fix the issue, so then I tried the steps described in the “Fix Missing Chunks” article, which also did not help.

But it is there (well, maybe in a not so explicit way…)

Thanks for the suggestion, I just added a note there.

About your other doubts, let’s wait for someone with more experience with the web version.

There should be 256 folders going from ‘00’ to ‘ff’ under the chunks folder. If any sub folder is missing, it may be the Archive Storage that hides or moves the folder due to some rule. I don’t have first hand experience with this storage class of Google Cloud Storage, nor have I read their doc but this is the only reasonable explanation.

I just noticed I only looked at page 1 of the files/folders. There are more that I overlooked, it just only displayed me the first 50 ones of page 1. I can look at more pages.

Yes, I see now that the folders go from 00 to ff, so those are all there.

So again, it complains that this chunk does not exist:

0e5665992600082539fab2db6a4806a3d56fd49f239b71edd16cf947b07aa237

There is a 0e folder. And there are a huge amount of files in that folder. But not that file. One file is starting with 565, and another one starting with 567, but none are starting with 566:

/chunks/0e/56549538a3f8180e79b3f29cc03f115beaa2a6875180982c75bcfbd596967e
/chunks/0e/5672807d398ca6fe1a0841df139b3d807b5b9ef5f9dc9f368333f4d15ba8a7

I don’t under this part. If the initial backup is still in progress the check command wouldn’t see any snapshot file and it shouldn’t report missing chunks. Did you see any file under the snapshots folder on the storage?

There are 2 folders in the snapshot folder. One folder is called “1/”, and another folder is called “4/”.

I am backing up 4 different drives with Duplicacy. Two of the drives finished the initial backup quickly, because they are small SSDs. The other 2 drives are bigger HDDs, which take way longer to backup. I am backing up these 4 drives in parallel.

So the from the two small SSDs Duplicacy is doing incremental backups now, while from the two HDDs its still working on the initial backup.

Can you check the prune logs to see if that particular chunk was deleted by a prune operation (Fix missing chunks)?

For the web GUI you can find the prune logs under ~/.duplicacy-web/repositories/localhost/all/.duplicacy/logs.

I have never, ever, run a “prune” job on a backup with Duplicacy yet. Only “backup” and “check”.

The folder .duplicacy-web/repositories/localhost/all/.duplicacy does not contain a “logs” folder. All logs I can find are stored in .duplicacy-web\logs. In there are a lot of “backup-number” and “check-number” log files. No “prune” ones.

It could be the backup done on December 16 that introduced the missing chunks. If so, you should see that all previous backups passed the check in the log.

If you’re unsure if this is the case you can post the check log here.

The last “check” log where check completed successfully is “check-20201215-030001.log” from 3:01 AM:

2020-12-15 03:00:02.038 INFO SNAPSHOT_CHECK Listing all chunks
2020-12-15 03:01:22.630 INFO SNAPSHOT_CHECK 1 snapshots and 13 revisions
2020-12-15 03:01:22.637 INFO SNAPSHOT_CHECK Total chunk size is 1152G in 342348 chunks
2020-12-15 03:01:22.913 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 1 exist
2020-12-15 03:01:23.526 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 2 exist
2020-12-15 03:01:24.109 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 3 exist
2020-12-15 03:01:24.666 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 4 exist
2020-12-15 03:01:25.328 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 5 exist
2020-12-15 03:01:25.914 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 6 exist
2020-12-15 03:01:26.500 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 7 exist
2020-12-15 03:01:27.063 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 8 exist
2020-12-15 03:01:27.651 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 9 exist
2020-12-15 03:01:28.224 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 10 exist
2020-12-15 03:01:28.740 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 11 exist
2020-12-15 03:01:29.263 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 12 exist
2020-12-15 03:01:30.806 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 13 exist
2020-12-15 03:01:44.623 INFO SNAPSHOT_CHECK 
  snap | rev |                               |   files |    bytes | chunks |    bytes |   uniq |    bytes |   new |    bytes |
     1 |   1 | @ 2020-12-06 20:00 -hash -vss |  841121 | 375,771M |  65172 | 205,003M |   4957 |  18,876M | 65172 | 205,003M |
     1 |   2 |      @ 2020-12-06 22:01  -vss | 3493695 | 920,231M | 142680 | 446,871M |   2047 |   4,560M | 82467 | 260,747M |
     1 |   3 |      @ 2020-12-07 02:00  -vss | 3493335 | 920,539M | 142937 | 447,378M |   1777 |   3,878M |  2775 |   6,133M |
     1 |   4 |      @ 2020-12-07 14:00  -vss | 3495045 | 921,089M | 143244 | 451,287M |   1806 |   3,880M |  4586 |  12,323M |
     1 |   5 |      @ 2020-12-07 16:01  -vss | 3495094 | 926,981M | 144228 | 453,438M |   3044 |   6,584M |  3408 |   7,525M |
     1 |   6 |      @ 2020-12-07 17:01  -vss | 3495761 | 923,303M | 143771 | 452,482M |   2573 |   5,545M |  2848 |   6,209M |
     1 |   7 |      @ 2020-12-07 18:01  -vss | 3495951 | 918,698M | 143019 | 451,105M |   1807 |   4,040M |  1948 |   4,455M |
     1 |   8 |      @ 2020-12-07 19:06  -vss | 3496541 | 922,543M | 144055 | 453,656M |   2293 |   4,979M |  3290 |   7,203M |
     1 |   9 |      @ 2020-12-07 20:23  -vss | 3494367 | 909,802M | 142049 | 447,683M |   2283 |   5,123M |  2651 |   6,021M |
     1 |  10 |      @ 2020-12-08 16:01  -vss | 3497178 | 908,526M | 141928 | 447,687M |   2775 |   6,240M |  3138 |   7,458M |
     1 |  11 |      @ 2020-12-13 19:01  -vss | 3442828 | 817,179M | 130394 | 397,382M |   1779 |   4,467M | 17057 |  37,039M |
     1 |  12 |      @ 2020-12-15 01:09  -vss | 3445679 | 823,334M | 131736 | 400,204M |   1280 |   2,731M |  3362 |   7,873M |
     1 |  13 |      @ 2020-12-15 02:01  -vss | 3446099 | 823,233M | 131755 | 400,783M |   1758 |   4,220M |  1758 |   4,220M |
     1 | all |                               |         |          | 194460 | 572,215M | 194460 | 572,215M |       |          |

On the next “check”, “check-20201215-033001.log”, from 3:33 AM, the check fails because of missing chunks:

2020-12-15 03:30:02.294 INFO SNAPSHOT_CHECK Listing all chunks

2020-12-15 03:32:53.469 INFO SNAPSHOT_CHECK 1 snapshots and 14 revisions

2020-12-15 03:32:53.474 INFO SNAPSHOT_CHECK Total chunk size is 1159G in 344356 chunks

2020-12-15 03:32:53.705 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 1 exist

2020-12-15 03:32:54.241 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 2 exist

2020-12-15 03:32:54.777 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 3 exist

2020-12-15 03:32:55.277 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 4 exist

2020-12-15 03:32:55.814 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 5 exist

2020-12-15 03:32:56.337 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 6 exist

2020-12-15 03:32:56.853 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 7 exist

2020-12-15 03:32:57.355 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 8 exist

2020-12-15 03:32:57.882 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 9 exist

2020-12-15 03:32:58.373 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 10 exist

2020-12-15 03:32:58.843 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 11 exist

2020-12-15 03:32:59.315 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 12 exist

2020-12-15 03:32:59.799 INFO SNAPSHOT_CHECK All chunks referenced by snapshot 1 at revision 13 exist

2020-12-15 03:33:02.915 WARN SNAPSHOT_VALIDATE Chunk e4e9e41353d799de55fa3da82be6c990a2f55733ad597f4e302452127c9962ea referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:02.982 WARN SNAPSHOT_VALIDATE Chunk 4c396583e904185a35fd676444327f4eaec6f49782bb20735cfd39c5273641aa referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.042 WARN SNAPSHOT_VALIDATE Chunk c8895514cb0d41daf2e54eaddb9435a905024a7e1d455358e646bda67d1240ef referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.097 WARN SNAPSHOT_VALIDATE Chunk dc75c3384b6cc9aa3830f3d0e1ff64eccd403577a319193845a7980cef1b7aa2 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.153 WARN SNAPSHOT_VALIDATE Chunk b937266c40d5535ca900218378b8b95f60af09bef37a1c4b4a91af9f9b5b4fd0 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.218 WARN SNAPSHOT_VALIDATE Chunk 0e5665992600082539fab2db6a4806a3d56fd49f239b71edd16cf947b07aa237 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.283 WARN SNAPSHOT_VALIDATE Chunk e6195555cd684e0c2408842b699766b78a3bf5e90f3429858784eb05616d97e9 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.348 WARN SNAPSHOT_VALIDATE Chunk dc6510a1c9bc1905bfb124cef17234f499081ee1c922c2fb8334a83bb698614b referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.405 WARN SNAPSHOT_VALIDATE Chunk 07a43479d7e8008b96aac0b01f88229e4bd66575e2760e543986dd9342b0a81d referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.468 WARN SNAPSHOT_VALIDATE Chunk ce2d9a6c2d3ed0ad7891edb5b3b5e657f1461e6f9ae77f2dee79231d17dc8635 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.526 WARN SNAPSHOT_VALIDATE Chunk 0be36e22ce57a02e5d7eacdafc8b3044ccbab8b0e99787f1a884857f0352d1d2 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.587 WARN SNAPSHOT_VALIDATE Chunk 2a4b289bea387dc34ff04efcb4d4edb5d28991b319cb64f727acb86cba232d06 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.652 WARN SNAPSHOT_VALIDATE Chunk 41812459ee93575897f84190fbc3f1eb8806238f9b71824bb63b2c7bdcbeee1e referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.715 WARN SNAPSHOT_VALIDATE Chunk c73f69884763c6e1cf4886b90f70662a237b248e9314421c78f4098cbe102bdc referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.784 WARN SNAPSHOT_VALIDATE Chunk 6ae999236d5c69c459e3b258c6ea3cfcdbc0cab645b78990ab9dcf25ae44f321 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.842 WARN SNAPSHOT_VALIDATE Chunk 3e8ba9ad287232b189147ce47e370f8ec16421647e6403eb9286c13e1cd2df1a referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.900 WARN SNAPSHOT_VALIDATE Chunk 8ef0d569025488ca8738193d804c5d5764696ac4a96eb5b2e13453f446422c95 referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:03.957 WARN SNAPSHOT_VALIDATE Chunk dd32c0bd0757785a976bfb21efa9d44426f9417ba37115a7c3bf39d8cfcaa28a referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:04.019 WARN SNAPSHOT_VALIDATE Chunk c7a7ed6b225f7f1a2939f367544cd5492cad71dad3c1df660488643e4d82cd9d referenced by snapshot 1 at revision 14 does not exist

2020-12-15 03:33:04.023 WARN SNAPSHOT_CHECK Some chunks referenced by snapshot 1 at revision 14 are missing

2020-12-15 03:33:04.023 ERROR SNAPSHOT_CHECK Some chunks referenced by some snapshots do not exist in the storage

Some chunks referenced by some snapshots do not exist in the storage

In between those 2 checks, one of the SSD backups (A backup of my full C drive) successfully finished an incremental backup at 3:16 AM. That backup log ends with this:

2020-12-15 03:16:31.392 INFO BACKUP_END Backup for C:\ at revision 14 completed

2020-12-15 03:16:31.392 INFO BACKUP_STATS Files: 3447608 total, 821,440M bytes; 2844 new, 4,513M bytes

2020-12-15 03:16:31.392 INFO BACKUP_STATS File chunks: 171300 total, 849,377M bytes; 787 new, 4,305M bytes, 1,774M bytes uploaded

2020-12-15 03:16:31.392 INFO BACKUP_STATS Metadata chunks: 212 total, 1,096M bytes; 92 new, 519,177K bytes, 149,965K bytes uploaded

2020-12-15 03:16:31.392 INFO BACKUP_STATS All chunks: 171512 total, 850,474M bytes; 879 new, 4,812M bytes, 1,921M bytes uploaded

2020-12-15 03:16:31.392 INFO BACKUP_STATS Total running time: 00:16:29

2020-12-15 03:16:31.392 WARN BACKUP_SKIPPED 3 directories and 1 file were not included due to access errors

2020-12-15 03:16:31.817 INFO VSS_DELETE The shadow copy has been successfully deleted

So since the missing chunks are in “Revision 14”, and that backup worked on that “Revision 14”, I guess that backup introduced the missing chunks, right?

Yes, revision 14 is clearly broken. I don’t know what can cause that, but if you just want to fix the problem you can manually delete the file snapshots/1/14 from the storage.

1 Like

It seems that I have a related Problem. I backed up to my local HDD, Everything went fine so far, but everytime the check fails. It seem like all chunks are there. The log says
"2023-04-28 17:14:44.501 ERROR SNAPSHOT_CHECK 1 chunks have a size of 0
1 chunks have a size of 0"

What does that mean. It doesn’t change when I add revisions.
Here is the whole log.

*"Running check command from /Users/user1/.duplicacy-web/repositories/localhost/all*
*Options: [-log check -storage DateisicherungFestplatteS2 -a -tabular]*
*2023-04-28 17:13:04.529 INFO STORAGE_SET Storage set to /Volumes/userS2*
*2023-04-28 17:13:04.587 INFO SNAPSHOT_CHECK Listing all chunks*
*2023-04-28 17:14:21.237 WARN SNAPSHOT_CHECK Chunk 7ce0a1401a9669512d0334e1563c711834906d45087a3f39510d8dc9298b41e8 has a size of 0*
*2023-04-28 17:14:22.807 INFO SNAPSHOT_CHECK 2 snapshots and 43 revisions*
*2023-04-28 17:14:22.812 INFO SNAPSHOT_CHECK Total chunk size is 2477G in 519720 chunks*
*2023-04-28 17:14:24.017 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 1 exist*
*2023-04-28 17:14:25.054 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 2 exist*
*2023-04-28 17:14:26.061 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 3 exist*
*2023-04-28 17:14:27.050 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 4 exist*
*2023-04-28 17:14:28.048 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 5 exist*
*2023-04-28 17:14:29.047 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 6 exist*
*2023-04-28 17:14:30.049 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 7 exist*
*2023-04-28 17:14:31.043 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 8 exist*
*2023-04-28 17:14:32.041 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 9 exist*
*2023-04-28 17:14:33.036 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 10 exist*
*2023-04-28 17:14:34.029 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 11 exist*
*2023-04-28 17:14:35.032 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 12 exist*
*2023-04-28 17:14:36.026 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 13 exist*
*2023-04-28 17:14:37.027 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 14 exist*
*2023-04-28 17:14:38.036 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 15 exist*
*2023-04-28 17:14:39.037 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 16 exist*
*2023-04-28 17:14:40.034 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 17 exist*
*2023-04-28 17:14:41.040 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 18 exist*
*2023-04-28 17:14:42.034 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 19 exist*
*2023-04-28 17:14:43.036 INFO SNAPSHOT_CHECK All chunks referenced by snapshot BilderHDDS2 at revision 20 exist*
*2023-04-28 17:14:43.098 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 1 exist*
*2023-04-28 17:14:43.161 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 2 exist*
*2023-04-28 17:14:43.224 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 3 exist*
*2023-04-28 17:14:43.283 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 4 exist*
*2023-04-28 17:14:43.339 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 5 exist*
*2023-04-28 17:14:43.394 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 6 exist*
*2023-04-28 17:14:43.449 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 7 exist*
*2023-04-28 17:14:43.506 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 8 exist*
*2023-04-28 17:14:43.562 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 9 exist*
*2023-04-28 17:14:43.617 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 10 exist*
*2023-04-28 17:14:43.672 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 11 exist*
*2023-04-28 17:14:43.727 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 12 exist*
*2023-04-28 17:14:43.784 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 13 exist*
*2023-04-28 17:14:43.841 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 14 exist*
*2023-04-28 17:14:43.896 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 15 exist*
*2023-04-28 17:14:43.952 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 16 exist*
*2023-04-28 17:14:44.007 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 17 exist*
*2023-04-28 17:14:44.067 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 18 exist*
*2023-04-28 17:14:44.124 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 19 exist*
*2023-04-28 17:14:44.178 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 20 exist*
*2023-04-28 17:14:44.233 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 21 exist*
*2023-04-28 17:14:44.288 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 22 exist*
*2023-04-28 17:14:44.501 INFO SNAPSHOT_CHECK All chunks referenced by snapshot DokumenteHDDS2 at revision 23 exist*
*2023-04-28 17:14:44.501 ERROR SNAPSHOT_CHECK 1 chunks have a size of 0*
*1 chunks have a size of 0"*

Does it mean my backup is corrupted?
I hope you can help me.
THX Paul

This is a problem right here. local HDD cannot guarantee data consistency.

This shall never happen with reliable storage. Zero size chunks could have been created as a result of abrupt shutdown/system hang/panic/reset, filesystem corruption, or disk being full at the time of backup.

You can try to repair those:

  1. delete all zero size chunks
  2. Create another temporary snapshot ID and make a single backup: the hope here is that duplicacy may end up with the same chunks and upload them to the storage, thus re-creating the deleted zero-sized ones.
  3. delete the new snapshot ID, we don’t need it anymore.
  4. run check again on the old snapshot ID.
  5. The remaining missing chunks, if any, are unfixable. You will lose access to the version history of the files that rely on data in those chunks.
  6. Optional: Run prune -exhaustive to get rid of orphaned chunks from the temporary snapshot.

I strongly recommend not backing up to a single hard drive, especially external one. It’s a matter of when, not if, if you lose data.

okay, thanks I got it. It’s not a good idea to use it for local backup.
From time to time i get this error message from my cloud storage. What does it mean?

*Running check command from /Users/xxx/.duplicacy-web/repositories/localhost/all*
*Options: [-log check -storage SicherungGoogleDrive -threads 100 -a -tabular]*
*2023-04-29 12:02:07.634 INFO STORAGE_SET Storage set to gcd://Sicherungen Duplicacy*
*2023-04-29 12:02:11.141 INFO SNAPSHOT_CHECK Listing all chunks*
*2023-04-29 12:02:38.942 INFO GCD_RETRY [0] Maximum number of retries reached (backoff: 100, attempts: 15)*
*2023-04-29 12:02:38.942 ERROR LIST_FILES Failed to list the directory chunks/: googleapi: Error 403: Quota exceeded for quota metric 'Queries' and limit 'Queries per minute' of service 'drive.googleapis.com' for consumer 'project_number:xxxx'., rateLimitExceeded*
*Failed to list the directory chunks/: googleapi: Error 403: Quota exceeded for quota metric 'Queries' and limit 'Queries per minute' of service 'drive.googleapis.com' for consumer 'project_number:xxxx'., rateLimitExceeded*

I do the check after the prune on my regulary schedule. It just happens from time to time.
best regards.

Google drive is rate limiting you. Reduce number of threads to 4 at most

That’s not what I said.

thanks for the hint! That was the point! It is suprising, because I can do the Backup tast with -threads 100, it it is so much faster than with -threads 4.