Based on this, i can be sure that everything is fine with the backup? The check takes less than two minutes.
Just the same archive Duplicati checks for very long. I understand that Duplicati has another principle of work, but I want to be sure that when the archive is needed, it will really work.
Running check command from C:\Users\arty/.duplicacy-web/repositories/localhost/all
Options: [-log check -storage test_local -a -tabular]
2019-06-11 01:00:54.112 INFO STORAGE_SET Storage set to D:/TEST/DUPLICACY_WEB/TEST_BACKUP
2019-06-11 01:00:54.245 INFO SNAPSHOT_CHECK Listing all chunks
2019-06-11 01:02:14.091 INFO SNAPSHOT_CHECK 1 snapshots and 4 revisions
2019-06-11 01:02:14.092 INFO SNAPSHOT_CHECK Total chunk size is 228,022M in 49988 chunks
2019-06-11 01:02:14.652 INFO SNAPSHOT_CHECK All chunks referenced by snapshot test_backup at revision 1 exist
2019-06-11 01:02:15.015 INFO SNAPSHOT_CHECK All chunks referenced by snapshot test_backup at revision 2 exist
2019-06-11 01:02:15.357 INFO SNAPSHOT_CHECK All chunks referenced by snapshot test_backup at revision 3 exist
2019-06-11 01:02:15.994 INFO SNAPSHOT_CHECK All chunks referenced by snapshot test_backup at revision 4 exist
2019-06-11 01:02:18.795 INFO SNAPSHOT_CHECK
snap | rev | | files | bytes | chunks | bytes | uniq | bytes | new | bytes |
test_backup | 1 | @ 2019-06-09 12:54 -hash | 173994 | 282,483M | 49936 | 227,922M | 16 | 23,178K | 49936 | 227,922M |
test_backup | 2 | @ 2019-06-09 21:47 | 173930 | 282,484M | 49937 | 227,922M | 0 | 0 | 17 | 23,442K |
test_backup | 3 | @ 2019-06-10 01:00 | 173930 | 282,484M | 49937 | 227,922M | 0 | 0 | 0 | 0 |
test_backup | 4 | @ 2019-06-11 01:00 | 174051 | 282,531M | 49956 | 227,977M | 35 | 79,072K | 35 | 79,072K |
test_backup | all | | | | 49988 | 228,022M | 49988 | 228,022M | | |