I really need the option of checking files after upload in web version. Especially when archiving goes to a remote server. Do you plan to add it in the future?
Wouldn’t it be enough to run check daily?
Does this command really check the status of the file being uploaded? I realized that she only checks for all versions.
Yes, and in 99% of the cases i expect that to be enough.
If you want 100% assurance that everything is in order, then run the check with the -files
option, but this will download EVERYTHING from the storage each time in order to compare with the files on your computer. (i think it’s a waste of time, processing power and bandwidth).
I think it isn’t. the initial backup is yes but when the check only download the last changes it shouldn’t be a problem.
for example:
- backup 100GB upload = 100GB download to check if the files on remote are fine
- backup inkremental 20MB changes = 20MB download to check. (ok a bit more or less, I don’t know exactly how the backup works)
but you don’t need to download all files to check if everything is fine on remote. just the changes you upload. and this should be a good performance when run it parallel. so in the time the backup is running and upload files you can download the files that are already uploaded and check them.
i don’t know if this should be a feature request. when yes I can make a new post to this.
The way currently works is that it downloads everything, for each revision. This happens because each revision is a “full image” of all the files you’re backing up, and there’s no deduplication logic for the check command.
So for your example we would have the following:
- 1st revision: uploaded 100GB, revision size: 100GB; total storage size: 100GB
- 2nd revision: uploaded 20MB, revision size: 100GB (i’m assuming here some files were just replaced); total storage: 100GB + 20MB
The first check -files
will download all the 100GB of chunks from the first revision.
The second check -files
will download all 100GB of chunks from the second revision, even though 99GB are the same.
I don’t know how difficult it would be to keep a list of already-checked-chunks.
I think this would be a fantastic way to fix the behavour of check -files
without a revision being specified, but I don’t think it would easily work because it’s a files-based check and not chunk-based check.
Some of those chunks may be used in other files.
It might be possible if instead of keeping a list of chunks, it would work out a diff between revisions and only check the files that were changed between the previously-checked revision.
Otherwise, I think it might be prudent to recommend (and note in the documentation) that check -files
should be used with -r
.
Or, change the default behaviour to automatically check the last revision only, unless another -r
or -all
is specified?
It might be possible if instead of keeping a list of chunks, it would work out a diff between revisions and
only check the files that were changed between the previously-checked revision.
that’s what I think to. only the files that gets uploaded should downloaded and checked if the file got corrupted while upload