Help: Duplicacy didnt work the last year!

It pruned 275 days ago: https://i.imgur.com/lswYu1d.png

and it says backup completed successfully, but now i checked the logs and it actually does nothing…the backup finishes in 2 seconds.

This is the whole log:

Running backup command from D:/Cache/localhost/0 to back up C:/Backup Symbolic Links
Options: [-log backup -storage Duplicacy -stats]
2020-11-13 17:33:03.293 INFO REPOSITORY_SET Repository set to C:/Backup Symbolic Links
2020-11-13 17:33:03.360 INFO STORAGE_SET Storage set to gcd://Duplicacy Backup
2020-11-13 17:33:07.382 INFO BACKUP_START Last backup at revision 4894 found
2020-11-13 17:33:07.382 INFO BACKUP_INDEXING Indexing C:\Backup Symbolic Links
2020-11-13 17:33:07.382 INFO SNAPSHOT_FILTER Parsing filter file \?\D:\Cache\localhost\0.duplicacy\filters
2020-11-13 17:33:07.382 INFO SNAPSHOT_FILTER Loaded 8 include/exclude pattern(s)
2020-11-13 17:33:08.825 INFO BACKUP_END Backup for C:\Backup Symbolic Links at revision 4895 completed
2020-11-13 17:33:08.825 INFO BACKUP_STATS Files: 3 total, 41K bytes; 0 new, 0 bytes
2020-11-13 17:33:08.825 INFO BACKUP_STATS File chunks: 2 total, 3,971K bytes; 0 new, 0 bytes, 0 bytes uploaded
2020-11-13 17:33:08.825 INFO BACKUP_STATS Metadata chunks: 3 total, 787 bytes; 0 new, 0 bytes, 0 bytes uploaded
2020-11-13 17:33:08.825 INFO BACKUP_STATS All chunks: 5 total, 3,971K bytes; 0 new, 0 bytes, 0 bytes uploaded
2020-11-13 17:33:08.825 INFO BACKUP_STATS Total running time: 00:00:02

As you can tell, im using a folder with symbolic links to backup. been doing that for multiple years already, to just find out it stopped working almost a year ago.
Any way i can let it function again with my current google cloud backup structure, as its multiple tb’s?

Thanks!

You might be best using or mapping it is as a Share. Id say that would be more reliable for you.

The image you posted shows a check 3 days ago. What is the content of this check log?

where can i find the logs?

Click on the “Checked” in front of the job:

127.0.0.1

i do not have that option or button anywhere.

Do you have a scheduled ‘check’ job under the Schedule tab? Sounds like it’s been deleted or the schedule has been set to never run (Mon-Fri unticked). If you don’t have such a job, you need to create one.

1 Like

no, it has a schedule that starts every x hour or so.

i deleted the backup schedule and remade it, and now its stuck on indexing, and the log file wont open(permanently loading browser tab)

Which schedule did you delete? Your screenshot - even though you deleted it - shows a schedule with just a prune operation, that runs hourly. It probably isn’t advised to run it more often than once a day, as otherwise they could overlap each other(?), and I can imagine it might get stuck.

Now you describe a running backup job. The browser isn’t very good at loading the log while in progress, so probably best to let it run.

You can probably load the actual log file (found in .duplicacy-web\logs) in notepad and keep refreshing/reloading.

its already stuck on indexing for over 5 hours now. the log keeps adding new entries, however, that are named “POST /get_backup_status”.

Well it’s probably having to re-index everything because your previous snapshot wasn’t able to follow the symbolic links.

If it seems really stuck, it shouldn’t be a problem to forcefully kill the duplicacy_*_2.7.1 process, but you may also want to restart the GUI process too (which depends on if it’s installed as a service or not as to how you go about doing that).

But re-check your logs - you should have duplicacy_web.log (for the GUI) and backup-*.log for the actual backup, which should be more informational. You could try adding -v or -d to the global options for the job, to get more detail about what it’s really doing.

now get this error…
2020-11-25 16:12:02.785 INFO BACKUP_LIST Listing all chunks
2020-11-25 22:08:29.786 INFO GCD_RETRY [0] Maximum number of retries reached (backoff: 64, attempts: 15)
2020-11-25 22:08:29.786 ERROR UPLOAD_CHUNK Failed to upload the chunk 08d8a41b8bbca806e6605339c1b351e18e954b6cffe05a63f44fec726ccdf950: googleapi: Error 403: The limit for this folder’s number of children (files and folders) has been exceeded., numChildrenInNonRootLimitExceeded
2020-11-25 22:08:29.788 INFO INCOMPLETE_SAVE Incomplete snapshot saved to D:\Cache\localhost\0/.duplicacy/incomplete

Sounds like you’re backing up to a Google Drive storage that was initialised with an older version of Duplicacy, which didn’t support nested chunk folders at the time. There’s a limit of 500K files per folder, which nesting practically mitigates.

i have the (second) nesting file next to the config in google drive, and still get the same error:
2020-11-27 14:42:57.771 ERROR UPLOAD_CHUNK Failed to upload the chunk 08d8a41b8bbca806e6605339c1b351e18e954b6cffe05a63f44fec726ccdf950: googleapi: Error 403: The limit for this folder’s number of children (files and folders) has been exceeded., numChildrenInNonRootLimitExceeded

i checked the folder and all the chunks are in a single folder, not multiple, it seems.

I’m guessing here - because you may already be at the 500K limit - that Duplicacy is unable to create the 00 to ff (256 in total) subdirectories within the chunk directory. Thus you may have to manually move a handful (actually about 256) files into a nested folder structure, in order to free up the limit…

Say if you have a chunk file called 00e816dda5f7aae26709180a20fb4870458084ec49bfc9a21d79724fce117b86. This would need to be renamed to e816dda5f7aae26709180a20fb4870458084ec49bfc9a21d79724fce117b86 (omit the first two characters) and placed in a subfolder called 00.

Unfortunately, you may have difficulty creating this subfolder within the chunks directory, since you’re at the limit. :slight_smile: So you might have to create this somewhere else on your Google Drive, move the chunk(s) inside, and move the folder back into the root of the chunks directory.

You probably have a good number have chunks that begin with 00, so you could start with that directory and move ~256 or so chunks that begin with 00 into that directory. Then you’d manually have to rename each file by removing the leading 00. Move the 00 directory into chunks and hopefully Duplicacy can now work with the storage.

trying to start a new backup, will take probably 3 weeks to complete but dont think i can fix this otherwise so easily.
But starting a new backup, it only backs up files in the root folder, not the symbolic folders inside of the root folder.

2020-11-30 15:30:57.004 INFO SNAPSHOT_FILTER Loaded 5 include/exclude pattern(s)
2020-11-30 15:30:57.005 ERROR SNAPSHOT_EMPTY No files under the repository to be backed up
No files under the repository to be backed up

The loading patterns are 5 symbolic links folders in the root of the backup repo.

But yeah…it says there is nothing to backup.
The backup is placed in a new google drive folder, so it shouldnt have a problem.

i didnt include or exclude anything, and now it is working, luckily :slight_smile:
with a other issue again, but i will open a seperate thread for that.