Failure to back up: "Input/output error"

Grateful if anyone can point me in the right direction or suggest ideas to try. I tried searching the forum to no avail.

Error message:

Running backup command from /Users/xxxxx/.duplicacy-web/repositories/localhost/9 to back up /Volumes/xxxxx

Options: [-log backup -storage xxxxx -threads 4 -stats]
2022-12-13 22:51:20.554 INFO REPOSITORY_SET Repository set to /Volumes/xxxxx
2022-12-13 22:51:20.554 INFO STORAGE_SET Storage set to b2://xxxxx
2022-12-13 22:51:21.165 INFO BACKBLAZE_URL download URL is: xxxxx
2022-12-13 22:51:23.861 INFO BACKUP_EXCLUDE Exclude files with no-backup attributes
2022-12-13 22:51:23.984 INFO BACKUP_START No previous backup found
2022-12-13 22:51:24.137 INFO INCOMPLETE_LOAD Previous incomlete backup contains 143901 files and 580 chunks
2022-12-13 22:51:24.137 INFO BACKUP_LIST Listing all chunks
2022-12-13 22:55:13.034 INFO BACKUP_INDEXING Indexing /Volumes/xxxxx
2022-12-13 22:55:13.034 INFO SNAPSHOT_FILTER Parsing filter file /Users/xxxxx/.duplicacy-web/repositories/localhost/9/.duplicacy/filters
2022-12-13 22:55:13.034 INFO SNAPSHOT_FILTER Loaded 0 include/exclude pattern(s)
2022-12-14 00:37:01.265 ERROR CHUNK_MAKER Failed to read 0 bytes: read /Volumes/xxxxx/xxxxx: input/output error
2022-12-14 00:37:12.088 INFO INCOMPLETE_SAVE Incomplete snapshot saved to /Users/xxxxx/.duplicacy-web/repositories/localhost/9/.duplicacy/cache/xxxxx/incomplete_snapshot
Failed to read 0 bytes: read /Volumes/xxxxx.jpg: input/output error

Additional information:

  • I am backing up to B2
  • The directory to backup is on a local NAS
  • Duplicacy is on a local Mac (middle man)
  • Mac and NAS are connected by CAT6 cable, not wireless
  • I have the NAS mounted to the Mac via AFP, not SMB
  • I have rebooted the Mac
  • I have rebooted the NAS (although not a full power-off for 5 minutes thing)
  • I have performed a full disk scrub on the NAS (I am using BTRFS and SHR-1)
  • Disk scrub returned “healthy”, no errors
  • I have attempted this backup at least 5 times and get the same error (sometimes the exact same filename in the error, sometimes not)
  • This is a new backup I recently added and it has never run successfully
  • I have performed other backups successfully to B2 with same setup and config (same Mac, same network, same CAT6 cable, same Duplicacy, same NAS)

Feeling a bit stuck. Greatly appreciate any thoughts on what I can try next.

Assuming this is one of the files you are trying to backup, can you open terminal and do

cp  /Volumes/xxxxx/xxxxx /tmp/

Just to confirm that that file is actually accessible over the mount?

You can also add -d flag to duplicacy for debug output.

Or if this is a first file it is trying to read—it’s likely it does not have permissions to read the mount. As it “privileges”, not actually linux permissions. Did you give DuplicacyWeb Full Disk Access?

Update…

Assuming this is one of the files you are trying to backup, can you open terminal and do

I was able to action this operation successfully. Copied to /tmp/ successfully.

You can also add -d flag to duplicacy for debug output.

Not sure if it’s the correct process [1], but using the web GUI opened up “options” in the backup job and typed “-d”.

Did you give DuplicacyWeb Full Disk Access?
I assume so, but honestly cannot say with certainty. Is there a way to check, or grant again?

I ran the backup job again this morning, and it’s currently ~15% complete.

Only changes:

  1. Computer was restarted since yesterday. I woke up this morning and the Mac was off. It was due to a MacOS kernel panic shutdown according to the MacOS crash report message on boot up.
  2. I added the -d flag

I’ll post another update when it fails or succeeds.

(It might take a day to complete, it’s a large backup.)

In Global options, right?

So, it occurs after a while, not immediately?

Yes, in System Preferences → Security and Privacy, full disk access. But if you are saying it works for a while and then stops—it’s not it. If it did not have access—it would not be able to read a first file.

I’ll bet Synology broke the AFP, again. Last time it was leaking handles. Try switching to SMB, or, better yet, NFS or SFTP. (This nonsense with breaking working third party tools with every other update is why I threw in a towel years ago with Synology). BTW SFTP is also broken there (but you should not be surprised by now)—you’ll need to provide full path to the storage, even though it’s not full path… i.e., if your “Share” is called “backups” and duplicacy folder under it, the SFTP URL will be sftp://user@nas://backups/duplicacy. Note the two slashes after :.

NFS appears to be the best approach here, because:

  • SFTP adds unnecessary overhead with encryption/decryption
  • SMB is likely broken. (see my this post from years ago: https://www.reddit.com/r/synology/comments/j7s4qk/afp_vs_smb_not_as_clearcut_anymore/)
  • AFP is likely broken again, per your observations. They use ancient version of netatalk and don’t configure it correctly either.
  • NFS is implemented in the kernel, is very small, and there is little opportunities for Synology’s middleware department to screw up, kernel folks don’t mess around. So it’s likely to work well, and it has benefit of being much more performant than any other protocol from above.
1 Like

AWESOME. This gives me some traction… I’ll give it a shot.

Meanwhile, MacOS kernel panicked again last night and bombed the whole job. Woke up to the machine powered off. I’ve been delaying the upgrade to Ventura – perhaps that should be my next step before proceeding?

Can you share the panic file? DM if you want. macOS should not panic under normal use

1 Like

Thanks again Saspus for all your help with this.

For posterity, anyone finding this post in the future…

Here’s the solution that worked for me: using SMB, instead of AFP.

The backup finished relatively quickly the first try, with no input/output errors. I’ll need to test it more thoroughly, but for now it seems good. Previously (years ago) I had problems with SMB randomly disconnecting the mounted drive, leading to my switch to AFP. I’m going to roll with SMB for now, and see if any issues develop.

1 Like