LZ4 compression error: input too large

Please describe what you are doing to trigger the bug:
Set -max-chunk-size to 2048M or greater with encryption enabled.

Please describe what you expect to happen (but doesn’t):
Either a fail on init (“chunk size too large” or similar message) or work like normal.

Please describe what actually happens (the wrong behaviour):
Once backup is run, it will eventually fail with:

Failed to encrypt the chunk <hash>: LZ4 compression error: input too large

2GB size chunks seems a bit excessive to me. :grimacing:

2 Likes

Google team drives have a limit of 400,000 files. Maybe I’m overestimating my future wants, but at an average size of 512Mi/chunk (where the default max size is 2Gi) gives approx 195TiB of storage. The default average of 4Mi/chunk only gives a max of ~1.5TiB.

So maybe 2G/chunk is just ridiculous, but I still think init should show an error in that case.

I agree, it should probably show an error.

What type of data are you backing up? If it’s media, I’d sooner recommend using Rclone for the job instead of locking such large files into a chunk format. Otherwise, do you really need 195TB of space (if Google would even let ya :stuck_out_tongue: )?

It’s a wide variety. I really hope I don’t need to use one tool/backup solution for big files, and a different one for small files.

“Google team drives” and “195 TiB” are two things that shouldn’t be in the same sentence. :laughing:

2 Likes