OOM killed during diff

Please describe what you are doing to trigger the bug:
duplicacy diff my/file/path on a file that’s 1.5GB

Please describe what you expect to happen (but doesn’t):
The command completes successfully without erros.

Please describe what actually happens (the wrong behaviour):
duplicacy uses up all available RAM and dies.

...
265 chunks have been allocated
Currently allocated: 3225.53M, total allocated: 10756.14M, system memory: 3831.38M, number of GCs: 22
Currently allocated: 3225.54M, total allocated: 10756.14M, system memory: 3831.38M, number of GCs: 22
Chunk size: 5352486 bytes, data size: 7493748, parity: 5/2
Chunk 80432fec24a42a93b94ccf9ba7b521f3a801eaf4d2badd5fa2649bf650d34d27 has been downloaded
Killed

Follow-up questions
These may not warrant their own topics and were discovered at the same time, so…

I noticed that a ‘./’ prefix causes a ‘file not found’ error even though other commands like history work fine with it - seems like another minor bug.

I did a dry-run backup to see what’s changed in my repo, noticed one file was truncated from a bad transfer between laptops, so I restored it using duplicacy and re-ran the dry-run but duplicacy still registered the same file as being different (which sent me down the path of discovering the diff OOM issue) - am I missing something or is this another bug? The timestamp and file size are the same (not sure about milliseconds though) but in any case after a restore I expect everything to be bit-identical.

Is it at all possible to cache the file chunks when running a diff? I did read in the docs that they are explicitly not cached, only snapshots are, but running diffs on files in the order of gigabytes is just not a good experience when all the chunks need to be downloaded over and over - perhaps I missed some way to cache them? If not I think it would be a good feature to add.