Bots can/should be used to keep go dependencies more up-to-date

Looking through the github repo for duplicacy, it appears that a lot of duplicacy’s dependencies are very out-of-date. Just an example, the latest version of klauspost/compress is currently at v1.18.3 (incidentally, v.1.18.3 fixes a CVE). Duplicacy is using 1.16.3, released almost 3 years ago.
klauspost/reedsolomon latest is at v1.13.0. Duplicacy is using v1.9.9, from May 2020, almost 6 years ago.

The last commit in duplicacy’s github repo was 9 months ago. Given that some of its dependencies have had CVE fixes in the years since they were last updated, this makes me think duplicacy is starting to become abandonware the longer it goes without any updates. Yes, it currently works and works well, and I get that there are arguments to be made of “if-it-ain’t-broke…”.

However we have seen time and time again that that is precisely the attitude that sees out-of-date software get exploited by bad actors. Yes, keeping dependencies up-to-date can sometimes introduce more things to fix and do, but that is the price to be paid to keep up with security fixes that are discovered on a continual and rolling basis.

This is especially confusing to me when it comes to dependencies, as there are github bots that can do this maintenance work for you. I really don’t see a particularly good reason that duplicacy should be depending on 6-year-old go modules (that have had CVE fixes since they were last pulled into duplicacy).

I can’t speak for others, but I know it would give me much more confidence in continuing to use and recommend duplicacy if I knew it was pulling in upstream security fixes, rather than needlessly resting on 6-year-old dependencies.

1 Like

Duplicacy(CLI) is not an Internet facing service. Security bugs in Duplicacy CLI dependencies are non-exploitable and irrelevant.

On the other hand, blindly updating go and underlying libraries for no reason is asking for trouble. Stuff keeps breaking upstream all the time. Every change in a lock file must be thoroughly tested and this requires work for no benefit in return.

Dependencies are frozen for a reason and unless something breaks — there is no reason to update that something.

Web UI on the other hand — totally different story. It must be religiously updated, but it isn’t.

The absolutely abysmal progress in fixing actual user facing bugs is disheartening, and I would agree, but not because dependencies are frozen.

1 Like

Every change in a lock file must be thoroughly tested and this requires work for no benefit in return.

I would argue that that is precisely the job being the maintainer of a project like this calls for, otherwise code just becomes stale. The longer you leave it to pull upstream fixes and changes, the larger the delta and the more likely you make it for yourself some point in the future when you do actually want to pull some fix your project wants/needs, to have to workaround potentially large and breaking changes.

I’m not saying duplicacy should blindly pull every single upstream update, but at least keeping on top of them seems like a good goal to have and the role a maintainer of this project should have, espcially when we have tools to make this easy (precisely why something like dependabot exists and people use it).

And just because a service is not internet-facing is poor reason not to pull already patched CVE’s from upstream. Sysadmin’s everywhere will tell you that something wasn’t a problem, until it was. Can you absolutely guarantee an IOT device on the same LAN as duplicacy might never one day exploit a CVE in duplicacy because it is known and exploitable? Just seems pointlessly risky to me, and bad coding practice.

I feel like even if you want to be conservative with which fixes you pull (dependecybot can even be configured to only pull security-fixes), there is somewhere in the middle between pulling everything, and pulling nothing, and letting code go stale for years.

And it’s not like security fixes are the only benefit. Duplicacy depends on storj.io/uplink v1.12.1. Latest is v1.13.1. Meanwhile 1.12.2 “improved download performance”. v1.18.1 of klauspost/compress fixed “incorrect buffer size in dictionary encodes” for the zstd module duplicacy uses. There is no reason why duplicacy can’t and shouldn’t be benefitting from such upstream fixes and improvements.

We have the power and technology to make things like this easy. I don’t see why we shouldn’t use them.

2 Likes

Just as an experimental proof-of-concept, I forked duplicacy and used dependencybot to update all the out-of-date go modules duplicacy depends on.

Took hardly any effort, duplicacy binaries built fine, and are currently running perfectly for me

Isn’t this one of the benefits of go’s modular structure and self-contained binaries in the first instance?

1 Like

I agree. Hard to argue the obvious… it would be great if it kept up to date and old 5 year old bugs that are the only reasons most people create forum account were getting fixed. (Interrupted prune leaving ghost snapshots is the one i’m referring to. There are even PRs to fix this, auto-closed bec-use whoever opened it gave up and deleted their fork). Since this is still not addressed I’m assuming there are glaring resource issues and expecting proactive dependencies updates would be naive.

This, however is not sufficient. The problem is not that it won’t run, but any change can introduce and/or expose corner cases that need to be stress tested again. I don’t know how does gchen test, but I’m sure it’s not just run backup/restore and call it a day.

No. Modularity and dependencies encapsulation does not change how those dependencies interact. That’s why we have unit tests and integration tests. And integration tests are far more volatile.

1 Like

This, however is not sufficient. … I don’t know how does gchen test, but I’m sure it’s not just run backup/restore and call it a day.

Yeah, I mean, like I said, it was just a proof-of-concept, and I realise that it would require more testing if deployed in the real world by gchen. But the point I was trying to make is that there are fixes upstream (both security fixes for public CVEs, and QOL fixes and improvements such as the ones mentioned), that could be easily pulled. Yes it would require time and work to test the result (arguably this could also be automated somewhat), but, like I said, to me it seems obvious that that time and effort requirement goes hand-in-hand with being the maintainer of a project like this.

To me, the fact none of these dependencies have been touched or tested in years, despite getting upstream bug and security fixes and qol improvements, when we have tools to make doing that easy, is just indicative of duplicacy getting little to no effort put into it recently and increasingly becoming abandonware.

I certainly take your point about dependencies requiring further testing, but I disagree that there is no point to updating them. To me, the fact no effort is being made to pull upstream improvements and perform the subsequently required testing is indicative of a larger issue with the ongoing maintenance of duplicacy, and just goes hand-in-hand with the points you make about PRs going ignored and the web-ui getting nothing.

2 Likes

Old thread about the same issue. Is this project dead?

100% agree. This is one of the reasons I’ve recently stopped recommending Duplicacy on subreddits etc…

Another reason is the aforementioned interrupted pruning leaving snapshots, that @gchen doesn’t seem to want to acknowledge is a issue.

Yup! Can attest to that. Leave it too long and you’re gonna be bitten by something you couldn’t foresee back then. This is the way.

Just because Duplicacy CLI isn’t ‘internet facing’, doesn’t mean a compromised backend server can’t trigger an exploit in your client’s buggy API and do RCE! A tale as old as time.

I did the same, for over a year. (IIRC, these are just security updates.)

Even though one might go through each one and carefully reason ‘doesn’t apply’ or ‘Duplicacy only uses it for this or that so isn’t vulnerable’, it’s sad to see no apparent effort is made to look at them.

Which you would have to do if you were a responsible developer - but leaving the most egregious dependencies unpatched for years suggests to me this step has been forgotten. (The Terrapin one is well known, and there should be no reason it’s not patched already, even if low impact.)

IMO, it’s infinitely more easier to gradually patch stuff (and have dependabot et al handle the minor versions) than leave yourself in the position of having to deal with major PRs then heavily test after a big jump. Or keep going back to a bigger list of issues and try to reassure yourself ‘it’s fine’.

Plus the potential performance improvements is a no brainer.

klauspost/compress 1.18.3: that is a bug that occurs when opening ZIP achives. Duplicacy only uses zstd to compress and decompress its own data chunks and so it is not affected.

kauspost/compress 1.18.1: the buffer size fix is related to zstd dictionaries. Duplicacy doesn’t use dictionaries during compressing and decompressing.

storj.io/uplink: this is a dependency that I do want to update, because it is used in a less critical part, and any bugs are safeguarded by Duplicacy’s built-in checksums and can be detected early.

Generally, I have reservations about updating the dependencies just to make them look “current”. The worst case scenario is where one of the dependent libraries may introduce a subtle behavioral change that only manifests under rare conditions.

That said, I agree that the current development status of Duplicacy is not ideal. Over the past several years, I’ve spent most of my time on my other project – building an AI ball tracking camera – which has taken longer than I originally expected. I’ve always been thinking that once that project reaches a stable point then I’ll come back to continue the work on Duplicacy but that that has taken longer and longer.

What I’m going to do next, is to make a new web GUI version. I promised @saspus a long time ago that I would do this, but I didn’t so I owe him an apology. One reason (not a particular good excuse) is that every time I updated the web GUI to make security tools happy I had to upgrade the go compiler, which in turn would drop support of some older operating systems. Last time I updated the web GUI, the new version couldn’t even run on one of my old macs, which made me hesitant to push further changes.

I would also like to thank everyone for the continued support and for taking the time to raise concerns and provide feedback. Believe me, Duplicacy will never be abandonware – even if the progress has been slower than anyone including myself would like.

3 Likes

Great news! I hope your other project is successful and brings you satisfaction :slight_smile:

I can’t wait to see the next version of the web GUI; I hope I won’t be retired by then! :smiley:

Very happy to see this post, thank you. gl with the computer vision project. :slight_smile: