Request: Ability to sort recovery list

This order is the order how files/directories are stored in the backup. I feel changing to a different order is unnecessary and may not be liked by some users.

The order it is stored in the backup is absolutely irrelevant and implementation details user did not have any control over. There absolutely must be a sortable list at the very least, to sort by last modified, by recently added, by creation time, etc, but what’s more important – the list shall be filterable.

Throwing thousands of files at the user in random order is hardly a feature users love. Literally no one will dislike more useful UI.

1 Like

In a vacuum perhaps, but this is not true if polishing the UI detracts from other features that might be more useful.

We’ve discussed this before.

Specifically because we are not in the vacuum, polishing UI must be a top priority, ahead of all those features that might seem important to a few, but don’t matter for most.

We want product to succeed, right? For that it needs to be commercially viable. For that it needs to attract new paying users. Users, who interact with, and judge the product by UI, and who will not stand substandard experience when plenty of alternatives exist. And when users go extra mile and make these threads, instead of just throwing in a towel, like I did initially, this should be taken as the top priority. Because for each user that cared enough to start the thread are many who tried and silently decided to use a competitor.

Right now it’s a minefield to configure backup and unnecessary hurdles to restore data. And I’m not even taking about configuring exclusions. If it was temporary - ok. But nothing was improved in the UI for years.

Fixing UI is a top priority. Otherwise new users don’t come, gchen does not get paid, loses interest, product gets slowly abandoned, and people like me, who don’t even use or care about UI, lose in the end.

That’s the selfish reason I want usable, let alone polished, ui. For duplicacy to not vanish.

1 Like

Indeed, this is a rather philosophical discussion. You’re making assumptions though as to what is and is not important to a majority, where majority is with respect to current/potential users of a backup tool like duplicacy. And I don’t know if it is necessarily true that “users…judge the product by UI” applies to the case of :d:. “Average” computer user might be repelled by non-intuitive UI, but what are the chances that such user would even try :d: in the first place? I’d argue that most average users don’t use nor even look for backup solutions such as this, outside maybe something that is already baked into their OS of choice. Try the pitch line of “lock-free deduplication cloud backup tool” on an average consumer, and see how attracted they’re to it.

People who use :d: are already technically inclined, and usually have some specific requirements and features they’re attracted to, and in all likelihood it is not UI. And backup tools as a category are already a domain of power users / admins to begin with, and I don’t see it changing. Not to mention that outside of initial setting up and probably rather infrequent restores, interactions with UI of a backup tool are very few and far between. I’ve been a Crashplan user for more than 10 years, and outside of initial setup and a couple of test restores I never touched UI, and my backup requirements are probably more extensive than most.

A tool doesn’t need to have a slick (G)UI to be successful if that’s not how it is primarily used. Linux could be one example, where desktop experience improved over the years, “average” users and those interested in UI generally wouldn’t be picking it over Windows or MacOS, but that’s not Linux’s main selling point. Even more extreme example (not dissimilar to backup tools) would be enterprise networking software, which is primarily operated via CLI, with some limited (if any) GUI - because people who use these tools don’t need that.

It’s all about who is your target audience and what is the value proposition. UI does not necessarily need to be a part of this.

That’s precisely what needs to change. Today those users find and stick to Duplicati, the horrible unstable crap (without an official stable version that) because it’s first on the google search page and has a half decent ui.

Same. With duplicacy I ended up not using UI either. And I wrote a bunch of crutches to make it work on macOS. That’s way further than any average use might go. If the tool only targets those users - it will be unsuccessful albeit well respected in narrow circles.

I’d argue that specifically the apps that are seldom used must have the best UI — because it makes no sense to invest time in learning (fast and productive) CLI for something you are only going to use few times.

It’s a bad example, because making a general purpose OS usable is insurmountable task, but one specific app can absolutely have a great UI on every platform.

which is primarily operated via CLI, with some limited (if any) GUI - because people who use these tools don’t need that.

Right. In addition to point made above about seldom used tools — do we want to limit duplicacy to only users who don’t need UI? Those users are a minority and are already covered. According to the other thread, gchen feels duplicacy is not as successful as he had hoped. So, do we throw in a towel or do we tap into a massive hordes of people who don’t even know they need a backup tool? To attract those people it needs great UI and yes, likely separate branding. Something better than “lock free deduplication something”, as you noted, and ideally something with “backup” in the name. Users who need backup tool indeed type “backup” in google, not “lockless dedupe”, you are right.

I’m talking strictly on taking market share away from Arq and Duplicati for non-technical users. Those that want to pay to get a working product and not become an experts in configuring one.

If this is not a goal, and duplicacy is OK just targeting minority of highly technical CLI users — why bother with UI in the first place? At this point it only adds support workload, wastes efforts, and damages reputation.

So, who is target audience here? Corporations won’t touch it. For that software quality is generally unimportant; there are different selection criteria, mostly driven by SLA and the terms of support contract. This leaves free home users and enthusiasts, and a minority of tech inclined small business owners(I’m putting MSPs into this bucket).

Not addressing existence of hordes of Duplicati customers that are itching to have a competitor to jump to is literally leaving money on the table. Yes, it will be hard to compete with Arq. But worthy competition is good.

Today i feel WebUI makes more damage to the brand than helps sell the tool.

If it was my business— I would have cut out support for *Drive services, outsourced the design on the UI on every platform that
matters
(read: Windows, and macOS) to a reputable design studio, and implementation to the same studio or other third parties with a proven track record of delivering quality software. And I would stop wasting time putting more work into a backup engine itself until it’s done: it’s already good enough, and improvements there benefit a very small subset of potential customer base. (Ok, I would have fixed the prune bug that leaves bad snapshots when interrupted. Come on, it was reported years ago, and directly drives support volume up and trust in the tool down. Exemplary of how much care this tool is receiving today). That’s what I would have done if I wanted to grow the business.

1 Like

I’d say Duplicacy generally feels like a tool written by a programmer for other programmers, by someone who isn’t thinking much about what a normal user wants to have. A normal user wants to install a backup tool, click one button that automatically configures a whole backup for his PC, and that’s it. I know from my own experience that it’s not really “fun” to improve the UX compared to working on low-level improvements that feel more challenging and fun, but if the goal is to actually make money with it and get a significant amount of users, then UX needs to be the most important thing.

And with Duplicacy, it feels like 90% of the work is there and the rest 10%, that really makes it usable for the average person, was just skipped somehow. Adding some default backup schedule that simply adds a schedule for backing up the whole PC should be super easy to add, but instead users are asked to read about what all the different things (backup, check, prune) actually mean, what kind of command line arguments they should use, etc. If I wouldn’t be a programmer myself, who is obviously inclined to also write software that way myself, I wouldn’t have bothered with that, yeah.

1 Like

This is the question (though quite far from the original topic of this thread). Different market niches in the same category are possible, can be successful or unsuccessful just the same. Right now, :d: has advanced underlying technology and not so advanced UI. What you’re advocating for is to stop working on the core (strength) and instead work on buffing up UI (not exactly a strength), outsourcing UI even. Spending effort on weak(er) non-core areas just to make them average is often a business mistake. This would make sense if it would be just another generic tool that does exactly the same thing as dozens of others, then it may make sense to differentiate with slightly better UI etc.

But :d: has competitive advantages in the core technology, and this is valuable to the right audience. As I mentioned, not all products need to be marketed to the general public in order to be successful. Veeam did grow into a several billion $ company without marketing to the home users, and the underlying products are not miles away from what :d: and Vertical do (though they do more things nowadays). Saying that corporations won’t touch it is simply not true. If a vendor product solves a specific problem, it will get money thrown its way - if you get it in front of the right people who do have this specific problem. And global IT budgets are massive, and customers are way less fickly then general populace attracted by fancy UI.

But either approach (or even combination) can be successful, it is just right now UI is not a competitive advantage, and it is unlikely to be as it requires quite different skillset. You can’t be all things to all people, and there are reasons to play to one’s strengths.

1 Like

And thank god it isn’t eh.

What would be the logic of removing perfectly good, working, features, that are one of its biggest selling points, and which a big chunk of the user base demands and relies on?

They’d attempt to pair it with Rclone mount or serve and end up in a worse position support-wise. Utterly pointless exercise.

Again, cutting out NAS, RasPi, and Linux users - a big chunk of the user base.

There’s plenty of FOSS projects who have dozens or hundreds of devs and have spent years polishing and iterating on UIs. This is not an easy or cheap of endeavour you make it sound, let alone desirable.

Show me a successful product of this scale that ‘outsourced’ its UI, and tell me it wouldn’t get abandon by the very fact it was outsourced, cobbled together by barely adequate frameworks which you have to trust are somewhat future-proof because, y’know, you didn’t have time to pick your own and DIY.

It was never perfectly good, we’ve discussed this before. File sharing services are not designed for bulk data, period. Google shared drive specifically limit number of items. One drive limits number of threads. Both have horrific latency. Instead of supporting something poorly it’s best to drop support altogether. Search this forum — all issues users are having are with drive services including webdav. Nobody complained about s3 ever. Edit. Oh, and I personally used duplicacy with google workspace for quite a while and can attest firsthand the experience was horrible.

You are claiming this based on what? Backup to google drive is a silly nonstarter. The fact that some people are duped into this idea does not validate it in any way.

Yes. Focus on one thing at a time, and do it well. Rasperry pi and other Linux users can use CLI for the time being.

Great. How is this relevant?

It’s not cheap, it’s not easy, but necessary.

Today Kopia, that FOSS project your are probably referring to, is a superset of duplicacy features and design. It supports everything we have ever asked for from duplicacy. So, what competitive advantage does duplicacy have today over Kopia? Usable UX could be that advantage. Because Kopia’s UI is the same degree of horseshit, albeit more functional.

You see, a person cannot be great at everything. There is nothing wrong in ordering UX design to design studio. I cannot give you an example because how am I supposed to know if the design was developed in house or contracted? But when I see crap design — it’s almost always a DIY job by an indie developer who wants to do everything him/herself. Good design is expensive but to make money one has to spend money.

I don’t know what are you referring to here. I’m taking about designing user experience by someone who does that professionally. Implementation is separate. And while I’d argue native on the platform frameworks shall be used, to prevent those problems you are discussing and ensure futureprofing, this is really implementation detail.

Coming up with excuses why this is not possible to do it counterproductive and there are always a million of reasons not to do something. There are plenty mediocre software tools. With the limited resources focusing on one thing at a time and doing it well is a winning strategy. Otherwise why pay for duplicacy is Kopia already caught up and overtook?

We have, and it been working perfectly good for me for years, thanks.

You can keep making this unfounded claim, but it has no factual basis when users are happily storing ‘bulk data’ in the TBs and PBs on such services. These are the facts.

Forums and similar support channels are mainly used by those who have issues. Incidentally, they’re also places that users resolve their problems and eventually got to use the tool how they intended.

Really?!

Encouraging normal users down this path has been far more harmful than a few troubleshooting issues with any other cloud drive.

The fact you were incapable of getting it to work isn’t indicative of why users would want to, nor succeed in doing so. We should have a poll; I guarantee you after we’re done scratching off all the storage types you don’t like, and which ppl are currently using, poor GChen would be left broke.

So again, you’re encouraging the developer to drop support for paying customers just because you don’t like some design choices or features you personally don’t use?

Absolute crazy.

I wasn’t referring to that project, but this simply isn’t true in any case.

I won’t go into all of the missing functionality or lack of trust I put into its datastore - aside from some slight implementation improvements (which Duplicacy could easily improve on down the line) Kopia’s basic engine is inferior to Duplicacy’s, and I wouldn’t be able to migrate to it without significant pain.

This, before even considering their UIs.

If implementation is separate, then nothing changes. Outsourcing at this scale simply doesn’t happen. Plenty of user suggestions have already been made - a good portion of them haven’t been implemented, likely due to time constraints and different prioritisation.

IMO, GChen can fix this by open sourcing the GUI as a starting point for others to iterate on. Comp Sci grads will have studied UX, there just needs to be more people working on it. Hell, even I’d help out if this was OSd!

1 Like

I wouldn’t use Duplicacy if I couldn’t use it with Google Drive. And I never had any issue yet that was related to Google Drive limitations. All slowness I found was either due to slow Duplicacy code doing local stuff, or due to using the shared Google Drive project that allows only few threads before rate limiting. But with that custom token thingy that’s now officially supported since V3, and then being able to use something like 100 threads, Google Drive is perfect for Duplicacy.

You probably have a very low bar acceptable for user experience, and your anecdotal positive experience does not matter and is cancelled out by my anecdotal negative experience. So, let’s go by general principles instead.

Are those users paying adequately for the amount of data stored or are they freeloaders abusing the service? If you don’t pay for what you use — you are abusing the system. I know you disagree, you think you are within the “terms of service” and “as much as needed” is “unlimited”, and you can actually store petabytes for $12/months and feel good about it.

Right. I’m incapable. It has nothing to do with the software itself. In your opinion it’s my, the users fault, that my backup to a supported out of the box destination had tons of issues and generally wasted my time over free months. You expect me to spend time making it work. Hard no. I’ll just go to competitor.

It’s funny you brought up this specific example. It’s an application issue that it let user configure it in a way that caused 100% data egress from a backup location. This is an app bug. Why backup tool egresses data in the first place without telling the user that they are about to incur huge costs?! It’s unacceptable design flaw. And if it’s not clear: this is not an example of issues with s3. It worked as designed. It’s an example of issues with the app.

Because of the deficient software we are forced to send users down the wrong parts, instead of making S3 work well.

How is it my problem though? I the user see horrid design, no improvements over years and go elsewhere.

That’s why you hire a professional (freelance designer will do) to do the job you don’t have time, resources or expertise for.

Ultimately, it’s a closed source paid piece of software that the author has no time to work on or improve but still wants to get paid. Brilliant.

GChen can fix this by open sourcing the GUI as a starting point for others to iterate on

No. Because this web ui as it exists today needs to be scrapped and rewritten from scratch. You of course are free to contribute or start a new one, if gchen does not mind. I’ll just grab an app that already exists and works instead.

Comp Sci grads will have studied UX

That does not mean anything. I have studied ux but I could not design a good ui if my life depended on it.

A backup software that works well for an average user specifically should be designed to work as well as possible with *Drive services, because that’s what an average user will use. No average user wants to learn how S3 or how Google Cloud works. Those services are specifically designed for enterprise customers. I have tried to understand those services, and it’s way too much work. For any regular person, Google Drive or OneDrive or similar services are the only thing they can or want understand.

They should not have to. The app must be managing this complexity for them. That’s the point. Otherwise what do you need an app for if you have to do all the work yourself?!

Correct. It’s for app developers to provide value to their users.

An example would be Arq. It supports both usecases: you can use s3 or google cloud buckets as is, if you care to configure them, or you can get an app manage google cloud storage for you, if you don’t. It also manages cost for you, and the other thread where the user unknowingly incurred the huge cost due to full egress would not have happened.

That’s a consequence of lack of proper support for storage services designed for the backup job, (that also forces users to use hot storage btw). Drive services are designed with the different purpose in mind. It’s performing poorly for bulk data storage requests and is expensive for the users: To maintain my 2TB backup at google drive I had to pay $13/month. I’m now paying $3/month on Amazon s3, and get better performance.

Drive services can be made work but this requires more work in the tool — e.g. tweaking average chunk size to reduce number of chunks. Just slapping the connection backend is a start, not the end.

Google shared drives, for example, limit number of items that can be stored there and duplicacy does nothing to warn the user (and increase the chunk size 100x to mitigate that. Or drop support for shared drives). Instead the, shared drive is a first choice in the UI, setting up users for the data loss after about 500GB backed up.

They should not have to. The app must be managing this complexity for them. That’s the point. Otherwise what do you need an app for if you have to do all the work yourself?!

Well yeah, if a user doesn’t need to do anything except creating an account at S3, then it’s fine. But a user shouldn’t need to know anything about how to create a bucket or what a bucket even is. I didn’t know that backup software could do all that automatically. If that’s the case, then it’s fine, yeah.

To maintain my 2TB backup at google drive I had to pay $13/month. I’m now paying $3/month on Amazon s3, and get better performance.

Your full restore would not just cost your $3 though, right? But if all those costs are fully transparent to the user, displayed in the UI of the backup software, then it can be fine to use such services. If not, a flat fully predictable and stable X dollars per month is better for the user.

Drive services can be made work but this requires more work in the tool — e.g. tweaking average chunk size to reduce number of chunks. Just slapping the connection backend is a start, not the end.

Google shared drives, for example, limit number of items that can be stored there and duplicacy does nothing to warn the user (and increase the chunk size 100x to mitigate that. Or drop support for shared drives). Instead the, shared drive is a first choice in the UI, setting up users for the data loss after about 500GB backed up.

Well I fully agree with that. As I said, if a software like Duplicacy wants to be successful, it should of course be designed to ideally make use of something like Google Drive and have a UI that makes sense for it. That “requires more work in the tool” is work that should be high priority for the developer if the goal is to make a tool that an average user wants to use.

After all, the real competition is something like Crashplan Small Business which costs $10 per month for unlimited backup, or Backblaze Personal Backup that costs $7 per month for unlimited backup. That’s what any backup tool needs to compete with. It can’t be much more expensive, or much harder to use, if it wants to be successful.

1 Like

Sure. There is api to create buckets, configure permissions, etc.

Right. Full restore cost will depend on how soon I want the whole data back. From free (to slowly restore only what’s needed now) to exorbitant (to get all of it now). This cost however needs to be multiplied by the probability of needing full restore: you pay for storage always, it’s 100% probability, and you pay for full restore — the expectation is never. Or 1% or whatever else you think is appropriate.

Absolutely. And what pains me, is that today technology and services exist to make that tool :fast (modern storage is very fast, cheap (most users don’t store enough data to need to pay $10/month; even most expensive tier from hyper scalers will be cheaper than $10 for most users) and with UI tailored to humans (doctors, chefs, and music produces, not just IT pros). Crashplan did great UI, but their pricing model caused them to throttle heavy users. Which is Ok, but heavy in this case was about 1TB, so not so much anymore. Backblaze had the right idea in terms of UI but utterly botched implementation. It’s an abandonware, at least on a Mac.

Market has a room for a nice little tool to cover that void. I hoped duplicacy would be that tool. It has all the prerequisites. I probably still hope, that’s why I’m still hanging out here.

1 Like

No, that’s not the way it works. You can’t just come in and claim GCD somehow ‘doesn’t work’, because it wasn’t designed for the purpose, and insist it should be removed - when it’s patently untrue. It’s like saying your car broke down, therefore there’s a design flaw, with no evidence of such.

My own experience with GCD - with personal and customer backups - has been extremely straightforward, with backups running flawlessly for years. Compared with B2 and S3, the ‘user experience’ is no different, so pretending your clear bias against Drive services has anything to do with Duplicacy usability and not with your own, frankly odd, ideas of what a backend should be and do, doesn’t wash.

First off all, none of your business. Secondly, it’s entirely irrelevant to your claim that it’s ‘not designed for bulk data’.

As I’ve stated before; I personally store only a handful of TBs, use as much as I need, and do not abuse the service contrary to what you say. I also benefit from being able to do full restores, whenever I choose, without being charged an arm and a leg. I feel pretty good about it.

GCD is clearly capable of storing bulk data.

You know well enough Duplicacy isn’t downloading hundreds of GBs without the user having done a check -files, -chunks, or full restore. The guy won’t admit it, but it’s obviously user error. (Which you will, ofc blame on Duplicacy, because a standalone backup tool should also sort their sock drawer as well! :wink: )

If there’s anything to be removed from Duplicacy, it’s S3.

However, instead I’d suggest a big phat red warning: “Duplicacy is a standalone backup tool - used in conjunction with your chosen local storage or cloud service, which is not included. Henceforth, you are responsible for any and all fees. Be especially careful with enterprise-grade PAYG services, such as Amazon S3, as these incur hefty bandwidth costs in addition to storage.”

But this already went without saying.

Duplicacy, as a standalone tool, is not and can not be responsible for your storage costs; in the same way it doesn’t give you free hard drive space, nor purchases your cloud account for you. It’s not for complete newbies, nor should it be. Trying to fit it in that ‘average joe user’ box is misguided.

Duplicacy isn’t ‘forcing’ users onto anything. Duplicacy doesn’t even support archival tier, the only reason anyone would want to do S3 anyway. Yet the option is there for those that know (risk) what they’re doing.

There’s been a litany of horror stories of why such services is unsuitable for home user backups. Even big corps and experts get burned:

https://www.troyhunt.com/how-i-got-pwned-by-my-cloud-costs/

https://www.theregister.com/2020/12/10/google_cloud_over_run/

https://dev.to/juanmanuelramallo/i-was-billed-for-14k-usd-on-amazon-web-services-17fn

https://news.ycombinator.com/item?id=13072830

The list goes on. I consider myself an expert, but I wouldn’t touch enterprise-grade PAYG cloud with a bargepole.

We haven’t seen the source so it’s hardly for anyone to judge.

I said starting point; I don’t see why the current GUI couldn’t be iterated on, since the underlying frameworks are open, in common use, and could be replaced while keeping the rest of the engine, which works well enough, can also be improved.

What’s untrue exactly? That it has high latency? That it does not perform well with millions of files? That shared drive does no limit number of object? That you either don’t pay for what you use or overpay, with no middle ground? That the future is murky, as every single other unlimited service perished?

Well, in this case you are MSP. Your opinion does not matter, there are much more users than MSPs. You’ve done some calculations, and concluded that for the time being it’s OK to abuse google storage – customers don’t care about latency, they pay low price, and google won’t lose data, so win-win.

Except invariably nothing is free and every single other unlimited service collapsed. If someone comes to you offering hey - wanna store 1PB for $12 - the only reasonable response is – get the hell out, I don’ know how and where are you screwing me but I don’t want to know and don’t care to figure out.

I highly doubt that, at least for me it was drastically different. Your users won’t tell you, they pay you money to handle that for them.

Yes, though the magic of having tried to use all of them, I have formed by opinion and biases.

These are two separate claims.

You are therefore overpaying for inferior service, but that’s your choice. We are not discussing what is good or bad for you or me personally. We are discussing what’s applicable for most average users.

I do. It’s likely.

Yes. A backup tool shall not let users make expensive mistakes. It’s not a rocket science to add a warning to known backends to remind the user about the cost. You say it’s not its job. I say it’s what makes a difference between abandonware and successful product.

Well then. That’s the root of disagreement it seems. If this is not for the newbies and intended to IT pros – I take everything I said back, stop expecting any positive change in this realm, wish the project success, and show myself out.

I just need to shake off the feeling that you might be wrong – because of the whole WebUI exists, that does not support multi-user environments, HTTPS properly, and downloads unsigned executables from the internet as root. IT Pros will love it! It’s clearly for them.

If you are right – WebUI shall be scrapped, and efforts should be focused on “not complete nubies”. Which rules out most of customers, including myself. I don’t want to be a pro with backup. I just want to have backup done. But hey, maybe Microsoft will buy it out.

I’ll look through them.

In the same vein, I would not touch that free cheese. Google Workspace is a collaboration platform. Google provides storage services, it’s called Google Cloud Storage. I prefer to use the tools designed for the job. I want to know what I’m paying for, and know that I pay for the stuff I use, no more, no less. Sometimes I will miss a great, albeit temporary, “deal” but most of the time I will be right, and avoid wasting time analyzing merits and finding flaws in every too good to be true offerings. (That saved time I will promptly waste arguing with strangers on the forums, obviously).

S3 pricing model is not that hard, I absolutely love it. It incentivizes the intended use of the platform with pre-determined outcomes. I don’t yet know what’s in those articles you’ve linked, so wont’ comment just yet.

We have seen the outcome, and lack of progress in 2 years. I’ll be the judge. It’s a dead-end abandonware at best, or unsupportable complex concoctions of workarounds at worst. You don’t need to see the code for that.

Nothing is impossible, amount of effort determines if it’s feasible. But even without seeing the code, it’s obvious to me that trying to steamroll “control panel” component into a usable UI for a backup tool was a bad idea from the start. Why obvious? Because I see the state of it today, and two years ago, as a user. I have also seen previous duplicacy GUI. And I strongly suggest outsourcing the next version of it. No offense to anyone here. “The rest of the engine”, aka duplicacy on GitHub, is indeed good, and efforts would be best spent iterating on it instead of fumbling with web frameworks. But to pay for these efforts – someone needs to build a usable, by regular humans, application. Because I personally would not pay the license fee when there are free alternatives available, with better UI. (I bough licenses as a way to donate to the project, I don’t even use them)

Again, to your point, if average joes, including myself, are not the intended audience – most of what I say is not applicable. And it’s also not clear to me why should not I use restic, borg, or kopia if I have to learn the CLI. It will work just fine for vast majority of users. “lockless deduplication” is cool on paper and appeals to nerds, but makes little difference in real world (in my anecdotal experience), and some of the listed tools support it already anyway.

So, where is the competitive advantage? Why should I pay money? I know for a fact, that if there was a usable UI, I would be now paying for 8 annual subscriptions ($2/year is laughable and needs to be increased at least 10x, but that’s indeed not for me to judge), and would be continually advocated for Duplicacy.

But I’m not an MSP. I’m a user that votes with a wallet.

1 Like

What high latency? In what way - if that actually exists - does it prevent GCD from working with Duplicacy? It doesn’t.

In what way does it not perform well with millions of files? It doesn’t. I’m not aware of any such scaling bottleneck with GCD. If you’re referring to slow checks and prunes, with tens of thousands of revisions, you know full well this has nothing whatsoever to do with GCD and is a processing bottleneck that happens with any backend.

Already known and accounted for. I don’t use Shared Drives, don’t need to - My Drive is perfectly suitable for the task. If I DID consider using it, 400k is plenty enough for most situations, and I can always bump the chunk size.

This is wrong. There are multiple tiers of Workspace which do provide a middle ground, and I can upgrade/downgrade at will. At present, I pay for the top tier Enterprise Plus ~ 23GBP/m - but ofc you’ll complain I’m not paying enough! Thankfully, this is not up to you. :slight_smile:

Again, enough with the unfounded nonsense. I do not abuse Google’s storage in any way.

Precisely! You’re claiming GCD ‘wasn’t designed for bulk storage’, which is provably false. So now you change the goalposts and appeal to the ethics of ‘abusing’ a system which again, provably, isn’t happening. This is equivocating, because your argument against GCD is biased and weak.

I was referring to the non-GUI related code, of the closed-source GUI.

There’s still a large amount of work that makes it tick, and using that as a starting point - even if to rewrite the visual framework - would be far better for any volunteer endeavour, than embarking on a total rewrite.

Legally, I’m sure a free-for-personal use custom-made GUI front end would be okay, but implicit permission to improve on the GUI, by releasing the source code, would be a far better proposition - regardless of how much of it needs to be rewritten. It can be iterated on, it does not need to be a total rewrite.

At the very least, basic UX improvements can be made on the existing stack, which is far more likely to happen with many hands than just the one.

1 Like