Dont buy this scam

Huge thanks for your detailed explanation, I will avoid kopia.

Btw there is no “official” container.

I am using unraid as my main nas system and don’t intend to switch any time soon, so I need to use docker or a vm.

How does duplicati rank on your list? I have tried it today and to me it seems like a good compromise with a huge community and ok web ui for my needs.

Not necessarily. See this recent thread: How to generate ssh keys in dupkicacy web ui docker container on unraid - #6 by saspus

It’s not on my list. It used to be when I started, but now it’s beyond bottom of the list. It’s not a piece of software worthy consideration. Besides obviously lacking a stable version (why EoLs old codebase before new one is stable?) it’s absolute garbage in the one task it is designed to do — keeping your backups intact. Its database gets corrupted if you just look at it wrong, and reliance on Mono framework does not help overal reliability.

Pretend it does not exist.

The problems were very well detailed by @saspus: they are essentially unfinished software (one in zero version and the other in eternal beta). They have one huge point of failure in common: they use databases.

One of the main reasons for Duplicacy’s robustness is that it only relies on the storage filesystem.

1 Like

I don’t think Kopia uses central databases; its design is very similar to duplicacy’s: they use content addressable storage (CAS).

1 Like

My conclusion so far and for the past many years was and still is as following

  • FreeBSD and Linux: duplicacy CLI (CLI only, free for personal use, paid for commercial)
  • macOS and Windows: Arq7 (commercial, excellent UI)

Are there any reasons, outside of the nice GUI, to prefer Arq7 over Duplicacy CLI? (Windows user)

1 Like

I don’t want to advertise the competitor’s product here, but since this can be framed as “why did duplciacy lost customer” and ultimately help improve the product – I’ll do.

I’m macOS user but some things will still apply to windows too.

Dealbreakers (has been requested from duplicacy many times). One is enough:

  • Support for AWS Archival storage. This is a full stop. Forcing users to use hot storage for backup is recklessly wasteful.
  • Support for user-mounted filesystems (via helper process and impersonation). On windows this is a separate disaster because the OS does not support concurrent connections to the same server with different user. (Microsoft invented SMB – and yet, here we are)
  • Support for macOS multi-user environment (SIP, access: I made a workaround for myself here).

Nice-to-haves (not dealbreakers, but have been requested many times too):

  • CPU throttling
  • Prevent/pause/throttle backup with on battery power
  • AWS cost management
  • AWS object lock management

New developments:

3 Likes

Thank you very much, appreciated!

With Arq 6 i lost all of my backups. it was a terrible disaster.

The publisher has raised the bar with Arq 7, which is now beginning to mature, however, it remains weak on reliability. For example, I encountered a lot of VMDK corruption issues on powered off Workstation VM backups, an issue I never had with Duplicacy.

But the most annoying thing is that there is no procedure or possibility of repair in case of file corruption. If some files are corrupted, it is not uncommon to lose the roof of the backup set because you cannot restore it.

Apart from the rigid and unintuitive side of the Arq 7 interface, it remains interesting as a second backup software.

The big difference that makes me put Duplicacy in front is its incredible robustness, and all the possibilities of being able to recover your data when a backup is corrupted, because the huge advantage is that there is no database system to store meta data or whatever, everything happens at the file system level, it’s incredibly powerful and robust.

And with Duplicacy, the only limitation on performance is only hardware, CPU, RAM or disk IO. Arq 7, sometimes, does nothing for hours, no CPU load, no disk IO, no full RAM, then it goes back to its slow speed until the next shutdown.

With Duplicacy, I almost halved my backup times.

3 Likes

I believe Kopia does use central databases, or what they called index. I briefly looked at how it worked when it just came out. In my opinion their way of handling the databases is too complicated, even more complicated than other competitors.

This has changed, I think around 0.9, now indexes are append-only and also live in CAS.

Can’t comment on arq6. It was very short lived, electron based UI, experiment. Not even a contender. It was rightfully scraped shortly after.

I doubt it had anything to do with a backup tool. Any backup solution is as reliable as underlying storage. On the other hand, with encryption enabled with any tool, including Arq and duplicacy, there is no way for you to end up with bad data even if storage rots. You either get back exactly what was backed up or nothing.

There are two factors that affect reliability (assuming your ram and cpu are working correctly): storage and networking.

For storage it’s easy: Use reliable storage that guarantees data consistency. Repairing data rot is a job of a filesystem, not applications. Datastore corruption is an impossibility.

Duplicacy does support erasure coding, so it can tolerate some rot — but the problem is, if even one byte is allowed to rot, who is to say that it will be limited to that one byte? Ultimately, the storage either guarantees data integrity or it does not. In the former case you don’t need to worry’s about corruption. In the latter — you don’t have any guarantees, so don’t use that storage.

The other factor — resilience to network interruptions — is indeed important, and both tools handle it very well, according to my limited synthetic testing and years of use.

Performance is never a selling point of a backup tool. It’s simply irrelevant. Yes, duplicacy is very fast, which is nice, but I take CPU throttling over any super-speed backup utilizing full resources. There is no reason to hurry up to finish backup in 5 minutes to then wait for 12 hours for another backup.

Duplicacy benefits from a very fast regex engine, this helps a lot with scanning when exclusions are specified. Arq5 used to be very, ridiculously slow for the reasons I did not look into. I don’t know about arq7, because I no longer use manual filters. Everything gets backed up indiscriminately.

This statement applies to any piece of software on earth :slight_smile:

Arq 7, sometimes, does nothing for hours, no CPU load, no disk IO, no full RAM, then it goes back to its slow speed until the next shutdown

Have you got to the root cause of this? Apps don’t just do nothing, when they appear to be not doing anything with all stacks idle they are likely waiting for some asynchronous request, network, or disk IO. This is not specific to ark or duplicacy or Microsoft word. Nobody inserts delineate delays in the code :slight_smile:

For the VMDK corruption with the support we think the corruption was during restore. We search but never find why.
I make a new test just few minutes and with last version of Arq 7 this seems working. If i reinstall the old Arq 7 that i use when i have the troubles, problem occurs again with a new one,

if I launch several backups taking care to have modifications in the VMDK (just start then stop the VM) in one snapshot out of two I have no files displayed and therefore nothing to restore.

This doesn’t seem to happen with the latest version, I couldn’t find anything interesting in the changelogs that might relate to this issue but it’s not very reassuring.

So the easiest thing for me was to go with Duplicacy, much less obscure, very practical in its CLI version, and just as little to work under linux, which allows me to have a single backup format for my three platforms.

For performance I don’t agree, nowadays it’s not up to humans to wait for tools, so I prefer a reliable and fast tool, than a reliable but slow tool. When I launch a new backup, I need the time needed to achieve it to be realistic, Arq 7 has improved on this point but not enough for my taste and given the articles found on the internet I am not not the only one to think that.

But fortunately there are many tools, this makes it possible to satisfy more people and find the tool best suited to their needs. :slight_smile:

PS: I really like restic too which seems a very good tool :smiley:

3 Likes

Ha! This is very interesting. Maybe some weird versioning incompatibility. If this is repeatable – i.e. you backup a file, restore a file and it’s different – send that file to arq, they should be able to debug.

Can you clarify this a bit? are you saying arq does not detect changed files? Or does vm solution change files without changing timestamp?

Absolutely. Simplicity and robustness is what I value in duplicacy as well. This is great for servers. But:

  • for workstations and personal laptops I value creature comfort more.
  • Support for Archival storage is still missing. I don’t want to pay 4x in storage fees for no reason. for 4TB this is literary difference between $50 and and $200/year. I can find better use for $150.

If that was about anything else I would agree. But backup specifically is a background task, there is no hurry ever, nobody is waiting for it to end. If you backup once every hours – it has the whole hour to make that incremental backup.

Just like when some people don’t use dishwashers because “I can wash dishes faster!”. Well, guess what you are not doing when the dishwasher takes 2 hours to do that – you are not washing dishes!.

Also, when I’m running on 5% battery remaining, and duplicacy decides to complete backup in 30 seconds, fans blaring, immediately sending my laptop to sleep, I strongly dislike it. I’d prefer it to take 3 hours and not disturb me.

Two comments here:

  • initial backup is a one-time operation, and therefore it does not matter how long it takes.
  • I’m yet to see a backup tool that is limited by anything but network. Granted, I don’t run them on raspberry pis…

This is a bit sideways comment… but I increasingly grew to take stuff written on the internet with a giant boulder of salt :slight_smile: In vast majority of cases there is some sort of undisclosed factor making the whole ordeal pointless and misleading. The remaining genuine testimonials are so far and between that they are not not worth the efforts to sifting though mountains of garbage. So, I don’t read reviews anymore, no longer ask questions on forums, and don’t form any opinion on anything, unless I’ve looked and tried things myself. Expecially when technology is concerned.

Yes. Competition is good, and nothing is perfect.

Restic is great. Borg is also fine. Some people get away with using git for backup. My main day-to-day backup is, in fact, Time Machine. Why? Because it’s integrated into the OS and provides excellent UI. It takes 4 hours to complete an incremental backup – because it’s a background service and it stays out of the way. It supports exclusions if needed, but I don’t use them, storage is cheap. Arq to AWS is for disaster recovery, I never expect to need to restore from it.

At some point it becomes too much to keep pouring over one solution, and you want something that “just works” without the need to read forums and write scripts. Time Machine and Arq allow me to do that, (so I can obsess over other things :slight_smile: )

LoL dupliacy.com doesn’t even comply with GDPR, good luck getting sued.
(C) 2019 is just sad, who doesn’t auto update that.
You handle customer data, where are your privacy policies?
Even with external payment processing, you still have at least the name and email of the person.

This indeed looks like an oversight. @gchen?

https://duplicacy.com/privacy_policy.html

Constructive criticisms are welcome, but labeling my paid software as a scam (while continuing to use my free software) is highly unprofessional, to say the least.

4 Likes

There are no links to it from anywhere I looked — including the homepage of duplicacy.com. It’s impossible to find it. Per GDPR it shall be prominently displayed and easy to find.

Completely agree.

while continuing to use my free software) is highly unprofessional, to say the least.

Yes, ofc it is highly unprofessional. As a company, it is also highly unprofessional to completely ignore basic consumer rights and to interpret law in a very strange way. That’s the cost of doing worldwide business. GDPR is not an opt-out law, like install a browser plugins to disable Google Analytics in your policy. It’s the complete opposite, the customers have to actively agree and only after that you can send their data away. If the link is not accessible from the site, the policy basically doesn’t exist.
Yes, calling your product a scam is overdramatic, but there are some shady things going on and everything I touch feels deprecated.
The constructive part of the criticism above is the following.
Either redesign the whole site and make the paid version actually feature complete with a good gui or pay someone to do it.

1 Like

This is for the forum: Privacy - Duplicacy Forum. You can access it by clicking the three line icon on the top right corner and then clicking Privacy. It is just the default Discourse template and was set up before there was GDPR. Certainly it needs an update.

A post was merged into an existing topic: Cli version , I cant tell if anything is happening