Yep, most chunks will be de-duplicated regardless of what ID they’re in…
Notice however, when you’re uploading a new repository for the first time (or the file history of a particular repository ID is being nuked by another location’s file metadata for the same ID - as you’ve just experienced), it’ll take longer to finish the job because it’s re-hashing file contents locally. It’ll skip the majority of chunks and even appear as though it’s still uploading. Those numbers aren’t upload speeds, but throughput of how much data is being processed.
As for deleting your ‘Live’ files, there probably isn’t much point as most of the chunks will be referenced by the Archive, but I’d set a normal retention period for your
prune operation on that repo and those old snapshots will disappear eventually. Or, you can always prune individual revisions to speed that process up, and in case you want to tidy up the moments when it was interleaving between the two repositories with the same ID.
You really don’t need to do anything special with the backup jobs when moving stuff from Live to Archive - just move your files between your repositories and Duplicacy will take care of the de-duplication for you.
Edit: Oh, and as @leerspace mentioned, it’s a good idea to test with