RAM Usage for 30tb

Hello,
I just got Duplicacy WebUI, and one of the folders I need to back up is about 30TB.
I started the backup process but I can see the Duplicacy docker eating up my memory very quickly and it looks like it will use up all the available RAM on the server way before it gets to finish the backup! What do I do? Is there a way to limit its memory usage? I’m not sure what’s going to happen when all of the memory gets used up, if duplicacy just starts over?

What should I do?