Benchmark command details

The benchmark command is used to test the upload and download speeds for a specific storage and disk access speeds for your repositories.

Click here for a list of related forum topics.

Quick overview

NAME:
   duplicacy benchmark - Run a set of benchmarks to test download and upload speeds

USAGE:
   duplicacy benchmark [command options]

OPTIONS:
   -file-size <size>            the size of the local file to write to and read from (in MB, default to 256)
   -chunk-count <count>         the number of chunks to upload and download (default to 64)
   -chunk-size <size>           the size of chunks to upload and download (in MB, default to 4)
   -upload-threads <n>          the number of upload threads (default to 1)
   -download-threads <n>        the number of download threads (default to 1)
   -storage <storage name>      run the download/upload test agaist the specified storage

Sample output:

duplicacy benchmark
Storage set to sftp://gchen@192.168.1.125/storage
Generating 244.14M byte random data in memory
Writing random data to local disk
Wrote 244.14M bytes in 3.05s: 80.00M/s
Reading the random data from local disk
Read 244.14M bytes in 0.18s: 1388.05M/s
Split 244.14M bytes into 53 chunks without compression/encryption in 1.69s: 144.25M/s
Split 244.14M bytes into 53 chunks with compression but without encryption in 2.32s: 105.02M/s
Split 244.14M bytes into 53 chunks with compression and encryption in 2.44s: 99.90M/s
Generating 64 chunks
Uploaded 256.00M bytes in 62.88s: 4.07M/s
Downloaded 256.00M bytes in 63.01s: 4.06M/s
Deleting 64 temporary files

A post was split to a new topic: OneDrive benchmark: deleting 41 thousand temporary files

I’m wondering why does the -file-size command only change the size of the local file benchmark, but the upload/download to the remote storage is still 256M?

I was attempting to benchmark b2 with the free account with the 1GB cap, and would only be able to test 3 full 256M runs and wanted a 4th run of 150M with -file-size option, but it still uploaded/downloaded 256M.

The local test split 150M into 34 chunks while the remote test split 256M into 64 chunks.

duplicacy benchmark -storage b2-test -upload-threads 4 -download-threads 4 -file-size 150
Repository set to D:/testuploadsmallCopy
Storage set to b2://duplicacytest
download URL is: https: //f003. backblazeb2 .com
Generating 150.00M byte random data in memory
Writing random data to local disk
Wrote 150.00M bytes in 0.18s: 845.98M/s
Reading the random data from local disk
Read 150.00M bytes in 0.03s: 5354.80M/s
Split 150.00M bytes into 34 chunks without compression/encryption in 0.64s: 234.75M/s
Split 150.00M bytes into 34 chunks with compression but without encryption in 0.87s: 173.00M/s
Split 150.00M bytes into 34 chunks with compression and encryption in 0.89s: 168.92M/s
Generating 64 chunks
Uploaded 256.00M bytes in 65.59s: 3.90M/s
Downloaded 256.00M bytes in 22.82s: 11.22M/s
Deleted 64 temporary files from the storage

Also wondering whether this has been fixed yet?

The size of download/upload is controlled by the -chunk-count and -chunk-size options: download/upload-size = chunk-count * chunk-size

1 Like