CloudBerry Backup is designed to provide really rapid backup speed but in some cases backup can be slow. What to do to speed up the backup – that is the topic of this article.
Most users of Windows OS keep their firewalls and antivirus programs active. In some cases firewalls and antivirus programs can become the reason of low backup speed.
Firewalls and antivirus programs can block unknown applications, limit bandwidth and prevent long transfers for them. That means that both of them - firewalls and antivirus programs can lower the speed of backup.
Check out connection whitelists as a first step:
Now here are some more things that can improve overall performance of Backup.
1) Adjust chunk size in such a fashion it was equal to (or factor of) the size on the screenshot.
This is not a calculation, just a guess - a chunk (or N x chunk) must fit approximately 70% of files.
Thus there won't be much chunks containing too little information because, e.g. 21.5 MB cannot be split to 7 MB chunks very effectively.
For example, if the average size of a single file is 0,5 MB - you should lower chunk size to the minimum of 5 MB to optimize the process.
Or if the average size of a single file is 17 MB - you should set chunk size to 18 MB to optimize the process. However, consider recommendations of point 2.
For smaller files little chunks are recommended; for greater files - big chunks.
Although be careful, RAM consumption may grow in accordance with the formula below:
RAM = 2 x thread count x chunk size
Now to distinguish "small" and "big" chunk/file size it is important to consider network bandwidth. The greater the bandwidth is, the greater chunk size may be.
For a gigabit connection 10 MB chunk is too little (provided it is not set because the files are very little), it is better to set it like 120 MB or so.
For a modem (satellite, cellular - to lesser extent) connection - set the minimal value of 5 MB.
It is also important to note that the whole connection for a site is most likely used by multiple processes on multiple machines (on this one machine - for dedicated connection).
So instead of full theoretically possible bandwidth we recommend to use expected available bandwidth for calculations.
2) Adjust thread count to utilize network bandwidth and CPU in the most effective way - too much threads may overuse them both and decrease overall performance; too little of threads will be too slow.
If network connection is not very stable, smaller chunks and small thread count work better.
For small files - great thread count and small chunks.
For mid-size or big files we recommend 6 threads and chunk size in accordance to the point 1).
3) Exclude non-important (or include only important) files by Advanced Filters.
Here is how it looks on different systems (clickable)
Windows, Linux/Mac GUI, Linux web UI.
For example, you can split the whole data to several plans.
Let's say pictures and videos do not change frequently - separate them to a monthly backup plan.
MS Word documents, in contrary, change several times a day - schedule a daily plan for [ *doc; *docx ] files.
And you might never want to backup any temporary files - exclude (or don't include) all [ *tmp; *temp; *~*; *.bak ] files from all plans.