-
Does anyone have an idea as to how much memory clickhouse-backup can consume when running as a container? It's footprint is fairly small while running, but I'd guess it can balloon out as the backup is occurring. Is it some percentage of the db size or is it fairly controlled by the software? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Look https://github.com/AlexAkulov/clickhouse-backup#concurrency-cpu-and-memory-usage-recommendation Most memory allocates during upload and download operation
How much RAM will allocate? Also, it depends on compression format, In our tests for 2Tb backup on 32 cores (16 concurrency) to S3 with zstd compression |
Beta Was this translation helpful? Give feedback.
Look https://github.com/AlexAkulov/clickhouse-backup#concurrency-cpu-and-memory-usage-recommendation
Most memory allocates during upload and download operation
on each concurrency
How much RAM will allocate?
It depends on how much Upload / Download Concurrency do you use.
By default, it is half of available
cpu
cores.UPLOAD_CONCURRENCY=3, DOWNLOAD_CONCURRENCY=3 could significant reduce memory usage
Also, it depends on compression format,
compression_format: none
orcompression_format: tar
is cheaper for memory/…