Capacity Planning for Storage-Compute Decouple mode #51991
Replies: 1 comment 2 replies
-
You can first refer to the general model of 1 CPU core : 4GB Memory and 1 CPU core : 50GB Data for estimation. However, there are no universal stress test metrics for cluster scale evaluation—it is closely tied to your actual scenario and highly relevant to your business. For example: Is the data log-based, report-based, or of other types? These factors can lead to significant differences in data scan volumes. Another example: if data is stored on a daily basis, even if the total volume reaches 1TB (accumulated over a year), the daily data volume may only be around 3GB. In this case, if queries are filtered by day, the actual scan volume will be very small. Therefore, cluster scale must be evaluated through stress testing based on business data. It usually depends on two key factors: concurrent access in the scenario and computational complexity of data requirements. for detail recommendation, we can have a deep discussion. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am setting up doris 3 in comute storage decoupled mode in k8s.
My target doris cluster will only used by a single application and will host around 100 TB of data. What config would you recommend for
Beta Was this translation helpful? Give feedback.
All reactions