Hi,
Our development server has 1 Node with 30GB of RAM. Loading(ETL) 250MB of data takes approximately around 2 mins.
Is there a way to estimate how much time would be required to load 1GB of data, to a 3 node cluster with 122GB RAM per node.
This is required to asses the number of Nodes or RAM required for the queries to run faster with a raw data size of 1TB.
Is there any formulas available for data extrapolation to keep us informed on the h/w upgrade beforehand.
Thanks,
Saritha
Workload Management & Capacity Planning
Moderator: NorbertKrupa
- JimKnicely
- Site Admin
- Posts: 1825
- Joined: Sat Jan 21, 2012 4:58 am
- Contact:
Re: Workload Management & Capacity Planning
Hi,
2 minutes seems like an awful long time to load 250 MB of data.
Can you run the vioperf utility to check the performance of your host's input and output subsystem?
Go here to see how to use vioperf:
https://my.vertica.com/docs/7.1.x/HTML/ ... ioperf.htm
Also, take a look at the "Recommendations for Sizing HP Vertica Nodes and Clusters" guide here:
https://community.dev.hp.com/t5/Vertica ... a-p/228148
2 minutes seems like an awful long time to load 250 MB of data.
Can you run the vioperf utility to check the performance of your host's input and output subsystem?
Go here to see how to use vioperf:
https://my.vertica.com/docs/7.1.x/HTML/ ... ioperf.htm
Also, take a look at the "Recommendations for Sizing HP Vertica Nodes and Clusters" guide here:
https://community.dev.hp.com/t5/Vertica ... a-p/228148
Jim Knicely
Note: I work for Vertica. My views, opinions, and thoughts expressed here do not represent those of my employer.
Note: I work for Vertica. My views, opinions, and thoughts expressed here do not represent those of my employer.