> In PGtune I can able to get configuration changes based on RAM and Disk and No of Connection.
> but if we want to recommend RAM, DISK, and No of connection based on DB size. any calculation is there
> For Example, 1 TB Database how much RAM and DISK Space required for better performance
> the DB size will increase per day 20 GB
> Frequent delete and insert will happen
The database size on itself is not enough to provide any sensible
recommendation for RAM. The connection count and usage patterns are
There are 1TB databases which could work really fine with as little as
40GB RAM, if connection number is limited, all queries are
index-based, and the active data set is fairly small.
On the other hand, if you have many connections and non-indexed
access, you might need 10x or 20x more RAM for a sustained
That's why PgTune configurator requires you enter RAM, connection
count and DB access pattern class (OLTP/Web/DWH)
Anyway, what PgTune gives is just and approximated "blind guess"
recommendation. If auto-configuration was easy, we would have it in
core postgres long time ago. It could be nice to have a configuration
advisor based on active data set size... but I doubt it will be
created - for several reasons, 1st - it would be still a "blind
guess". 2nd - current version pgtune is not-that-nice for contributors
(fairly ugly JS code).