If I use postgres's EXPLAIN
command, there's a top-level "cost". Assuming that the explain is accurate (ie despite the cost being in reality quite unreliable and/or inconsistent), what is the very approximate conversion from cost to minutes/seconds query duration (for a "large" cost)?
In my case, the query cost is 60 million
For context, my hardware is a regular laptop and the data is 12M rows joining to 250K rows on an indexed column, grouped on several columns to produce 1K rows of output.
This question is not about the query itself per se - there could be better ways to code the query. This question is also not about how inaccurate, unreliable or inconsistent the explain output is.
This question is about estimating the run time a query would take if it executed given its EXPLAIN cost and given that the EXPLAIN output is in fact an accurate analysis of the query.