I have a python list of upwards of 2000 elements. I have a postgress table i need to query for each one of these elements, i am currently using the following.
cur.execute("SELECT * FROM patent_info WHERE patentno = ANY (%s);", (list_of_patent,))
What this does is query the table one at a time for every element in the list and i store this data in a dataframe afterwards using
data = cur.fetchall()
The issue is that with over 2000 elements the response takes a while to generate sometimes causing my webapp to timeout or the browser to timeout.
I was looking for a way to query all the elements from the table faster or more efficiently.
i thought about using a for loop for every element in the list and using cur.fetchone() but i assumed that would be slower.
Do a
join
in instead: