Query a Postgress table using a large python list and Psycopg2 with speed.?

572 views Asked by At

I have a python list of upwards of 2000 elements. I have a postgress table i need to query for each one of these elements, i am currently using the following.

cur.execute("SELECT * FROM patent_info WHERE patentno = ANY (%s);", (list_of_patent,))

What this does is query the table one at a time for every element in the list and i store this data in a dataframe afterwards using

data = cur.fetchall()

The issue is that with over 2000 elements the response takes a while to generate sometimes causing my webapp to timeout or the browser to timeout.

I was looking for a way to query all the elements from the table faster or more efficiently.

i thought about using a for loop for every element in the list and using cur.fetchone() but i assumed that would be slower.

1

There are 1 answers

0
Clodoaldo Neto On

Do a join in instead:

cursor.execute("""
    select *
    from
        patent_info
        inner join
        unnest(%s) u(patentno) using (patentno)
""", (list_of_patent,))