String buffer fails to write data to database table

488 views Asked by At

I am porting a mongo database over to a PostgreSQL one and I came across with an issue. I am using psycopg2's COPY_FROM, which takes as arguments a file object, the table to write to and other optional arguments. My original code looked like the following:

records = '\n'.join(','.join([row['id'], row['att1'], row['att3']]) for row in data)
fio = io.StringIO(records)
cursor.copy_from(fio, 'mytable', sep=',')
postgres.commit()

The code above works fine but fails for columns containing commas (splits by commas). Thus, I want to escape all commas and other punctuation that may interfere. For this, I used Python's csv module which handles this and got to the following code:

fio = io.StringIO()
writer = csv.writer(fio)
writer.writerows([row['id'], row['att1'], row['att3']]) for row in data)
cursor.copy_from(fio, 'mytable', sep=',')
postgres.commit()

Using the code above, mytable remains empty no matter what. I tried iterating fio after writing the rows and the content is the same as in the initial code snippet (using ','.join). I also checked the size of the object and it has about the same size in both snippets after writing the records.

What am I missing here? Why isn't the data being written to the table in the second example?

1

There are 1 answers

1
Nick On BEST ANSWER

After writing to fio you are at the end of the file. You need to return to the start for when it is read by psycopg2.

Easy modification, like so:

fio = io.StringIO()
writer = csv.writer(fio)
writer.writerows([row['id'], row['att1'], row['att3']]) for row in data)

fio.seek(0) # Return to beginning of file.

cursor.copy_from(fio, 'mytable', sep=',')
postgres.commit()