Inserting large binary blobs with aiomysql

111 views Asked by At

I'm trying to insert a large binary blolb (column type LONGBLOB) into a local mariadb instance using aiomysql version 0.0.20 (https://pypi.org/project/aiomysql/).

I pass the blob inside a python str object via INSERT using the % syntax: execute('INSERT INTO blah VALUES (%s)', (large_blolb)).

The blob is rather large (~500million character, so probably ~0.6 to 1GB), but certainly bellow the 4GB limit of LONGBLOB

The insert seems to fail due to network errors... How may I go about doing this ? Is there an option doing with with aiomysql ? What would be the easiest async alternative (e.g. starting an async sub-process and having mariadb read from a file)

0

There are 0 answers