I am conducting the following data migration in SQLAlchemy and the operation keeps timing out, so I would like to batch the query in a way that allows me to complete the migration.
session = Session()
q = session.query(RawDocument)
for i in q:
my_tf_dict = get_word_count(i)
new_ob = ten_q()
new_ob.tf_dict_phrases = my_tf_dict
session.add(new_ob)
del my_tf_dict
session.commit()
I think yield_per() may be an option? https://www.codepowered.com/manuals/SQLAlchemy-0.6.9-doc/html/orm/query.html
Also, maybe applying these methods could work? https://carto.com/docs/carto-engine/sql-api/batch-queries/
Table I am trying to update:
id cik tf_dict_phrase factset_id key_name
1 706688 000BFT-E 10q/706688_2005-5-10_10-Q
2 706688 000BFT-E 10q/706688_2005-8-8_10-Q
3 706688 000BFT-E 10q/706688_2005-11-8_10-Q