Looping over the records to get the data and push to api

163 views Asked by At

I am using the following code, the code works but the issue is for 20000 records, the loop is taking too much time and timing out.

so basically it is like this

<cfset x = new myapi()>
<cfparam name='length' default="0">
<cfparam name='start' default="5000">
<cfset iQEmpty = false>
<cfloop condition="true">
    <cfquery name="rs">
    select * from mytable limit #start#,#length#
    </cfquery>
    <cfset start += rs.recordCount>
    <cfset myst = queryToJson(rs)>
    <cfset call = x.UpsertData(myst)>
    <cfif rs.recordCount NEQ 1000>
        <cfbreak>
    </cfif>
</cfloop>

Can't use coldfusion latest because i am still stuck on cf11 and upsert is expecting a json data to be sent

2

There are 2 answers

0
Sebastian Zartner On

If you don't care about how long your script runs but just want to avoid the timeout, you can increase the time limit via <cfsetting>. E.g. you can set the timeout to 10 minutes by writing this:

<cfsetting requesttimeout="600">
3
Adrian J. Moreno On

UpSertData is call to the api where it pushes the data to the api for uploading and inserting @rrk Can you explain what you mean by putting it separate, how that can help?, its already doing a batch upload

If the "upsert" is happening in the same database, then running a query to read data from one table and update another is infinitely faster done directly on the database.

SQL update from one Table to another based on a ID match

Example from that answer:

UPDATE
    Sales_Import
SET
    Sales_Import.AccountNumber = RAN.AccountNumber
FROM
    Sales_Import SI
INNER JOIN
    RetrieveAccountNumber RAN
ON 
    SI.LeadID = RAN.LeadID;

If the data is coming from your system and being sent to another system's API, it may be better to send that data over one record at a time or fewer at a time. You can create a Scheduled Task in CF Admin that pings your script every minute or so, sending a much lower volume of data at a time. It will take more requests, but could likely complete in a much more timely manner than trying to shove a large volume of data over at once.

For example, I had a product sending over 10k records to mine at a time. It would often timeout, failing to sync data. That product was already sending single record updates to a microservice, so we updated my product to subscribe to the same microservice. I would get individual record updates, but my system (CF based) could process 10k requests quickly and without fail, as opposed to trying to process (and often failing to process) one large request.