Controlling Async Batch request inside redux action

1.5k views Asked by At

I have a redux action for a search in the application. When the user starts the search it batches the queries and send 30 queries per request and queues first 10 requests. Whenever any one of the request is successful, it will add the next request to the request queue. All this is happening as a redux action, and whenever a request is successful it will dispatch action to append the result into the store. I would like input regarding how to handle this if the user clicks "cancel search" and enters a new search term. How can I cancel existing request and redux actions so the previous searches requests will not succeed and add to the result store?

Example Code Below :-

function search(queries){
  // split the queries into chunks of size 30
  batches = _.chunks(queries, 30);

  let count = 0;

  //get the first batch manually
  getBatch(batches[0]);

  function getBatch(batch){
    axios.post('url', batch.join(',')).then((response) => {

      // recursively call get batch to get rest of the data
      if(count < batches.length) { getBatch(batches[count++]); }

      // dispatch action to append the result into the store
      dispatch({ type: 'APPEND_RESULT', payload: response })
    }
  }
}

this is a minimal code for sharing the concept

I have read about cancellable promises axios supports it. But I am not sure how to control this recursive call on a second execution of the same function.

eg: user input will be { ids :[1,2,3,..1000] } I am trying to create batches and sent parallel requests { ids:[1,2, .. 29,30 }, { ids: [31, 32, .. 59,60]} etc.

2

There are 2 answers

0
Jinto On BEST ANSWER

This looks like a candidate for redux observable. You can create an observable and then add a debounce (like mentioned in the above one comment ) and you can easily cancel or switchMap to the new request whenever the current request is finished so that any remaining items in the previous queue wont be considered.

2
Alex Antonov On

Answer is total IMHO

In my opinion, the whole way to solve problem is broken here. You'll drop your server by constant asking it for any input change. Just imagine how much traffic will your website consume for mobile!

The best workaround here is to:

  1. Add _.debounce to your input with a small waiting time (100ms is good)
  2. Have only one query in memory. On each new event from user input cancel the request and override the current one.
  3. Dispatch an action with data on each server response