So I have a foo table, which is huge and whenever I try to read all data from that table Node.JS gives me out of memory error! However still you can get chuncks of data by having offset and limit; but again I cannot merge all of the chuncks and have them in memory, because I run into out of memory again! In my algorithm I have lots of ids and need to check whether each id exists in foo table or not; what is the best solution (in terms of algorithm complexity) when I cannot have all of the data in memory to see if id exists in foo table or not?
PS: The naive solution is to get chuncks of data and see chunck by chunck for each id; but the complexity is n squared; there should be a better way I believe...
Under the constraints you specified, you could create a hash table containing the ID's you are looking for as keys, with all values initialized to false.
Then, read the table chunk by chunk and for each item in the table, look it up in the hash table. If found, mark the hash table entry with true.
After going all the chunks, your hash table will hold values of true for ID's found in the table.
Providing that the hash table lookup has a fixed time complexity, then this algorithm has a time complexity of O(N).