We have a GitHub action that copies data from one Firebase project to another using firebase-tools
(we are using the latest version, 9.11.0
) package:
firebase use fromProject && firebase database:get / -o export.json
firebase use toProject && firebase database:set -y / export.json
This has worked fine until our data has grown bigger and now we are getting the following error:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
As a temporary fix, we’ve been able too apply node --max-old-space-size
flag which simply increases memory available to node process:
node --max-old-space-size=4096 /home/runner/work/foo/foo/node_modules/firebase database:set -y / export.json
Considering our data will keep growing, we’d like to implement a proper fix, which in my understanding would be to set data by streaming the JSON. However, I’m not sure firebase-tools
allow that. Searching through Github issues didn't yield anything useful.
Perhaps apart from streaming there is another useful approach in splitting a huge JSON file into chunks before setting them?
Thanks!
We have used the HTTP streaming with Firebase HTTP API in order to overcome the troubles of saving, modifying locally and uploading a HUGE (over 256mb) JSON file.
We built a function to HTTP pipe data stream from one project’s database to another:
And we use it like this: