The documentation for Google Cloud Functions is a little vague - I understand how to deploy a single function that is contained within index.js - even in a specific directory, but how does one deploy multiple cloud functions which are located within the same repository?
AWS Lambda allows you to specify a specific file and function name:
/my/path/my-file.myHandler
Lambda also allows you to deploy a zip file containing only the files required to run, omitting all of the optional transitive npm dependencies and their resources. For some libraries (eg Oracle DB) including node-modules/** would significantly increase the deployment time, and possibly exceed storage limits (it does on AWS Lambda).
The best that I can manage with Google Cloud Function deployment is:
$ gcloud alpha functions deploy my-function \
--trigger-http
--source-url https://github.com/user-name/my-repo.git \
--source-branch master \
--source-path lib/foo/bar
--entry-point myHandler
...but my understanding is that it deploys lib/foo/bar/index.js which contains function myHandler(req, res) {} ...and all dependencies concatenated in the same file? That doesn't make sense at all - like I said, the documentation is a little vague.
The current deployment tool takes a simple approach. It zips the directory and uploads it. This means you (currently) should move or delete
node_modulesbefore executing the command if you don't wish for them to be included in the deployment package. Note that, like lambda, GCF will resolve dependencies automatically.As to deployment, please see:
gcloud alpha functions deploy --helpYou might opt to use the --source flags to upload the file once, then deploy the functions sans upload. You can also instruct google to pull functions from a repo in the same manner. I suggest you write a quick deployment script to help you deploy a list of functions in a single command.