The gsutil command has options to optimize upload/download speed for large files. For example
GSUtil:parallel_composite_upload_threshold=150M
GSUtil:sliced_object_download_max_components=8
see this page for reference.
What is the equivalence in the google.cloud.storage python API? I didn't find the relevant parameters in this document.
In general, does the client API and gsutil have one to one correspondence in terms of functionalities?
I think it's not natively supported.
However (!) if you're willing to decompose files then use
threadingormultiprocessing, there is acomposemethod that should help you assemble the parts into one GCS object.Ironically,
gsutilis written in Python but it uses a librarygslibto implement parallel uploads. You may be able to usegslibas a template.