How to generate a Blob signed url in Google Cloud Run?

9.5k views Asked by At

Under Google Cloud Run, you can select which service account your container is running. Using the default compute service account fails to generate a signed url.

The work around listed here works on Google Cloud Compute -- if you allow all the scopes for the service account. There does not seem to be away to do that in Cloud Run (not that I can find).

https://github.com/googleapis/google-auth-library-python/issues/50

Things I have tried:

  1. Assigned the service account the role: roles/iam.serviceAccountTokenCreator
  2. Verified the workaround in the same GCP project in a Virtual Machine (vs Cloud Run)
  3. Verified the code works locally in the container with the service account loaded from private key (via json file).
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
blob.generate_signed_url(expiration=expires)

Fails with:

you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py, line 51, in ensure_signed_credentials

Trying to add the workaround,

Error calling the IAM signBytes API: 
{  "error": {  "code": 400,

    "message": "Request contains an invalid argument.",
    "status": "INVALID_ARGUMENT"  }
}
Exception Location: /usr/local/lib/python3.8/site-packages/google/auth/iam.py, line 81, in _make_signing_request

Workaround code as mention in Github issue:

from google.cloud import storage
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta

def get_signing_creds(credentials):
    auth_request = requests.Request()
    print(credentials.service_account_email)
    signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.ser
vice_account_email)
    return signing_credentials


client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
signing_creds = get_signing_creds(client._credentials)
url = blob.generate_signed_url(expiration=expires, credentials=signing_creds)
print(url)

How do I generate a signed url under Google Cloud Run? At this point, it seems like I may have to mount the service account key which I wanted to avoid.

EDIT: To try and clarify, the service account has the correct permissions - it works in GCE and locally with the JSON private key.

6

There are 6 answers

13
guillaume blaquiere On BEST ANSWER

Yes you can, but I had to deep dive to find how (jump to the end if you don't care about the details)

If you go in the _signing.py file, line 623, you can see this

if access_token and service_account_email:
   signature = _sign_message(string_to_sign, access_token, service_account_email)
...

If you provide the access_token and the service_account_email, you can use the _sign_message method. This method uses the IAM service SignBlob API at this line

It's important because you can now sign blob without having locally the private key!! So, that solves the problem, and the following code works on Cloud Run (and I'm sure on Cloud Function)

def sign_url():
    from google.cloud import storage
    from datetime import datetime, timedelta

    import google.auth
    credentials, project_id = google.auth.default()

    # Perform a refresh request to get the access token of the current credentials (Else, it's None)
    from google.auth.transport import requests
    r = requests.Request()
    credentials.refresh(r)

    client = storage.Client()
    bucket = client.get_bucket('EXAMPLE_BUCKET')
    blob = bucket.get_blob('libraries/image_1.png')
    expires = datetime.now() + timedelta(seconds=86400)

    # In case of user credential use, define manually the service account to use (for development purpose only)
    service_account_email = "YOUR DEV SERVICE ACCOUNT"
    # If you use a service account credential, you can use the embedded email
    if hasattr(credentials, "service_account_email"):
        service_account_email = credentials.service_account_email

    url = blob.generate_signed_url(expiration=expires,service_account_email=service_account_email, access_token=credentials.token)
    return url, 200

Let me know if it's not clear

4
glasnt On

You can't sign urls with the default service account.

Try your service code again with a dedicated service account with the permissions, and see if that resolves your error

References and further reading:

1
Miguel Rueda On

An updated approach has been added to GCP's documentation for serverless instances such as Cloud Run and App Engine.

The following snippet shows how to create a signed URL from the storage library.

def generate_upload_signed_url_v4(bucket_name, blob_name):
    """Generates a v4 signed URL for uploading a blob using HTTP PUT.

    Note that this method requires a service account key file. You can not use
    this if you are using Application Default Credentials from Google Compute
    Engine or from the Google Cloud SDK.
    """
    # bucket_name = 'your-bucket-name'
    # blob_name = 'your-object-name'

    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        # This URL is valid for 15 minutes
        expiration=datetime.timedelta(minutes=15),
        # Allow PUT requests using this URL.
        method="PUT",
        content_type="application/octet-stream",
    )


    return url

Once your backend returns the signed URL you could execute curl put request from your frontend as follows

curl -X PUT -H 'Content-Type: application/octet-stream' --upload-file my-file 'my-signed-url' 
1
Brett Bond On

I store the credentials.json contents in Secret Manager then load it in my Django app like this:

project_id = os.environ.get("GOOGLE_CLOUD_PROJECT")
client = secretmanager.SecretManagerServiceClient()
secret_name = "service_account_credentials"
secret_path = f"projects/{project_id}/secrets/{secret_name}/versions/latest"
credentials_json = client.access_secret_version(name=secret_path).payload.data.decode("UTF-8")
service_account_info = json.loads(credentials_json)
google_service_credentials = service_account.Credentials.from_service_account_info(
        service_account_info)

I tried the answer from @guillaume-blaquiere and I added the permission recommended by @guilherme-coppini but when using Google Cloud Run I always saw the same "You need a private key to sign credentials.the credentials you are currently using..." error.

3
Guilherme Coppini On

The answer @guillaume-blaquiere posted here does work, but it requires an additional step not mentioned, which is to add the Service Account Token Creator role in IAM to your default service account, which will allow said default service account to "Impersonate service accounts (create OAuth2 access tokens, sign blobs or JWTs, etc)."

This allows the default service account to sign blobs, as per the signBlob documentation.

I tried it on AppEngine and it worked perfectly once that permission was given.

import datetime as dt

from google import auth
from google.cloud import storage

# SCOPES = [
#     "https://www.googleapis.com/auth/devstorage.read_only",
#     "https://www.googleapis.com/auth/iam"
# ]

credentials, project = auth.default(
#     scopes=SCOPES
)
credentials.refresh(auth.transport.requests.Request())

expiration_timedelta = dt.timedelta(days=1)

storage_client = storage.Client(credentials=credentials)
bucket = storage_client.get_bucket("bucket_name")
blob = bucket.get_blob("blob_name")

signed_url = blob.generate_signed_url(
    expiration=expiration_timedelta,
    service_account_email=credentials.service_account_email,
    access_token=credentials.token,
)

I downloaded a key for the AppEngine default service account to test locally, and in order to make it work properly outside of the AppEngine environment, I had to add the proper scopes to the credentials, as per the commented lines setting the SCOPES. You can ignore them if running only in AppEngine itself.

0
Gillespie On

I had to add both Service Account Token Creator and Storage Object Creator to the default compute engine service account (which is what my Cloud Run services use) before it worked. You could also create a custom Role that has just iam.serviceAccounts.signBlob instead of Service Account Token Creator, which is what I did: enter image description here