I plan to run hundreds of websites within one Google Cloud Platform project (using GKE). Each of them will use two Google Cloud Storage buckets for storing its assets.
I planned to create one service for every website in order to grant access to only its own respective buckets. However, there's a limit of 100 service accounts per project, which apparently can't be raised.
How can I make sure that each website only has access to the buckets (or sub paths in a bucket) which is allowed to see?
We have a similar use-case and I believe I've found a solution for this problem. The key is that service accounts from other projects can be given access to buckets of your GCS-enabled project.
Basically you'll use two kinds of GCP projects:
The service accounts from the second type of "user pool" projects can be given access to the buckets of your data project with a fine granularity (1 service account -> 1 bucket). When the last user pool is close to the 100 limit, just create a new project and start adding new service accounts there.