As an organization, we're following the data mesh approach. We have many teams that provide the data of their products in BigQuery. The data objects are then accessed by many different teams and departments. Each team uses its own set of projects for providing their data or running queries.
One of the issues we're facing is while producers know exactly who is accessing their data, they usually have no idea how, how often and for which reason the consumers are accessing their data.
As a consequence, data is often provided in a way that makes the ingestion quick and easy for the producer, but the consumption more expensive for the consumers. Producers do not seem to care too much because they don't have to pay for the consumption by other teams, and they do not know how often their data is read by others.
What would be quite helpful in this context was an overview about how frequently which table is used by others and the number of bytes / slots consumed by these queries and ideally even the queries itself. Is there any way of achieving this?
In the information schema, you can only see your own projects queries unless you're an organization admin. How about the audit logs?