I have a bigquery table that I want to capture data changes. Let's say the table has a userid and a repeated string field for tag. Every time the a tag is deleted or new tag is created, it should save a row in another table, with the DELETE/INSERT type and the tag.
I already saw the EventArc trigger + Cloud functions architecture described here. But this approach, doesn't contain the actual data (userid, tag), only Bigquery event metadata.
Is there any way to accomplish this without using a microservice approach -- that all delete/writes go to a microservices, which handles the data changes?
I also thought of scheduling a query every 1 minute, but this would fail if changes are made inside that one minute.
As an option - you can use APPENDS TVF for this like in below example (using public data)
with below output