Provided a use case:
A stream processing architecture; Events go into Kafka then get processed by a job with a MongoDB sink.
Database name: myWebsite
Collection: users
and the job sinks user
records in the users
collection.
- So Debezium will monitor the
users
collection for changes and, at every change, will produce events into Kafka on the topicdbserver1.myWebsite.users
? Assumingdbserver1
is the name of the connector. - If so, then I can have a Kafka consumer that will consume from
dbserver1.myWebsite.users
topic and react to these events? - From what I understood the events produced by Debezium also contain the value of the database record? If its a change contains old/new value? If a db record is created the old is null?
I would like a some sort of confirmation of my understandings so far. Thank you!
the answer are simple
before
) and new (after
) values. In case ofINSERT
onlyafter
is present in case ofUPDATE
bothbefore
andafter
are present (Postgres need a special setting for it) and in case ofDELETE
onlybefore
is present.