Provided a use case:
A stream processing architecture; Events go into Kafka then get processed by a job with a MongoDB sink.
Database name: myWebsite
Collection: users
and the job sinks user records in the users collection.
- So Debezium will monitor the
userscollection for changes and, at every change, will produce events into Kafka on the topicdbserver1.myWebsite.users? Assumingdbserver1is the name of the connector. - If so, then I can have a Kafka consumer that will consume from
dbserver1.myWebsite.userstopic and react to these events? - From what I understood the events produced by Debezium also contain the value of the database record? If its a change contains old/new value? If a db record is created the old is null?
I would like a some sort of confirmation of my understandings so far. Thank you!
the answer are simple
before) and new (after) values. In case ofINSERTonlyafteris present in case ofUPDATEbothbeforeandafterare present (Postgres need a special setting for it) and in case ofDELETEonlybeforeis present.