I would like know how to get all DATETIME(io.debezium.time.Timestamp) coming from Debezium sql server connector in YYYY-MM-DD hh:mm:ss format. Right now this is giving in epoch int64 format which will be difficult to transform data using spark.
configuration for debezium sql server connector used below:
{
"name": "localDB-sqlserverconnector",
"config": {
"connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
"database.hostname": "192.168.1.22",
"database.port": "1433",
"database.user": "user",
"database.password": "user_123",
"database.dbname": "localDB",
"database.server.name": "DEV1",
"table.whitelist": "dbo.testtabledebezium",
"database.history.kafka.bootstrap.servers": "192.168.1.81:32105",
"database.history.kafka.topic": "history_DB.DEV1",
"include.schema.changes": false,
"transforms": "unwrap",
"transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
"transforms.unwrap.drop.tombstones": true,
"transforms.unwrap.delete.handling.mode": "rewrite",
"snapshot.mode": "schema_only"
}
}
result from debezium for topic DEV1.dbo.testtabledebezium
{"id":9,"column1":"t6","column2":1601480866593,"column3":18535,"__deleted":"false"}
Sqlserver table actual data
|id| |column1| |column2 | |column3 | =>
|9 | |t6 | |2020-09-30 15:47:46.593| |2020-09-30|
Actual requirement
{"id":9,"column1":"t6","column2":"2020-09-30 15:47:46.593","column3":"2020-09-30","__deleted":"false"}
Add "time.precision.mode": "connect" to your connector config.
Refer https://debezium.io/documentation/reference/1.3/connectors/sqlserver.html#sqlserver-temporal-values