I am loading the data from Oracle to Bigquery. I wanted to add an additional column in Bigquery table based on the existing column in Oracle.
I created a simple flow. ExecuteSQL ->UpdateRecord -> PutBigQueryBatch
In Oracle, I have A(string), B(string), and C(timestamp) columns.
For testing if this flow works, I created a new property '/D' in UpdateRecord with the value '/A'. The output data had the additional column D with the value same as A(string). This worked perfectly.
For testing if this flow works, I created a new property '/D' in UpdateRecord with the value '/C'. The output data had the additional column D with the value same as C(AVRO logical type timestamp-millis). This worked as well.
I want to upload the data to Bigquery but D column as a BigQuery Date data type instead of BigQuery Timestamp data type though.
How to convert the Avro logical type Timestamp-millis to Avro logical type Date?
Example:
Oracle:
A | B | C
-----------------------------------------------------------------------------
1 | 2 |2013-12-10T13:24:40.000Z
Bigquery:
A | B | C | D
-------------------------------------------------------------------------------------------------
1 | 2 | 2013-12-10T13:24:40.000Z | 2013-12-10
Output Schema in the 'field' from ExecuteSQL:
{ "name": "A", "type": ["null","string"] },
{ "name": "C", "type": { "type" : "string","logicalType" : "timestamp-millis" } },
{ "name": "B", "type": ["null","string"] }