How to treat a field as binary/blob with Oracle Golen Gate

57 views Asked by At

I've a ancient database with EBCDIC characterset.

I setupping a CDC (Oracle Golden Gate with Kafka Connect Handler) to expose to other services.

In some table was used varchar field as binary field (the content isn't character but many thing, souds, images, integer.... ). Ok don't mind about why this situation... :-D

My need is "simply":tell to OGG ok treat these fields like a blob or binary, so don't converte characterset and pass each byte as is.

My try: myNonBinaryField=@BINARY(myNonBinaryField)

=ERROR 2024-02-13 16:19:37.000568 [main] - Unable to decode column 11 : Input length = 1
java.nio.charset.MalformedInputException: Input length = 1
        at java.nio.charset.CoderResult.throwException(CoderResult.java:281) ~[?:1.8.0_281]
        at java.nio.charset.CharsetDecoder.decode(CharsetDecoder.java:816) ~[?:1.8.0_281]
        at oracle.goldengate.datasource.UserExitDataSource.createColumnValue(UserExitDataSource.java:1106) [ggdbutil-21.9.0.0.3.001.jar:21.9.0.0.3.001]
Exception in thread "main" oracle.goldengate.util.GGException: Unable to decode column 11 : Input length = 1
        at oracle.goldengate.datasource.UserExitDataSource.createColumnValue(UserExitDataSource.java:1203)
1

There are 1 answers

0
Domenico Lorusso On

I've findout this solution:

myNonBinaryField=@BINARY(myNonBinaryField)), COLCHARSET(PASSTHRU, myNonBinaryField)

To be honest, it seems a workaround, because theoracle doc about @binary says:

Use the @BINARY function when a source column referenced by a column-conversion function is defined as a character column but contains binary data that must remain binary on the target. By default, binary data in a character column is converted (if necessary) to ASCII and assumed to be a null-terminated string. The @BINARY function copies arbitrary binary data to the target column.

Anyway it's working..