I've a ancient database with EBCDIC characterset.
I setupping a CDC (Oracle Golden Gate with Kafka Connect Handler) to expose to other services.
In some table was used varchar field as binary field (the content isn't character but many thing, souds, images, integer.... ). Ok don't mind about why this situation... :-D
My need is "simply":tell to OGG ok treat these fields like a blob or binary, so don't converte characterset and pass each byte as is.
My try: myNonBinaryField=@BINARY(myNonBinaryField)
=ERROR 2024-02-13 16:19:37.000568 [main] - Unable to decode column 11 : Input length = 1
java.nio.charset.MalformedInputException: Input length = 1
at java.nio.charset.CoderResult.throwException(CoderResult.java:281) ~[?:1.8.0_281]
at java.nio.charset.CharsetDecoder.decode(CharsetDecoder.java:816) ~[?:1.8.0_281]
at oracle.goldengate.datasource.UserExitDataSource.createColumnValue(UserExitDataSource.java:1106) [ggdbutil-21.9.0.0.3.001.jar:21.9.0.0.3.001]
Exception in thread "main" oracle.goldengate.util.GGException: Unable to decode column 11 : Input length = 1
at oracle.goldengate.datasource.UserExitDataSource.createColumnValue(UserExitDataSource.java:1203)
I've findout this solution:
To be honest, it seems a workaround, because theoracle doc about @binary says:
Anyway it's working..