We are facing a challenge in reading the COMP-3 data in Java embedded inside Pentaho ETL. There are few Float values stored as packed decimals in a flat file along with other plain text. While the plain texts are getting read properly, we tried using Charset.forName("CP500");, but it never worked. We still get junk characters.
Since Pentaho scripts doesn't support COMP-3, in their forums they suggested to go with User Defined Java class. Could anyone help us if you have come across and solved such?
Is it a Cobol File ???, Do you have a Cobol Copybook ???. Possible options include
Converting Comp-3
in Comp-3,
There is more than one way to convert a comp-3 to a decimal integer. One way is to
Java Code to convert comp3 (from a byte array:
JRecord
In JRecord, if you have a Cobol Copybook, there is
- Cobol2Csv a program to convert a Cobol-Data file to CSV using a Cobol Copybook
- Data2Xml convert a Cobol Data file to Xml using a Cobol Copybook.
- Read Cobol-Data File with a Cobol Copybook.
- Read a Fixed width file with a Xml Description
- Define the Fields in Java
Reading with Cobol Copybook in JRecord Defining the File in Java with JRecordZoned Decimal
Another Mainframe-Cobol numeric format is Zoned-Decimal. It is a text format where the sign is Over-typed on the last digit. In zoned-decimal 123 is "12C" while -123 is "12L".