Hey guys hope you might be able to help.
I am using DataStage 9.1 and I am having an issue with the job log in Director. Let me first say the company that I work for just bought and installed InfoSphere about 6 months ago so I am fully expecting some growing pains and this might very well be something on the admin side. I am familiar with 8.5 and older (3 years using the tool) but not 9.1. ...On to the issue...
I have a large sequence with many sub-sequences under it and I am using many of the same parallel jobs (with the 'Multiple Instance' setting checked) multiple times. I have changed the Invocation ID for the Multiple Instance jobs for each system I am loading by setting the id as a passed down parameter (target table name).
Here is an example..
Multiple Instance job name = Temp_To_Final
When used for System_A = System_A_Temp_To_Final
When used for System_B = System_B_Temp_To_Final
The problem I am having is when I view the Job Log it is mixing System_A_Temp_To_Final with System_B_Temp_To_Final.
Example of what the job log looks like:
Starting job System_A_Temp_To_Final.(....) <---System_A
Environment variable settings (....) <---System_A
OSH script (....) <---System_A
Starting job System_B_Temp_To_Final.(....) <---System_B
Parallel job reports successful completion <---System_A
Environment variable settings (....) <---System_B
OSH Script (....) <---System_B
When I have 18 jobs running and doing this, it turns into a huge mess and it's hard to step through or find out what errors or warnings belong to which job. Does anyone know of a way to have this organized? I didn't have this issue with 8.5 but that environment was established and stable.
Thanks for all of the help!
So I figured out what the issue was and wanted to share in case anybody else has the same problem. This issue was on the Server side/Administration (which I have no experience with) so I'll apologize now if I don't make any sense. When DataStage keeps a log of the different jobs, it creates a file and stores it somewhere out on the server. The issue was the file was corrupt for this particular job. All we had to do was take a file from another job and rename it to the same name of the corrupt file and replace the corrupt file. That fixed the issue immediately.