Wednesday, 28 September 2016

DS_PXDEBUG - DataStage Parallel Debugging Variable

* Controlled with an environment variable, not exposed on GUI.  DS_PXDEBUG set to activate feature (e.g. DS_PXDEBUG=1)
* Warning logged when job run with this debug feature on.
* Debug collected under a new project-level directory "Debugging" on the server. Subdirectories on a per-job basis, named after the job (created as required). For multi-instance jobs jobs run with a non-empty invocation ID, the directory will be "<jobname>.<invocationID>".
* Internally turns on Osh environment variable APT_MSG_FILELINE so that warnings/errors issued by Osh have source filename & linenumber attached.
* Internally turns on Osh environment variable APT_ENGLISH_MESSAGES so that unlocalised copies of PX-originated error / warning messages are issued in addition to the localised copy (where available).

* Internally turns on Osh environment variables APT_PM_PLAYER_TIMING APT_PM_PLAYER_MEMORY APT_RECORD_COUNTS for more reporting from playes
* Places content of jobs RT_SC<jobnum> directory in the debug location (includes job parameter file, Osh script, parent shell script, any osh and compile scripts associated with transformers). These will be in the same characterset as the original files.
* Places content of jobs RT_BP<jobnum>.O directory in the debug location. Includes library file binaries for PX transformers (plus possibly binaries associated with any Server portions of the job).
* Dump of environment varaible values at startup (same as in the log) placed in a named file in the debug location.
* Dump of osh command options placed in a named file in the debug location. Note that this is as issued from the Server wrapper code. Particularly in the case of Windows, it may not represent exactly what is received by the Osh command line, due to the action of the OshWrapper program,  and interpretation of quotes and backslash-escapes.
* Copy of received raw osh output messages in a named file in the debug location. These will typically be in the host characterset, even though on an NLS system Orchestrate will be originating them in UTF8.
* Copy of PX configuration file placed in the debug location. This will be in the same characterset as the original file.
* This new feature collects together and enhances a number of debug features already exposed with other environment variables. In order to minimise code impact risk, the original features will not be removed at this stage.
* The exception is the "dump of raw osh output messages"; it was previously placed in the &COMO& directory. If the old and the new debug options are both enabled, the new one will take precedence and there will not be a copy in &COMO&. Again this decision has been taken to minimise code change.

Contributed by Christ Thornton 2/2/2007

Like the below page to get update

Monday, 19 September 2016

#1 DataStage Solutions to Common Warnings/Errors - Datatype

Warnings/Errors Related to Datatype

This Warnings/Errors described in this section are based on Data type like data type mismatch, Column length variations.
Few common Warnings/Errors we get based on data type are :

1.1    Conversion error calling conversion routine decimal_from_string data may have been lost
Cause:    This can happen when the input is in incorrect format while converting into target data type or is contains null value, so that the conversion function is not able to convert it to target data type
Resolution:Check for the correct date format or decimal format and also null values in the date or decimal fields before passing to StringToDecimal functions.
    Similar issue can come for the datastage StringToDate, DateToString,DecimalToString  conversion functions as well

1.2    Possible truncation of input string when converting from a higher length string to lower length stringCause:    This can happen when the input is having length which is more than the length given in output of the same stage
Resolution: Change the length of the specified column in specified stage by giving same length in output as of it is in input.
This can happen in stages like Merge, Sort, Join, Lookup etc

1.3    APT_CombinedOperatorController,0: Numeric string expected for input column 'XXXXX’. Use default value. Cause:    This can happen when the input data type and output data type are different and the type conversion is not handled in transformer.
Resolution: Type conversion function should be applied based on target data type.
Ex:    Input data type = Char, Output data type= BigInt
In this case, the direct mapping with out any type conversion will give this message. Need to provide the type conversion function

i. The Log normally doesn’t show this message as Warning/ Errror, it will be mentioned as “Info”  
ii. When this happen the records will not be inserted into the table/file.
iii. The stage name will not mentioned in the log, to get the stage name where this issue is happening, need to include 1 Environment Variable in job properties. i.e. $APT_DISABLE_COMBINATION and set it to “True”
1.4    Reading the WVARCHAR database column XXXXX into a VARCHAR column can cause data loss or corruption due to character set conversionsCause:    This can happen when the data type is supposed to be Unicode but it’s not mentioned in stage.
Resolution:Change the data type for the column to Varchar along with “Unicode” instead of Varchar alone.  i.e. select Unicode from the Drop down provided in Extended column.

1.5    Schema reconciliation detected a type mismatch for field SSSSS. When moving data from field type CHAR(min=100,max=100) into INT64Cause:    This can happen when the data type is supposed to be Char  but it’s mentioned as BigInt in stage.
Resolution:Change the data type for the column to Char with length 100 instead of BigInt in corresponding stage.

Like the below page to get update

Monday, 12 September 2016

Python Points #15 - Exceptions