Showing posts with label Solution. Show all posts
Showing posts with label Solution. Show all posts

Monday, 16 January 2017

Linux Shell Utilities


This shell script can be called from any other shell script as ". UXUtils.sh" and any of the functions can be called to use. Even the funcions can be kept in the .profile and called through calling .profile.


The following functions are provided here:
to_lower               [ Change the upper case word to lower case.]
to_upper              [ Change the lower case to upper case. ]
check_numeric    [ Checks whether an input is a number. ]
check_decimal     [ Checks whether an input is a floating point number. ]
check_null            [ Checks for NULLs. ]
string_length        [ Evaluates the length of a string. ]
concatAll              [ Concats 'n' number of strings. ]
token_n                [ Returns the n'th token of a string.]

How To:
This is how you have to call this script to utilize any function in your current script.

In the caller script:
----------------------
# Calling UXUtils.sh
. UXUtils.sh

x1="COUMPTER"
RV=`to_lower $x1`
RV1=`string_length $x1`


Find the complete script here -




Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Monday, 12 December 2016

vmware player powering on internal error



vmware workstation logoLast night, I struggled with "vmware player powering on internal error" for almost an hour when I was trying to run a vmware os on my machine and followed so many tweaks to resolve this but no success.
After lots of googling, I found one solution which worked for me So sharing the same here if it can help someone stuck like me :)


Saturday, 8 October 2016

#2 DataStage Solutions to Common Warnings/Error - Null Handling


Warnings/Errors Related to Null Handling -



1.1       When checking operator: When binding output interface field “XXXXX” to field “XXXXX”: Converting a nullable source to a non-nullable result

Cause: This can happen when reading from oracle database or in any processing stage where input column is defined as nullable and metadata in datastage is defined as non-nullable.

Resolution: Convert a nullable field to non  nullable. Need to apply available null functions in datastage or in the query.


1.2       APT_CombinedOperatorController(1),0: Field 'XXXXX' from input dataset '0' is NULL. Record dropped.

Cause: This can happen when there is no null handling mentioned on column and the same column is used in constraints/Stage Varibales.

Resolution:  Provide Null handling function to the column mentioned in constraint/Stage variable.


http://www.datagenx.net/2016/09/datastage-solutions-to-common.html


1.3       Fatal Error: Attempt to setIsNull() on the accessor interfacing to non-nullable field "XXXX".

Cause: This can happen when the column in source is nullable but in DB2 stage its mentioned as Non Nullable

Resolution: Change the Nullable field for the column to “Yes” instead of “No” i.e.


1.4       Exporting nullable field without null handling properties

Cause: This can happen when the columns are mentioned as nullable in sequential file stage and no representation for null values was specified.

Resolution: Specify Null field value in Format tab of sequential file stage.






Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Monday, 19 September 2016

#1 DataStage Solutions to Common Warnings/Errors - Datatype


Warnings/Errors Related to Datatype

This Warnings/Errors described in this section are based on Data type like data type mismatch, Column length variations.
Few common Warnings/Errors we get based on data type are :

1.1    Conversion error calling conversion routine decimal_from_string data may have been lost
Cause:    This can happen when the input is in incorrect format while converting into target data type or is contains null value, so that the conversion function is not able to convert it to target data type
Resolution:Check for the correct date format or decimal format and also null values in the date or decimal fields before passing to StringToDecimal functions.
    Similar issue can come for the datastage StringToDate, DateToString,DecimalToString  conversion functions as well




1.2    Possible truncation of input string when converting from a higher length string to lower length stringCause:    This can happen when the input is having length which is more than the length given in output of the same stage
Resolution: Change the length of the specified column in specified stage by giving same length in output as of it is in input.
This can happen in stages like Merge, Sort, Join, Lookup etc

1.3    APT_CombinedOperatorController,0: Numeric string expected for input column 'XXXXX’. Use default value. Cause:    This can happen when the input data type and output data type are different and the type conversion is not handled in transformer.
Resolution: Type conversion function should be applied based on target data type.
Ex:    Input data type = Char, Output data type= BigInt
In this case, the direct mapping with out any type conversion will give this message. Need to provide the type conversion function

Note:
i. The Log normally doesn’t show this message as Warning/ Errror, it will be mentioned as “Info”  
ii. When this happen the records will not be inserted into the table/file.
iii. The stage name will not mentioned in the log, to get the stage name where this issue is happening, need to include 1 Environment Variable in job properties. i.e. $APT_DISABLE_COMBINATION and set it to “True”
   
1.4    Reading the WVARCHAR database column XXXXX into a VARCHAR column can cause data loss or corruption due to character set conversionsCause:    This can happen when the data type is supposed to be Unicode but it’s not mentioned in stage.
Resolution:Change the data type for the column to Varchar along with “Unicode” instead of Varchar alone.  i.e. select Unicode from the Drop down provided in Extended column.

1.5    Schema reconciliation detected a type mismatch for field SSSSS. When moving data from field type CHAR(min=100,max=100) into INT64Cause:    This can happen when the data type is supposed to be Char  but it’s mentioned as BigInt in stage.
Resolution:Change the data type for the column to Char with length 100 instead of BigInt in corresponding stage.






Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Friday, 6 May 2016

DS Fatal Error: Destination "APT_TRinput0Rec0" is already bound


Fatal Error: Destination "APT_TRinput0Rec0" is already bound - Transformer Stage Error


Solutions:
* Check if the output stage is having identical column names
* Check if RCP is enabled in input links

If yes,
Rename the target output name accordingly
or Disable the RCP>





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Sunday, 6 March 2016

Python SyntaxError - Non-ASCII character '\xe2' in file


If you get below error while running your python code - 

SyntaxError: Non-ASCII character '\xe2' in file .\set_learn.py on line 32, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details

and You are using Notepad++ so here is how you have to resolve this -

1. By converting the Text Encoding

Go to Menu -> Encoding -> Convert to UTF-8

and save the file.


2. By seach and replace the \xe2 value to empty

Use Ctrl-F
Find [^\xe2]+
or Find [^\x00-\x7F]+ to delete all non-ascii char
Select Search mode as -Regular Expression
Hit Enter to replace all values


3. In Linux

a. Find the line which is having bad charaters -
grep -nP "[\x80-\xFF]" INPUT_FILE


b. Some ways to remove 
sed -i 's/[^[:print:]]//g' INPUT_FILE > clean-file
sed 's/[\x80-\xff]//g' INPUT_FILE > clean-file
tr -cd '\11\12\15\40-\176' < INPUT_FILE > clean-file

** word of caution - It may remove some charaters which you need file as we are using range, so take a backup of your file first



Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/



Sunday, 10 January 2016

A Quick DataStage Recipe


Under this series, I am trying to cook some quick solution for DataStage problems, issues, technical implementation of re-usable logics which we face in day to day task.

       
        Hope you will find them useful and helping. Keep looking for this space.

A Quick DataStage Recipe -> http://www.datagenx.net/search/label/aQDsR?max-results=12





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://groups.google.com/forum/#!forum/datagenx

Wednesday, 7 October 2015

Shell Script Scenario #2 - Solution


You can find other scenarios HERE



==================================================================================
==================================================================================





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://groups.google.com/forum/#!forum/datagenx

Monday, 24 August 2015

Linux Shell Script Scenario Solution - 1




You can find the question HERE



Solution :