Showing posts with label Shell Script. Show all posts
Showing posts with label Shell Script. Show all posts

Thursday, 9 March 2017

Perl Script to get content difference of a file

This Perl script will help you to get the content difference of a file available in two directories.

Usage: filename dir1 dir2

Working: Script will pick the filename from argument and start comparing its content and print the
difference. Here assumption is the same file is available in two directories.
But we can alter this script to take two file name and one directory or based on our need.

View or Download :

Like the below page to get update

Thursday, 23 February 2017

Shell script to access DataStage Director ETL joblogs

We have various data warehouses hosted on AIX/Linux/Unix operating systems. DataStage Director is one of the ETL monitoring & scheduling tool used in numerous data warehouses. In case of ETL job failures we need to login to DataStage Director and check the log for error messages. The ETL job logs can tend to span into several pages with various types of informative messages. Usually we need to locate the error message under ‘Fatal’ message type. Doing this task by searching the DataStage log manually can be very time consuming and tiring.

This shell script facilitates the user to access the error messages from the ease of the  Linux screen. The script also has facility to email the filtered log messages to the user’s mail box.

Ø  Accept jobname and other parameters while script execution.
Ø  Establish proper environment settings for local DataStage use
Ø  Locate event ids for fatal errors in the ETL joblog.
Ø  Extract detail log for fatal errors
Ø  Mail the filtered job log with exact fatal error message to the user.

Like the below page to get update

Tuesday, 21 February 2017

PDF Split & Merge with Perl

Sharing two utility scripts (Perl), which can be used for extracting only few pages from a PDF file and also for combining different pdf files into a single pdf files. There might be lot of situation where in, we might require only few pages from a big PDF files. So instead of storing / carrying such a big PDF files, we could extract the page or pages which we require from the original big PDF files. Also we could combine different pdf files into a single pdf file using the second script. As an example, if we have extracted pages 10-15 and pages 100-120 from a big pdf file using the and
we can combine these two pdf files (i.e. pdf which contains pages 10-15 and pdf which contains
pages 100-120) into a single pdf file using

NOTE : These two perl scripts uses a perl module called PDF::API2. If this is not present on your system as part of the perl installation, you can download these modules from and install. Please see the installation section for more details.

These two scripts can be used on windows, unix or linux. Currently tested on Windows with active perl 5.8.8, but it should work on unix and linux as well. For the script to work on unix and linux, please change the variable called "path_separator" to "/" instead of "\\". This variable can be seen at the starting of the script. can be used both on windows and unix/linux without any modification



     perl -i <input pdf file> -p <page or page range which needs to be extracted>

        -i : Please give the full path to the input PDF file
        -p : Page Number or Page range which needs to be extracted from the input PDF

        example : To extract pages 3 to 5, execute

           perl -i /tmp/abc.pdf -p 3-5

        example : To extract only page 3, execute

        perl -i /tmp/abc.pdf -p 3

Executing with -h option will display the usage onto the screen

Example : perl ./ -h


perl <output pdf file with full path> <input pdf file 1> <input pdf file 2> etc

Execute the script with all the pdf file which needs to be merged.
Script will merge in the same order which is given in the input

i.e. If you execute like /tmp/out.pdf /tmp/abc.pdf /tmp/xyz.pdf

then pages from xyz.pdf will be after pages from abc.pdf

Executing with -h option will display the usage onto the screen

Example : ./ -h



Like the below page to get update

Thursday, 9 February 2017

get Queue Depths

Monday, 16 January 2017

Linux Shell Utilities

This shell script can be called from any other shell script as "." and any of the functions can be called to use. Even the funcions can be kept in the .profile and called through calling .profile.

The following functions are provided here:
to_lower               [ Change the upper case word to lower case.]
to_upper              [ Change the lower case to upper case. ]
check_numeric    [ Checks whether an input is a number. ]
check_decimal     [ Checks whether an input is a floating point number. ]
check_null            [ Checks for NULLs. ]
string_length        [ Evaluates the length of a string. ]
concatAll              [ Concats 'n' number of strings. ]
token_n                [ Returns the n'th token of a string.]

How To:
This is how you have to call this script to utilize any function in your current script.

In the caller script:
# Calling

RV=`to_lower $x1`
RV1=`string_length $x1`

Find the complete script here -

Like the below page to get update

Monday, 21 November 2016

Reading DSParam - datastage parameter file

I am sharing a utility which can help you to read DSParam file which holds all the environmental datastage parameters.

Utility to view contents of DSParams file. Useful when trying to see what all the customer has set at the project level.

$ cat DSParams | ./ | more
$ cat DSParams | ./ > outputfile

1. copy script text below to a file ( on a UNIX system
2. Set execute permissions on this file. chmod 777
3. Usually perl is in /usr/bin/perl but you might have to adjust this path if neccessary. (hint "which perl" should tell you which one to use)
4. cat the DSParams file from the project you are concerned with and redirect the output to this script. You may have to put the Fully Qualified Path for this file.
5. capture the output to screen or file. File may be useful to have the customer send the info to you in email.

Like the below page to get update

Wednesday, 12 October 2016

Script to Auto Compress the System Log Files

This Script is originally written by "Andy Welter" to compress the linux system log files. I have modified it to work better. You can find it below (modified version) -

Script Usage:
logroll [-compress|-nocompress]

$ logroll -compress
# compress the log files and move to archive directory

logroll -nocompress
# Move the file to archive directory with compressing it 

=================================================================== ===================================================================

Like the below page to get update

Wednesday, 20 April 2016

Shell Script Scenario #8 - Odd and Even No

 Write a shell script to find whether an input integer is even or odd. 

Like the below page to get update

Tuesday, 19 April 2016

Best book for Linux Shell Scripting - Intermediate User

Today, I am gonna share my secret of Linux Scripting. The book, shared below, is core responsible for my shell scripting technique and ease. I love the tips n tricks, tweaks or many more or you can say I simply love this book :-)

So here it is -- Hoping you all also get benefitted by this.

For Intermediate Linux Shell Scripting users, this book provide all the tips and tricks which we can include in out script to make it work more efficient way.

Mastering Unix Shell Scripting: Bash, Bourne, and Korn Shell Scripting for Programmers, System Administrators, and UNIX Gurus
by Randal K. Michael

== ==

Like the below page to get update

Sunday, 31 January 2016

Create a Daemon to Trace New Processes


The following code can be used to create a daemon that will watch for processes that show up in the "ps -ef" output with certain characteristics. When it identifies such processes, it will attach to them with a trace utility (i.e. strace, truss, tusc... you must change the code to reflect this on whatever platform this is run on). The tool does not follow these processes with a fork since it will trace any children that contain the same "ps -ef" characteristics. This makes it useful for tracing DS PX programs that contain rsh since truss's fork flag (i.e. "-f") blocks the rsh from executing.


The script below should be saved to a file such as /tmp/ and given rwx permissions.  The trace utility name that is appropriate for your platform should be altered in the "ps -ef" command and in the "for" loop.  The script would then be run using this syntax:
    /tmp/ <search string>
As mentioned above, the search string can be any value that would appear in the "ps -ef" output.  Such values might be a user id, particular time, a command, or arguments to a command.  The fifth and eight lines of this script gather lists of all commands to be traced and then attempts to remove commands that should be ignored.  If you find too many processes getting traced, identify why it was selected and then alter these two lines by adding a "grep -v" to the list of items bieng ignored.

Like the below page to get update!forum/datagenx

Friday, 22 January 2016

Shell Script for getting lines after and before of search String

Sometime when we search some text string in Unix environment we need those strings also which come before or after that searched string.
This below small shell-script gives same output and return output in one file depends on user choice that how many no. of lines user wants to print after and before any searched string.

This script take input as search string and no. of line you want to print before and after search result line and give output in search.txt

Like the below page to get update!forum/datagenx

Thursday, 24 December 2015

Python Points #4 - Conditions

Thursday, 19 November 2015

Python points #2 - Data Type & String Manipulations

Sunday, 15 November 2015

Shell Script Scenario #7 - Anagram words

Two words are called Anagram when you can rearrange the letters from one to spell the other.
i.e. -
Coat and Taco
Heater and Reheat
Cloud and Could

So, write a script which accept two input from user and return a result whether they are anagram or not ?

Like the below page to get update!forum/datagenx

Tuesday, 10 November 2015

Check Memory Utilization by Datastage processes

While we are running lots of DataStage job on a Linux DataStage server or different environment are sharing the same server which cause the resource crunch at server-side which affect the job performance.

It's always preferable to have an eye on resource utilization while we are running jobs. Mostly, DataStage admin set a cron with a resource monitoring script which will invoke in every five min ( or more) and check the resource statistics on server and notify them accordingly.

The following processes are started on the DataStage Engine server as follows:

dsapi_slave - server side process for DataStage clients like Designer

osh - Parallel Engine process
DSD.StageRun - Server Engine Stage process
DSD.RUN - DataStage supervisor process used to initialize Parallel Engine and Server Engine jobs. There is one DSD.RUN process for each active job

ps auxw | head -1;ps auxw | grep dsapi_slave | sort -rn -k5  | head -10
atul 38846  0.0  0.0 103308   856 pts/0    S+   07:20   0:00 grep dsapi_slave

The example shown lists the top 10 dsapi_slave processes from a memory utilization perspective. We can substitute or add an appropriate argument for grep like osh, DSD.RUN or even the user name that was used to invoke a DataStage task to get a list that matches your criteria.

Like the below page to get update!forum/datagenx

Friday, 9 October 2015

Shell Script for listing out Running Datastage jobs

This script is using the logic/commands from Get currently running DataStage jobs (link). You can run this script in your linux terminal with the argument and keep getting update which job is running at server. You can give a small no to refresh time for quick refresh.



You may need to change some command according to your linux flavour such as 'sort' command or you can simply remove it whatever suits you better.

You can find many more post like this HERE

Like the below page to get update!forum/datagenx

Wednesday, 7 October 2015

Shell Script Scenario #2 - Solution

You can find other scenarios HERE


Like the below page to get update!forum/datagenx

Thursday, 1 October 2015

Get currently running DataStage jobs

To get a list of DataStage jobs that are running use a command :

ps -ef | grep DSD.RUN

i.e. - 

>ps -ef | grep -v grep | grep DSD.RUN

"Inv_Rep3_word" and "stand1" is the running job names.

We can use this command in shell script to get the list of running jobs whenever we want or can schedule in cron to get hourly report.

To get the job list only, use below command - 

ps -ef | grep DSD.RUN | grep -v grep | awk '{print $10}' 

This will give you the running job list, awk command need to be modify as per your server installation host whether it is Linux or Windows.

Shell Script for listing out Running Datastage jobs

Like the below page to get update!forum/datagenx

Wednesday, 30 September 2015

Shell Script Scenario #5 - Letters order

Design a script to get below output -
Requirement - Check whether all the letters in word is in order or not.


almost is in ORDER
cereal is not in ORDER 

For more scenario -  CLICK HERE

Wednesday, 23 September 2015

Shell Script Scenario #6 - Shuffle the input

- Design a shell script which take some character or number or string as a input separated by space and generate a shuffled output from the input. The output should be random.

1 2 3 4 5 6 7

Output1: #when we run script 1st time (it can differ from your output)
7 1 4 2 3 6 5

Output2: #when we run script next time (it can differ from your output)
5 1 7 3 2 6 4


a p s t u b

Output: #when we run script 1st time (it can differ from your output)
t b p s a u

Like the below page to get update!forum/datagenx