Showing posts with label Client. Show all posts
Showing posts with label Client. Show all posts

Tuesday, 6 August 2019

Setup HTTP/S Proxy for Docker


When working with Docker behind the Proxy firewall, docker is unable to communicate to public docker repo (dockerhub) to download or install any dependency for your docker scripts. This you can resolve by altering or creating few config files for docker. Let's see How --

1. In command line:

Use below environment variable in docker command as argument to enable proxy for it

# In Command Line:
--env HTTP_PROXY="http://<user>:<password>@<proxy_server>:<port>"
--env HTTPS_PROXY="http://<user>:<password>@<proxy_server>:<port>"
--env ALL_PROXY="http://<user>:<password>@<proxy_server>:<port>"

 

2. In Docker File:

You can add proxy setting in your docker file if don't have access to change at system level. Just add below line in your docker script.

# In DockerFile
ENV HTTP_PROXY "http://<user>:<password>@<proxy_server>:<port>"
ENV HTTPS_PROXY "http://<user>:<password>@<proxy_server>:<port>"
ENV ALL_PROXY "http://<user>:<password>@<proxy_server>:<port>"



3. Using config.json (Client level) file:

Or you can create user level config file for docker environment if not exist. These settings will be available to your docker script by default until override by other means. 

# Create/Edit ~/.docker/config.json file (All below json in this file, create file if not exists)
 

{
 "proxies":
 {
   "default":
   {
     "httpProxy": "http://<user>:<password>@<proxy_server>:<port>",
     "httpsProxy": "http://<user>:<password>@<proxy_server>:<port>",
     "allProxy": "http://<user>:<password>@<proxy_server>:<port>"
   }
 }
}



4. Using docker system level config change (Docker Server level):

You can change the docker server level proxy change to use it as below: 


# Altering Docker Syetem File (you have to be root user for this)
sudo mkdir -p /etc/systemd/system/docker.service.d
sudo vi /etc/systemd/system/docker.service.d/http-proxy.conf
 
#(add below content to this file )
[Service]
Environment="HTTP_PROXY=http://<user>:<password>@<proxy_server>:<port>"
Environment="HTTPS_PROXY=http://<user>:<password>@<proxy_server>:<port>"
Environment="ALL_PROXY=http://<user>:<password>@<proxy_server>:<port>"

# now run below command to restart the docker daemon
sudo systemctl daemon-reload
sudo systemctl restart docker

# verify
sudo systemctl show --property=Environment docker





Like the below page to get the update  
Facebook Page      Facebook Group      Twitter Feed      Telegram Group     


Wednesday, 7 December 2016

Import the jobs from DS windows client #iLoveScripting


As we have discussed a script which can export the datastage jobs from your client system (http://bit.ly/2frNPKj) likewise we can write another one to import the jobs. Let's see how -

DsImportJobsClient.bat :

This Script read all the *.dsx job name from the specified Directory and Sub-Directory and import to the Specified project. It can also build (Only BUILD) the existing package created on Information Server Manager and send it to the specified location on client machine.

To use the build feature you need to make sure the package has been created with all the needed jobs, saved and closed. Only update to the selected job will be taken care automatically. To add/delete a job, you need to do manually.

Modify the Import.properties and ImportJobList.txt file and Go the .bat dir and then execute the importAndBuild.bat.




Import.properties :


ImportJobList.txt :


DsImportJobsClient.bat :





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Sunday, 6 November 2016

Export the jobs from DS windows client



Datastage jobs Export/Import are occasional activity (Deployment time :-)) for a developer But it becomes very tedious if the job list are long or it's daily routine to export or import jobs.

So I have written a batch script (windows script) which we can execute from the Client Machine (where datastage clients are installed) and automate this process.



DSjobExportClient.bat :
Export Script read the job name from the file (ExportJobList.txt) and exports the jobName.dsx from the project to the export base location and maintain the folder structure specified in the ExportJobList.txt file. File "ExportJobList.txt" and “Export.properties” should be updated before running the export script.

Copy the above file on any location of your DS windows client machine and update the "ExportJobList.txt" and “Export.properties” files.
-    Export.properties
-    ExportJobList.txt
-    ExportJobs.bat   



Export.properties :


ExportJobList.txt :


DsExportJobsClient.bat :




Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Tuesday, 12 April 2016

Export DataStage job log from Director Client



These steps are not for exporting log for more than 3-5 jobs as these are manual steps and need to repeat for each job (Yes, each job individually) and NO CUSTOMIZATION available in export.

** If want to customized export, use dsjob command in your script.
1. Open the job log view for one of the jobs in Director Client.
2. Choose Project > Print All entries > Full details > Print to file, and then enter a file name in which you want to save the log.




Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://datagenx.slack.com/messages/datascience/

Monday, 16 November 2015

List DataStage jobs which used this Parameter



Open the DataStage Administrator Client

Click on the Projects tab and select the project you would like to generate a list for.

Click the Command Button

In the command entry box type:

LIST DS_JOBS WITH JOBTYPE = 3 AND EVAL "TRANS('DS_JOBOBJECTS','J\':@RECORD<5>:'\ROOT',14,'X')" LIKE ...<VARNAME>...

<VARNAME> should be the name of the parameter or environment variable



Example:

LIST DS_JOBS WITH JOBTYPE = 3 AND EVAL "TRANS('DS_JOBOBJECTS','J\':@RECORD<5>:'\ROOT',14,'X')" LIKE ...TMPDIR...

Click Execute

If the output is on more than one page, click Next to page done and click Close when finished.


In this example, a job type of 3 is a parallel job. Valid job types value are:
0 = Server
1 = Mainframe
2 = Sequence
3 = Parallel





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://groups.google.com/forum/#!forum/datagenx

Monday, 2 November 2015

5 Tips For Better DataStage Design #4



1) While using AsInteger() function in datastage transformer always trim the imput column before passing it to function because if there are extra spaces or unwanted characters which generates zeros when actual integer values are expected. We should use APT_STRING_PADCHAR=0x20 (space) env var for fixed field padding.

2) The Len(col) will return the incorrect length if the input column is having some non-ASCII or double byte characters in it. So check your NLS settings for the job to fix this.



3) To remove embedded spaces from decimal data, use StripWhiteSpace(input.field) function to remove all spaces.

4) To get the datastage job no, Open the log view of the job in datastage director and double click on any entry of the log. The job number will be listed under the field "Job Number:"

5) Set these 2 parameters APT_NO_PART_INSERTION, APT_NO_SORT_INSERTION to TRUE to avoid datastage to insert partitioning or sorting method to improve the job performance at compile job. This will remove the warning also "When checking operator: User inserted sort "<name>" does not fulfill the sort requirements of the downstream operator "<name>""





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://groups.google.com/forum/#!forum/datagenx

Thursday, 15 October 2015

Behavior of Multi-Instance job in Director Client



Multi-Instance Job:
                DataStage is supporting Multi-Instance jobs which can be run at same time with different invocation id.
      Today, we will discuss the behavior of multi-instance datastage job in Director.


Running Jobs:
                When we run the multi-instance job, it will ask for Invocation Id to be passed, when the job is running, it will display a new job in director in format <JOB_Name>.<Invok_Id>. Nothing change in original job, it is still in compiled status. So, if we invoke job 3 times with 3 invocation id, it will generate 3 jobs in director -

Jobname.InvkId1
Jobname.InvkId3
Jobname.InvkId3


Monitoring Jobs: 
                 We can monitor each invoked job as it is been generated and visible in the Director with invocation id. But the tool is using the same RT_LOGnn file to write the job log for all invocation id, So we can see the n instance in director and its log in the Director but in backend, it is a single file. We can monitor, stop and check individual job log.


Deleting Jobs:
                If we delete a job instance from director, it will be deleted and other instances will remain there But the job log for this instance is still with RT_LOGnn file ( we can access through from Datastage Command Line but not in Datastage Director as job instance has been deleted).


Purging Job logs:
               If we purge the job log in datastage, it will delete the job instances as well as job logs from RT_LOGnn file. So the difference here is that the Director delete action only deletes records from RT_STATUS whereas the purging mechanism deletes records from RT_LOG.
                  





Like the below page to get update  
https://www.facebook.com/datastage4you
https://twitter.com/datagenx
https://plus.google.com/+AtulSingh0/posts
https://groups.google.com/forum/#!forum/datagenx