We provide IT Staff Augmentation Services!

Datastage Developer Resume

Plano, TX


  • 9+ years of IT experience in the Development, Implementation and Testing of Database/Data warehousing applications for Financial, insurance, auto and retail industries using Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Proficient with IBM InfoSphere DataStage, Ascential Data Stage, Quality Stage, Information Analyzer, Profile Stage and Audit Stage.
  • Experience in integration of various heterogeneous data sources like XML, Mainframe COBOL Files, Flat Files, Oracle, SQL Server, Teradata, and DB2UDB EEE into Data Warehouse.
  • Extensive experience in loading high volume data and Performance tuning.
  • Hands on experience with Data Stage Client Components Designer, Director and Manager.
  • Experience with UNIX shell scripting for File validation.
  • Usage of Teradata Bulk Load stages to Load the data into Teradata Database.
  • Migrated/Upgraded Datastage 8.x to 11.x using SVN - Version controlling tool, Istool export and Istool import and Export/Import options in Designer client.
  • Involved in importing and exporting jobs for category wise Datastage 8.x to 11.x and maintaining the backup regularly.
  • Experience in designing and developing complex Data Stage jobs, routines and sequencers.
  • Hands on experience in writing, testing and implementation of the triggers, procedures, functions at Database level using PL/SQL.
  • On call primary support for production jobs on rotation basis
  • Worked on managing ETL process by pulling large volume of data from OLTP to a staging Database Using SSIS for Data transformation
  • Hands on experience working with SSIS, for ETL process ensuring proper implementation of Event Handlers, Loggings, Checkpoints, Transactions and package configurations.
  • Created SSIS Packages to integrate data coming from Text files and Excel files.
  • Supported Production On call on rotation basis, for newly installing code and for all Production issues.
  • Worked on different scheduling tools like TWS, Autosys and Control-M
  • Experience in 24/7 production support for various projects.
  • Experience in design and implementation of DW projects with Star, Snowflake schemas. Strong understanding of Fact tables, Dimension tables, Normalized/De-normalized tables, Data marts, Data mappings and Ralph-Kimball/Bill Inmon methodology.
  • Extensive experience in design & development of OLTP, OLAP, Decision support Systems (DSS).
  • Experienced with all phases of software development life cycle. Involved in Business Analysis, Design, Development, Implementation and Support of software applications.
  • Experience in Waterfall and Agile project methodologies.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.


ETL Tools: Information Server 11.5/11.3/9.1/ 8.5/8.0.1/8.1 /9. X/, Ascential Data Stage 7.5/7.1/6.0, Parallel Extender (Orchestrate), Quality Stage, Information Analyze(Profile Stage), FastTrack, Audit Stage Business Glossary and Metadata Workbench.

Databases: Oracle 11g/10g/9i/8i/8.0/7.0, SQL Server 2000/2005/2008, Teradata V2R6/V2R5, DB2 UDB 9.5/9.1/8.2/8.1 EEE, MS Access 97/2000

Languages: SQL, Transact SQL (T-SQL), SQL*Plus, UNIX Shell Scripts, Perl, C, C++, Java, HTML, XML and .Net

Data Modeling: Erwin 3.5.1/4.2, Power Designer 6.0/9.5 and MS Visio

Other Software: Hadoop File system, TOAD, MS Office, PRO*C, BTEQ, Teradata SQL Assistant 6.1.0, Microsoft Office, Secure CRT

Versioning Tool : SVN,CVS and PVCS.

Scheduling Tools: TWS, Autosys and Control-M.

Operating Systems: IBM UNIX AIX5.2, HP UNIX 10.2,Windows 9x/2000/NT/XP, 2003/2008 Windows Server, Solaris 2.8/SunOS5.8, Redhat Linux AS


Datastage Developer

Confidential , Plano, TX


  • Analyzed the data model and created mapping document based on the data model for fact and dimension table. worked on existing datastage reusable jobs to load the datawarehouse and datamart tables
  • Used E-3 Framework of Confidential to create Config, Parm files for ETL Jobs and used existing initializing and run scripts to run the ETL-Jobs
  • Worked on Aginity workbench as querying tool for Netezza and squirrel for DB2
  • Worked on Control-M work scheduler for the ETL job flow and job dependencies.
  • Analyzed the data by performing Hive queries (HiveQL) and ran Pig scripts
  • Used Hive for data warehousing and summarization

Environment: IBM-Datastage 11.5/9.1,Netezza,Unix shell scripting, HDFS, Sqoop, Control -M, Aginity, Squirrel,DB2,RTC

Confidential, Minneapolis, MN

ETL-IBM Datastage Developer


  • Interacted with the SME, BA, DBAs, Operations and Scrum Masters in everyday Scrum.
  • Obtained requirements from the business.
  • Prepared the design document to look at the Mapping information.
  • Designed the ETL solution as per the Mapping requirements.
  • Involved in High Level Design, Low Level Design and Developing the Code and Performance Tuning for long-running queries.
  • Raised HP ALM Tickets to DBAs for any DDLS changes to be created in SIT, UAT and Prod.
  • Extensively used DataStage Tools like InfoSphere DataStage Designer and InfoSphere DataStage Director for developing jobs and to view log files for execution errors.
  • Extensively used DataStage Director to monitor and check the run statistics of the jobs and extensively used SVN and CLM to Export/Import DataStage components.
  • Migrated the code to SIT and UAT using Components Tool Migration (CLM) and PROD with change ticket using HPSM.
  • Migrated/Upgraded Datastage 8.x to 11.x using SVN-Version controlling tool, Istool export and Istool import and Export/Import options in Designer client.
  • Involved in importing and exporting jobs for category wise Datastage 8.x to 11.x and maintaining the backup regularly.
  • Worked on Medicare claims for various domains of Confidential .
  • Worked on the technical issues in the code and what needed to be changed.
  • Implemented the best practices for better performance.
  • On call primary support for production jobs on rotation basis
  • Supported Production On call on rotation basis, for newly installing code and for all
  • Production issues.
  • Prepared Technical Specs, Unit Test Plan, ORG documents and other relevant documents related to the project and uploading in SharePoint.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Created OnDemand in UNIX to run the datastage jobs through TWS to load the tables.
  • Created Unix scripts to automate the process for long running jobs and failure jobs status reporting.

Environment: IBM Datastage 11.5/8.5, Teradata, SSIS,Tivoli Work Scheduler(TWS),UNIX Shell Scripting, SVN, ITG, SharePoint, CLM

Confidential, Sanleandro, CA

Datastage and Quality stage Developer


  • Interacting with business users, other project teams to analyze the requirements.
  • Creating Functional specifications, Mapping documents and Technical Specifications.
  • Defining the project plan and delivery time lines for each phase.
  • Designing the process for loading the data from foundation layer to dimension tables and to fact tables.
  • Designing the process flows for Master Data Management.
  • Used Information Analyzer to understand the data patterns by doing column analysis and key relationship analysis.
  • Developing the ETL Jobs using the Transformer, modify, change capture, aggregator, funnel, dataset, File set, CFF and enterprise plug-in etc., stages to transform and load the data into data ware house.
  • Standardizing and Matching the data using quality stages like investigation stage, standardization stage, Match frequency stage, Survive stage and CASS stage etc.,
  • Alter the Match Stage (Quality Stage) specification for correct match.
  • Design and implementing the Survivorship logic.
  • Worked on managing ETL process by pulling large volume of data from OLTP to a staging Database Using SSIS for Data transformation
  • Building the SSIS packages to load the Data into Staging, ODS and dimensional model
  • Developed ETL process using SSIS to transfer data from heterogeneous data sources
  • Extensively worked with Job sequences using Job Activity, Email Notification, Sequencer, Wait for File activities to control and execute the Data stage Parallel jobs.
  • Create Basic subroutines to capture and reporting row counts and job status.
  • Creating the Sequences to maintain restart ability and recoverability, UNIX scripts to execute the jobs.
  • Enhancing and Converting existing process into advanced architectures and to Managed Services Frame work. Used PAC2000 V 7.1 tool for creating Work Orders for production installs and Problem Tickets for Production issues.
  • Supported Production On call on rotation basis, for newly installing code and for all Production issues.
  • Review and provide the feedback for the Designs and jobs created by other team members.
  • Provide mentoring and support to Macys Team members for DataStage design, code, testing, and system administration.

Environment: IBM Information Server 8.7/9.1, HDFS, ksh, Perl, SSIS, C, C++, Java, Share point, SAS, Perl, Teradata R13, PostGreSQL, Oracle, DB2,CLM, TWS, SVN and Linux.

Confidential, Tampa, FL

DataStage Developer


  • Involved in meetings with the Business Analysts to collect the requirements, analysis and implementation of it and prepared Specification documents for EDW process.
  • Worked with Project Lead, Technical Lead and Functional analysts to understand the functional requirements and from there designed the Technical specifications
  • Developed the jobs that generate output csv files.
  • Extensively worked on plug-ins like stored procedures.
  • Extensively worked with Join, Look up and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes
  • As a Datastage Developer I created both Parallel and server jobs and also Sequence jobs.
  • Used the Client components Designer, Director and Administrator
  • Created the user variables that required for the jobs.
  • Also used Perl and UNIX scripts in file moving, scheduling jobs, removing null from the Flat files.
  • Creating the Frame work scripts and jobs to stream line the ETL Process flow.
  • Worked on managing ETL process by pulling large volume of data from OLTP to a staging Database Using SSIS for Data transformation
  • Created SSIS Packages to integrate data coming from Text files and Excel files.
  • Implementing performance-tuning techniques along various stages of the ETL process.
  • Following up deployment process of DataStage code migration on different environments (development, test and production) with admin team.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins

Environment: IBM InfoSphere Information Server 8.5/8.7/9.1, HP UNIX, SSIS,Oracle 11g, C/C++, ksh, Perl, SVN, Linux.

Confidential, Greenville, SC

ETL Datastage Developer


  • Design, develop and Implementation of ETL jobs to load internal and external data into data mart.
  • Worked closely with business team to gather requirements
  • Developed the Source-Target Mapping for each Dimension and Fact tables.
  • Develop and modify ETL jobs to meet monthly and quarterly report needs
  • Developed ETL processes to load data into fact tables from multiple sources like Files, Oracle, Teradata and SQL Server databases.
  • Wrote Perl, UNIX scripts for preliminary file check and extracting data from vendors.
  • Developed processes to schedule and control ETL production runs using Control - M
  • Hands on experience with HP Open view tool for migration requests from Dev to Test and Production
  • Developed Visio Process flow diagrams to better understanding for process.
  • Performed ETL tuning to improve performance.
  • Wrote archival scripts for extracted source data.
  • CVS repository used to track ETL changes.
  • Created Test cases and performed unit & System testing for ETL jobs.
  • Work closely with Testing Team to rectify defects and document them in Clear Quest-defect tracker.

Environment: IBM Infosphere Information Server 8.5/8.7, HP UNIX, UDB, Oracle 9i/10g, C/C++,ksh, Perl, Erwin4.0, Cognos, Control-m.


ETL Datastage Developer


  • Effort estimates, Impact Analysis, Design the code using best methodologies to ensure existing system is least affected.
  • Allocate work to offshore team and review the code before getting delivered.
  • Writing Complex Oracle Sql queries for Aggregating, Joining sales data and loading into Denormalized tables.
  • Running Materialized views to provide data for downstream applications.
  • Writing complex unix scripts using grep, awk, sed to run multiple steps from transforming data and loading into Oracle tables.
  • Loading data in Hierarchy models and writing complex PlSql procedures, functions.
  • Providing support for Cognos team.
  • Writing complex Datastage jobs for implementing complex transformations.
  • Migrating Datastage jobs from 7.5 to 8.1 version.
  • Running Datastage jobs in multiple instances and loading Partitioned/Sub partitioned Oracle tables with 180 million records.

Environment: IBM DataStage 8.1, Oracle 10g, UNIX Shell Scripting, Pl/Sql, CONTROL-M, Cognos, Plsql, TOAD.


DataStage Developer


  • Coding and testing of end-to-end Datastage applications.
  • Creating DS jobs to extract job logs and load into Oracle database.
  • Create DS programs using Aggregator stage to build the counters with number of webpage hits by each customer.
  • Creating Implementation Plan(Scheduling documents) and Day-to-Day client interaction
  • Creating/Reviewing LLD, HLD, BSD, Unit testing, SIT, UAT, Implementation plan documents.
  • Impact Analysis across processes and creating Business solution document.
  • Implementing unix scripts to change personal User Id’s/Password with Generic User Id’s.
  • Modifying all DS jobs utilizing developer User Id’s.
  • Create unix scripts to access secured file consisting of sensitive User Id/Password which will be utilized by DS jobs.
  • Involved in Unit Testing, System Testing, Integration and Performance Testing of the jobs.
  • Involved in the Execution and creation of Test Plans Test scripts and job flow Diagrams
  • Worked closely with Data Quality Analysts and Business Users for data accuracy and consistency after table loads.

Environment: Ascential DataStage 7.1, Oracle 9i, PlSql, UNIX Shell Scripting, Perl Scripting, CONTROL-M, HP-Quality Center,

Hire Now