We provide IT Staff Augmentation Services!

Datastage Etl Developer Resume

Chicago, IL

SUMMARY:

  • 8 years of professional experience in Extract, Transform and Load(ETL) experience using IBM Info sphere DataStage 11.5/11.3/ 9.1/8.5/8.1/7.5.
  • Expert Level experience in DataStage Client components - DataStage Designer, DataStage Manager, DataStage Director and DataStage Administrator
  • Extensively worked on Server Edition, Enterprise Edition (Parallel Extender) and Development of Data Warehouse/Data Mart Applications.
  • Having DataStage experience on connecting to Hadoop / BIGDATA
  • Having experience in dealing with SAP systems as source and configuring SAP related stages in DataStage
  • Extensively made use of all stages Aggregator, Sort, Merge, Join, Change Capture, Peek stages in Parallel Extender job.
  • Used Data Stage Manager to import/export Data Stage projects and jobs to define table definition in repository.
  • Used Data Stage Director to debug, validate, schedule, run and monitor Data Stage Jobs
  • Experience in designing Job Batches and Job Sequences for scheduling server and parallel jobs using DataStage Director, UNIX scripts.
  • Proficiency in data warehousing techniques for Slowly Changing Dimensions, Surrogate key assignment and Change Data Capture
  • Applied the DataStage run time column propagation (RCP) and created DataStage Parallel sync up the tables between environments
  • Extensive experience in dealing with High Volume Data, Performance Tuning, Maintaining, Multiple Job
  • Expertise in writing UNIX shell scripts and hands on experience with scheduling of shell scripts using AUTOSYS/CONTROL-M.
  • Expertise in Data Migration and Upgrading.
  • Expertise in using ERWIN for data modeling.
  • Proven track record in troubleshooting of DataStage Jobs and addressing production issues such as performance tuning and enhancement
  • Converted Complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance
  • Strong Knowledge of Data Warehouse architecture in Designing Star Schema, Snow Flake Schema, FACT and Dimensional Tables. Physical and Logical modeling using Erwin.
  • Extensive experience in developing strategies for Extraction, Transformation, Loading (ETL) data to various sources into data warehouse and Data Marts using Data Stage.
  • Involved in Performance Fine Tuning of ETL programs. Tuned DataStage and Stored Procedures Code.
  • Excellent experience in Relational Database (RDBMS), Oracle 11g,10g,9i, Microsoft SQL Server, Netezza, Teradata Load and MultiLoad, SQL, PL/SQL, TOAD.
  • Involved in Logical and Physical Design, Backup, Restore, Data Integration and Data Transformation Service and Creating Database Objects (Tables, Index, Triggers, Views, and Store Procedures).

TECHNICAL SKILLS:

ETL tools: IBM Information Server DataStage 11.5/11.3/ 9.1/8.7/8.5/8.1

Data Modeling: E-R Modeling, Star and Snowflake Schema Modeling

Databases: Oracle 11g/10g/9i/8i, SQL Server 2008/2005, Netezza 7.x/6.x Teradata 13.x, DB2 UDB, Cassandra

Languages: C/C++, SQL, PL/SQL and Shell Scripting

Operating System: Unix, Linux, Windows XP,7,8 server

Scheduling Tools: Autosys, Control-M, Tivoli

PROFESSIONAL EXPERIENCE:

Datastage ETL Developer

Confidential, Chicago, IL

Responsibilities:

  • Designed and developed Datastage jobs that load millions of records from Source to target tables
  • Established best practices for DataStage jobs to ensure optimal performance, reusability
  • Working on enhancing the sql scripts or queries to make it better performed.
  • Prepared technical design Document
  • Updated existing process flow document (Microsoft Visio) for newly added code to scheduler
  • Involved in system analysis and design of DataStage jobs
  • Design and development ETL processes for extracting data from legacy systems and loading into target tables using SQL and DataStage designer client
  • Developed jobs using various processing stages like Join, Lookup stage, Column generation and Funnel stages etc by using parallel extender partition concepts
  • Used Datastage Director to verify logs and monitoring jobs during run and after running the jobs
  • Supported Product Verification team to Identify and to fix the issues
  • Performing Unit Testing and integration testing
  • Triggering the job streams from Tivoli Scheduler and capturing the Tivoli logs
  • Worked with Tivoli scheduler team to add new jobs to schedule
  • Worked on Unix shell script for enhancement to support business requirements
  • Running the Shell scripts in debug mode to identify the cause of failure
  • Check in and check out of Datastage jobs, Shell scripts into Clear case repository
  • Provided knowledge transfer to newly on-boarded resources
  • Analyzing the production issues and fixing them by working with business analyst
  • Responsible for creating the Deployment verification guide.
  • Responsible for verifying the code after it has been deployed in production, Product Verification environments.
  • Responsible for synchronize the code in different development environments
  • Compared the Datastage code against different environments using Datastage tool

Environment: IBM InfoSphere Datastage 11.5v, Toad for DB2 6.5v, SQL Server, Unix, Winscp, Clear case Version control, IBM Tivoli scheduler, Clear Quest.

DataStage ETL Developer

Confidential, Lebanon, NJ

Responsibilities:

  • Involved in full development life cycle.
  • Participated in all stages of the development life cycle including reverse Engineering, requirement analysis, preparing mapping documents, design and developing ETL jobs and Sequences.
  • Worked on Reverse engineering between existing source and target layers using Oracle SQL.
  • Developed jobs using various stages like Oracle Connector, Copy, Pivot, Funnel, Lookup, Join, Merge, Sort, Transformer, Dataset, Row Generator, Column Generator, Datasets, and Aggregator Stages.
  • Used very extensively the Peek stage for debugging purposes.
  • Developed Multi-Instance jobs to generate Pick list values.
  • Used Checksum and Change Data Capture (CDC) stages for doing incremental loads.
  • Implemented type 1, type 2 slowly change dimensions (SCD).
  • Sat with QA people helped them understanding the requirements, analysed and fixed the bugs raised by them.
  • Worked on the testing of code and fixed defects during the conversion.
  • Involve in design and code reviews and extensive documentation of standards, best practices, and ETL procedures.
  • Took the responsibility of code back up and checking it in TFS and maintain the versions of the code changes.
  • Responsible for adopting the Company Standards for Stage & Link Naming Conventions.
  • Design Technical documents on delivered functionality / code as required.
  • Reviewed, Unit tested and documented the code that was done by teammates.
  • Resolving Defects and issues in production environment and bug fixing.
  • Created the Sequences for running the jobs and logging the counts into audit tables.
  • Used Loop stage in sequence to call a Multi-Instance job which dumps data from one Virtual Database to other Virtual Database multiple times.
  • Responsible for all activities related to the development, implementation, administration and support of ETL processes for large-scale data warehouses using IBM Information Server.
  • Developed SQL queries to perform DML against the databases.
  • Performed Unit and System Testing.
  • Worked on Migrating Code from 11.5 to 9.1.
  • Worked on Data Model changes to add new required fields to the existing tables and implemented those changes in the existing code.
  • Developed test data and conducted performance testing on the developed modules and unit test plans and documents.
  • Extensively worked on user-defined SQL for overriding Auto generated SQL queries in Data Stage.
  • Worked on Agile and Scrum methodologies.
  • Identified issues, tracked, reported and resolved in a timely manner.
  • Fine Tune Jobs/Process for higher performances & debug complex jobs.
  • Have good knowledge in creating USER’S and assigning them the roles and role hierarchy through sales force front end application.

Environment: IBM InfoSphere 11.5v/9.1v, Toad for oracle 12.7, Unix, Winscp, TFS Version control, Jira, ESP scheduler, Salesforce.com

DataStage ETL Developer

Confidential, Dallas, TX

Responsibilities:

  • Creating Low Level Design documents by understanding the requirements and there by develop jobs.
  • Worked in Data Acquisition project, where we need to extract data from different sources, process the data and generate the files and transfer these files to Target Systems.
  • Worked on several change requests, which were created because of production incidents and requirement changes to the code in production environment.
  • Responsible for using different types of Stages such as ODBC Connector, Oracle Connector, DB2 Connector, Teradata Connector, Transformer, Join, Sequential File to develop different jobs.
  • Developing DataStage Parallel and Sequence Jobs.
  • Developed common Jobs, Shared containers and Server Routines, which are used across the project in most of the interfaces.
  • Used Job parameters, stage variable and created parameter files for flexible runs of Job based on changing variable values
  • Imported the required Metadata from heterogeneous sources at the process level
  • Created Batches (DS job controls) and Sequences to control set of jobs
  • Scheduled jobs using Control-M scheduler utility based on the requirements and monitored the production processes closely for any possible errors
  • Created Unix Shell Scripts that takes care of end-to-end automation. Developed UNIX shell Scripts that trigger DataStage jobs, transfer the output files, perform basic validations on file.
  • Implemented and hardcoded high performance DataStage routines
  • Supporting the testing team, Integration Team and Reporting team after ETL data Load
  • Performed the Integration and System testing on the ETL jobs
  • Responsible for generation of DDL statements which are executed for database creation
  • Deploy the developed code to SIT, Production environments and validate the code
  • Fix the defects raised by testing team and maintain the status in HP Quality Center.
  • Extensively used SQL tuning techniques to improve the performance of Data Stage Jobs.
  • Tuned DataStage transformations and jobs to enhance their performance.
  • Provide Post Implementation Support.

Environment: IBM InfoSphere 11.3v, IBM InfoSphere 9.1v, IBM InfoSphere DataStage 8.7v, UNIX shell Script, ORACLE 11g, SQL Developer, DB2, Teradata, CASSANDRA, AQT (for accessing SQL Server and DB2), Office.

Senior DataStage Developer

Confidential, Cincinnati, OH

Responsibilities:

  • Handling production issues and working on developing new design specifications into ETL coding and mapping standards.
  • Extensively used Datastage Designer to develop processes for extracting, transforming, integrating and loading data from various sources into the Data Warehouse.
  • Created ETL processes composed of multiple Datastage jobs using job sequencer and developed shell scripts to automate the process and tested the processes.
  • Extensively worked on different databases like Oracle, Db2, Netezza and SQL, Cassandra Server to extract and load the data from one another.
  • Used different types of stages like Transformer, CDC (change data capture), Remove Duplicate, Aggregator, ODBC, Join, Funnel, dataset and Merge for developing different jobs.
  • Involved in the migration of the jobs from DB2 to Netezza
  • Extensively used Parallel stage like row generator, column generator and Peek Stages for debugging purpose.
  • Used the Director and its run-time engine to schedule running the job, testing and debugging its components, and monitoring the resulting executable versions.
  • Documented data sources and transformation rules required to populate and maintain Data Warehouse content.
  • Customized UNIX scripts as required for preprocessing steps and to validate input and output data elements.
  • Developed Multi-Instance reusable Datastage jobs.
  • Used Netezza Enterprise stage for doing loads into Netezza Database
  • Effectively implemented Partitioning and Parallelism techniques to fully utilize the resources and enhance job performance.
  • Wrote SQL scripts to extract and load data from source and target databases.
  • Implemented slowly change dimensions (SCD) type 1, type 2.
  • Designed and developed various jobs for scheduling and running jobs under job sequencer and DataStage Director.
  • Extensively implemented import/export utilities for migrating code.
  • Replaced transformer stages with other stages to improve performance of job.
  • Attended daily meetings to review the status of the schedule and go through pending issues.
  • Coordinated with Release manager and DA to migrate the components from one environment to another environment. Integrated with other Projects, for sharing the table structures and data.
  • Involved in performance tuning by rewriting the queries and modifying existing Datastage jobs.
  • Involved in production support for production cycle runs and ETL related issues. Communicated data availability with users and management.
  • Documented the purpose of mapping so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.
  • Also involved in ETL test plans, test scripts and validation based on design specifications for unit testing and functional testing.

Environment: IBM Infosphere Datastage 9.1/8.7/8.1, Oracle 10g, Netezza, DB2, SQL, PL/SQL, UNIX Shell Scripting, Datastage Version Control, MS SQL server, Control M 7.5, RTC.

DataStage Developer

Confidential, PA

Responsibilities:

  • Participating on Business Analysis Development Phase and Gathering Requirements. Working with Developer Department team for translating business requirements into Data Mart Design.
  • Broadly involved in Data Extraction, Transformation and Loading (ETL Process) from source to target systems using IBM Info Sphere Datastage 8.7.
  • Working with Database team for creating logical and physical Data Modeling process using Erwin and guide business group and users session.
  • Developed several jobs to improve performance by reducing runtime using different partitioning techniques
  • Taking part in planning and management of all Data Warehouse migration process.
  • Designed complex job control processes to manage a large number of Jobs
  • Involved in creating strategy for Star Schemas with Fact and Dimension Tables.
  • Translating Business requirements into Data Mart design coordinating with team members Creating Fact, Dimensional and Aggregate Tables and Loading Data Warehouse tables.
  • Used DataStage Director and the runtime engine to schedule running the parallel jobs, monitoring and debugging its components
  • Used development/debug stages to test the environment by creating samples of data from given high volume data or by creating mock data
  • Developed Shell Scripts for taking backup and recovery of database. Performed physical and logical backup.
  • Used Control-M to schedule the Datastage ETL batch jobs on weekly and monthly basis.
  • Used DataStage manager to import, create and edit the metadata.
  • Used the DataStage Administrator to assign privileges to users or users groups, move, rename or delete projects and manager or publish jobs from development to production status.
  • Used several stages in Sequencer like Abort Job, Wait for Job and Mail Notification stages to build an overall main Sequencer and to accomplish Re-startability
  • Developed Server side functionality by using PL/SQL and UNIX shell programming.
  • Constructed SQL Scripts to validate the data after loading process.

Environment: IBMInfosphere DataStage 8.7/8.5, SQLServer 2005, Oracle10g/9i, PL/SQL, Control-M, UNIX, Shell scripting.

DataStage Developer

Confidential, Dallas, TX

Responsibilities:

  • Developed Shared containers and Server Routines which are used across the project in most of the interfaces.
  • Worked in Data Migration project.
  • Involved in Each Phase of project - End to End module.
  • Written complex queries to facilitate the supply of data to other teams.
  • Responsible for using different types of Stages such as Sequential File, Sort Aggregator, Transformer and ODBC to develop different jobs.
  • Designed jobs using different parallel job stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup File Set, Change Data Capture, Modify, and Aggregator.
  • Used DataStage Director for Schedule jobs in production for some of the projects.
  • Prepared Unit Test Cases and executing them.
  • Provide support to Testing Team and fix defects raised in different phases of testing.
  • Developed DataStage Parallel and Sequence Jobs.
  • Developed UNIX shell Scripts used to validate the files.
  • Understand the existing PL/SQL procedures to extract data loaded by People Soft.
  • Created Error Files and Log Tables containing data with discrepancies to analyze and re-process the data.
  • Tuned DataStage transformations and jobs to enhance their performance.
  • Used Control-M for Scheduling jobs and monitor the same.

Environment: IBM InfoSphere DataStage 8.5v/8.1v, UNIX shell Script, ORACLE 10g, ORACLE 11g, SQL Developer Client for Oracle 10g, Office, Visio.

DataStage Developer

Confidential

Responsibilities:

  • Documenting the Proof Of Concepts and delivered to Clients
  • Writing of Technical Specification of the project
  • Design activities for Batch Control Architecture, Mapping Relationships, Keys and Indexes
  • Designed Parallel jobs using various stages like Join, Remove Duplicates, FTP stage, Filter
  • Extensively used TOAD for analyzing data, writing SQL, PL/SQL scripts performing DDL operations.
  • Profiling and minimizing overall costs and resources for critical data integration projects by scanning the samples of data and determining their quality and structure.
  • Used development/debug stages to test the environment by creating samples of data from given high volume data or by creating mock data
  • Worked on the code fixes and on the tickets raised due to the job failures
  • Supporting unit, integration, and end user testing by resolving identified defects
  • Ensuring timely deliveries of work items to the Client
  • Involved in Implementing ETL standards and Best practices within our portfolio
  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database
  • Reusing the logic from DataStage jobs in real time
  • Extensively used almost all database stages, file stages, processing stages, and development/debug stages

Environment: IBM Web Sphere Data Stage 8.0.1 (Designer, Director, Administrator, and Manager), DB2, Oracle, Windows Server 2008, SQL Loader, ControlM, SQL, PL/SQL, Oracle SQL Plus, UNIX Shell Scripting, MS SQL Server.

Hire Now