We provide IT Staff Augmentation Services!

Senior Datastage Developer Resume

2.00/5 (Submit Your Rating)

Tucson, AZ

SKILLS SUMMARY

  • Over 7 years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS).
  • Extensive experienceusing ETL tools with RDBMS like Oracle, MS SQL server, DB2, SAPR/3, Teradata , XML on Windows and UNIX platforms.
  • Over 5 Yearsof experience in building Data Warehouse/Data Marts using ETL Tool IBM Datastage Information Server 8.5 (Parallel Jobs) ,IBM WebSphere DataStage 8.1, IBM-Ascential DataStage 7.5x2/7.0/6.0 (DataStage Enterprise Edition(PX)& Server Edition).
  • Proficient in understanding businessprocesses/requirements, translating them into technical specifications in the context of Data mart/EDW, metadata management.
  • Extensive experience in extraction, transformation of Metadata directly from different heterogeneous source systems like flat files, Excel, Sybase, Oracle, Greenplum, SQL Server andperformData Cleansing, data profiling, Data integration, Operations and loading into their Metadata repositories.
  • Extensive experience in designing and monitoring jobs using DataStage Designer, Data Stage Manager, and DataStage Director.
  • Expertise in Data Modeling, OLAP/ OLTP Systems, generation of Surrogate Keys, Data Modeling experience using Ralph-Kimball methodology and Bill-Inmon methodology, implementing Star Schema, Snow Flake Schema, using Data Modeling tool Erwin.
  • Experience in UNIX Shell scripting as part of file manipulation, and have strong knowledge in scheduling Data Stage jobs using Control-M as well as familiarity with Autosys.
  • Gained Expertise in investigating & solving thedata integrity issues.
  • Effective in cross-functional and global environmentsSupply chain and logistics, Health-Care, financial/wealth management, Sales domainsto manage multiple tasks & assignments concurrently.

EDUCATION 
Bachelor of Technology from Confidential University

TECHNCIAL SKILLS 
ETL: IBM Web sphere DataStage 8.7/8.5.0/8.0.1/7.5.2/7.0/6.0 (Datastage Enterprise Edition, mvs Edition and Server Edition)
Databases: Oracle11g/10g/9i/8i/7.x,DB2 UDB 9.0/8.2/7.2/6.1,Teradata,GreenPlum, SQL server 2008 R2
Operating Systems: IBM AIX 5.2/5.1, SunOS 5.8, SuSe Linux V9.0, Windows XP/NT/2000/98 
BI TOOLS: Siebel Analytics 7.x, Business Objects 5.x, 6.xCognos Impromptu 6.0/7.0/7.1, Power designer 15.0, Crystal Reports 9.0.
Languages: C, C++, Visual Basic, SQL, PL/SQL, Postgre SQL, FORTRAN and COBOL 
Version Controls: Rational Clear Case, CVS and DataStage Version Control

WORK EXPERIENCE

Confidential, Tucson, AZ Aug ’12 – Till Date
Role Senior Datastage Developer
Project : Mosaic

Confidential is the name of Confidential\'s Enterprise System Replacement Project (ESRP).The purpose of this project is to update and augment the ageing core administrative systems. It is a multi-phase project with five key areas: Employee Administration, Financials (Kuali Financial System, KFS), Human Resources / Payroll, Research Administration, and Business Intelligence.

  • Involved in various roles of Administrator and Developer throughout the project. Extensively worked in the design and development of the data acquisition process for the data warehouse including the initial load and subsequent refreshes
  • Prepared Technical Design Approach Document for UAZ Server to Parallel Datastage Jobs Migration and migrated around 1500 datastage jobs from Server 7.5 to parallel 8.5 by replacing existing server routines with parallel Transformer Functions as applicable in DS 8.5 .
  • Designed and Developed DataStage jobs that handles the Initial load and the Incremental load for the Financial System’s EPM.
  • Created PX Jobs in using different stages like Aggregators, Joins, Merge, Lookup, Source dataset, Row generator, Change Capture, Peak stages, and Column generator.
  • Involved in design and development of both server jobs and parallel jobs to extract data into Oracle, text files, sequential files and flat files
  • Created master controlling sequencer jobs using DS Job Sequencer. Extensively developed and deployed UNIX Shell Scripts as wrappers that provided values to DataStage jobs during runtime.
  • Involved in providing technical design review, development plan review, code review, test plans, and results as per best practices of IBM DataStage.
  • Involved in various roles of Administrator and Developer throughout the project. Involved in the design and development of the data acquisition process for the data warehouse including the initial load and subsequent refreshes.
  • Involved in full integration test of all jobs within each sequence before deploying the jobs and sequencers from the Development environment (Dev) to the subsequent environments.
  • Involved in Production Scheduling to setup jobs in order and provided 24x7 production support.
  • Coordinated tasks with onsite and offshore team members in India.

Environment: IBM DataStage Information Server 8.5 (Parallel Jobs), Ascential Datastage 7.5.2(Server Jobs), Oracle Business Intelligence Suite Enterprise Edition (OBIEE), Oracle 10g, Toad, SQL Developer 1.5, Sql* plus, Oracle PeopleSoft Campus Solutions, Windows XP, Linux.

Confidential, OR Sep ’11 – Jul ’12
Senior ETL Developer 
Confidential is an industry-leading, non-asset-based supply chain management company that optimizes the freight management process, delivering an efficient flow of goods from origin to destination. The Current Project One World Freight Forwarding is a Data Integration project which consolidates data marts at different locations into one single Enterprise Data warehouse.

Responsibilities:

  • Involved in One World Freight Forwarding Data Integration Project using ETL tool IBM Data Stage Information Server 8.5 for extracting, transforming, sorting, and integrating data from sql server and loading it in base tables of warehouse.
  • Involved in developing and understanding business requirements for data consistency, deriving fields from source fields, and providing default field values.
  • Worked alongside Data Architects, Data Modelers, and other functional teams to understand and analyze the requirements in order to develop the work flow progress in the warehouse.
  • Involved in designing and development of common jobs to update common batch status, control, file info, and app parameter tables.
  • Developed an ETL process design in DataStage that extracts data from the source SQL Server and loads into the target Data warehouse
  • Designed and Developed DataStage jobs that handles the Initial load and the Incremental load for the Freight forwarding systems
  • Worked and designed parallel jobs using IBM Datastage Information server (8.5) to denormalize the History Tables in the Staging area.
  • Created Datastage jobs for data load into Fact and Dimension Tables using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Sequential file, Dataset, Funnel, Remove Duplicates, Copy, Modify, Filter, Surrogate Key .
  • Designed Sequence jobs in Datastage that extracts the data on a daily basis, loads into the target data warehouse and notifies the status of the load upon completion.
  • Created master controlling sequencer jobs and implemented Re-startability using various checkpoints while automating the Entire EDW loading process.
  • Created sanity check jobs to identify and match the row counts of sources and target databases.
  • Developed ETL Datastage jobs to maintain Referential integrity and ensured the parent and child relationships are in conjunction with the physical and logical data model. Used jobs to identify and remove duplicate rows using remove duplicate stage.
  • Involved in production support for the developed application and successfully performed knowledge transfer to operations team for further maintenance of the application.

Environment: IBM DataStage Information Server 8.5 (Parallel Jobs), MS SQL Server 2008 R2, Oracle RDBMS 10g/11g, Sql server management studio, Power designer 15.0, UNIX (ksh) and WINDOWS 7.

Confidential, Portland, OR 
Senior ETL Developer Mar ’10 – Sep ’11 

Confidential is a nonprofit health insurance company that offers a variety of service and product options including medical and dental insurance for the State employers and individuals. As a part of RITS A & E DATA Management team I was responsible in building new ETL designs for various Member Eligibility and Health Claims Processes. Also was a prime resource in transferring data from the legacy systems to the staging area and subsequently to data warehouse efficiently.
Projects: Coordination of Medical Benefits (COB), Claims Extract to MMC, OMAP Eligibility Extract, MORRI

Responsibilities:

  • Involved in meetings with business analysts and DBAs to gather and understand the business requirements and process flow details in order to plan the ETL extraction and loading.
  • Extracted data using different strategies from the various native SQL against relational databases like Oracle, Sybase, Green Plum and flat files for multiple projects involved in.
  • Enhanced and developed Datastage jobs for data load into Fact and Dimension Tables using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Sequential file, Dataset, Funnel, Remove Duplicates, Copy, Modify, Filter, Surrogate Key .
  • Used PgAdmin III and postgre SQL to access Greenplum database for extraction and loading purposes in the process of migration effort.
  • Imported healthcare data from various transactional data sources residing on Facets, Oracle, Sybase ,flat files and performed Null value handling, data Cleansing using null handling functions and UNIX routines.
  • Developed complex jobs and made sure that each record in all processed files has at least one ETL action indicator in its corresponding ETL audit table identifying if the record was an insert, update, no change, or delete followed by loading into oracle Database and flat files.
  • Enhanced the reusability of the jobs by making and deploying shared containers and multiple instances of the jobs.
  • Involved in writing UNIX shell scripts for automation, job run, file processing, initial load, batch loads, cleanup, job scheduling, and reports Linux/UNIX environment.
  • Created Audit jobs and generated Reconciliation reports balancing the Source and target record counts using DSJobReport Functions and –dsjob commands by reading job Start time, End time, Source records count, Target records count, Reject records count and no of warning messages generated in the DataStage director log.
  • Extensively worked on error handling, data Cleansing and performing lookups for faster access of data.
  • Used lookup stage with reference to Oracle tables for insert/update strategy and updating of slowly changing dimensions.
  • Involved in Unit testing, Integration testing, UAT by creating test cases, test plans and helping Datastage administrator in deployment of code across Dev, Test and Prod Repositories.

Environment: IBM WebSphere DataStage and Quality Stage 8.0.1, Oracle 10g/11g, Sybase, Green plum, PgAdminIII, AIX 5.3, Erwin 4.0, Smart CVS 7.0.9, UNIX Scripts, IBM Rational Clear Quest Web client 7.1.1, Toad 9.7.2.5, Harvest 12.0.2, CTRL-M,TOAD for Data Analysts 2.7.0

Confidential (ETC), TX Nov ’08 – Mar’10 
DataStage Developer
The REPORT initative’s objective was to provide a standardized means for transforming, calculating and reporting of the customer survey data received from various sources for client reporting purposes. This system will provide standardized reporting process for customer study data. The project objective was to integrate different departments/DB into the new specialty department DSS data repository. The integration effort was divided into a multiple phase approach.

Responsibilities:

  • Involved in entire product life cycle development from the requirement gathering through implementing in production.
  • Worked with Data Modeler and DBAs to build the data model and table structures.
  • Involved in Analyzing the System requirements, implementation of star schema & Involved in designing and development of data warehouse environment.
  • Compiled source to target mapping documents to design the ETL jobs.
  • Prepared detail design documents which contain job designs and functionality of each job.
  • Developed DataStage extract jobs using Oracle stages and load the data into a staging area in the form of flat files/Data Sets.
  • Used Join/Merge/Lookup Stages to replicate integration logic based on the volumes of the data.
  • Extensively used IBM tools Information Analyser, Quality Stages like Investigate, Match and Survivorship.
  • Worked extensively with Parallel Stages like Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer Stages.
  • Collaborated analysts, data modelers and developers to capture business logic and translated into DataStage ETL jobs.
  • Extended data profiling capabilities and provided foundation for data governance initiatives using IBM Information analyzer and job maintainability.
  • Coordinated tasks with onsite and offshore team members in India.

Environment: IBM WebSphere DataStage and Quality Stage 8.1/8.0.1, UNIX Shell Script, Erwin, SQL*Loader, Oracle 10g, DB2/UDB, Autosys, Windows XP.

Confidential, MI Mar ’08 – Oct ‘08
ETL Developer
Confidential EDW Project consolidates more than ten years of data from over 30 source systems. It includes data related to vehicles, parts, customers, dealers, marketing campaigns, global procurement, and supply performance trends.

Responsibilities:

  • Involved with Business Analysts to understand the Business Requirement Specifications and implemented the ETL jobs using DataStage. Deployed the solutions that maximize the consistency and usability of data.
  • Extensively used DataStage Director for monitoring and debugging of the jobs and Sequences. Used DataStage Manager to Import and Export DataStage components and Import table definitions from source databases.
  • Worked on the SQL Programmer, DB2 QMF tools to extract the data from the Source DB2 database and the target Oracle database.
  • Implemented the underlying logic for Slowly Changing Dimensions.
  • Executed Pre and Post session commands on Source and Target database using UNIX shell scripting.

Environment: DataStage 7.5.2 (Parallel Extender EE), Oracle 10g, SQL, PL/SQL, Unix Shell scripts, SQL*Loader, Toad, Sun Solaris 5.8, Windows XP.

Confidential, NY Nov ’06 – Mar ‘08
DW Developer
The Data Warehousing development for Confidential involves a development of Data Mart which will feed downstream reports, development of User Access Tool using which users can create ad-hoc reports, and run queries to analyze data in the proposed Cube. The current project replicated the functionality incorporated in the Underwriting Measures Cube, but with daily update, a more robust architecture, and enhanced reliability.

Responsibilities:
  • Aided in the design and development of the logical and physical data models, business rules and data mapping for the Enterprise Data Warehouse system.
  • Involved in Design and development of Data Marts for specific Business aspects which include Policy, Claim, Enterprise Billing and eQuote.
  • Coordinated tasks with onsite and offsite team members.
  • Worked with Business Users and designed report layouts.
  • Designed/wrote the tech specs (Source-Target mappings) for the ETL mappings along with the Unit Test scripts. Mapping of Data items from Source systems to Target systems.
  • Involved in designing various jobs in PX, DataStage as per given specs
  • Used DataStage EE 7.X to extract data from source systems such as Oracle, Flat files, XML to target data warehouse.
  • Developed SQL Scripts to validate the data after the loading process
  • Modified and tested PL/SQL stored procedures.
  • Developed UNIX Korn shell scripts to load the flat files from the source system into the tables in staging area.

Environment: DataStage 7.5.3, Oracle 9i, DB2/UDB, UNIX AIX, Unix Shell Scripting, PL/SQL, SQL, PVCS, TOAD, ERWIN

Confidential Apr ’05 – Sep ‘06
Database Developer/Analyst
The aim of the project is to gather a great deal of medical data related to its members and participating independent medical facilities. The data came from several of its databases as flat files, the task, however, was to integrate the data into an Apollo database to provide a single vision and single platform that enables healthcare providers to analyze and use the information to improve healthcare services they provide.

Rsponsibilities:

  • Created database objects such as tables, views, synonyms, indexes, sequences and database links as well as custom packages tailored to business requirements.
  • Used PL/SQL features such as stored procedures, functions, packages and database triggers for maintaining complex integrity constraints and implementing the complex business rules.
  • Responsible for extracting, transforming and loading data Using UTL_FILE, SQL LOADER.
  • Involved in writing Unix Shell Scripts.

Environment: Oracle 9i, UNIX Shell Scripting, SQL Loader, PL/SQL, TOAD, Windows 2000, UNIX.

We'd love your feedback!