We provide IT Staff Augmentation Services!

Etl/data Stage Developer Resume

2.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • 8 years of IT experience in the Development, Implementation and Testing of Database/Data warehousing applications for Financial industries using Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Proficient Data Warehousing with Ascential Data Stage, Quality Stage, Profile Stage and Audit Stage.
  • Experience in integration of various data sources like XML, Mainframe COBOL Files, Flat Files, Oracle, SQL Server into Data Warehouse.
  • Extensive experience in loading high volume data and Performance tuning.
  • Hands on experience with Data Stage Client Components - Designer, Director and Manager.
  • Experience with UNIX shell scripting for File validation.
  • Experience in designing and developing complex Data Stage jobs, routines and sequencers.
  • Hands on experience in writing, testing and implementation of the triggers, procedures, functions at Database level using PL/SQL.
  • Worked on different scheduling tools like Autosys and Control-M
  • Experience in 24/7 production support for various projects.
  • Experienced with all phases of software development life cycle. Involved in Business Analysis, Design, Development, Implementation and Support of software applications.
  • Following up deployment process of Data Stage code migration on different environments (Development, Test and Production) with admin team.
  • Involved in design and implementation of Data Warehouses fact and dimensional tables (Star Schema) with identification of Measures and Hierarchies.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.

TECHNICAL SKILLS

ETL Tools: Information Server 9.1/8.7/8.5/8.0.1/8.1, Ascential Data Stage 7.5, Parallel Extender (Orchestrate), Quality Stage, Information Analyzer (Profile Stage), and Fast track, Audit Stage, Business Glossary and Metadata Workbench, PG Admin III.

Databases: Oracle 11g/10g, SQL Server 2008

Languages: SQL, UNIX Shell Scripts, C, C++, HTML, XML, VB 5.0/6.0, .NET

Data Modeling: Erwin 3.5.1/4.2, Power Designer 6.0/9.5, MS Visio

Other Software: TOAD, MS Office, PRO*C, Microsoft Office, Secure CRT,SQL Developer

Versioning Tool: PVCS, SVN

Scheduling Tools: Autosys, Control-M, Tivoli Workload Scheduler

Operating Systems: IBM UNIX AIX5.2, HP UNIX 10.2,Windows 9x/2000/NT/XP, 2003/2008 Windows Server, Solaris 2.8/SunOS5.8, Red hat Linux AS

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

ETL/Data Stage Developer

Responsibilities:

  • Prepared the required application design documents based on functionality required.
  • Used DataStage Parallel Extender stages namely Sequential, Lookup, Change Capture, Funnel, Transformer Stage, Column Export stage and Row Generator stages in accomplishing the ETL Coding.
  • Developed Teradata SQL Queries to Load data from Teradata Staging to Enterprise Data warehouse
  • Extensively worked on Error Handling and Delete Handling.
  • Designed and developed the jobs for extracting, transforming, integrating, and loading data using DataStage Designer.
  • Used the DataStage Director and its run-time engine to monitor the running jobs.
  • Involved in performance tuning and optimization of DataStage mappings using features like partitions and data/index cache to manage very large volume of data.
  • Involved in Unit testing, System testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Extracted the data from the data warehouse using Business Object for reporting purposes.
  • Used Surrogate Keys to keep track of Slowly Changing Dimensions (SCD).
  • Extensively used Pivot stage to pivot the source data to achieve required table structures like converting data from rows into a column.
  • Used Extensively Tivoli Workload Scheduler to schedule the jobs and to monitor the jobs.
  • Developed unit test plans and involved in system testing.

Environment: IBM InfoSphere Information Server 9.1/8.7/8.5, HP UNIX, Teradata, C/C++ and T W Scheduler

Confidential, Tampa, FL

ETL/Data Stage Developer

Responsibilities:

  • Involved in meetings with the Business Analysts to collect the requirements, analysis and implementation of it and prepared Specification documents for EDW process
  • Worked with Project Lead, Technical Lead and Functional analysts to understand the functional requirements and from there designed the Technical specifications
  • Developed the jobs that generate output csv files.
  • Extensively worked on plug-ins like stored procedures.
  • Extensively worked with Join, Look up and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes
  • As a Datastage Developer I created both Parallel and server jobs and also Sequence jobs.
  • Used the Client components Designer, Director and Administrator
  • Created the user variables that required for the jobs.
  • Also used UNIX scripts in file moving, scheduling jobs, removing null from the Flat files.
  • Implementing performance-tuning techniques along various stages of the ETL process.
  • Following up deployment process of DataStage code migration on different environments (Development, Test and Production) with admin team.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins
  • Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.

Environment: IBM InfoSphere Information Server 9.1/8.7/8.5, HP UNIX, Oracle 11g, C/C++, PVCS, Secure CRTRed hat Linux AS

Confidential, Tampa, FL

ETL/Data Stage Developer

Responsibilities:

  • Involved in meetings with the Business Analysts to collect the requirements, analysis and implementation of it and prepared Specification documents for EDW process
  • Worked with Project Lead, Technical Lead and Functional analysts to understand the functional requirements and from there designed the Technical specifications
  • Developed the jobs that generate output csv files.
  • Extensively worked on plug-ins like stored procedures.
  • Extensively worked with Join, Look up and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes
  • As a Datastage Developer I created both Parallel and server jobs and also Sequence jobs.
  • Used the Client components Designer, Director and Administrator
  • Created the user variables that required for the jobs.
  • Also used UNIX scripts in file moving, scheduling jobs, removing null from the Flat files.
  • Implementing performance-tuning techniques along various stages of the ETL process.
  • Following up deployment process of DataStage code migration on different environments (Development, Test and Production) with admin team.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins
  • Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.

Environment: IBM Info Sphere Information Server 8.7/8.5, HP UNIX, Oracle 11g, C/C++, PVCS, Secure CRTRed hat Linux AS

Confidential, Detroit, MI

DataStage Developer

Responsibilities:

  • Involved with Business users and ETL Leads from different teams to implement ETL Frame Work using DataStage Server/PX combination of jobs.
  • Sourced data from various sources like DB2 UDB, Flat Files and CSV files
  • Designed jobs using different parallel job stages such as Join, Merge, Lookup, Remove Duplicates, Filter, Dataset, Lookup File Set, Change Data Capture, Switch, Modify, Aggregator, DB2 Enterprise, and DB2 API.
  • Involved in developing DataStage Designer- Server and PX jobs for Extracting, Cleansing, Transforming, and Integrating /Loading Data into Data Warehouse
  • Developed User Defined subroutines using Universe BASIC to implement some of the complex transformations, date conversions, code validations and calculations using various DataStage supplied functions and routines.
  • Developed Job Sequencers with restart capability for the designed jobs using Job Activity, Exec Command, E-Mail Notification Activities and Triggers.
  • Extensively designed, developed and implemented Parallel Extender jobs using Parallel Processing (Pipeline and partition parallelism), Restartability techniques to improve job performance while working with bulk data sources.
  • Created projects using DataStage Administrator.
  • Changed user group assignments.
  • Unlocked the jobs from administrator and director.
  • Extensively used DataStage Director to Monitor and check the run statistics of the Jobs.
  • Extensively used DataStage Manager to Export/import DataStage components.
  • Extensively used SQL tuning techniques to improve the database read performance through DataStage Jobs and used Frame Work approach to improve transformation and loading steps.
  • Involved in Unit Testing, System Testing, Integration and Performance Testing of the jobs.
  • Involved in the Execution and creation of Test Plans Test scripts and job flow Diagrams
  • Worked closely with Data Quality Analysts and Business Users for data accuracy and consistency after table loads.

Environment: IBM InfoSphere Information Server 8.1, oracle 10g,DB2 UDB 9.1 Enterprise Edition, Redhat Linux, Autosys 4.5, Connect Direct Putty, Microsoft Visio, Microsoft Project server, Microsoft Portal, Clear Quest,, Microsoft Office(Excel, Word and Power point), Acrobat distiller, Clear Case.

Confidential

Jr.Data Stage Developer

Responsibilities:

  • Extensively used DataStage Designer to develop various jobs to extract, cleanse, transform, integrate and load data into Database (DB2 UDB RDBMS).
  • Worked with DataStage Manager to import/export metadata from database, jobs and routines between DataStage projects
  • Used DataStage Director to schedule, monitor, cleanup resources and run job with several invocation ids.
  • Used DataStage Administrator to control purging of Repository and DataStage client applications or jobs they run, cleanup resources, execute TCL commands, move and manage or publish jobs from development to production status
  • Wrote Batch Jobs to automate the System Flow using DS Job Controls with restart-ability to of jobs in a Batch.
  • Developed user defined Routines and Transforms by using DataStage Basic language
  • Used TCL commands in DS Jobs to automate Key Management of surrogate keys and used DataStage Command Language 7.0
  • Developed DataStage server jobs using Sequential File, Hash File, Sort, Aggregator, Transformer, ODBC, Link Collector/Practitioner stages
  • Designed several DataStage parallel jobs using Sequential Lookup File set, Join, Merger, Lookup, Change Apply, Change Capture, Remove duplicates, Funnel, Filter and Pivot stages.
  • Extensively used Teradata Load and Unload utilities such as Multi Load, Fast Export and Bulk Load stages in Jobs for loading/extracting huge data volumes
  • Developed various SQL scripts using Teradata SQL Assistant and used some of them in DS Jobs with BTEQ Utility and some used in Teradata Stages as SQL override
  • Involved in Unit testing, System and Integration testing
  • Wrote UNIX shell Scripts for file validation and scheduling DataStage jobs

Environment: Data Stage 7.1/7.5,/8.1 Teradata V2R5.0, IBM AIX 4.1.8

We'd love your feedback!