We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume Profile

2.00/5 (Submit Your Rating)

NY

Professional Summary

  • Experience in programming using SQL, PL/SQL, and UNIX Shell Scripting.
  • Experience in designing Batch Jobs and Job Sequences for scheduling server and parallel jobs using DataStage Director, Enterprise Scheduling tools, UNIX scripts.
  • Expertise in troubleshooting DataStage jobs, enhancement of jobs and addressing post production activities like performance tuning and enhancements.
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 10g/9i/8.x, DB2 UDB, DB2 AS400, Mainframe files/COBOL Copybooks, PeopleSoft and SAP systems.
  • Created frame work, best practice, standards and naming conventions documents used in ETL environment.
  • Good knowledge of Netezza database Architecture and working knowledge of utilities like nzload, nzsql
  • Excellent understanding of Teradata V12/V2R6/V2R5 Architecture and expertise in using the utilities.
  • Extensive experience in IBM DB2 UDB 9.7/9.0/8.0, Oracle 10g/9i/8i, Microsoft SQL Server 2005/2000, Teradata, Netezza databases.
  • Experience in Relational Database Modeling and Dimensional Modeling-Star schema, Snow-Flake schema, and hands on experience in data modeling tools like ERwin and ER Studio.
  • Worked with Business users to elicit business requirements and formulating technical requirements.
  • Involved in all the phases of the SDLC from Requirement gathering, Design, Development, Unit testing, UAT, Production, Maintenance and support.
  • Teradata Certified Professional V2R5.
  • IBM Certified for IBM InfoSphere Information Server 8.5.
  • 7 Years of experience in IBM Info Sphere Information Server 9.1/8.7/8.5/8.1/7.5 and Ascential DataStage 7.x using components like DataStage/QualityStage Designer, Director and Administrator, Information Analyzer.
  • Over 8 years of experience in software industry including requirement gathering, Analysis, Design, Development, Testing, Data Analyzing, Implementation, Maintenance and Production Support of various Data Warehousing and OLAP Applications using ETL Tools.
  • Excellent communication and interpersonal skills, teamwork, problem solving skills, flexible, self-direct and energetic person

Technical SKILLS:

  • Programming Languages
  • SQL, PL/SQL, Unix Shell Script, Perl Script, C , Python, Java, HTML, CSS
  • ETL Tool
  • Datastage 9.1/8.7/8.5/8.1/8.0.1/7.5.2/7.1 Administrator, Manager, Designer, Director, Parallel Extender/Orchestrate, Information Server, Quality Stage , ClearCase 9.1
  • Reporting Tool
  • Micro Strategy 9.0/9.0.1, Cognos10.1, Business Objects
  • Tools Utilities
  • Microsoft Office Suites, Citrix, SQL Assistant, Advanced Query Tool, TOAD, Control M, MS Visio,Oracle SQL Developer
  • RDBMS
  • Teradata, Netezza TwinFin, Oracle11g/10g/9i/8i/7.3,SQL Server 2000/2005/2008, DB2. MSAccess
  • Operating Systems
  • Windows 8/7/XP/ 2000, Linux ,UNIX, Sun Solaris

Professional EXPERIENCE:

Confidential

Sr. ETL Developer

The Unemployment Insurance System Improvement UISIM Project in the NYS Department of Labor is an initiative to implement modernization of the current Unemployment Insurance system. The programs that support the nearly 3.5 billion New York State Unemployment Insurance are not adaptable to the existing technologies and the new system to be built focuses on meeting the NY State's standards. The objective of the project is to develop a system with the ability to support the Unemployment Insurance program.

Responsibilities:

  • Communicated with business analysts for obtaining ETL requirements from the mainframe programs
  • Involved in understanding the scope of application, present schema, data model and defining relationship within and between groups of data.
  • Responsible for designing, developing and building DataStage parallel jobs using DataStage designer.
  • Extensively used Reject Link, Job Parameters, Parameter Sets and Stage Variables in developing jobs
  • Designed parallel jobs using Sequential File, Dataset, XML, Join, Merge, Lookup, Surrogate Key Generator, Funnel, Filter, Copy, Column Import, Peek, Oracle Connector, Aggregator, Transformer Stages.
  • Used DataStage Director to identify errors in sequence and parallel jobs
  • Coded UNIX wrapper scripts to trigger ETL jobs
  • Created jobs for managing batch requirements and loads into batch processing tables.
  • Worked on troubleshooting, performance tuning and performance monitoring for enhancement of DataStage jobs.
  • Extensively worked on error handling, cleansing of data, creating lookup files and performing lookups for faster access of data. Used DataStage designer for importing metadata from repository, new job categories and creating new data elements.
  • Documented the design, sequence and functioning of ETL jobs

Environment: IBM Infosphere DataStage and QualityStage 9.1, ClearCase 9.1, Oracle 11g, SQL, PL/SQL Developer, XML, Windows 7, UNIX AIX 6.1/Solaris 9 .

Confidential

Sr. DataStage Developer

Confidential, the company serves as the headquarters for the Americas and employs nearly 3,000 people. Astellas is dedicated to improving the health of people around the world through the provision of innovative and reliable pharmaceutical products.

Responsibilities:

  • Worked closely with Business analysts and Business users to understand the requirements and to build the technical specifications.
  • Responsible to create Source to Target STT mapping documents.
  • Responsible to design, develop and build DataStage parallel jobs using DataStage designer.
  • Developed and supported the Extraction, Transformation and Load process ETL for a Data Warehouse from disparate data sources using DataStage Designer.
  • Designed and developed Parallel jobs to extract data, clean, transform, and to load the target tables using the DataStage Designer
  • Designed developed job sequences to run multiple jobs in the predefined order to populate Data Ware house tables
  • Used DataStage Designer for importing the source and target database schemas, importing and exporting jobs/projects, creating new job categories and table definitions.
  • Designed and developed the Routines and Job Sequence for the ETL jobs
  • Prepared Technical Specification Documents for DataStage Jobs.
  • Involved in Unit testing and integration testing.
  • Responsible for running jobs using Job Sequencers, Job Batches.
  • Developed Parallel jobs using various stages like Join, Merge, Funnel, Lookup, Sort, Transformer, Copy, Remove Duplicate, Filter, Peek, Column Generator, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems
  • Implemented logical and physical data modeling with Star and Snowflake techniques using Erwin in Data warehouse.
  • Extensively worked with Netezza utilities like nzload, nzsql to transform and load data.
  • Extensively used Reject Link, Job Parameters, Parameter Sets and Stage Variables in developing jobs
  • Used the DataStage Director to run, schedule, monitor, and test the application on development, and to obtain the performance statistics.
  • Developed Packages and customized functions and Triggers based upon the business logics.
  • Involved in Performance tuning of complex queries.
  • Developing Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the functionality for various modules.
  • Researched and developed functional specifications for an ecommerce project based on best market practices and contributed to the business growth

Environment: IBM DataStage 8.5/8.1.1 DataStage and Quality Stage, Oracle 10.2., Oracle SQL Developer, Netezza TwinFin, Aginity Netezza WorkBench, XML, Autosys, Windows XP, RedHat Linux 6.5.2.

Confidential

Sr. ETL Developer

Confidential provides financial services through over 175 bank locations in and around Tennessee. Individual Banking: Checking Accounts, Savings CDs, Loans Credit Cards, Investments Retirement, Online Services, Financial Planning, Insurance. West point, PA

Responsibilities:

  • Interacted with Business Analysts to gather ETL requirements
  • Used Datastage designer for importing metadata from repository, new job categories and creating new data elements.
  • Designed and Developed ETL jobs according to business requirements using Datastage tool to load data warehouse and Data Mart.x
  • Used SAP R/3 Load IDoc Stage to load data into different SAP target Tables.
  • Run UNIX Shell Scripts to concatenate files before loading data into target tables
  • Wrote custom batch job control to run multiple instances of jobs to get the parallel job processing capability
  • Load data into target Data Warehouse tables using Oracle PL/SQL stored procedures.
  • Analyzed data from multiple source systems
  • Initiated meetings between Development Architecture teams
  • Created and documented ETL mapping documents to implements business logic
  • Extensively involved in Developing Shared Containers.
  • Used Complex Flat Files to read Cobol Copy books and load data into Sql Server Tables
  • Used Director for executing, analyzing logs and scheduling the jobs.
  • Extensively worked with Team to design complex DataStage jobs.
  • Identifying and addressing the data quality issues.
  • Optimization/Tuning of code to improve performance

Environment: IBM Infosphere DataStage and Quality Stage 8.01/ Ascential DataStage 7.5 Parallel Extender, SAP R/3, SQL Server2000, Ascential Version Control, DB2 UDB, AQT.

Confidential

Datastage Developer

Responsibilities:

  • Interacted with Business Users to understand the business requirements and in identifying data sources.
  • Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.
  • Used DataStage stages Column Export, Column Import, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
  • Developed job sequence with proper job dependencies, execute command stages, user variable activities, job control stages, triggers.
  • Used Zeke job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments.
  • Reviewed reports on Mainframe TSO ISPF environment and allocated datasets on the mainframe using Classic Federation jobs.
  • Created shared containers to simplify job design.
  • Involved in performance tuning of the DataStage jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.
  • Worked on change management system on code migrations from Dev to QA to Prod environments.
  • Extensively worked on building ETL interfaces to read and write data from DB2 data base using DB2 Enterprise Stage and DB2 API Stage.
  • Involved in functional and technical meetings and responsible for creating ETL Source to Target mapping documents.
  • Modifying the existing jobs and Datasets according to the changing business rules
  • Developed jobs for transforming the data and stages like Join, Merge, Lookup, Funnel, Transformer, Pivot and Aggregator
  • Experience with Scheduling tool Zeke for automating the ETL process.

Environment: Ascential DataStage 7.5 Designer, Director, Manager and Administrator, Test Director, Clear Case, Zeke, K-Shell Scripts, Mainframe TSO ISPF, IBM DB2 9.1, AIX 5.3, WinSQL for DB2, PeopleSoft 9.2

Confidential

Data Warehouse Analyst

Responsibilities:

  • Coordinated the requirement gathering stage using Rational Requisite Pro which included interviews, brainstorms and meetings with stakeholders
  • Extensively used DataStage Manager, Designer, Administrator and Director for creating and implementing jobs.
  • Extensively worked on error handling, cleansing of data, creating lookup files and performing lookups for faster access of data.
  • Used Dynamic RDBMS stage to pull the data from source.
  • Used DataStage Manager to import the Metadata from sources and targets.
  • Involved in creating technical documentation for source to target mapping procedures to facilitate better understanding of the process and incorporate changes as and when necessary.
  • Tuned DataStage transformations and jobs to enhance performance.
  • Developed DataStage jobs based on business requirements using various stages like lookup file, lookup stage, join stage, merge stage and sort stage.
  • Used Shared Containers for code reuse and implementing complex business logic.
  • Involved in creating designing and mapping documents for DataStage jobs.
  • Involved in writing shell scripts for reading parameters from files and invoking DataStage jobs.
  • Extensively worked Teradata utilities like Tpump, Multiload, FastLoad, FastExport

Environment: Ascential DataStage 7.5 Designer, Manager, Director, Flat Files, Microsoft SQL Server 2005/2000, Teradata V2R5, Oracle 8i, Shell scripting, Sybase, HP- UX, Windows 2000

Confidential

PL/SQL Developer

Responsibilities:

  • Assisted the PM in developing Use Cases and project plans and also managed changes to the scope of the project.
  • Conducted user interviews, gathered Requirements, analyzed the Requirements and managed changes using Rational Suite.
  • Conducted interviews with end-users to collect requirement and business process information.
  • Involved in High Level Design Document and Low Level Design Document for oracle ETL process development.
  • Developed complex sql queries PL/SQL scripts. Developed PL/SQL scripts per requirements.
  • Involved in development of table, keys, constraints, sequencer, global temporary tables, view and materialized view per project design.
  • Developed UNIX Shell scripts per project requirements.
  • Used SQL Loader to perform load operation to complete ad hoc tasks.
  • Wrote database triggers for audit and data validation.
  • Designed the entity-relationship model and normalized database.
  • Conducted re-sizing of data files, managed table spaces and data files.
  • Reviewed and participated in developing test cases with QA team.
  • Worked to fix defects loaded in Test Director by QA team.

Environment: Oracle, UNIX, TOAD, SQL LOADER, SQL Developer, Mercury Test Director, PVCS Tracker

We'd love your feedback!