We provide IT Staff Augmentation Services!

Etl Developer Resume

0/5 (Submit Your Rating)

Hoboken, NJ


  • More than 7 years of experience in IT industry especially in client/server business systems and Decision support Systems (DSS) analysis, design, development, Testing and implementation.
  • Over 6 years of experience in loading and maintaining Data Warehouses and DataMarts using DataStage ETL processes.
  • Strong knowledge of Extraction Transformation/Verify and Loading (ETL/EVL) processes using Ascential DataStage, UNIX shell scripting, and SQL Loader.
  • Expertise in working with various operational sources like DB2, DB2UDB, SQLServer, Oracle, Teradata, Sybase, Flat Files into a staging area..
  • Extensively worked on Parallel Extender on Orchestrate Environment for splitting bulk data into subsets and to dynamically distribute to all available processors to achieve job performance.
  • Experience in Evolving Strategies and developing Architecture for building a Data warehouse by using data modeling tool Erwin.
  • Designed and Developed Data Models like Star and Snowflake.
  • Excellent database migration experience in Oracle 7.x/8i /9i, DB2UDB, and SQL.
  • Hands on experience in writing, testing and implementation of the triggers, Procedures, functions at Database level using PL/SQL.
  • Extensive experience in loading high volume data, and performance tuning.
  • Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
  • Excellent team member with problem - solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.



OLAP Tools: BUSINESSOBJECTS 4.1/5.0/5.1 (Supervisor, Designer, Reports, Broadcast Agent, Set Analyzer), WEB INTELLIGENCE 2.5 / 2.6, SDK.HIPAA EDI X12 transactions like 834, 837, 835, 270, 271, 276, 277

Programming Languages: COBOL, C, C++, JAVA (1.2), SQL, PL/SQL, and TOAD.

Internet Technologies: HTML 4.0, JAVASCRIPT 1.2, JSP, and SERVLETS.

Databases: ORACLE 7.X/8.0/8i, SQL SERVER 6.5/7.0, DB2, DB2UDB 7.2/8.1 EEE.

Data Modeling Tools: ERWIN, ORACLE DESIGNER.

GUI: ORACLE D2K with FORMS4.x/5.x/6.x, REPORTS 4.x/5.x/6.x, APPLETS, VISUAL BASIC 5.0, Visual C++, CRYSTAL REPORTS.


Operating Systems: Windows NT/2000, UNIX, AIX.Solaris



ETL Developer


  • Worked on Error Handling, Creating Hashed Files and Performing Lookups for Faster access of Data.
  • Created Tables, indexes, synonyms in Oracle 10g Database to Load Data.
  • Used Version Control to get latest versions of updated jobs and sequences
  • Extensively used parallel extender to extract, transform and load data into data warehouse and Data Mart with partitioning in MPP environment.
  • Worked on different stages in PX like Join, Lookup, Funnel, Filter, Merge, Aggregator and Transformer.
  • Defined reference lookups, aggregations, constraints and derivations.
  • Created and used DataStage Shared Containers, Local Containers for DataStage jobs and retrieving Error log information
  • Provided support including off-hour on DataStage issues.
  • Developed parallel Shared Containers and re-used in multiple jobs.
  • Performed Import and Export of DataStage components and table definitions using DataStage Manager.
  • Extensively used the Hash file Stage, Aggregator Stage, Sequential File Stage, Oracle OCI stage
  • Maintained the DataStage server processes.
  • Created new projects and associated data directories on the DataStage Server Node.
  • Created Business Objects reports, Queries with constant interaction with the end users.
  • Wrote Routines for Data Cleansing.
  • Created master controlling sequencer jobs using the DataStage Job Sequencer.
  • Performance tuning of ETL jobs.
  • Used DataStage Director and its run-time engine to schedule running the jobs, testing and debugging its components, and monitoring the resulting executable versions.
  • Worked with other technical teams to set-up and configure the DataStage environments.
  • Interacted with the vendor technical support staff to resolve issues regarding the DataStage product suites.
  • Managed space allocation and usage within the DataStage file systems.
  • Extensively used Ascential DataStage Manager, Designer, and Director for creating and implementing jobs.
  • Tuned the DataStage server processes.
  • Developed UNIX shell scripts for scheduling the jobs and for automation of ETL processes.
  • Unit Test ETL code and create test cases.

Environment: Ascential DataStage 7.5(Server Edition and Parallel Extender), BusinessObjects6.5.1, Webi, Oracle 9i, Oracle 10g, DB2UDB 8.1, SQL, PL/SQL, UNIX Shell Scripts, Windows XP.

Confidential, HOBOKEN,NJ

ETL Developer


  • Experience in Information Technology with special emphasis on design, development, and administration of Database/Data Warehousing/Client-Server applications.
  • Experience in all the phases of the Data warehouse life cycle involving design, development, analysis & testing of Data warehouses using ETL, Data Modeling, Online Analytical Processing & reporting tools.
  • Strong database, data warehousing and ETL background under Oracle, DB2, Teradata, SQL Server and Informatica PowerCenter 6.2/8.
  • Excellent understanding of data marts and multi-dimensional models.
  • Created complex mappings using Expression, Aggregate, Lookup, Update Strategy, Stored Procedure and Router Transformations using informatica power center designer.
  • Created unique primary key values to replace missing primary keys using Sequence Generator Transformation making reusable to use the same Sequence Generator in multiple mappings.
  • Wrote Stored Procedures and Functions to calculate business days, Time dimensions using connected and unconnected stored procedure Transformation in Mapping.
  • Created sequential batches to run batch sessions one after the other and concurrent batches to start all the sessions in the batch at once.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflakes Modeling, fact and dimension tables, Pivot Tables, modeling of data at all the levels: logical & physical.
  • Strong experience in coding using SQL, PL/SQL, Oracle 10g, Procedures/Functions, Triggers and Packages
  • Unit Test, create test cases

Environment: Oracle 8i, Oracle 10g, Erwin 3.5, Informatica-PowerCenter 6.2/8, Erwin 4.5.2 and ETL, Linux, windows2000.


ETL Developer


  • Used Ascential Integrity for cleansing the source data coming heterogeneous sources such as ANSI X12 (fixed width flat files), CSV files, and COBOL files and also loaded using DataStage jobs
  • Used plug-in stages such as FTP, Merge and Pivot and also various stages like Sequential, Hashed, ODBC, DB2, Aggregator, Links Practitioner/Link collector, and Inter-Process.
  • Created Batches (DS job controls) and Sequencers to control set of DataStage jobs.
  • Extensively used pre built-in DataStage transforms, functions and routines for conditioning, conversions, validation and loading, and also developed if required
  • Developed Stored Procedures for complex jobs
  • Created of different types of reports, such as: Master/Detail, Cross Tab and Chart (for trend analysis).
  • Scheduled BO reports using Broad Cast Agent and monitored them through Broad Cast Agent console
  • Involved in unit testing, system testing and integrity testing.
  • Designed Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimensions.
  • Participated in the review of Technical, Business Transformation Requirements Document.
  • Participated in discussions with Team leader, Group Members and Technical Manager regarding any technical and/or Business Requirement issues
  • Involved in the Analysis of Physical Data Model for ETL mapping and the process flow diagrams for all the business functions and involved in designing the procedures for getting the data from all systems to Data Warehousing system.
  • Involved in designing the procedures using Ascential Integrity for data cleansing, used pre-built procedures for cleansing the address data of customer for inturnal business analytical purposes, The data was standardized to store various business units in tables.
  • Implemented extracting, cleansing, transforming, integrating and loading data into Data warehouse using DataStage Designer.
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.
  • Worked with DataStage Director to schedule running the solution, testing and debugging its components and monitoring the resulting executable versions (on adhoc or scheduled basis).
  • Extensively used pre built-in DataStage transforms, functions and routines for conditioning, conversions, validation and loading, and also developed if required.
  • Extensively wrote user SQL coding for overriding for generated SQL query in DataStage.
  • Developed user defined Routines and transformations to implement business logic and Shell scripts to automate file manipulation and data loading procedures

Environment: DataStage 7.1,PeopleSoft, Oracle 9i, DB2 UDB, HP-UNIX, AIX,MS Excel 2000, ERWIN, Business Objects, WEBI, SQL Navigator, SQL * Loader, Solaris and Windows NT

Confidential, Cypress CA

Ascential DataStage Consultant


  • Used DataStage Manager to store, manage reusable Metadata and created custom routines and transforms for the jobs.
  • Experience in Using DataStage Director to Validate, Run, and Schedule and Monitor Data stage jobs.
  • Used DataStage Administrator to assign privileges to user groups.
  • Worked extensively on all the stages such as OCI, Sequential, Aggregator, Hashed Files, Sort, Link Partitioner, Link Collector and ODBC.
  • Developed jobs in Parallel Extender using different stages like Transformer, Aggregator, lookup, Source dataset, external filter, Row generator, column generator, peek stages.
  • Distributed load among different processors by implementing the Partitioning of data in parallel extender.
  • Extensively used MetaBroker for importing metadata from Erwin and export warehouse data to Business Objects for reporting purpose
  • Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer and exported the universe to the Repository to make resources available to the users
  • Extensive use of slice and dice for generating master/detail and tabular reports
  • Used Ascential Metastage for data integration, standardization for loading in oracle data warehouse.
  • Used Ascential Metastage for data management when extracting data from various databases.
  • Experience in using Ascential Quality stage GUI tools for customizing data mart business logics
  • Used Partition methods and collecting methods for implementing parallel processing.
  • Extracted data from oracle, db2 databases, and transformed data and loaded into Oracle data warehouse.
  • Developed UNIX scripts to automate the Data Load processes to the target Data warehouse using Autosys Scheduler.
  • Worked on implementation and production support of X12 EDI Maps Using Mercator 5.0/6.5
  • Making systems changes to comply with HIPPA standards
  • Worked on different HIPAA EDI X12 transactions like 834, 837, 835, 270, 271, 276, 277 etc.

Environment: Ascential DataStage 7.5/7.0, Parallel Extender, (Designer, Director, Manager), DataStage BASIC language Expressions, Oracle 9i, Windows 2000, IBM AIX 4.2/4.1, Java, Sybase,PVCS, MQ Series, Erwin, Toad, Dbase3 Files, MS Access, CSV Files, XML Files, Tivoli Work Scheduler.


Ab Initio Developer


  • Review with team of technical and business resources to understand the scope of the migration project.
  • Identified and analyzed data, reporting needs and highly complex business processes to formulate the best business strategy and approach for the Data Warehouse.
  • Analyzed the mapping document for the listing of transformations applied in the input fields.
  • Setting up the NDM process to transfer the compressed feed files from the FDR to the ETL server.
  • Create DDLs and DMLs in the development environment that is used in the migration process.
  • Created Lookup files from the Billing cycle and the client tables.
  • Developed graphs with multistage components for massive parallelism.
  • Created Duplicate file check graphs to check if the file is a duplicate of any of the previously received files for registering and archiving new files.
  • Ensured that the produced data is clean and integrated for transformation.
  • Checkpoints are created in different phases of the ETL process to ensure the data is extracted, transformed and loaded correctly.
  • Transform and store the data into multifile system for parallel processing.
  • Responsible for automation of Ab Initio graphs using Korn shell scripts.
  • Worked under the Enterprise Meta Environment and created Ab Initio scripts for data conditioning, transformation, validation and loading.
  • Applied business rules for transformation of data as per the target table specifications.
  • Scheduling the jobs in staging area and preparing drafts for each process.
  • Responsible for Cleansing of each process is done before running each job for stress testing.
  • Validations are done using validate component in the Card Holder master ETL process.
  • Created ETL Billing Promo History load process design which describes techniques used for the ETL process, timing and control for the ETL process and sequencing of the different ETL process steps.

Environment: Ab Initio GDE 1.13, Co>Operating System 2.13, Unix, Autosys, Oracle 8i, Queryman, TestDirector 8.0, MS Project office, Visio 2000, TOAD, ERWin, Windows XP.


Software Engineer


  • Involved in development of data entry screens and report generation.
  • The re-order level, safety-level, and cost for each item are stored.
  • The stock verification function helps to keep a check on all materials in the store.
  • Wrote PL/SQL stored procedures & database triggers for enforcing business rules
  • Reports are generated by branch and by product-wise.
  • Designed and Developed Product Installation and Leasing System with Oracle Developer 2000.

Environment: Oracle7.3, SQL, PL/SQL, Forms 3.0/4.5, and Reports 2.5, Windows95 and UNIX

We'd love your feedback!