We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Over five (5+) years of IT experience inTeradata, Netezza, Informatica, Datastage,DB2 and Unixshell scripting.
  • Strong experience using ETL development.
  • Extensive Experience onTeradata database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables within committed timelines.
  • Expertise in maintaining data quality, data organization, metadata and data profiling.
  • Experience in Business Analysis and Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Used Datastage Enterprise Edition/Parallel Extender stages likeDatasets, Sort, Lookup, Change Capture, Funnel, Row Generatorstages in accomplishing the ETL Coding.
  • Extensive experience with development, testing, debugging, implementation, documentation and production support.
  • Experienced in loading data fromFlatFiles, XML Files, Oracle, Teradata, Netezza, DB2, SQL Serverinto Data Warehouse/Data Marts using Informatica and Datastage.
  • Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.
  • Expertise in RDBMS, databaseNormalization and Denormalizationconcepts and principles.
  • Strong skills in coding and debugging Teradata utilities likeFast Load, Fast Export, MultiLoadandTpump.
  • Experience in all Testing Phases:Unit testing, Integration Testing, Regression testing, Performance, Acceptance testing.
  • Sound Knowledge ofData Warehousing concepts,E - R model&Dimensional modeling likeStar Schema, Snowflake Schemaand database architecture forOLTPandOLAP applications, Data Analysis and ETL processes.
  • Createdmapping documentsandwork flowsanddata dictionaries.
  • Extensive experience indata modelingusing Erwin tool.
  • Designed and modeled manyData Martsas per business requirements.
  • Worked extensively in Development of large Projects with complete END to END participation in all areas ofSoftware Development Life Cycle and maintain documentation.
  • Quick adaptability to new technologies and zeal to improve technical skills.
  • Good analytical, programming, problem solving and troubleshooting skills.

TECHNICAL SKILLS

ETL tools: Informatica, Datastage

Programming Language: Teradata SQL, PL/SQL, Shell Scripting

Databases: Teradata, Netezza, DB2, Oracle, SQL Server

Teradata Tools & Utilities: BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant

Data Modeling Tools: ERWIN

Process/Methodologies: Waterfall, Agile Methodology

Scheduler: CA Workload Automation, Autosys

Configuration Management tools: Serena Dimensions, SVN

PROFESSIONAL EXPERIENCE

Confidential

Senior ETL Developer

Responsibilities:

  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of project.
  • Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Creation of jobs sequences.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Coordinate with team members and administer all onsite and offshore work packages.
  • Analyze performance and monitor work with capacity planning.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.

Environment: IBM InfoSphere DataStage 8.5, Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQL Server, IBM DB2,Teradata,ORACLE 11G, CAWA, Unix, Windows.

Confidential

Senior ETL Developer

Responsibilities:

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Modified existing mappings for enhancements of new business requirements.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Involved in production support.
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, IBM ISeries (DB2), MS Access, Windows XP, Toad, SQL developer.

Confidential

Senior ETL Developer

Responsibilities:

  • Analyzed the requirements and framed the business logic for the ETL process.
  • Extracted data from Oracle as one of the source databases.
  • Involved in the ETL design and its documentation.
  • Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
  • Followed Star Schema to design dimension and fact tables.
  • Experienced in handling slowly changing dimensions.
  • Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
  • Responsible for the development, implementation and support of the databases.
  • Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.
  • Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Developed reusable Mapplets and Transformations.
  • Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations
  • Design, develop and Informatica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Scheduled the tasks using Autosys.
  • Loaded the flat files data using Informatica to the staging area.
  • Created SHELL SCRIPTS for generic use.
  • Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.
  • Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.

Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata, Oracle, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO, Autosys, Korn Shell, Quality Center 10.

Confidential

ETL Developer

Responsibilities:

  • CreatedUML DiagramsincludingUse Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration DiagramsandDeployment Diagrams, Data Flow Diagrams (DFDs), ER Diagramsusing MS Visio.
  • Prepared low level technical design document and participated in build/review of theBTEQ Scripts, FastExports, MultiloadsandFast Load scripts, Reviewed Unit Test Plans&System Test cases.
  • UsedSQL Assistantto queryingTeradata tables.
  • Analyzebusiness requirements, designsandwrite technical specificationsto design and redesign solutions.
  • Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage).
  • Worked with complexSQL queriesto test the data generated bythe ETL processagainst the target database.
  • Involved in completesoftware development life-cycle(SDLC) includingrequirements gathering, analysis, design, development, testing, implementationanddeployment.

Environment: IBM InfoSphere DataStage 8.5, Microsoft Visio, Teradata,ORACLE, Query man, Unix, Windows.

Confidential

ETL Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Extracted data from various source systems likeOracle,SQL Serverand flat files as per the requirements.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to Teradata RDBMS using BTEQ, MultiLoad and Fast Load.
  • Created, optimized, reviewed, andexecutedTeradataSQL test queriesto validate transformation rules used in source to target mappings/source views, and to verify data in target tables
  • Performed tuning and optimization of complexSQL queriesusingTeradataExplain.
  • Responsible forCollect Staticson FACT tables.
  • Performance Tuning of sources, Targets, mappings andSQL queriesintransformations Designing, creatingand tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to sup-port normalized and dimensional models.
  • Wrote numerousBTEQ scriptsto run complex queries on theTeradata database.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to theloading process. Streamlined the Teradata scriptsand shell scripts migration process on the UNIX box.
  • DevelopedUNIX shell scriptsto run batch jobs in production.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.

Environment: UNIX, Teradata, Oracle 11g, SQL, PL/SQL,SQL Developer, ER-win, MS VISIO.

We'd love your feedback!