We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

4.00/5 (Submit Your Rating)

NyC

PROFESSIONAL SUMMARY

  • 9+ years of IT experience with data integration, data migration, data quality, data profiling,data virtualization, data validation and Data Warehousing implementation across various industries.
  • Experience in all the phases ofData warehouse life cycleinvolving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Experience in working withbusiness analysts to identify study and understandrequirements and translated them into ETL code.
  • Experience in Business Model development withDimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, and Cache Management.
  • Extensively worked on theETL mappings, analysis and documentation ofOLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Well versed inOLTP Data Modeling, Data warehousing concepts.
  • Strong noledge ofEntity - Relationship concept, Facts and dimensions tables, slowly changing dimensionsandDimensional Modeling (Star Schema and Snow Flake Schema).
  • Strong background in ETL Data Warehousing using Informatica PowerCenter.
  • Strong noledge in Informatica Data Quality (IDQ) with Informatica Developer tool, Informatica Analyst Tool.
  • Experienced in using IDQ tool for Profiling, applying rules, Creating Scorecards and develop mappings to move data from source to target systems.
  • Extensively used all IDQ Developer transformations like Match, Exception, Association, Consolidation, standardizer and Parser etc.
  • Excellent noledge of IDQ Addressdoctor/validator to correct addresses, identify deliverable address.
  • Experience in integration of various data sources likeOracle, DB2, Teradata, Sybase, SQL server, MS access andnon-relational sources likeflat filesinto staging area.
  • Experience in creatingReusable Transformations(Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) andMappingsusing Informatica Designerand processing tasks usingWorkflow Managerto move data from multiple sources into targets.
  • Experience in creatingReusable Tasks(Sessions, Command, Email) andReusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
  • Experienced inUNIXwork environment,file transfers,job schedulingand error handling.
  • Exposure to Data Profiling in Informatica Data Explorer & SSIS.
  • Extensively work donedevelopinganddebuggingInformaticamappings,
  • mapplets,Sessions andworkflows.
  • Good exposure inInformatica Data Qualitywhere data Cleansing, De-duping and Address correction were performed.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTILOAD, FASTLOAD and Informatica.
  • Worked onPerformance Tuning,identifyingandresolving performance bottlenecksin various levels like sources, targets, mappings and sessions.
  • Experience in writing, testing and implementation of thePL/SQL triggers, stored procedures, functions, packages.
  • Involved increating Test data,Unit testing, System testingto check whether the data loads into target are accurate.
  • Expertise in using Tableau to join the data from heterogeneous systems and generating the reports according to the user requirements
  • Exposure to Hive - Hue, Pig and Impala. Ability to write queries for business data analysis with huge volumes of data
  • Good experience and expertise in Business and Data Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Experience in support and noledge transfer to the production team.
  • Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
  • Extensive functional and technical exposure. Experience working on high-visibility projects

TECHNICAL SKILLS

Databases: Oracle 7.x/8.x/9x/10g/11g, SQL Server 2012/2008/2005/2003/2000, DB2 UDB 7.2, Mysql 5.0/4.1, MS-Access. Editors (SQL Navigator, Toad), Netezza, Teradata,Oracle

ETL Tools: Informatica 8.x/9.x, Informatica Data Quality 9.5, Pentaho Kettle 7.x

Data Modeling: ERWIN, Visio.

Reporting Tools: MS SQL Server Reporting services, Business Objects, Tableau Desktop.

Database Skills: SQL, PL/SQL.

Scripting Language-Python:

Operating Systems: Windows 2007/XP/2000, UNIX.

PROFESSIONAL EXPERIENCE:

Confidential,NYC

Senior ETL Developer

Responsibilities

  • Understanding the requirements from the various source systems dat provides the transactional data to the DB - Record Retention
  • Analysis, Design & Development (Performance Engineering) of new requirements from the regulators to onboard the new systems with the record level approach which uses Informatica 9.6.1, Oracle 11G and UNIX
  • Deal with various source types (Flat File, XML, JSON, Relational Database, JMS)
  • Performance tuning and optimization of various complex SQL queries
  • Involve in Data and Solution Architecture discussions and contribute TEMPeffectively to make the system reliable and secure
  • Data transformation using python.
  • Onboarding the new source systems.
  • Loaded data in Hive tables for data analysis.
  • Worked on Trade cycle analysis, gap analysis,Trade mappings,data dictionary.
  • Developed and tested all the backend programs,stored Procedures, data flows and update processes.
  • Worked on both structured and unstructured data.
  • Write documentation to describe program development, logic, coding, testing, changes and corrections.

Tools and Environment:Oracle 11g,Powercenter 9.5, Flat Files, XLS, Windows,Centera,UNIX,Python,Hive,Hue,Spark.

Confidential, Philadelphia,PA

ETL /IDQ Developer

Responsibilities

  • Documented user requirements, translate requirements into system solutions and develop implementation plan and schedule.
  • Worked with Informatica Data Quality 9.5.1(IDQ) Toolkit for analyzing, standardizing, cleansing, matching, conversion, exception handling, reporting and monitoring the data.
  • Experience in implementing Data quality rules using IDQ.
  • Developed complex mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, and Rank.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the Bottlenecks, Removed the Bottlenecks and implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Developed and tested all the backend programs,stored Procedures, data flows and update processes.
  • Write documentation to describe program development, logic, coding, testing, changes and corrections.

Tools and Environment:SQL Server management studio 2012,,Powercenter 9.5, IDQ 9.5, Flat Files, XLS, Windows.

Confidential, NYC, NY

ETL /IDQ Developer

Responsibilities

  • Documented user requirements, translate requirements into system solutions and develop implementation plan and schedule.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Experience in working with Salesforce components as creating mappings to pull data from salesforce using salesforce.com and troubleshoot the failed workflow
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator,Connected Lookup, Unconnected lookup,Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).
  • Worked on detailed analysis of Master data based on attributes such as Data Completeness, conformity, consistency, accuracy, duplication and Integrity. Analysis involved creating a presentation.
  • Worked with various developer modules like profiling, standardization and matching.
  • Showing various counts, creating Scorecard using Informatica Analyst and Trend analysis report.
  • Utilized Informatica Data Quality (IDQ) software to provide name and address cleansing for improving data quality.
  • Analyze and extract data from different source systems, transform and load as per targeted business requirements.
  • Developed Mappings between source systems and warehouse components.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, mappings and update processes.
  • Write documentation to describe program development, logic, coding, testing, changes and corrections.
  • Perform analysis and resolution of Help Desk Tickets and maintenance for assigned applications.

Tools and Environment:Informatica Powercenter 9.1, IDQ 9.1 Teradata 14.0,Salesforce.com,SQL Server, Flat Files, XLS, Windows, Unix,Microstrategy.

Confidential, Tallahassee, FL

ETL Developer

Responsibilities

  • Documented user requirements, translate requirements into system solutions and develop implementation plan and schedule.
  • Analyze and extract data from different source systems, transform and load as per targeted business requirements.
  • Developed data Models/Mappings between source systems and warehouse components.
  • Used Kettle to Develop ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, mappings and update processes.
  • Creation of database objects like tables, views, materialized views, procedures and packages .
  • Write documentation to describe program development, logic, coding, testing, changes and corrections.
  • Perform analysis and resolution of Help Desk Tickets and maintenance for assigned applications.

Tools and Environment:Pentaho Data Integration tools(Kettle),Oracle, PL/SQL, Flat Files, XLS, XML, Windows.

Confidential

ETL Developer

Responsibilities

  • Analyzed relationships of Flat Files and to extract the analyzed systems, met with end users and business units in order to define the requirements
  • Documented user requirements, translate requirements into system solutions and develop implementation plan and schedule.
  • Extracted data from excel files, high volume of data sets from data files, Oracle, DB2,SalesForce.com(SFDC) using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area
  • Developed data Mappings between source systems and warehouse components.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Informatica PowerCenter Server.
  • Responsible to tune ETL procedures and STAR Schemas to optimize load and query performance.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Generated context filters and used performance actions while handling huge volume of data.
  • Generated tableau dashboards for sales with forecast and reference lines.

Tools and Environment:Informatica PowerCenter 9.1, Oracle, Tableau (7.x), PL/SQL, Flat Files, MS Access, XML, Windows.

Confidential

ETL Developer

Responsibilities:

  • Analyzed the source system and involved indesigning the ETL data load.
  • Developed/designed Informatica mappingsby translating the business requirements.
  • Worked in various transformations like Lookup, Joiner, Sorter, Aggregator, Router, Rank and Source Qualifier to create complex mapping.
  • Involved in performance tuning of the Informatica mappings using various components like Parameter Files, round robin and Key range partitioning to ensure source and target bottlenecks were removed.
  • Implemented documentation standards and practices to make mappings easier to maintain.
  • Extensive SQL querying for Data Analysisand wrote, executed, performance tuned SQL Queries for Data Analysis & Profiling. Extracted business rule and implemented business logic to extract and load.
  • Worked with Teradata utilities like FastLoad and MultiLoad.
  • Involved in automating retail prepaid system process. Created packages and dependencies of the processes.
  • Identified common issues in Established Dashboards and Business reports.
  • Created automating retail prepaid system process; created packages and dependencies of the processes.
  • Used Autosysfor scheduling various data cleansing scripts and loading processes; maintained the batch processes using UNIX Scripts.
  • Monitor & troubleshoot batches and sessions for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Tuned the mappingsby removing the Source/Target bottlenecks and Expressions to improve the throughput of the data loads.
  • Tools and Environment: Informatica PowerCenter 8.x, Oracle 10g, PL/SQL, MS SQL Server,Business Objects, Autosys.

Confidential

ETL Developer

Responsibilities:

  • Responsible fordeveloping, support and maintenance for the ETL(Extract, Transform and Load) processes usingInformatica PowerCenter
  • DevelopMappings and Workflowsto generate staging files.
  • Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Created multiple Mapplets. Workflows, Tasks, database connections using Workflow Manager.
  • Created sessions and batches to move data at specific intervals & on demand using Server Manager.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Recovering the failed Sessions and Batches.
  • Extracted the datafrom Oracle, DB2, CSV,SQL Server and Flat files.
  • Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance. Understanding the Functional Requirements.
  • Designed the dimension modelof the OLAP data marts.
  • Preparing the documents for test data loading.
  • Tools and Environment: Informatica PowerCenter 8.x,Oracle 9i, Teradata, Erwin, TOAD, DB2 and Flat files.

We'd love your feedback!