We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Seven plus years of progressive IT experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • Experience in coding using SQL, TOAD, SQL*Plus, PL/SQL procedures/functions, triggers and exceptions.
  • Experience in Design and maintenance of financial, Pharmaceutical Data Warehouse applications.
  • Experience in Data Warehouse Life Cycle and performed ETL procedure to load data from different sources into data warehouse using Informatica Power Center.
  • Understanding & Working knowledge of Informatica CDC (Change Data Capture).
  • Experience working with scheduling tools like Autosys, IS602 DIS, Control M and Maestro.
  • Expertise in OLTP, OLAP, System Study, Analysis and E - R modeling. Developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modeling.
  • Experience in profiling the data and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad).
  • Six Plus years of Database experience using DB2, IBM ISeries (DB2), Oracle 10g, MS SQL Server 2005/2000 and Sybase.
  • Worked on all SDLC methodologies and importantly CRD (Charles River Development) Methodology, Agile methodology, Waterfall Methodology.
  • Experience in UNIX, Java Programming working environment, writing UNIX shell scripts for Informatica pre & post session operations.
  • Experience on developing / architecting BI Solutions for supporting financial, sales, and marketing performance reporting
  • Excellent communication and interpersonal skills. Ability to work effectively as a team member as well as an individual.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.x/8.x, Power Exchange Change Data Capture (CDC) 8.1/8.5.1,IDQ, Data Profiling, Data cleansing, OLAP, ROLAP, MOLAP, SQL*Plus, SQL*Loader.

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERwin 4.0/3.5.2.

Databases: DB2, Oracle 11i/10g, Sybase, Mainframe, MS SQL Server 2005/2000, MS Access 2000.

Programming: SQL, PL/SQL, Visual Basic 6.0/5.0, HTML, C, UNIX Shell Scripting, Java Programming

Environment: IBM-AIX4.0/3.1,Mainframe, SunSolaris9/8/7/2.6/2.5, LINUX, Windows2000/XP, WinNT4.0, ERWIN, TOAD, AQT (Advance Query Tool).

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Senior ETL Developer

Responsibilities:

  • Interviewed business users, business analyst and project design team asking detailed questions to gather business requirement information concerning the project and requirements.
  • Followed agile methodology throughout the project.
  • Created low level design documents and unit test documents for the ETL jobs.
  • Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Developed Complex mappings in Informatica ETL tool to load the data from various sources using different transformations like Custom, Union, Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations. Used debugger to test the mapping and fixed the bugs.
  • Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
  • Implemented the business rules and Slowing changing dimensional types logic.
  • Developed Pre-Session and Post-Session SQL commands.
  • Designed and developed Oracle SQL Procedures and wrote SQL scripts code for extracting data to system.
  • Developed Advance PL/SQLpackages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQLNavigator. Generated server side PL/SQLscripts for data manipulation and validation and materialized views for remote instances.
  • Performed Database Administration of all database objects including tables, clusters, indexes, views, sequences packages and procedures.
  • Used SQL*Loader as an ETL tool to load data into the staging tables.
  • Maintained existing data migration program with occasional upgrades and enhancements.
  • Worked with IS602 DIS (Data Integration Scheduler). This new job scheduler will accommodate various types of job flows, including Informatica workflows, and interdependencies among our multiple data processing environments.
  • Used SQL server code to create DIS jobs.
  • Documented all DIS jobs and design details in SharePoint.
  • Used TFS (Team Foundation Server) for version control of informatica, oracle SQL and SQL Server Codes so that it gets deployed to QA and PROD through Release Management resources.
  • Worked closely with the QA team to understand the defects and provide code resolutions.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Identified bottlenecks and performed performance optimization for Informatica mappings and SQL overrides.
  • Participated in the release management review meetings for code implementations.

Environment: Informatica Power Center 9.5.1,power Exchange 9.5.1 Oracle 11g, Microsoft SQL Server 2008,Microsoft Team Foundation Server 2010,SQL Developer, SQL code, IS602DIS(Data Integration Scheduler) .

Confidential, Portland, OR

ATG Ecommerce Re-platform Project:

Responsibilities:

  • Participated in project kickoff, design review, resource allocation and key milestones and deliverables meetings.
  • Involved in the ETL technical design discussions and prepared ETL high level technical design document.
  • Set up daily working session to make sure we implement like for like implementation of ATG to EDW. Also understand the business requirement for the marketing intelligence.
  • Created data lineage documents for User profiles, Order and Offers from the source ATG to various channels to EDW, data at rest.
  • Extracted data from oracle, SQL server, Lotus notes, XML, MS excel and Flat files, transformed and loaded to common Oracle staging area.
  • Developed the extract, transformation and load process (ETL) for profiling the Data in the Teradata from the source systems (Relational & Files) using informatica Power Center.
  • Designed Complex mappings, Used Lookup (connected and unconnected), Update strategy and filter transformations for loading historical data.
  • Wrote complex PLSQL functions/procedures/packages to generate Table DDL's, Session XML's, Parameter files and load functions to load target tables.
  • Expertise in performance tuning the user queries. execution of frequently used SQL operations and improve the performance
  • Involved in performance tuning of Informatica code and PLSQL scripts
  • Reviewed multiple design change approaches during the actual development of the project.
  • Reviewed ETL Informatica code, Teradata Database object definitions, UNIX shell scripts and design documentation. Provided feedback after the review.
  • Created tables, views in Teradata, according to the requirements.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse.
  • Fixing invalid Mappings, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Created Reference/Master data for profiling using IDQ Analyst tools. Used the Address Doctor Geo-coding table to validate the address and performed exception handling reporting and monitoring the data.
  • Profiled the data and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Performed benchmarking on mapping and session level to increase the performance of the ETL jobs.
  • Performed Unit testing on code and performed data validation.
  • Followed agile methodology throughout the development lifecycle to show project progress with minimum defects in production.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Set up User Acceptance Test meeting with our Marketing intelligence team (Business Users) and our ATG website Front end team.
  • Expert in developing, designing and testing ETL processes for Enterprise system.
  • Design and Documented JSON requests according to the front end pages.
  • Closely involved in the deployment plan and periodically notifying the status of the deployment to the management team.
  • Conducted post production validation of all the production deployed objects.
  • Production warranty was provided for a month and the code was constantly monitored.

Environment: Informatica Power Center 9.1, Power Exchange 9.1, Java, Informatica Address Doctor, Informatica Data Transformation, Mainframe, Teradata 13.10.03, UNIX and Windows 7, ESP Mainframe Scheduler.

Confidential, St Paul, MN

Senior Informatica Specialist/Lead

Responsibilities:

  • Developed high level technical design specification and low level specifications based on the business requirements.
  • Extensively used Informatica client tools (Source Analyzer, Warehouse Designer, Mapping Designer and Workflow Manager).
  • Used Informatica Designer for developing mappings, using transformations, which includes aggregate, Update, lookup, Expression, Filter, Sequence Generator, Router, and Joiner etc.
  • Created reusable transformations and mapplets and used them in mappings to reduce redundancy in coding.
  • Extensively used Informatica Power Exchange Change Data Capture (CDC) for creation of Data Maps using Mainframe Tables.
  • Responsible for creating the DATA MAPS, extracting (incremental) CDC data from Mainframe sources, exporting and updating them to the repository, importing the required source files on the staging environment by using Informatica Power Exchange
  • Design/developed and managed Power Center upgrades from v7.x to v8.5. Integrate and managed workload of Power Exchange CDC.
  • Coded number of batch and online programs using COBOL-DB2-JCL.
  • Designed Complex mappings, Used Lookup (connected and unconnected), Update strategy and filter transformations for loading historical data.
  • Extensively used SQL commands in workflows prior to extracting the data in the ETL tool.
  • Implemented different tasks in workflows which included Sessions, Command Task, Decision Task, Timer, Assignment, Event-Wait, Event-Raise Control, E-Mail etc.
  • Used Debugger to test the data flow and fix the mappings.
  • Involved in Performance tuning of the mappings to improve the performance.
  • Performed Unit Testing and prepared unit testing documentation. Developed the Test Cases and Test Procedures.
  • Extensive use of IDQ for data profiling and quality.
  • Built a Unix Script which checks the Mapping, Session and Workflow names by identifying the Power Center folders, builds an XML and then it zips (tar) all documents based on the names given and deploy them across the environments.
  • Responsible for migration of Target Flat Files across the environments (DEV, IT, UAT, PROD) using Connect Direct.
  • Supported during QA/UAT/PROD deployments.
  • Worked with RM, DBA and DI tool Administrators to migrate code.
  • Scheduled Jobs and box jobs in Autosys and analyzed the Run status of both jobs and box jobs in DB2 Environment.

Environment: Informatica Power Center 8.5.1, Informatica Power Exchange CDC 8.5.1, DB2 Client -v8.2.6 and Mainframe, AQT (Advanced Query Tool), Autosys, Toad, Windows XP, UNIXAmerican Express Co. Inc. Atlanta, GA. Jan 2011 - April 2012

Confidential

Senior Informatica Developer

Responsibilities:

  • Interviewed business users and various customer groups asking detailed questions to gather business requirements, wants, and needs concerning the project and carefully recorded the requirements in a format that can be understood by both business and the technical teams.
  • Worked closely with the end users and decision makers to develop the business logic and business rules to be used in CDE and Informatica as well.
  • CDE tool is used to convert the unstructured and semi-unstructured data the structured data.
  • Created a Mapping with parsers, mappers and serializes to convert the data from the source to the required format.
  • Created an Informatica Mapping which uses the CDE object using complex data transformation.
  • Hands-on experience building Informatica repositories, mappings (including re-useable objects such as mapplets), sessions/batches with Informatica Power Center.
  • Created Mappings, Sessions and Workflows which used mapping parameters, session parameters.
  • Used Workflow Manager for creating validating, testing and running sequential and concurrent batches.
  • Used Parameter files in automated jobs and adhoc jobs and these were entered by the users when they run jobs using web front end.
  • Performed Data cleansing prior to Transformation and Loading for heterogeneous sources.
  • Involved in writing PL/SQL’s-stored procedures and functions which are used in combination with informatica’s ETL Interfaces.
  • Assisted the DBA in identifying the performance bottle necks and tuning them.
  • Actively involved in Performance Tuning, Error handling, Product support.
  • Used Data Analyzer to export the metadata of Data Junction and Mercator mappings to Excel file.
  • Used Subversion Repository to check out the maps and work on it and to check back in to the repository.

Environment: Informatica Power Center 8.1.3, Complex Data Exchange 4.4, DB2 8.0/7.0, SQL, PL/SQL, Mainframe, Power Center Data Analyzer, Subversion, Perl Scripting and Java Scripting.

Confidential, Pittsburgh, PA

Senior Informatica Consultant

Responsibilities:

  • Collection of requirements from business users and analyzed based on the requirements.
  • Designed and built DataMart’s by using Star Schemas. Created Logical and Physical Data models using Erwin 4.5 extensively.
  • Designed the Dimensional Model of the Data Warehouse Confirmation of source data layouts and needs
  • Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Developed Complex mappings in Informatica to load the data from various sources using different transformations like Custom, Union, Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations. Used debugger to test the mapping and fixed the bugs.
  • Developed Mapplets using corresponding Source, Targets and Transformations.
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2/Type3) loads.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of Informatica server.
  • Design source qualifier from File, Relational and Normalize Systems. Assisted in Migrating Repository.
  • Created database triggers for Data Security.
  • Wrote SQL, PL/SQL codes, stored procedures and packages.
  • Error checking and testing of the ETL procedures and programs using Informatica session log.
  • Developed Unix Shell scripts to extract and convert data from different legacy systems in databases.
  • Used Autosys for scheduling the jobs.
  • Designed and developed Oracle PL/SQL Procedures and wrote SQL, PL/SQL scripts code for extracting data to system
  • Designed database for Client Server based applications.
  • Planned and implemented backup strategies for the database.
  • Handled Complex data loading from various sources into Oracle Database using SQL Loader.
  • Performance tuning of applications to improve data access and response time. Tuning Memory Structures and disk striping for I/O Load balancing to improve database response time.
  • Developed and reviewed PL/SQL codes including triggers, cursors, procedures and functions and assisted developers in tuning SQL statements.
  • Training for developers and SQL/PLSQL, Database concepts, tuning of SQL statements, DBA areas and UNIX related areas.
  • Oracle DBA experience includes Planning, Creation, Performance Tuning and reorganization of databases, monitoring table spaces, indexes, extents, and performance and tuning.

Environment: Oracle 7.3, 8i, UNIX.

We'd love your feedback!