Sr. Etl Informatica Developer Resume
Cedar Rapids, IA
SUMMARY:
- About 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center, Power Exchange using Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor and SQL Developer
- Strong experience in creating ETL processes using Informatica 10.2/9.6.0/9.5.0/9.1.0/8.6/8.5/8.1/7.1 .4/7.1.2 using Informatica Designer (joiner, expressions, lookups, filters, stored procedures etc.,), Informatica Workflow Manager (Sessions, Sources, targets, transformations etc., & direct, indirect sources), Informatica Workflow Monitor, Informatica Administration Console and Repository Manager.
- Strong skills in Oracle server side programming which includes writing simple and complex SQLs, PL/SQL, Functions, Procedures, Packages, creation of Oracle Objects - Tables, Views, Materialized views, Triggers, Partition tables, Indexes, Synonyms, User Defined Data Types, Nested Tables, Collections, Pipelined functions, XML DB functions.
- Experienced in creating entity relational & dimensional relational data models using Kimball Methodology i.e. Star Schema and Snowflake Schema.
- Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and worked on integrating data from flat files like fixed width and delimited.
- Expert in using tools like MS SQL Profiler, Index Tuning Wizard, Windows Performance for monitoring and tuning SQL Server performance.
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Hands on experience in data modelling (Erwin) data analysis, data integration, data mapping, ETL/ELT processes and in applying dimensional data modelling(star schema) concepts in data warehouse.
- Using agile methodology for SDLC and utilize scrum meetings for creative and productive work.
- Expertise in full life cycle of ETL (Extraction, Transformation and Loading) using Informatica Power Center (Repository Manager, Server Manager, Mapping Designer, Workflow Manager, Workflow monitor).
- Extensively created mapplets, common functions, reusable transformations, look-ups for better usability.
- Well versed in OLTP Data Modeling, Data warehousing concepts. Analyzed COBOL code for field level analysis.
- Extensively worked on Change Data Capture/Incremental loading of SCD Type I/II. Implemented performance tuning techniques at application, database and system levels.
- Experience in UNIX shell programming.
- Developed strategies for Incremental data extractions as well data migration to load into the Teradata.
- Used Apache Camel with URIs to work directly with any kind of Transport or messaging model such as, ActiveMQ, RabbitMQ.
- Solid understanding of Relational (ROLAP) and Multidimensional (MOLAP) modeling, broad understanding of data warehousing concepts, star and snowflake schema database design methodologies and Meta data management.
- Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant
- Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.
- Expertise building Data Integration APIs to communicate data between various data sources like SQL Server, Oracle, Flat Files, and XML files.
- Expertise in working with different data sources like flat files, XML, Teradata, DB2, and Oracle.
- Experience in SQL-tuning using Hints, and Materialized Views.
- Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 10.2/9.6.0/9.5.0/9.1.0/8.6/8.5/8.1/7.1 , Informatica Power Exchange, Informatica Data Quality (IDQ)
Reporting Tools: Oracle BI Business Intelligence (OBIEE 10.1.3.x, Siebel Analytics 7.x.), SSRS, SSIS, Business Objects, Cognos 8.4 (Awareness).
Dimensional Data modeling: Start & Snow Flake modeling, Erwin, Putty.
Scheduling Tool: OBIEE DAC, Autosys, Tidal, JIRA,Puppet, Chef
Operating System: UNIX, WINDOWS & LINUX
Database: MS SQL Server 2012/2008/2005/2000 , Oracle 11g/10g/9i, Netezza, DB2, Teradata
Database tools: SQL PLUS, Toad, SQL Developer, SQL Assistant
Languages: Basic, PL/SQL, SQL,SQL* Loader, UNIX shell programming, C, C++.
PROFESSIONAL EXPERIENCE:
Confidential, Cedar Rapids, IA
Sr. ETL Informatica Developer
Responsibilities:
- Actively involved in interacting with business users to record user requirements and Business Analysis.
- Translated requirements into business rules & made recommendations for innovative IT solutions.
- Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of star schema and snowflake schema.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Importing source/target tables from the respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.
- Worked with Power Center Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
- Used PL/SQL Developer Tool to do the backend development. Wrote complex Oracle SQL using joins, sub queries and correlated sub queries.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created the design and technical specifications for the ETL process of the project.
- Used Innformatica as an ETLtool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Developed PL/SQL Packages, Procedures and Functions accordance with Business Requirements.
- Implemented Star Schema, snowflake methodologies in enhancing data warehouse.
- Worked on Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
- Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
- Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and maintenance activities of the data warehouse.
- Created Workflows containing command, email, session, decision and a wide variety of tasks. Developed Parameter files for passing values to the mappings for each type of client
- Scheduled batch and sessions using Informatica scheduler and also wrote shell scripts for job scheduling.
- Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.
- Review SQL queries, Create Data Mapping Documents and work with Data Modeling team for analysis and documentation.
- Responsible of checking data from Source system with Nullable and Non Nullable fields.
- Responsible for verifying data types within the table and then for the same field across tables in the database.
Environment : InformaticaPower Center 10.2, PL/SQL, SQL, Microsoft SQL Server 2013/2012, DB2,, Teradata 12, Oracle 11/10g, PL/SQL, Shell Scripting, python, Dynamic SQL, Oracle SQL *Loader, UNIX, OBIEE, Windows-XP
Confidential, Milwaukie, WI
Sr. ETL Informatica Developer
Responsibilities:
- Created technical design specification documents for Extraction, Transformation and Loading Based on the business requirements.
- Extracted the provider records based on given requirement and sending the error messages for the records excluded from state file.
- Created ETL Design documents and technical specifications for ETL team to understand the Star & Snowflake Schema designs related to Fact & Dimension Tables
- Wrote SQL overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
- Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
- Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
- Developed data conversion, quality, cleansing rules and executed data cleansing activities such as data Consolidation, and standardization for the unstructured flat file data.
- Extensively used SQL Scripts and worked in Windows Environment.
- Created a new table that holds process details and auto generate Batch ID to populate the data in the extract, exclusion tables.
- Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
- Developed PL/SQL Scripts to write data into flat files and then loaded data from flat files to custom tables.
- Created data breakpoints and error breakpoints for debugging the mappings. Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
- Exported, imported the mappings, sessions, worklets, and workflows from development to Test Repository and promoted to Production.
- Used Session parameters, Mapping variables, parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Extracted the data from various Flat files and Loaded in Data warehouse Environment and written Unix Shell scripts to move the files across the Servers.
- Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed debugged mappings for failed sessions.
- Involved in creation queues and topics with ActiveMQ, created messages and send to queues.
- Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
- Worked with re-usable sessions, decision task, control task and Email tasks for on success, on failure mails.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Designed and implemented Tables, Functions, Stored Procedures and Triggers in SQL Server 2014 and wrote the SQL queries, Stored Procedures and views.
Environment : Informatica Power Center 10/9.6, EBS, InformaticaIDQ, Microsoft SQL Server 2012/2016, JIRA, DB2,, Teradata 12, ActiveMQ, Oracle 11/10g, PL/SQL, Perl Scripting,, Autosys,, Shell Scripting, python, Dynamic SQL, Oracle SQL *Loader, UNIX, OBIEE, Windows-XP.
Confidential, Phoenix, AZ
ETL Developer
Responsibilities:
- Extensively developed various mappings that incorporated business logic, which performed Extraction, Transformation and Loading of source data into OLAP schema.
- Successfully designed and implemented a data structure with multiple facts/dimension tables that held the source data for building new reports with SSRS.
- Extensively used transformations like Source Qualifier, Expression, Lookup, Update Strategy, Aggregator, Stored Procedure, Filter, Router, Joiner etc. Implemented Lookup transformation to update already existing target tables.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of star schema and snowflake schema.
- Extensively used reusable objects like mapplets, sessions and transformations in the Mappings.
- Developed sequential, concurrent sessions and validated them. Scheduled these sessions using workflow manager.
- Tuned target, source, transformation, mapping, and session to improve session performance.
- Expertise in understanding the reporting requirements.
- Responsible for improving the SSRS performance, as well as testing and debugging stored procedures
- Developed stored procedures and tested the application with Toad. Developed PL/SQL procedures and functions to build business rules which are helpful to extract data from source and load data to target.
- Coordinated with Oracle DBA and UNIX admins on oracle upgrade and server related issues respectively.
- Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, and fixing NULL values.
- Extensively used debugger to find out errors in mappings and later fixed them. Expertise in running and controlling workflows using the pmcmd.
- Fixed or reduced SSIS errors by configuring packages that utilized error logging and event handling.
- Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
- Proactively evaluated the quality and integrity of data by unit test and system test for Informatica mappings as per the business needs.
- Managed open defects and brought them to closure by working closely with Developers during SIT, UAT and Performance testing.
- Involved in converting COBOL code into PL/SQL.
- Writing different test cases for business rules validations during UAT phase and logging the defects in Quality Center.
- Extensively used change data capture concept in Informatica as well as in the Oracle Database to capture the changes to the Datamart.
- Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
- Recommended and implemented best practices for Quality Center implementation and usage.
Environment: Informatica Power Center 9.1, Windows-XP, Informatica Data Quality (IDQ), Power Designer, Power Exchange, Work Flows, ETL, Oracle 11g, Teradata, XML, SSRS, SSIS, Web-Services, MS SQL Server 2008, TOAD 9.6.1, Unix Shell Scripts, Autosys.
Confidential, Charlotte, NC
ETL Developer
Responsibilities:
- Responsible for requirement gathering from various groups. Followed Iterative Waterfall model for Software Development Life Cycle Process (SDLC).
- Designed and developed Informatica mapping codes to build business rules to load data. Extensively worked on Informatica Lookup, stored procedure and update transformations to implement complex rules and business logic.
- Analyzed and created facts and dimension tables.
- Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
- Developed the functionality at the database level PL/SQL using Toad tool as well at the Unix OS level using shell scripting.
- Experience with dimensional modelling using star schema and snowflake models.
- Experienced in identifying and documenting data integration issues, challenges such as duplicate data, nonconfirmed data, and unclean data
- Imported Source/Target Tables from the respective databases and created Reusable Transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator), Mapplets and Mappings using Designer module of Informatica.
- Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
- Participated in PL/SQL and Shell scripts for scheduling periodic load processes.
- Worked extensively on SQL coding to check the data quality coming from the respective parties.
- Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and databases.
- Applied performance tuning techniques for cubes to reduce calculate time and partitioned cubes.
Environment: Informatica Power Center 8.1, Oracle 9i, SQL,SQL server 2008 SQL Server Management Studio, Bitbucket, PL/SQL, SQL*Plus, Windows XP, Unix Shell Scripting.