Sr. Etl Informatica Technical Lead Resume
Reston, VA
SUMMARY:
- Around 8+ years of experience in ETL Architecture, Analysis, design, development, testing, implementation, maintenance and supporting of Enterprise level Data Integration,Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store(ODS)Data Warehouse (DW)/Data Mart (DM), ETL, OLAP, ROLAP Client/Server and Web applications on Windows and Unix platforms.
- Implemented end - to-end tasks effectively throughout the project till its delivery to customer.
- Experience in Federal, State Government, Financial, Insurance, Product based Companies.
- Extensively worked on Repository Server Administration Console, Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Java, web service consumer, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.
- Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool Informatica.
- Experience with Data Extraction, Transformation and Loading from different data sources like Oracle, MS SQL Server etc. into a common analytical data model using Informatica Power Center 7.1/8.1/9.1/9.5.
- Experience in performance tuning of Informatica Sources, Targets, Mappings, and Transformations & Sessions.
- Excellent team player with very good communication skills and leadership qualities.
- Extensive experience in designing the Data models for OLTP & OLAP database system.
- Strong Data Modeling experience using ER diagram, Dimensional data modeling, Star Schema modeling, Snow-flake modeling using tools like Erwin.
- Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, using industry leading Data Modeling tools like ERWIN.
- Highly skilled at configuring OBIEE Metadata Objects including repository, variables, interactive dashboards, and reports.
- Experienced in designing customized interactive dashboards in OBIEE using drill down, guided navigation, prompts, filters, and variables.
- Expert in using OBIEE Answers to create queries, format views, charts, and add user interactivity and dynamic content to enhance the user experience.
- Expertise in working with data extract using Power Exchange tool by COBOL copybooks layouts.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star flake schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
- Expert in understanding the data and designing/Implementing the enterprise platforms like Hadoop Data lake and Huge Data warehouses.
- Experience in Dimensional Data Modeling experience using Data modeling, Erwin Modeling (Erwin 4.0/3.5.5) like Physical & logical data modeling.
- Worked with different data sources non-relational Databases such as Flat files, XML files, and other relational sources such as Oracle, Sql Server,AS400 and DB2.
- Business Intelligence experience using Business Objects 6.5, Web Intelligence 2.5/2.6, Cognos 9, 8Suite,Cognos Impromptu 9.x/8.x, Power play, Transformer, and Impromptu Web Reports.
- Extensive experience in Creating and maintained Database Objects like Tables, Views, Materialized views, Indexes, Constraints, Sequence, Table Partitions, Synonyms and Database Link.
- Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions.
- Strong experience in Oracle database Performance Tuning and extensive experience in SQL * Loader Imports/Exports.
- Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.
- Experience with industry standard methodologies like Waterfall, Agile, and Scrum methodology within the Software Development Life Cycle.
- Test automation experience in developing test automation, including Cucumber, Gherkin.
- Creating Data Structures i.e. Tables & Views and applying the referential Integrity.
- Implemented Data Cleansing, Transformation Scripts, Stored Procedures/Triggers and necessary Test plans to ensure the successful execution of the data loading processes.
- Experience with Business Intelligence using Cognos 8.x.
- Familiar with UNIX Shell Scripting, PL/SQL Coding, Normalized Database creation in relational databases like Oracle 8i/9i/10g/11g, MS-SQL Server 2008 and optimizing the SQL to improve the performance.
- Strong understanding of Performance tuning in Informatica and Databases.
- Worked with Oracle and SQL Server Stored Procedures, Triggers, Index, Restore points and experienced in loading data into Data Warehouse/Data Marts using Informatica.
TECHNICAL SKILLS:
ETL Tools: InformaticaPower Center 9.x/8.x/7.x, Informatica Data Quality 9.x, Star Schema, Snowflake Schema, OLAP, OLTP, Power Exchange 9.x.
Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.0/3.x
Databases: Oracle 9i/8i/10g/11g/12c, SQL Server 2008, DB2,MS Access 2008, Netezza 7.0.x.
Reporting Tools: OBIEE 11.1.1.5, 10.1.3.3/10.1.3.4, Oracle BI Apps, BI Publisher, Cognos 9.x/8.x suite.
GUI: Visual Basic, Macromedia Dreamweaver MX, FrontPage 98/2K, Visio, MS Project,TOAD, Putty, WinSCP, Jenkins.
Programming: SQL, PL/SQL, Cucumber, Gherkin, Ruby, Transact SQL, HTML, DHTML, XML, C, C++, ASP.NET and Shell.
Environment: Windows2000/XP/ 2003/2005/2007/2008/2010, Unix-AIX 5.2/4.3, Linux, WinNT4.0, Autosys, CA7.
WORK EXPERIENCE:
Confidential, Reston, VA
Sr. ETL Informatica Technical Lead
Responsibilities:
- Involved in business requirement analysis and prepared functional requirement document
- Involved in the ETL technical design discussions and prepared ETL high level technical design document
- Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents
- Responsible for mentoring junior developers and Code Review of mappings developed by other junior developers.
- Driving POC initiatives for finding the feasibilities of different traditional and Big data reporting tools with the data lake Spot fire BO, Tableue etc.
- Developed metadata repository and configured metadata objects in all three layers using Oracle BI Administration tool.
- Build repository by importing the data, defining keys and joins, creating business model, defining complex joins, mapping columns and sources, creating measures, and developing subject areas.
- Developed various Reports, Interactive Dashboards with drill-down capabilities, with various charts and views, and tables using global and local Filters.
- Developed Reports and Dashboards with different Analytics Views including Pivot Table, Chart, Gauges, Column Selector, and View Selector with global and local Filters using Oracle BI Presentation Services.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
- Data modeling in Erwin, design of target data models for enterprise data warehouse.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, and Flat Files.
- Created complex Informatica mappings using various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Java, generateRow Java Transformation API, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
- Responsible for best practices like naming conventions, Performance tuning, and Error Handling
- Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
- Solid Expertise in using both Connected and Unconnected Lookup transformations.
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
- Developed Re usable Transformations, and Re Usable Mapplets.
- Worked with Shortcuts across Shared and Non Shared Folders.
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
- Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Worked with Session Logs, and Workflow Logs for Error handling and Troubleshooting in all environment.
- Responsible for Unit Testing and Integration testing of mappings and workflows.
- Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
- Created indexes on the tables for faster retrieval of the data to enhance database performance.
- Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
- Most of data which belongs to various members and Providers will be carried out throughout the development.
- Responsible for Production Support and Issue Resolutions using Session Logs, and Workflow Logs.
- Wrote Unit Test cases and executed unit test scripts successfully
- Supported during QA/UAT/PROD deployments and bug fixes
- BDD/Cucumber approach which allows non-programming QA engineers to write new automated tests.
Environment: Informatica Power Center 9.6.1, Informatica Data Quality 9.6.1, OBIEE-11.1.1.5, OracleBI Apps, Erwin 4.0, Oracle 12c, Main frames, IBM Netezza 7.0.x, XML, SQL, PL/SQL, Unix, Autosys, Cucumber, Aginity workbench, Toad, Cognos, Flat files, Windows 10.
Confidential, Indianapolis, IN
Sr. ETL Informatica/ OBIEE Developer
Responsibilities:
- Worked on Conversion project from ICES to IEDSS using ETL Informatica tool.
- Worked on Informatica Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
- Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
- Involved in modifying existing Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica source and Target Data.
- Analyzed source systems (SQL server, XML and COBOL files) from state of indiana vendor, these files were the source to the Informatica interfaces used to load data to the target downstream systems.
- Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table, and worked mostly on constraint based loading.
- Created stored procedures, sequences and triggers to insert key into the database table.
- Worked with the maps to reconcile the source data through the ODS.
- Functional knowledge of FSSA programs food stamps, medicaid, TANF, SNAP and Medical Assistance for lower income people eligibility determination using IEDSS application.
- Involved in Low level Design for the scripts of the database sequences, constraints, triggers and stored procedures.
- Knowledge of Restore points, Backups,Partitioning of Database tables.
- Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings using CA7 scheduler.
- Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
- Created Low level documents for creating maps to load the data from the ODS through the warehouse.
- Implemented integration of Datamart with Weblogic Server for creating connection pools and Data Sources of Oracle, SQL drivers.
- Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Prepared and used test data/cases to verify accuracy and completeness of ETL process.
- Used Application to navigate through all screens for eligibility determination and did smoke and unit testing.
- Used IBM rational tool (RTC) to maintain the service requests, build requests, defects.
- Coordinated with teams like Architecture team, Batch team, Interfaces team, testing team, Front Office team, DBA team for setting up the environments for various runs in Dev, Sys, UAT, Prod Environments.
Environment: Informatica Power Center 9.5.1, OBIEE-10.1.3.4, JAVA, Oracle 11g, Main frames, XML, SQL, PL/SQL, CA7 Scheduler, Quality Center, Designer, Toad, Cognos9.x, Flat files, CVS, Windows 8.
Confidential, Dallas, TX
Sr. ETL Informatica IDQ Developer
Responsibilities:
- Worked with heterogeneous sources including relational sources and flat files.
- Work with Data modeler to understand the architecture of Data warehouse and mapping documents
- Design mappings related to complex business logic provided by data modeler, which includes Dimensions and Fact tables.
- Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
- Worked on IDQ file configuration at user’s machines and resolved the issues.
- Used IDQ to complete initial data profiling and removing duplicate data.
- Involved in the designing of Dimensional Model and created Star Schema using E/R studio.
- Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of dimensions and fact tables and Technical Specification Document.
- OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.
- Interacting with the front end users to present the proof of concept and to gather the deliverables of the team.
- Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
- Doing research work to resolve production issues and data discarded during workflow runs.
- Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO DATE, DATE COMPARE in Transformations.
- Also used Used-defined Function, which declared once globally and used in various mappings.
- Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance. And maintain database security and disaster recovery procedures. And performed troubleshooting and maintenance of multiple databases. And resolved many database issues in accurate and timely fashion
- Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
- Extensively involved in monitoring the jobs in order to detect and fix unknown bugs and track performance.
- Used Informatica Workflow to create, schedule, monitor and send the messages in case of process failures.
- Involved in Performance Tuning of sources, targets, mappings, sessions and data loads by increasing data cache size, sequence buffer length and target based commit interval.
Environment: Informatica power center 9.5,, JAVA, SQL Server 2008, Greenplum, DB2, XML, SQL, PL/SQL, Linux, Oracle 11g, Tera data, Autosys, Quality Center, Designer, Toad, Business objects, Flat files, Windows 7.
Confidential, San Leandro, CA
Sr. Informatica Developer
Responsibilities:
- Analyze the source systems data and the current reports at the client side to gather the requirements for the Design inception.
- Extracted and transformed data from high volume data sets of fixed width, delimited and relational sources to load into target systems.
- Developed and maintained critical and complex mappings and transformations involving Normalizer, Aggregator, Expression, Joiner, Filter, Sorter, Sequencer, Procedure, connected & unconnected Lookup, Update Strategy and SQL transformations using Informatica Power Center 9.1.1.
- Performance tuning has been done to increase the through put for both mapping and session level for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Developed procurement interfaces between Maximo and SAP using XML, XSLT and Java successfully implementing XML driven Maximo MXES 5.2.
- Debugged mappings and sessions by creating break points using debugger wizard.
- Redesigned some of the existing mappings in the system to meet new business logic.
- Created mapplets, worklets and other transformations that enable the reusability of code.
- Used parameters and variablesextensively in all the mappings, sessions and workflowsfor easier code modification and maintenance.
- Created Pipeline session partitions for concurrent loading of data and to optimize the performancein loadingtarget tables.
- Effectively used error handling logic mechanism for data and process errors.
- Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files.
- Coordinated and lead the complete code migration/merge process from Dev, UAT to Production environments.
- Involved in Unit, data and Integration Testing of Informatica source and Target Data.
- Tuned DB2 SQL queries for better performance.
- Involved in extensive backend testing of various modules and documented the results using Quality Center.
Environment: Informatica Power Center 9.1.1, OBIEE 10.1.3.4, IBM Data power, SQL Server, DB2(AS400 machine), Oracle 10g, Unix, Autosys, Quality Center, Designer, Toad, Cognos 8.x, Flat files, CVS.
Confidential, Palo Alto, CA
ETL Informatica Developer
Responsibilities:
- Involved in the existing OLTP system analysis to determine the data flow.
- Frequent Interactions with the end users and business units in order to define the requirements.
- Created standalone Java application using Array List to read input from MS Excel and updating tables in Oracle for automating database configurations.
- Participated in system design and data modeling (logical and physical).
- Involved inData warehousing architecture to provide Better access to information to faster, better and more informed decision-making.
- Developed customized routine (ETL system) to capture and load data from external (Oracle 8.x,Flat files) data sources into Oracle database.
- Developed Data Mart for Sales and Marketing.
- Extensively used Informatica to load data from Central Data Warehouse.
- Configured Repository Manager, created folders and managed objects in repository manager of Informatica.
- Assigned Permissions and Privileges to Objects.
- Based on the data model, segregated the data, finely-tuned and transformed using Informatica.
- Created Informatica mappings to build business rules to load data.
- Most of the transformations were used like the Source qualifier, Aggregators, lookups, Filters & Sequence.
- Data Integration From GIF (Global Integration Factory) to ODS (Operational Data Store) and From ODS to DM (Data Mart).
- The ETL tool is an integral part of the data warehousing architecture that will be used to model and create target Warehouse, extract data from multiple data sources, transform the data to make it Accessible to business analysis, and loading target data marts.
Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer), Power Exchange 8.6.1, Oracle 10g, Toad, Tera Data, Netezza, Unix-AIX4.3, Ultra Edit 15.10, Text pad 5.4, Business Objects.
Confidential, Cleveland, OH
Informatica Developer
Responsibilities:
- Developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- The project source data (Legacy systems, Flat files, and Relational databases) is extracted from all the operational systems and loaded into the data warehouse, which is the union of all the constituent pieces of the business.
- Analyzed source systems (SQL server and Oracle) from Andesa third party vendor, these files were the source to the Informatica interfaces used to load data to the target downstream systems.
- Defined to implementing the required business transformation rules and logic for ETL design and development using Informatica Power Center as ETL tool.
- Written the technical specifications document and data mapping document.
- Created a design document for data flow process from different source systems to target system.
- Worked on the design and development of the data acquisition process for the data warehouse including the initial load and subsequent refreshes.
- Created reusable transformations and Mapplets to import in the common mappings to avoid complexity in the mappings using Informatica designer.
- Used PL/SQL procedures and functions in mappings to do complex database level operations with Oracle 10g.
- Interfaces are built and automated with Informatica ETL tool, PL/SQL and shell scripting.
- Designing and developing exception handling and data cleansing/standardization procedures
- Involved in migrating the Mappings, Sessions, Workflows from Test environment to Production environment.
- Created Business Objects Queries and Reports by analyzing user requirements and Created complex reports by linking data from multiple sources.
- Created documents for data flow and ETL process using Informatica mappings to support the project once it implemented in production.
Environment: Informatica Power Center 8.1, UNIX, EDW, SQL Server 2005, Oracle10g, TOAD, Cognos 7, Shell Scripting and Windows NT.