Informatica Technical Lead And Developer Resume
Louisiana, LA
SUMMARY
- 8+ years of Strong data warehousing experience with extensive knowledge in SDLC: Data Analysis, Design, Development, Implementation and Testing using Data Extraction, Data Transformation and Data Loading (ETL) using Informatica Power center 9.5/9.1.1/8.6/8.1/7.1.
- Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
- Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN, ER Studio.
- Experience in BI / Data warehouse projects involving Informatica and Cognos suite of products.
- Experience in working with data warehouses and data marts using Informatica Power center (Designer, Repository Manager, Workflow Manager, and Workflow Monitor).
- Understanding & Working knowledge of Informatica CDC (Change Data Capture).
- Worked as a Technical Team Lead to help my team members resolving any issue related to coding, testing, performance tuning in developing Informatica mappings and workflows.
- Proficient in designing and developing complex standard and re - usable mappings and mapplets using various transformation logics by utilizing the expression, filter, joiner, lookup, router, rank, normalizer, aggregator.
- Extensive Knowledge about Facets.
- Expertise in unit testing at various levels of the ETL as well as SQL, UNIX etc. Good exposure in Informatica
- MDM where data Cleansing, De-Duping and Address correction were performed.
- Developed Strategies for Extraction Transformation Loading (ETL) mechanism.
- Extensive experience in analyzing the Dimensional Data modeling, Star Schema/Snowflake Schema, modeling, FACT & Dimensions tables, Physical & logical data.
- Experience in developing OBIEE/Siebel Analytics Repository (.rpd) - three layers (Physical Layer, Business
- Model & Presentation Layer).
- Experience in customizing reports, dashboards, filters and charts in Oracle Answers.
- Extensive work with PL/SQL, performance tuning of Oracle.
- Experienced with Informatica Power Exchange (5.x/8.x/9.x) for Loading/Retrieving data from mainframe systems.
- Familiarity with Master Data Management (MDM) concepts and Informatica MDM tool -Informatica MDM Hub Console, Hierarchy Manager (HM) and Informatica Data Director (IDD)
- Good experience with configuring Data Warehouse Administration Console (DAC) for creating execution plans, metadata management and monitoring the Informatica ETL Process.
- Configured and registered the custom workflows in DAC.
- Worked with Data Warehouse Administration Console (DAC) to schedule and run full and Incremental ETL loads, to schedule the Informatica jobs, and to customize Execution Plans.
- Experienced Oracle Apps (Oracle E-Business Suite).
- Experience in Data Modeling experience using ERWIN 3.x
- Experience with Sybase as the target for the data marts.
- Extensive Unix Shell Scripting knowledge.
- Extensive experience with relational databases (Sybase 15.x, Oracle 8i/9i/10g, Netezza, Teradata v2 R6, SQL Server 2000/2005). Performed the data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Analyze Big Data by successful installation, configuration and administration of Hadoop ecosystem components and architecture.
- Strong understanding in the principles of Data ware housing using Fact Tables, Dimension Tables, star schema modeling and snowflake schema modeling, Slowly changing dimensions, foreign key concepts, referential integrity.
- Maintained outstanding relationship with Business Analysts and Business Users to identify information needs as per the business requirement Experience working with Onside-Offshore model.
- Willing to relocate: Anywhere
- Authorized to work in the US for any employer
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.1/8.6.1.1/7.1.2/6.2/6.1/5.1 , Pentaho(Kettle), Power Mart 5.0/4.7,IDD, Dataware House Builder, Abintio.
Operating Systems: Windows XP/2000/NT/98/95, Windows 2003 server, Unix, MSDOS RDBMS Oracle 11g,10g/9i/8i/8.0, MS-SQL Server 2005/2000/7.0/6.5
Data Modeling: Erwin 4.1.4/3.5.2
Data Base Tools: SQL Loader, Toad 4.0.3, MS Visio, MS Access
Reporting Tools: Business Objects XI/6.x/5.x, OBIEE 10.x/9.x; Power play and Transformer. Scripting UNIX Shell Scripting
Web Technology: Web Technology HTML, Java script
Languages: SQL, PL/SQL, C, C ++, Java 1.2, XML.
PROFESSIONAL EXPERIENCE
Informatica Technical Lead and Developer
Confidential - Louisiana, LA
Responsibilities:
- Requirement gathering, designing (Technical Design) ETL interfaces, reviewing mapping design by team members as well as leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows, production migration and providing stabilization support.
- Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Perform data modelling for existing mappings and new mappings based on business requirements.
- Responsible for contributing technical functional design, coding, testing and implementation.
- Meet project schedule, performance tune, trouble shoot for problem determination and resolution.
- Work with DBA on any database issues.
- Working with off shore regarding to resolve the production issues and developed new code for data warehouse using Informatica, UNIX and Oracle
- Transform business requirements into comprehensive solutions using the appropriate methods, tools and technologies.
- Coordinate and deliver application development related activities with team of 6 data integration team members to achieve success in design, development and deployment activities.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Analyzing the FACETS requirement and thus conducting gap analysis.
- Modified existing mappings for enhancements of new business requirements.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Conducted code reviews developed by my team mates before moving the code into QA.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Scheduled the tasks using Autosys.
- Created SHELL SCRIPTS for generic use.
- Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.
- Mentor and challenge junior software engineers and other team members, while growing the technical, functional and leadership skills.
Environment: Oracle 11g, CA WA WorkStation, Putty, Informatica Power Center 9.1, MS SQL Management, SQL/PLSQL, Admin Tool, Client/User Interface(UI), Facets, API(Application Program Interface), RMA(Rules Maintenance Application, UNIX(CAHIL Informatica, DEB Oracle RAC), SDLC Deployment(Rally, udeploy, Jenkins, SonarQube, GIT, Fortify)
Sr. ETL Developer
Confidential - Pittsburgh, PA
Responsibilities:
- Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
- Developed technical specifications of the ETL process flow.
- Extensively used ETL to load data from Flat files, PeopleSoft, DB2 and Oracle into Oracle.
- Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.
- Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database.
- Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Involved in configuration of FACETS Subscriber/Member application.
- Used reusable Session for different level of workflows.
- Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on.
- Implemented sending of Post-Session Email once data is loaded.
- Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
- Used Debugger to test the mappings and fixed the bugs.
- Tuned performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
- Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
- Involved in exporting database, table spaces, and tables using Data pump (10g) as well as traditional export/import (until 9i).
- Worked withInformaticaData Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
- Involved in writing UNIX shell scripts to run and schedule batch jobs.
- Configured and installed Informatica MDM Hub server, cleanse Server, resource kit in Development, QA, Pre- Prod and Prod Environments.
- Configured Informatica Data Director IDD in reference to the Data Governance of users IT Managers Data Stewards
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Used Unix Command and Unix Shell Scripting to interact with the server and to move flat files and to load the files in the server.
- Implemented error-processing strategy to reprocess the error data and manage notification of error data to corresponding business team.
- Worked with Informatica workflow monitor in running and debugging its components and monitoring the resulting executable version
- Involved in unit testing and documentation of the ETL process
- Daily monitoring of the mappings that ran the day before and fixing the issues
Environment: Informatica Power Center 9.5, Informatica Multidomain MDM 9.5.0, Informatica IDD, IDQ 8.6.1, Work Flow Manager/Monitor, Erwin 4.0, Facets, Oracle 10g/9i, SQL, PL/SQL, TOAD, SQL * Loader, UNIX / LINUX Shell Scripting, PeopleSoft, Microsoft SQL Server 2012, OBIEE.
Sr. ETL Developer
Confidential - Newark, NJ
Responsibilities:
- Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
- Designed Informatica mappings by translating the business requirements.
- Using the data Integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts.
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
- Developed mappings for customers, Investments and Risk analysis.
- Developed reusable Transformations.
- Responsible for data management, data modeling and data mapping, for writing story card and check if they are implemented on time, for mapping EDI X12 data into XML and then to the FACETS system using Extreme.
- Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Widely used Informatica client tools -- Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
- Used look up, router, filter, joiner, stored procedure, source qualifier, aggregator and update strategy transformations extensively.
- Assisted in adding Physical conceptual data model using Erwin 4.0.
- Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
- Done extensive bulk loading into the target using Oracle SQL Loader.
- Used workflow manager for session management, database connection management and scheduling of jobs.
- Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
- Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
- Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ 8.6.1 application for matching and merging process.
- Utilized of InformaticaIDQ 8.6.1to complete initialdata profiling and matching/removing duplicate data.
- Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement.
- Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
- Good Experience in creating cubes by using Pentaho Schema Workbench
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Involved in the process design documentation of the DW Dimensional Upgrades. Installed, and Documented the Informatica Power Center setup on multiple environments.
Environment: Informatica Power Center 9.5, Work Flow Manager/Monitor, Pentaho schema workbench, Erwin 4.0, Oracle 10g/9i, IDQ 8.6.1, SQL, PL/ SQL, TOAD, SQL * Loader, UNIX /LINUX Shell Scripting, DB2, Facets, Microsoft SQL Server 2012, Tivoli Scheduling Tool, OBIEE.
Sr. ETL Developer
Confidential - Chantilly, VA
Responsibilities:
- Interacted with business analysts, architects and application developer to analyze the data model and gathered the requirements.
- Involved in Code reviews and determining Informatica standards for Mappings/Sessions/ Workflows.
- Created reusable transformations and mapplets and used them in mappings.
- Implemented the slowly changing dimensions (SCD) type1 and type2 to maintain current information and history information in the dimension tables.
- Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, SQL, Union, Lookup, Joiner, XML Source Qualifier, Unconnected lookup transformations.
- Created shell scripts to kick off Informatica workflow/s and PL/SQL procedures.
- Created Database Triggers, Stored Procedure, Functions and Packages.
- Extensively used debugger to trace the errors.
- Extensively tested the code and documented the Unit Test Cases.
- Assisted the testers in system and integration testing and preparing the test cases and test plan.
- Taking the backup of the repository at regular interval depending on the amount of work done.
- Extensively used UNIX scripts for automation of ETL process.
- Extensively worked on tuning mappings and reducing processing times.
- Worked with DBA to identify source and target bottle necks.
- Resolved Inconsistent and Duplicate Data to Support Strategic Goals with Multidomain MDM
- Analyzed existing requirements and consolidated the reports for effective and efficient reporting.
- Involved in production support for the data warehouse Informatica jobs.
- Involved in bug fixing. Debugged the issues during QA level.
Environment: Informatica Power Center 9.5, Sybase 15.x, CA Scheduling, DB Artisian 9.0, YART1.0, UNIX / LINUX Shell Scripting, Informatica MDM, Data Profiling, Data Cleansing, SVN, MDM, Putty, Toad, OBIEE.