Sr. Informatica Developer Resume
New Orleans, LA
SUMMARY:
- Over all 8+ years’ work experience in ETL (Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts using Data Integration Tool Informatica, 9.x/8.x, SSIS, SSRS, Penthao Kettle in Banking, Insurance, Retail, Telecom and Health care, Medicare Medicaid departments.
- Hands on experience on the Migration of Informatica Version from 9.1 to 9.6.
- Interactions with
- Experience in the Implementation of full lifecycle in Enterprise Data warehouse, Operational Data Store (ODS) and Business Data Marts with Dimensional modeling techniques, Star Schema and Snow flake Schema using.
- Hands on Experience in Performance tuning, Optimization in Informatica and Databases.
- Hands on experience in the reconciliation process and audit tables.
- Strong Experience in ETL design, ETL Architecture Solution, Development and maintenance.
- Expertise on Informatica Mappings, Mapplet, Sessions, Workflows and Work lets for data loads.
- Certified Experience in Informatica Performance Tuning of Targets, Sources, Sessions, Mappings and Transformations .
- Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, and Data Validation.
- Good knowledge in interacting with Informatica Data Quality (IDQ).
- Experience in managing onsite - offshore teams and coordinated test execution across locations.
- Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.
- Implementing a system for a fast growing Indian financial market.
- Extensive experience on Slowly Changing Dimensions - Type 1, Type 2 in different mappings as per the requirements.
- Databases like Oracle, DB2, SQL Server, Microsoft Access and Worked on integrating data from Flat files like fixed width /delimited, XML files and COBOL files .
- Experience in writing Stored Procedures, Functions, Triggers and Views on Oracle 10g/9i/8i, DB2, SQL Server 2008/2005/2000, DB2, materialized TSQL.
- Extensively worked on Monitoring and Scheduling of Jobs using UNIX Shell Scripts.
- Worked with PMCMD to interact with Informatica Server from command line and execute the Shell script.
- Experience on ER Data Modeling in developing Fact & Dimensional tables, Logical and Physical models.
- Expertise on tools like Toad, Autosys, and SQL Server Management Studio .
- Involved in Unit testing, Functional testing and User Acceptance testing on UNIX and Windows Environment.
- Completed documentation in relation to detailed work plans, mapping documents.
TECHNICAL SKILLS:
ETL Tools: Inoformatica Power Center 9 X, 8X, SSIS, SSRS, Penthao Kettle
Micorsoft BI Suite: SSRS, SSAS, Cubes
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling
Database: Oracle, SQL Server, Teradata, Nateeza, DB2
DB Tools: TOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Teradata SQL Assistant, SSIS, SSRS, OBIEE, SSAS, RDMBS, DTS Packages, SAS, Visual Studio.
Programming Language: PL/SQL, Shell Scripting
Environment: Windows, Linux
Schedulers: Autosys, Informatica Scheduler, SQL Agent Services
PROFESSIONAL EXPERIENCE:
Confidential, New Orleans, LA
Sr. Informatica Developer
Responsibilities:
- Responsible for gathering requirement of the project by directly interacting with client and made analysis accordingly.
- Helped in building the New ETL Design.
- Coordinated the work flow between onsite and offshore teams.
- Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
- Extracting, Scrubbing and Transforming data from Main Frame Files, Oracle, SQL Server, DB2,Teradata and then loading into Oracle database using Informatica, SSIS
- Worked on optimizing the ETL procedures in Informatica.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.
- Used Type 1 and Type 2 mappings to update Slowly Changing Dimension Tables.
- Involved in the performance tuning process by identifying and optimizing source, target, and mapping and session bottlenecks.
- Configured incremental aggregator transformation functions to improve the performance of data loading. Worked Database level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.
- Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Informatica Data Quality (IDQ 8.6.1) is used here for data quality measurement.
- Used Active batch scheduling tool for scheduling jobs.
- Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.
- Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.
- Delivered projects working in Onsite-Offshore model . Directly responsible for deliverables.
- Developed UNIX Shell scripts for calling the Informatica mappings and running the tasks on a daily basis.
- Created & automated UNIX scripts to run sessions, Handling of Dynamic files on desired date & time for imports.
Environment: Informatica Power Center 9.6.1, IDQ,SSIS, SSRS, Team Foundation Server, Share Point, Oracle12, TOAD, Erwin 7.0, Unix, SQL Server 2012, Autosys, Windows.
Confidential, MD
Data Integration Programmer
Responsibilities:
- Worked on establishing ETL Design in such a way that each ETL Jobs identified to data model (subject area).
- Responsible to build System design, ETL data flow diagram, Building of Data Mapping Sheet and Run Book.
- Monitor loads and troubleshoot any issues that arise.
- Build Dynamic parameters ETL Process in Importing the data from mainframe
- Setting up dynamic parameter files for source and target connections, which is updated as per batch.
- Build Dr. Address transformation to validate the address.
- Build in depth test Cases and automate the test cases by writing the stored procedures. And did the impact analsyt for downstream applications.
- Worked on data profiling and automate the porcess of test cases
- Data mapping sheets for subject areas like Claims, Procedure, Drugs, Provider, Recipient(Members),Finance, Prior Authorization
- Completed training for HIPAA complaint policies
- Building Views for validating data mapping sheets
- Building views in the integration layer
- Building views for data validation of source system SAK
- Writing Functions/procedures for data validation
- Writing shell scripts for pre and post command sessions
- Setting up reconciliation ETL design for staging and Integration layer
- Setting up Event ID and Batch ID for the ETL process.
- Deployment of Objects by using Shell Scripting and making the DG group in Informatica.
Environment: Informatica Power Center 9.1, 9.1 Power Exchange, IDQ, Oracle 11g, SQL server, MS Visio, ALM, Mainframes, ERP,UNIX, Power Exchange., MDM, SharePoint, SSIS, SSRS.
Confidential, Los Angeles, CA
Sr. ETL Developer
Responsibilities:
- Developed complex Informatica power Center Mappings with transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Sequence Generator, and Update Strategy.
- Created Mapplet, reusable transformations and used them in different mappings.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Implemented Star schema logical, physical dimensional modeling techniques for data warehousing dimensional and fact tables using Erwin tool.
- Tidal Scheduler was implemented for scheduling of Informatica workflows.
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplet and Transformation objects.
- Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
- Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files using Tidal Scheduler
- Worked on handling performance issues of Informatica Mappings, evaluating current logic for tuning possibilities, Created PL/SQL procedures triggers views for better performance.
- Tuned SQL Queries in Source qualifier Transformation for better performance.
- Tuning the ETL-Informatica code in Mapping level, and session level.
- Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
- Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing.
- Provided production support by monitoring the processes running daily, Provided data to the reporting team for their daily, weekly and monthly reports.
- Involved in team weekly and by monthly status meetings.
Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Oracle, SQL Server, MySQL, Toad, SQL, PL/SQL (Stored Procedure, Trigger, Packages), Erwin, MS Visio, Windows XP, AIX, UNIX Shell Scripts.
Confidential, Houston, TX
Informatica Developer
Responsibilities:
- Worked with Informatica Power Center to move data from multiple sources into a common target area ERP System.
- Coordinated with Business system analyst (BSA) to understand and gather the requirements.
- Accordingly, Documented user requirements, Translated requirements into system solutions and developed implementation plan and schedule.
- Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions.
- Involved in writing the Technical Requirements Document, Mapping Documents, and Low Level Design Documents.
- Designed and developed mappings using Source Qualifier, Aggregator, Joiner, Lookup, and Sequence Generator, Stored Procedure, Expression, Filter and Rank transformations.
- The Mapping Designer module used for ETL tasks to upload the data from various centers with data in different source systems to build Data Marts.
- Created Informatica parameter files with workflows and sessions.
- Extensively used SQL overrides at Source Qualifier and Lookup Transformations while extracting data from multiple tables.
- Tuned mappings to perform better using different logics to provide maximum efficiency and performance.
- Categorized different alerts and exception types to be built from User’s Data Validation list and built mappings to generate exceptions and alerts accordingly.
- Tested the target data against source system tables.
- Used Command task to move the parameter files to the desired position at the start of the session.
- Worked with pre-session and post-session sql and stored procedures
- Worked with Performance tuning of the Mappings, Sessions and SQL Queries Optimization.
- Coordination with OBIEE Team regarding the reporting data for Answers and BI Publisher reports and also for the Dashboard Designs.
- Involved in requesting input forms and creating log-in Windows for users to input the feeds.
- Performed Validations and Worked on Quality center tool for defect resolution.
- Performed Informatica Code Migration across environments.
- Documentation of ETL extracts developed using the Informatica extract tool.
Environment: Informatica Power Center 8.x, Oracle, SQL server 2008, Informatica power exchange, Main Frame, XML Files.
Confidential, OK
Informatica Developer
Responsibilities:
- Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
- Created and Modified T-SQL stored procedures for data retrieval from MS SQL SERVER database.
- Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
- Extensively used Informatica Power Center to extract data from various sources and load in to staging database.
- Extensively worked with SSIS.
- Build Dash Board by SSRS.
- Designed the mappings between sources (external files and databases) to operational staging targets.
- Extensive work experience in the areas of Banking, Finance, Insurance and Manufacturing Industries.
- Involved in data cleansing, mapping transformations and loading activities.
- Tuend the SSIS, SSRS Broken Jobs
- Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
- Performing ETL & database code migrations across environments.
Environment: SSIS, SSRS, PL/SQL, MS Access, Oracle, Windows.