Sr. Etl-informatica Developer Resume Profile
Bloomington, IL
PROFESSIONAL SUMMARY:
- Over 8 years of experience in developing and designing of Data warehouse,Data migration, Identity resolution and data quality projects using Informatica products Informatica Power Center 6x/7x/8x/9X and Informatica Power Exchange for SAP .
- Extensive programming experience in Oracle, MS SQL Server, MYSQL and DB2.
- Extensivereporting experience in OBIEE.
- Good knowledge of Hadoop ecosystems, HDFS, Big Data.
- Knowledge on ecosystems like Hive, Pig, Sqoop, Map Reduce and Hbase.
- Strong knowledge of Hadoop and Hive and Hive's analytical functions.
- Experienced in Design, Development, Implementation and Testing of Oracle Databases and ETL Processes.
- Data warehouse architecture use in Kimball.
- Structured and iterative data application development life cycle methodologies.
- Extracted/loaded data from/into diverse source/target systems like Oracle, SQL Server, MY SQL, Sales Force, SAP, Task and Flat Files.
- Well versed in exposing stored procedures as web services in Informatica for data warehousing projects.
- Database end table design and implementation for Data warehouse and related ETL Processes.
- Analysis, Design, Development and Implementation of Data warehouse, ETL Clint/Server applications.
- Extensive working experience in data migration using Informatica Power Center.
- Good knowledge in Data Warehouse implementation using tools like InformaticaPower Mart and Power Center, Power Connect, Power Exchange CDC, IDE Informatica Data Explorer, B2B Data Transformation, B2B data exchange.
- IDQ - Informatica Data Qualityfor ETL Operations, Experience with Informatica RTM Reference Table Manager , Obiee for Reporting and Erwin for Data Modeling.
- Proficient in using Informatica workflow manager, workflow monitor, server manager, PMCMD Informatica command line utility to create, schedule and control workflows, tasks, and sessions.
- Strong in developing data models including Logical, Physical, Conceptual and additionally dimensional modeling using star schema for data warehousing projects.
- Experienced in Tuning Informatica Mappings to identify and remove processing bottlenecks.
- Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Autosys.
- Involved in deployment of Informatica as part of the Release Management of the Application.
- Postproduction support, enhancements and performance tuning.
- Excellent Analytical, Communication skills and Leadership qualities, working in a team and ability to communicate effectively at all levels of the development process.
TECHNICAL SKILLS:
Data Warehousing | Informatica Power Center 9.1/8.6/7.2/6.2/5.1, Designer, Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor Power Exchange, B2B Data Transformation, B2B data exchange IDE IDQ |
OLAP Tools | OBIEE 10.1.3,SSRS |
Dimensional Data Modeling | Erwin r7.2, Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimension Tables. |
Databases | Toad 9.1/8.6,Oracle 10g/9i/8i, SQL, PL/SQL, Tera data, SQL Plus, MS SQL server2008/2005/2000/7.0/6.5,DB2, MS Access 7.0/2000,MYSQL |
Programming | SQL, PL/SQL, SQL Plus 8.0/3.3, C, C , C . |
Environment | MS-Dos, Windows 95/98/NT/2000, Linux Red hat 7.0 , UNIX. |
Tools | HP Quality center9.0, Tidal 5.3.1, Autosys, CA7 scheduling and Pentaho. |
Web Services | SAP, Taleo, SFDC and task. |
Confidential
Role: Sr. ETL-Informatica Developer
Description:
- Confidential project will provide customers the ability to
- Acquire and service all of their State Farm insurance and financial services products across
- All access points. State farm data stored in IMS Corp data base. Moved the data from
- Corp to Haplex staging data base and ICP/TP Layer Final target data base .
Responsibilities:
- Designed the functional specifications are source, target, current proposed process and Interface process flow diagram.
- Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
- Maintaining the Release related documents by Configuration management techniques.
- Involved in data design and modeling, system study, design, and development.
- Created mappings using Informatica Designer to build business rules to load data.
- Created Folders and placed all the tested ETL, DB2 and UNIX scripts in the Staging path for DST Production movement.
- Developed PL/SQL stored procedures and triggers in T-SQL for various data cleansing activities.
- Extensively used ETL and Informatica to load data from DB2 Server, flat files and XML files into the target Oracle 11g database.
- Developed Informatica parameter files to filter the daily data from the source system.
- Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance and Integration.
- Studied Session Log files to find errors in mappings and sessions.
- Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
- Created UNIX shell scripts for Automation and parameter file generation.
- Identify and analyze data discrepancies and data quality issues and work to ensure data consistency and integrity. Performed audit on ETLs and validated source data Vs target table loads.
- Worked on bug fixes on existing Informatica Mappings to produce correct output.
- Identified the bottlenecks in the source, target, mapping, and loading process andsuccessfully attended/resolved the performance issues across this project.
- Helping the team in fixing the technical issues if any and Tuning of Database queries for better performance.
- I am gathering Information from subject matter experts SA's BA's in different Data stores like Auto, Fire and health to creating the document for future reference.
- Experience in production support, ETL executions and resolving root causes.
- Responsible for Unit Testing and Integration testing of mappings and workflows
Environment: Informatica Power Center 9.1, DB2, Oracle 11g,MS SQL Server 2008, Postgresql, ER Studio, Mainframe Rumba , RMS, Control-M, Pentaho Open source , spring sourceandUNIX.
Confidential
Role: Sr. ETL-Informatica Developer.
This project is intended to implement changes to the Issuer Funding System, the FRT, and HEAT-Funding that was identified at the beginning of the current funding process for the 2012 issuer funding cycle.The scope for the implementation of the new system should include interface to MS Dynamics Great Plains Great Plains , the HEAT correspondence database HEAT , the PCAOB's mailing house, and a payment system for brokers and dealers to pay invoices electronically.
Responsibilities:
- Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
- Maintaining the Release related documents by Configuration management techniques.
- Involved in data design and modeling, system study, design, and development by implementing dimensional modeling.
- Designed the functional specifications are source, target, current proposed process and Interface process flow diagram.
- Created mappings using Informatica Designer to build business rules to load data.
- Extensively worked in data Extraction, Transformation and loading from CSV files, XML Files S P Standard Poor's data to Microsoft SQL Server Data base.
- Created Folders and placed all the tested ETL, SQL Server and UNIX scripts in the Staging path for production movement.
- Developed Informatica parameter files to filter the daily data from the source system.
- Implemented Type 2 Slowly Changing Dimensions Methodology to keep track of historical data.
- Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance and Integration and Baseline test.
- Studied Session Log files to find errors in mappings and sessions.
- Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
- Worked closely with DBA in creating the Tables, indexes, views, Index rebuilds.
- Involved table partition in database.
- Automated the jobs thru scheduling using Built in Informatica scheduler, which runs every day by maintaining the data validations.
- Performance tuning has been done to increase the through put for both mapping and session level and T-SQL Queries Optimization as well.
Environment: Informatica Power Center 8.6 and 9.1, SQL Server 2005 and 2008, Flat Files, Windows NTandUNIX.
Confidential
Role: ETL-Informatica Developer.
Confidential services and Professional Services Data Warehouse to Adobe EDW and integrate the data sets. After that I worked Adobe HR Analytics Project. In this Project the first step is to aggregate data from a variety of sources SAP/ECC, SAP/BW, Taleo and HR FLEX Applications into one place, the Adobe Enterprise Data Warehouse.
Responsibilities:
- Worked with Informatica Designer of Functional Description, Scope and Detailed functional requirements.
- Extensively worked in data Extraction, Transformation and loading from Xml files, large volume data and Adobe PDF files to EDW using B2B data transformation and B2B Data exchange.
- Designed and Created data cleansing and validation scripts using Informatica ETL tool.
- Responsible for creating, importing all the required sources and targets to the shared folder.
- Worked with Different sources such as Relational Data bases Oracle, SQL Server, and MY SQL , Flat files, Adobe PDFs, Xml files, Sales Force, SAP/BW, SAP/ECC, Taleo and task.
- Created mappings using Informatica Designer to build business rules to load data.
- Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
- Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
- Involved table partition in database.
- Scheduled the jobs in Dev and Testing environment using Informatica Scheduler.
- Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
- Prepared Unit Test Plans.
- Supported user queries against availability of data from Data Warehouse.
- Performed troubleshooting for non-uniform ness.
Environment: Informatica Power Center 8.6, Informatica B2B data exchange and transformation, Oracle 10g/11, SQL Server, MYSQL, Toad 9.1, Tidal 5.3.1, Flat Files, Windows NT, UNIX, Sales Force, SAP/BW,SAP/ECC, Taleo and Task.
Confidential
Role: ETL-Informatica Developer.
Responsibilities:
- Algorithm design to implement type II slowly changing dimension
- Extensively worked in data Extraction, Transformation and loading from source to target system using BTeq, Fast Load, and Multi Load.
- Involved in writing scripts for loading data to target data Warehouse for BTeq, Fast Load and Multi Load.
- Did error handling and performance tuning in Tera data queries and utilities.
- Did data reconciliation in various source systems and in Tera data.
- Configured sessions, setup workflow and tasks to schedule data loads at desired frequency.
- Provided resolution to issues by correcting and validating specification changes.
- Involved in Creation of RPD.
- Developed dashboards and dashboard pages.
- Created prompts and filters.
- Designed Created and Tested Report in the Dashboard and created the adhoc report according to the client needs.
- Expertise in Building Metadata Model .rpd , Building Physical layer, Business Model and Mapping layer, Presentation layer using Obiee tool along with development of dynamic and interactive Dashboards.
- Implemented Time series functions Ago and to date in BMM layer of the repository.
- Created Dimensional Hierarchies in Logical Layer to achieve Drill down functionality.
- Used Analytics Web Catalog to set up groups, access privileges and query privileges.
- Experience in configuring Interactive Dashboards with drill-down capabilities using global and local Filters, Obiee Security Setup groups, access / query privileges , Metadata Objects and WebCatalog Objects Dashboard, Pages, Folders, Reports.
- Prepared validation scripts for user acceptance testing.
Environment: Informatica Power Center 7.1, Oracle 9i, SQL Server, Toad, Tidel,Teradata, Flat Files, Windows NT, UNIX, OBIEE and HP Quality center.
Confidential
Role: ETL-Informatica Developer.
Responsibilities:
- Requirement gathering and Business Analysis
- Extensively used ETL and Informatica to load data from MS SQL Server, Excel spreadsheet, flat files and XML files into the target Oracle 9i database.
- Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
- Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle 9i.
- Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
- Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
- Scheduled the jobs in Dev, and Testing environment using Informatica Scheduler.
- Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.
- Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
- Created UNIX shell scripting and automation of ETL processes.
Environment: Informatica Power Center 7.x, Oracle 9i, MS SQL Server 2000, UNIX, PL/SQL and SQL.