- Over 5+ years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center from various database sources.
- Experienced with full life cycle of Software Development (Planning, Analysis Design, Deployment, Testing, Integration and Support)
- Extensively worked on different Connectors (AWS s3, Workday, Twitter, Glassdoor) to retrieve the data Using ICS (Informatica cloud Services)
- Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
- Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.
- Strong work experience in Data Warehouse life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatic Power Center - Designer, Workflow Manager, and Workflow Monitor.
- Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
- Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
- Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
- Developing Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the functionality for various modules.
- Extensive knowledge of various Performance Tuning Techniques on Sources, Targets, Mappings and Workflows using Partitions/Parallelization and eliminating Cache Intensive Transformations.
- Strong RDBMS concepts and experience in creating, maintaining and tuning Views, Stored Procedures User Defined Functions and System Functions using SQL Server, T-SQL.
- Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
- Hands on experience working in LINUX, UNIX and Windows environments.
- Good knowledge on data quality measurement using IDQ and IDE
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
- Working experience using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions
- Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
- Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Experience in handling initial/full and incremental loads.
- Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M &Autosys
- Designed, Installed, Configured core Informatica/Siperian Hub components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
- Experience in support and knowledge transfer to the production team.
- Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
- Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.
- Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
ETL Tools: Informatica Power Center 7.1.4 / 8.6 / 9.1.0 , PDI Kettle 4.3.0 Pentaho Data Integrator
Scheduling Tools: Data Warehouse Administration console DAC 10.1.3.4.1 , Informatica Scheduler, Cron tab in Linux
Relational Databases: Oracle 10g / 11g, My SQL, MS SQL Server MSSQLDB2, IDMS
Data Modelers: Sybase Power Designer 16.1, Toad Data Modeler, Erwin
Data Base Tools: Toad, SQL Developer, PL/SQL, SQL, SQL Plus
Other Source Systems / Tools: Oracle EBS 188.8.131.52, Microsoft CRM Dynamics 4.0, BMC Remedy 7.6
Oracle BI Subject Areas: Oracle Financial Analytics 7.9.6, Human Resource Analytics
Reporting Tools: OBIEE 184.108.40.206.1 Oracle Business Intelligence Enterprise Edition, BI Answers
Source Code Versioning Tools: Sub Version, GIT
ETL Informatica Developer
- Gathered user Requirements and designed Source to Target data load specifications based on business rules.
- Used Informatica Power Centre 9.0.1.for extraction, loading and transformation (ETL) of data in the DataMart.
- Participated in the review meetings with functional team to signoff the Technical Design document.
- Involved in Design, Analysis, Implementation, Testing and support of ETL processes
- Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
- Validated the following HIPAA EDI transactions as 837 (Health Care Claims or Encounters), 835 (Health Care Claims payment/Remittance), 270/271 (Eligibility request/Response) and 834 (Enrollment/Disenrollment to a health plan) by developing mappings.
- Developed IDQ mappings using various transformations like Labeler, Standardization, Case Converter, Match & Address validation Transformation.
- Designed, Developed & Supported Extraction, Transformation & Load Process (ETL) for data migration with Informatica Power Center.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Coded PL/SQL stored procedures and successfully used them in the mappings.
- Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
- Scheduled the Jobs by using Informatica scheduler& Jobtrac
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing of Informatica mappings.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning at various levels like mapping level, session level, and database level.
- Provided production support by monitoring the processes running daily.
- Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
- Coordinating with the Offshore team and directly interacting with the client for clarifications & resolutions
- Introduced and created many project related documents for future use/reference.
- Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
- Developing several complex mappings in Informatica a variety of Power enter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
- Built complex reports using SQL scripts.
- Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.
- Created complex mappings to load the data mart & monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
- Ran the workflows on a daily and weekly basis using workflow monitor.
Environment: Informatica 9.0.1, PL/SQL, Informatica Data Quality IDQ 9.6, Informatica 8.6.1, 9.5, Oracle 9i, UNIX, SQL, PL/SQL, Informatica Scheduler, SQL*loader, SQL Developer, Framework Manager, Transformer, Teradata, Oracle 11g, TOAD, Windows Server 2008, UNIX.
Confidential, Madison WI
- Responsible to meet with business stakeholders and other technical team members to Gather and analyze application requirements.
- Worked on source analyzer, Target Designer, Mapping and Mapplet Designer, Workflow manager & Workflow Monitor.
- Created mappings for initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
- Extensively worked on performance tuning of Informatica and IDQ mappings.
- Created Informatica workflows and IDQ mappings for - Batch and Real Time.
- Created complex mappings for full load into target in Power Center Designer using Sorter, Connected Lookup, Unconnected Lookup, Update Strategy, Router, Union etc.
- Created Mapplets to reuse all the set of transformations for all mappings.
- Mappings, Mapplets and Sessions for data loads and data cleansing. Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center.
- Involved in development of Logical and Physical data models that capture current state Developed and tested all the Informatica Data mappings, sessions and workflows - involving several Tasks.
- Responsibilities include creating the sessions and scheduling the sessions.
- Worked on SAS Data management.
- Created various tasks to give various conditions in the workflows.
- Involving in extracting the data from Oracle and Flat files. Developed and implemented various enhancements to the application in the form of Production and new production rollouts.
- Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
- Extensively worked on confirmed Dimensions for the purpose of incremental loading of the target database.
- Created parameters and variables for the reusable sessions.
- Analyzed the various bottlenecks at source, target, mapping and session level.
- Tuning of the mappings and SQL Scripts for a better performance.
- Performed Unit testing on the Informatica code by running in the debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for the optimum performance.
- Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.
Environment: Informatica8.6, IDQ 8.6.1, Teradata, Oracle 10g, PLSQL, DB2, XML, SQL* PLUS, MS Excel, UNIX (AIX), UNIX Shell
Confidential, Nashville, TN
- Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
- Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
- Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
- Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Migration of code between the Environments and maintaining the code backups.
- Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
- Involved in the Unit Testing and Integration testing of the workflows developed.
- Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
- Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
- Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Profiled data on Hadoop to understand the data and identify data quality issues.
- Imported and exported data from relational databases to Hadoop Distributed file system-using Sqoop.
- Developed shell scripts for running batch jobs and scheduling them.
- Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
- Involved in Production Support
Environment: Informatica PowerCenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.
Confidential, Phoenix AZ
- Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
- Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
- Assisted in building the ETL source to Target specification documents
- Effectively communicate with Business Users and Stakeholders.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for the validity of the data from different data sources.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Perform Data Conversion/Data migration using InformaticaPowerCenter.
- Involve in performance tuning for better data migration process.
- Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Automated the jobs using CA7 Scheduler.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
- Created Data Model for the DataMarts.
- Used materialized views to create snapshots of history of main tables and for reporting purpose
- Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5
- Contact with Informatica tech support group regarding the unknown problem
- On-Call support during the weekend
- Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
- Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
- Prepared SQL Queries to validate the data in both source and target databases.
Environment: Informatica 9.5/8.6, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files
Confidential, Richmond VA
- Understanding the Business Design Documents & creating an overall design based on requirements & Business reviews.
- Designed and reviewed the ETL solutions in Informatica Power Center.
- Analyzed the requirements to identify the necessary tables that need to be populated into the staging area.
- Developed Informatica mappings to load data into various fact tables and dimension tables.
- Created Mappings using Mapping Designer to load data from various sources like Oracle, Flat Files, MS SQL Server and XML.
- Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
- Prepared ETL standards, naming conventions and wrote ETL flow documentation.
- Importing source and target tables from their respective databases.
- Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
- Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Performed Repository Administration tasks (Creating Repositories, Users, Assigning privileges, creating backups and recovery)
- Involved in designing the ETL testing strategies for functional, integration & system testing for Data warehouse implementation.
- Implemented versioning of folders in the Informatica Designer tool.
- Used Parameter files to specify DB Connection parameters for sources.
- Used debugger to test the mapping and fixed the bugs.
- Created mappings with flat-file from different ERP systems and used for Change Data Capture to reduce load on SAP BI.
- Prepared unit test cases to meet the business requirements and performed unit testing of mappings.
Environment: Informatica PowerCenter 9.5.1, Teradata 14, Oracle 11g, SQL Server 2012, PL/SQL, T-SQL, Toad, Erwin, Teradata SQL Assistant, SQL Server Management Studio, JIRA, Unix, Win 7.
- Analyze project plan to gather the business requirement.
- Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
- Extract data from flat files, Oracle, SQL Plus, MS SQL Server 2008 and to load the data into the target database.
- Extensively used Informatica Power Center 7.1 an ETL tool to extract, transform and load data from remote sources to DW.
- Created Complex mappings using transformation like Filter, Expression, Joiner, Aggregator, Router and Stored Procedure transformations for populating target table in efficient manner.
- Developed complex joins in the mappings to process data from different sources.
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors of target data load.
- Developed work flow tasks like reusable Email, Event wait, Timer, Command, Decision.
- Performed unit testing of Informatica sessions, batches and the target Data.
- Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
- Effectively utilized shared/ persistent caching techniques and incremental aggregation strategies to improve performance.
- Designed and developed UNIX shell scripts as part of the pre-session and post-session command to automate the process of loading, pulling, renaming and pushing data from and to different servers.
- Designed and developed SQL, PL/SQL and UNIX shell scripts.
Environment: Informatica Power center Designer 7, Oracle 9.x/10g, TOAD, Flat Files, UNIX, MS SQL Server 2008, SQL, PL/SQL, and SQL PLUS