- Over 9 years of experience in Software Development, administration and Expertise in Extracting data from heterogeneous sources using ETL Mappings and scripts in Informatica 9.6.1/8x/7.1.4
- Had extensive knowledge on end to end data flow in Master data management (MDM) from sources and back to sources via MDM inbounds (LND), Staging and BO tables using Informatica MDM 10.1/9.7.
- Experience in working on different databases like Teradata, Oracle, SQL Server, DB2, MS Access and Writing efficient and complex SQLs on huge volumes of data.
- Expert in implementing business rules by using corresponding Source, Targets and Transformations like Source Qualifier, Sequence Generator, Filter, Router, Joiner, Lookup, Expression, Update Strategy, Aggregator, Data masking and WebServicesConsumer to populate the data.
- Proficient in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
- Developed Mappings/ Mapplets to load the data into Datamart using Slowly Changing Dimension (SCD) methodologies. (SCD Type 1, Type 2, and Type 3)
- Excellent experience in tuning Informatica Mappings and Sessions for optimum performance.
- Involved in massive data cleansing prior to data staging.
- Good Knowledge on applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformation and loading confidential data into targets.
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Experience in UNIX shell scripting and file management in various UNIX environments.
- Worked extensively on Web services for automating mappings in MDM and Powercenter by passing parameters through SoapUi.
- Expertise in Data Warehouse Administration Console (DAC) to configure, monitor and schedule ETL.
- Experience in developing OBIEE Repository (.rpd) at three layers (Physical Layer, Business Model & Presentation Layer), Time Series Objects, Interactive Dashboards with drill - down capabilities using global & local Filters, Security Setup (groups, access / query/report privileges), configured Analytics Metadata objects (Subject Area, Table, Column), Web Catalog Objects (Dashboard, Pages, Folders, Reports) .
- Expertise in implementing Oracle BI Apps in all phases like implementing Out of order Box prebuilt mappings, Designing ETL, metadata management using DAC, Creating Oracle provided Data Warehouse, OBIEE RPD development, OBIEE reports and Dashboards.
- Good Knowledge in Developing and customizing Reports using OBIEE 10.1.3.4.1/10.1.3.4.0/10.1.3.3/10.1.3.2, Oracle Reports 6i/4.5/2.5 Discoverer 10g/4i/3i, XML Publisher 5.6.3/5.6.2, SOA Gateway.
- Expertise in Oracle BI Publisher, Oracle BI Answers, BI Interactive Dashboards and Oracle BI Server.
- Well versed with using SQL and PL/SQL for BI Publisher and XML Publisher and SOA gateway.
- Knowledge Genie in Informatica Test Data Management Data Masking Tool for data subset and data security.
- Exceptionally well organized, strong work ethics and willingness to work hard to achieve employer objectives.
- Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.
Programming Languages: SQL, PL/SQL, C, Java 1.3/1.4/1.5/1.6, HTML, XML
ETL Tools: Informatica 9.6.1/8.x/7.x, Informatica MDM 10.1/9.7, Data Masking
BI Tools: OBIEE 11g, 10.1.3.x, Siebel Analytics 7.x.x., BI Apps 7.9.x.x.
Database: Oracle 11g /10g/9i/8i/7.x, SQL Server 2005/2000, MS Access, Teradata
Scheduling Tools: DAC (Data warehouse Administration Console)
Data Modelling Tools: Erwin, Toad, and Microsoft Visio
Oracle Tools: SQL Developer, TOAD, SQL * Loader, Oracle Developer 2000
Scripting Languages: Java script, UNIX shell script
Operating System: Windows Vista, Win 95/98/2000/2003/2007/ XP, Linux, UNIX
Confidential, Midland, TX
ETL Informatica Developer/IDQ/TDM
- Supporting the ETL Inbound for heterogeneous sources using ETL CDC (ChangeDataCapture) Mappings and scripts in Informatica Powercenter.
- Copied subset of secured data from Production databases to development and Testing environments of Test Data Management (TDM).
- Build mappings to extract data and load to ASCII4 files.
- Involved in Performance Tuning of Informatica ETL mappings and sessions.
- Involved in identifying and resolving potential bottlenecks.
- Involved in Database Designing reviews.
- Worked on building mappings in such a way that scheduling tool invokes Informatica mappings using Web services.
- Involved in meetings to determined Data quality rules.
- Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment with SOA gateway.
Environment: Informatica Powercenter 9.6.1 HotFix3, Informatica Data Analyst (IDA), SoapUI 5.2.1, Oracle11g, PL/SQL, SQL-Server2008, UNIX.
Confidential, Houston, TX
ETL Informatica ETL/IDQ Developer
- Supporting the ETL Inbound for heterogeneous sources using ETL CDC (ChangeDataCapture) Mappings and scripts in Informatica Powercenter.
- Copied subset of secured data from Production databases to development and Testing environments using Informatica Test Data Management by applying masking techniques.
- Created test data subsets for cleansing team to test and part of team to create data cleansing rules.
- Involve in discussion with source system users to gather requirements for new Inbounds.
- Building custom cleanse functions using IDQ for Informatica Data Analyst IDA to take data cleansing to the next level.
- Building the requirement specifications for ETL.
- Built custom IDQ mapplets and import them in Powercenter to perform data cleansing and to raising the DQs (errors).
- Developed DQ mappings in Powercenter using mapplets that are designed in IDQ.
- Built several MDM land to staging mappings in Informatica MDM HUB.
- Working on building the match and merge rules. Working with Trust and Validation rules in order to arrive at survivorship of multiple attribute values coming from different sources.
- Working with User Exits to provide additional functionality at various levels of Batch Jobs in HUB.
- Worked on Informatica web services Consumer transformation to fetch the MDM executed batch related data using SIF API WSDL by passing parameters to Powercenter mapping through SoapUI.
- Gained extensive knowledge on SIF APIs in communication between external systems and MDM HUB.
- Worked on automating the Powercenter mappings using Web Services through SoapUI.
- Interacting with users to notify and get the corrections done in order to make sure the WELL changes are appropriate across the systems and hence making sure the integrity of the data.
- Supporting, maintaining, installing, and customizing Informatica Powercenter 9.x systems and related components.
- Handling day to day migration requests with in Informatica over the different environments and IDQ to Informatica.
- Supporting Informatica application performance monitoring, server restarts, capacity planning, and tuning.
- Coordinating with different teams like DBA, Network and Linux admin team to troubleshoot the various issues.
Environment: Informatica Powercenter 9.6.1 HotFix3, Informatica Siperian MDM 10.1, Informatica Data Quality(IDQ) 9.6.1, Informatica Data Analyst(IDA), SoapUI 5.2.1, Oracle11g, PL/SQL, SQL-Server2008, Toad, UNIX.
Confidential, Houston, TX
ETL Informatica ETL/IDQ Developer
- Interacted with Business Analysts and Data Analysts for analysis of business requirements.
- Involved in documenting the business requirements and the technical requirements based on the understanding of the existing systems.
- Involved in preparing mapping documents outlining source systems, target systems and transformation logics used.
- Applied Change Data Capture (CDC) methodology using Informatica Powercenter Designer to extract data from the source databases- SQL server, Oracle, Teradata, Hadoop(HDFS) and flat files into a staging area in Oracle database
- Applied incremental loading technique in Informatica Mappings load the data from staging area to ODS- Operational Data Store in Oracle database.
- Developed Mappings/ Mapplets to load the data into Datamart in Oracle database using Slowly Changing Dimension (SCD) methodologies. (SCD Type 1, Type 2, and Type 3 have been used)
- Developed Mappings for Error Handling to fix any errors captured in Datamart after they are fixed in the source system.
- Worked extensively on reusable and non-reusable transformations in simple and complex mappings. Examples of transformations used in Designer- Source Qualifier, Expression, Filter, Normalizer, Aggregator, Lookup, Update Strategy, Sequence generator, Joiner Transformations, Router, Rank, Data masking and WebServicesConsumer etc.
- Developed Informatica mappings by usage of SQL overrides in Lookups (connected and unconnected), source filter in Source Qualifier, overrides in Update Strategy and data flow management into multiple targets using Router transformations.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Developed Fast Load and MLoad scripts in control file, developed BTEQ scripts to process the data in to Landing.
- Built complex Informatica mappings/ mapplets using Mapping Variables and Parameter Files.
- Built mappings to mask the production data using Data masking transformation which had confidential data related to agreements made by the client.
- Used Informatica Workflow Manager for creating reusable and non-reusable sessions and worklets. Examples of tasks used in Workflow Manager- Session, Event Wait, Event Raise, Email, Decision, Worklets etc.
- Responsible for scheduling and launching workflows to run at specified times and recovering the failed sessions.
- Responsible for identifying the missed records in different stages from source to target and resolving the issue.
- Involved in monitoring Informatica loads using Informatica Workflow Monitor in Development, Integration, and UAT- User Acceptance Testing and Production environments.
- Responsible for data testing, validation and documentation at each level.
- Developed UNIX shell scripts as part of the ETL process for loading.
- Created test data subsets for testing team where I was a part of building test cases.
- Supporting the ETL Inbound for the Legacy MDM solution built.
- Debugged existing MDM outbound views and changed it according to the requirement.
- Worked on Informatica Data Quality toolset and has proficiency in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
- Worked in building mappings for populating data into the MDM Landing tables by processing the errors as DQ violations and re-processing them.
- Worked on adding new DQ violation checks to the existing mappings by building IDQ mapplets for data cleansing.
- Debugged several DQ related issues and fixed them.
- Built mappings that fetch data from MDM HUB outbound JMS message Queues where published XML messages are processed and feed back to the sources.
Environment: Informatica 9.5.1, Informatica Data Quality (IDQ) 9.5.1, Informatica MDM 9.7, Toad 184.108.40.206, WinSCP, PuTTy, Oracle 11g, UNIX, SQL
Informatica ETL/IDQ Developer
- Teamed with business analysts to deduce the business requirements to produce effective technology solutions
- Responsible for understanding the business requirements and Functional Specifications document and prepared the Source to Target Mapping document following Organization standards.
- Worked closely with Business Analyst and involved in Data Analysis to identify the business rules.
- Developed reusable objects like Mapplets and worklets combined with user defined functions to use across multiple mappings to pull data into MDM from heterogeneous sources like flat files, Oracle, SQL server, and Epic and Cerner database tables.
- Designed various mappings and Mapplets using different transformations such as key generator, match, labeler, case converter, standardizer, Address Validator, parser and lookup.
- Analyzing the issues and providing the solutions to dev and test teams.
- Complex Business Rules are implemented to identify error records.
- Involved in optimization to identify the bottlenecks for the existing jobs.
- Develop debugging and testing mechanism for the data flow mappings.
- Developed complex SCD-2 mappings in process of building operational data store (ODS).
- Preparing test scripts and reviewing the test results.
- Monitor and running the Powercenter workflows, MDM Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes
- Involved in fine tuning SQL overrides and Look-up SQL overrides for performance enhancements, Executed Pre and Post session commands on Source and Target database Optimizing the mappings by changing the logic to reduce run time.
- Handling issues raised by clients in Production Support environment.
- Good experience to work with different task such as Session, Command, Event wait, Event Raise, Timer, Assignment etc.
- Performed Unit testing and Performance Tuning testing.
- Troubleshooting load failure issues and data quality issues on a day to day basis.
- Identified and eliminated duplicates in datasets thorough IDQ components.
- Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them.
- Worked with the offshore team and supervised on their development activity
Environment: HealthInsuranceData, Informatica 9.1, IDQ, MDM, Oracle 10g, Epic and Cerner, Toad, PL/SQL, UNIX.
Sr. BI APPS Consultant
- Configured Oracle Business Intelligence Application (BIAPPS). Including setting up Oracle Business Analytics Warehouse (OBAW), using Database Administration Console (DAC).
- Worked extensively on performance tuning including creating aggregate tables, caching and event polling tables.
- Created Financial Analytics reports which are used to analyze Trial Balance, Intercompany Balances, Journal Inquiries, Flow Analysis, Expense Analysis, Treasury Movement of Cash analysis and Payable reports.
- Involved in developing Initial and Incremental ETL process i.e., Mappings & Mapplets using various Transformations and Workflows with Sessions/Tasks
- Developed many complex Informatica ETL process and Oracle Packages i.e., Mappings & Mapplets using various Transformations and Workflows with Sessions / Tasks.
- Developed many Reports/Dashboards with different Analytics Views Drill-Down & Drill-Across, Pivot Table, Graph, View Selector, and Column Selector, with global & local Filters using Oracle BI Web.
- Used Filters (global and local) and provided customized Prompts to avoid excessive data showing up in the reports.
- Built BI Publisher reports with various templates by using BI Publisher Desktop and scheduled to users based on the requirement.
- Used DAC to schedule and run the Full and Incremental ETL loads and to schedule the Informatica jobs in Oracle Business Analytic Warehouse.
- Configured Intelligence dashboards by including embedded content such as links/images & HTML objects.
- Upgraded Rpd, Web Catalogs from OBIEE 10.1.3.4 to OBIEE 220.127.116.11.
- Configured Usage Tracking and developed reports for BA’s to track the users using the reports based on Subject Areas.
- Created Application Roles in Oracle Enterprise Manager 11g and set the permissions on dashboard pages using the application roles.
- Involved in performance tuning by setting Cache, limiting the number of initialization blocks, limiting select table types, pushing the calculation to the database.
- Provided end-user training and documentation and second-line support to power users to develop dashboards, reporting metrics and reports.
- Involved in Repository Configuration, Troubleshooting, Migration and Server Administration of DEV, QA and PROD environment
Environment: OBIEE 18.104.22.168, OBIEE 10.1.3.4, OBIA 7.9.6, DAC 7.5, Informatica 8.6.1, Oracle Applications R12, Seibel Analytics (Financials, Order Management, HR Analytics, Procurement & Spend, Project Analytics), Linux, UNIX, SQL DEV, TOAD.
Confidential, Raleigh, NC
Informatica / Teradata developer
- Understanding of the Requirements Documents and interact with client team to get clarifications for issues if any.
- Participating in weekly all-hands meeting with onsite teams, participating in client interactions to resolve any business logic issues/concerns, participated in knowledge transition activities, escalate any techno-functional issues to the leads, and ensuring about the task timelines and quality.
- Designed and developed end-to-end ETL process
- Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata 14.
- Write Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build Unix shell script to perform ETL interfaces BTEQ, FastLoad
- Created ETL Scripts & Procedures to extract data and populate the data warehouse using Oracle Gateway to Teradata.
- Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.
- Creating schematic diagrams for depicting the pseudo mapping logic, and workflow logic. Creation of Unit test cases for each component in workflow and for other tasks as part of ETL process.
- Conduct/participate in the peer reviews of Mapping Documents - Get the sign-off from client to start construction phase.
- Unit test the ETL workflow & mappings - fixing the defects which came out of unit testing and if needed, make modifications to the documents to ensure that those are up to date as per the functionality in the system.
- Migrate the code from one environment to other environments
Environment:: Informatica Power Center 8.6.1, Teradata SQL Assistant.TPT, Bteq, Fastload, Multiload, SQLA, Oracle 11g, SQL Server 2012/2008, UNIX, Flat Files
- Coordinated in the client installations and integration of OBIEE 10.1.3.2, DAC and Informatica 7.1.4
- Interacted with business representatives for requirement analysis, and to define business and function/technical specifications.
- Customized OBIEE OOTB metadata repository for AR, AP and GL.
- Created the new columns in existing fact and dimension tables and creating all together the new dimension and fact tables.
- Imported the custom facts and dimension tables and configured it in the 3 layers of the RPD. Implemented Level Based Metrics in the BMM layer. Created Compound facts in the BMM as per the client requirement.
- Written RRD (Report requirement document) RDD (Report development document) TDD (Technical design document).
- Designed Schema using custom fact, dimensions, physical, logical, Alias and extension tables
- Created Dimension Hierarchies in the business model and mapping Layer to drill down custom Store and product dimensions.
- Implemented conditional formatting using OBI Publisher.
- Customized the out of the box Informatica SDE and SIL mappings to include mappings for new columns created in the existing facts/Dimensions or created new SDE and SIL mappings for new facts/dimensions.
- Created Security settings in OBIEE Admin Tool to setup groups, access privileges and query privileges.
- Performance tuning in OBIEE for custom made star schemas by implementing aggregate navigation and cache management.
- Developed different reports using Siebel Dashboard and Answers like views & charts, pivot and narrative, tabular used global and local filters.
- Created the Custom Dashboard for the custom reports and aligned the reports with in the dashboard according to client requirement.
- Tuned Performance for the reports by implementing aggregate navigation and cache management.
- Configured iBots and Delivers to send alerts to subscribed users.
Environment: OBIEE 10.1.3.2, OBIEE Publisher 10.1.3.2, EBS, OBI Applications, Informatica 7.1.4, Oracle 10g, SQL, Toad, Linux.
- Extracted source definitions from various databases like oracle, DB2 and SQL Server into the Informatica Power Center repository.
- Worked on Dimensional modeling to Design and develop STAR Schemas, used ER-win, identifying Fact and Dimension Tables.
- Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
- Worked with different Operation Data Sources such as Oracle, SQL Server and Legacy Systems, Excel, Flat files.
- Used Informatica to extract data into Data Warehouse.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Used Sever manager for session management, database connection management and scheduling of jobs to be run in the batch process
- Created dimensions model star schemas using Kimball methodology.
- Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for different Health Plan Systems to facilitate Daily, Monthly and Yearly Loading of Data.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data
- Preparation of Test Execution Report.
- Arrived at the dimension model of the OLAP data marts in Erwin.
- Recover the failed Sessions.
- Optimized the mappings by changing the logic and reduced running time.
- Finished the tasks within the allocated time for every release, always we are on Time and on Target.
Technologies: Informatica Power Center 6.0, Business Objects 5.1, Oracle, UNIX & Windows.