- Seven Plus (7+) Years of Total IT Experience in Analysis, Design, Development, Implementation, Testing and maintenance of Business Intelligence solutions using Data Warehousing/Data mart design, ETL, OLAP,OLTP client /server applications.
- Strong Data warehousing and ETL experience using Informatica Power Center 9.5/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x,Meta data Manager, IDQ Developer/Analyst 9.5.1, Warehouse Designer, Source Analyzer, mapping Designer, Transformation Developer, Mapplet Designer, Mapping Designer.
- Excellent knowledge upon RDBMS, OLAP, OLTP, Data Marts, ODS Systems.
- Mapping experiences using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Sequence Generator, Unconnected / Connected Lookup, Update Strategy, Aggregator, Sorter, Union, Transaction control , Match, Key Generator, Association, Consolidation, Labeler, Parser, Address Validator, Classifier, Decision, Standardizer, Expression, Case Converter and Merge
- Efficiently handled the granularity, indexing and partitioning with data warehouses.
- Mapping experiences with SCD TYPES, Master, Transactional Subject areas & other dimensional models with respect to ETL Data warehouses.
- Maintained warehouse Meta data, naming standards and warehouse standards for future application development.
- Performing Profiles, Standardization, Identifying Anomalies, creating/Applying rules and created mappings/mapplets.
- Good experience upon Business Intelligence reports using Business Objects, COGNOS, OBIEE, and Jaspersoft.
- Performed End-to-end data lineage with IDQ along with maintaining an excellent relationship with the end client.
- Worked closely with the users to understand the current state of information availability in the enterprise and then identify future needs based on their analysis of business requirements, current state environments, gap analysis, and future state warehousing implementation.
- Performed gap analysis between the current state and future data warehouse environment identifying data gaps and quality issues plus recommending potential solutions.
- Worked closely with the data warehouse development team to ensure user requirements and issues being addressed while controlling scope.
- Data modeling experience using Star Schema/Snowflake modeling, FACT & Dimensions tables, Relational & Dimensional tables in perspective of Master and Transactional data.
- Good knowledge upon databases using Teradata, Oracle 11g/10g/9.x/8.x/7.x, MS SQL Server 2000/2005, SQL, PL/SQL, SQL * Plus, SQL*Loader, Toad 7.3 / 8.x . 0/ 9.1/ 10.3.3.0/ 11.5.
- 1+ years of Informatica Power Center Administration in 7.1.x ,8.x,9.x including server setup, configuration, client installations, deployments, backups, repository management, server health and maintenance, performance monitoring and improvements, patching connectivity to connect to other databases., setting up ODBC.
- Experience in Performance tuning in Informatica Power Center.
- Experience in preparing, USE CASES, Activity Diagrams using Rational Rose.
- Good expertise in migrating various Informatica sessions and mappings from one version to another.
Data Warehousing- Informatica PowerCenter 9.5/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x (Repository Manager, Source Analyzer, IDQ Developer/Analyst 9.5.1,Designer, Server manager, Work Flow Monitor, Warehouse Designer, Mapplet Designer, Mapping Designer, Work Flow Manager), Metadata Manager 8.x, OLAP, OLTP, IDQ/IDE,SQL * Plus, SQL * Loader, Informatica Data Quality, Informatica Data Profiler.
BI & Reporting- Business Objects 6.5/6.0/5.1/5.0 ,SQL Server Reporting 2005, Siebel Analytics 7.8, OBIEE 10.x/11.x,Cognos 8.1/8.2/8.3/8.4
Data Modeling- Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, ER Diagrams, ERWIN, ERStudio 7.1/6.1
Other Tools HP Service Manager, IBM Rational Clear Quest 126.96.36.199.
Databases- Oracle 11g/10g/9i/8i, Teradata 13.0, MS SQL Server 2008/2005/2000 , Oracle SQL Developer 3.0.04, SQL*Loader, IBM Data Studio 188.8.131.52, EDW DB2
Programming- SQL, UNIX
Job Scheduling - IBM Tivoli Manager 5.0, Autosys, Control-M, Mainframes.
Environment- UNIX , RedHat Linux 5.x/4.x/3.x, Linux 2.6.32 Solaris, Windows 2008/2003
Other Tools- TOAD, RapidSQL, SQL Plus, CYGWIN (X-Windows Server), Winscp, Core FTP LE 2.2, Putty, AIX Servers, SONAS, EDGE Servers.
Confidential, Denver, CO
Roles & Responsibilities:
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Responsible for converting Functional Requirements into Technical Specifications and production support.
- Worked with the team to Integrate data from multiple source systems like Health Answers, Power System, HealthPlex into Single Data Warehouse system.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Consultant, Mapplet Designer and Mapping Designer.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Expression, Router, Filter, Aggregator, Sequence Generator.
- Worked with Shortcuts across Shared and Non Shared Folders.
- Developed Oracle views to identify incremental changes for full extract data sources.
- Developed the automated and scheduled load processes using Tidal scheduler. Involved in migration of mappings and sessions from development repository to production repository.
- Responsible for Unit testing and Integration testing of mappings and workflows.
- Provided daily production support of batches and system processes.
- Good experience in working on production tickets to resolve the issues in a timely manner.
- Monitor the testing project in Quality Center and ensuring defects are being entered, tested, and closed.
- Analyzed, documented and maintained Test Results and Test Logs.
- Exposure on partitioning for loading large volumes of data.
- Scheduled Informatica workflows using Informatica Scheduler to run at regular intervals.
- Actively participating in Agile process development style like attending scrum meeting (standup meetings).
Environment: Informatica PowerCenter 9.5/9.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), SQL Query Analyzer 8.0, Oracle 10g/11g, Teradata, SQL Developer , Windows, Unix/Linux ,putty.
Confidential, Colmbus, Ohio
Roles and Responsibilities:
- Designing the source to target mappings that contain the Business rules and data cleansing during the extract, transform and load process.
- Efficiently worked upon Parameters and Variables.
- Identifying and tracking slowly changing dimensions and created complex mappings by using the SCD concepts.
- Acted as a liaison between various teams to connect to business users.
- Implemented Indirect file types for loading data with the same structures.
- Worked with Oracle BI Administration Tool, Presentation Services, Answers, & Interactive Dashboards and Security Implementation.
- Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Used IDQ for data profiling ,standardization and structuring the data
- Proficient in using SQL and PL/SQL for extract transform and load data into data warehouse.
- Data quality monitoring and profiling tool for the enterprise wide data elements.
- Development of Business Objects Reports.
- Classify and enrich meta data requirements to ensure performance and functionality of the meta data management system for future releases
- Assist in the development of the back-end of the managed meta data environment (the process of getting data into the meta data repository), extracting meta data from its sources and programmatically integrating it into a database
- Utilized Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Target load order plans performed.4
- Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement.
- Performed many sequential and parallel loadings.
- Created Event Task, Event wait and performed many successful loadings to targets.
- Preparing HLD, LLD, BRD, FRD, gathering complete requirements.
- Used the Slowly Changing Dimensions – Type II in various mappings.
- Handled Maintenance and Enhancement Requests as a team.
- Translated Business processes into Informatica Mappings for building Data marts.
- Importing various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.
- Using Heterogeneous files from different sources and Importing stored procedures from Oracle for transformations.
- Team meetings and performance reviews
Environment: Informatica PowerCenter 9.1/8.6 (Repository Manager, PowerCenter Designer, Workflow Manager, Workflow Monitor) ,Informatica IDQ/IDE, Informatica Metadata Manager, Oracle 10g, OBIEE, SQL SERVER 2008, MS-Access, Unix, Window 7, SQL, PL/SQL, MS-Visio, Microsoft tools, Putty.
Confidential, Atlanta, Georgia
ACS to Provide BI Solutions to Confidential for ND, NH, AK States. “Conversion” as one of the most critical activities in the entire implementation process. The purpose of conversion project is to convert/migrate and load source data into the new MMIS database from legacy systems.
“Interface” process provides the state with information regarding inbound and outbound interface files processed within MMIS. It defines interface processes for trading partners who are authorized to submit and/or transmit interface files and have a valid Trading Partner Agreement (TPA) with Department of Health And Human Services (DHHS). The interfaces process affects various functional areas like: (EDMS, Member, Claims, Prior Authorization, Provider, Contact Management, EDI, Reference, Global, Managed Care, Claims Front End, Third Party Liability, Data Management, Administrative Reporting, and Pharmacy).
Roles & Responsibilities:
- Created Procedures to Generate Tables Data into Files Dynamically.
- Created database objects such as tables, views, stored procs, and functions in Oracle.
- Involved in ETL design, coding, testing and implementations.
- Designed ETL processes for optimal performance.
- Created test cases and test scripts to validate data
- Tuned the performance of complex Informatica mappings and database.
- Worked with cross functional teams to resolve the issues.
- Created slowly changing dimensions & Fact tables to meet the business requirements.
- Used various transformations including Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and Optimizing. Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
- Performed data validations.
- Created SQL Server ODBC connections in Informatica workflow Manager.
- Worked extensively on Informatica partitioning when dealing with huge volumes of data and also partitioned the tables. Used Pass Through, Round Robin and Hash portioning.
- Involved in the defect analysis call for UAT environment along with users to understand the data and to make any modifications to code.
- Created complex queries, procedures, and functions in SQL Server.
- Performed Root cause analysis and resolved complex issues.
- Created SQL Queries to validate the data in both source and target databases.
- Prepared the recovery process in case of workflow failure due to database issues or network issues.
- Supported the applications in production on rotation basis and provided solutions for failed jobs.
- Actively attended the production support calls and pagers at all times and answered the business questions.
Environment: Informatica 8.6, Informatica Power Exchange, Oracle 10g, PL/SQL, Java, Windows, UNIX.
Informatica Developer/ Analyst
Roles and Responsibilities:
- Involved in design and development of data warehouse environment, liaison to business users and/or technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
- Handled FTP’s, command task, event wait utilities in Informatica.
- Optimized Query Performance, Session Performance and Reliability.
- Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
- Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
- Scheduled the batches to be run using the Workflow Manager.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs effectively.
- Setup the data mart on Oracle database by running the SQL scripts from ERWIN designer.
- Translation of Business processes into Informatica mappings for building Data marts.
- Involved in the Migration process from Development, Test and Production Environments.
- Used SQL tools like TOAD to run SQL queries and validate the data.
Environment: Informatica PowerCenter 8.6/8.1, Oracle 10g/9i, SQLSERVER 2005, Flat files, TOAD, Visio, PL/SQL, SQL, UNIX (AIX), Windows XP.
Roles and Responsibilities:
- Analyzing the specifications and identifying the source data that needs to be moved to the data warehouse.
- Participated in various Source Code reviews.
- Created different source definitions to extract data from flat files, relational tables and external sources.
- Worked widely upon all Master and Transactional data’s of the OLAP systems.
- Historical Data Migration.
- Hands on experience upon Surrogate keys, Natural Keys.
- Strong understanding of the managed metadata environment (MME) and its architectural components.
- Ability to work at multiple levels of a metadata management project, including defining the metadata management architecture, managing the metadata project, mentoring metadata developers, etc.
- Created and maintained Data Models working with Data Architect.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Developed Stored Procedure to support front-end application.
- Involved in creating Naming Standards, Best Practices for ETL development.
- Used all types of transformations like Expression, Router, Lookup, Filter, Aggregate, Sorter etc.
- Expertise in Teradata SQL & Teradata Utilities like Multiload, Fast export, BTEQ (Batch Teradata Query), Fastload, Teradata Parallel Transporter, TPump, VIEWPOINT, DATAMOVER.
- Designed dimension and fact tables for Star Schema and Snowflake Schema to develop the Data warehouse.
- Participated in Integration of Siebel Database
- Involved in Integration of Siebel UI screen.
- Actively interacted with business analysts.
- Worked widely on Sales Force Dot Com (SFDC).
- Extensively utilized the Debugger utility to test the mappings.
- Efficiently worked upon Connected and Unconnected Lookups.
- Creating tables, constraints for improved database performance.
- Handling Workflow logs, Session Logs and Error Logs.
- Created tables and views for reporting purposes.
- Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
Environment: Informatica Power Center 8.1.1, MS SQL Server, SQL LOADER, Oracle 9i, Erwin, SQL, Oracle8i,SalesForceDotCom,Siebel CRM,Erwin, SQL*Loader, Salesforce,XML, Win 2000/NT
Roles and Responsibilities:
- Development of Informatica Mappings, Sessions, and Workflows of varying complexity.
- Migrated the ETL Codes.
- Used various Transformations like Lookup, Filter, Joiner, Aggregator, Expression, Router, Sequence generator, Normalize in the mappings.
- Worked upon the VSAM and COBOL Sources.
- Collected Business Analysis and Requirements by working with the business end users.
- Promptly analyzed the existing Informatica mappings and understanding and solving the issues.
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
- Converting the Requirements into Business Rules for ETL Transformation.
- Prepared USE CASES, Activity Diagrams, Identifying the Users, Actors.
- Prepared the ER Diagram Models.
- Understanding the complete business requirement and need of the report.
Environment: Informatica Power Center 7.4.x/7.1, Erwin, MS SQL Server ,Oracle 8i, IMS, Windows NT, Flat files, SQL, PL/SQL, SQL*Loader, Oracle 8i, XML, Cognos 5, Autosys, UNIX.