- IT professional with 8 years of experience in Data Warehousing using Informatica Power Center 9.x/8.x/7.x (Source Analyzer, Target Designer, Mapping Designer, Mapplets Designer, Transformation Developer),Informatica Powermart 7.x/6.x, Power Connect, Power Exchange
- Heavily involved in complete life cycle of enterprise data warehouse
- Experienced in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling by using Erwin/MS Visio
- Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De - normalization Concepts
- Ability to write complex SQL needed for ETL jobs and analyzing data, proficient with databases like Oracle 12i/11g/10g/9i, SQL Server 2012/2008/2005 , Teradata, Netezza, DB2, Flat Files, Sybase, COBOL files and XML files
- Experienced in Installation, Configuration, and Administration of Informatica Power Center
- Developed complex mappings by using various transformations like Unconnected/ Connected lookups, Router, Filter, Expression, Sorter, Aggregator, Joiner, Union, Update Strategy and rarely used transformations like Java, Stored procedure etc.
- Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user and Production support
- Expertise in Unit Testing, Integration Testing, System Testing and Data Validation for developed Informatica Mappings
- Research oriented, raise issues upfront and address them as soon as they have been identified
- Experienced in doing Error Handling and Troubleshooting using various log files
- Experienced in using Batch, Perl and UNIX shell scripts
Data warehousing Tools: InformaticaPowerCenter 9.5/9.1/8.x/7.x, Informatica Developer/Data Quality 9.5/9.1, Informatica power exchange Scheduler Cisco Tidal and Control-M
Databases: SQL Server 2012/2008/2005 , Oracle 12i/11g/10g/9i, Teradata v2r6Netezza, Sybase and DB2
Languages: SQL, PL/SQL, C, C++, Java and HTML
ERP Tool: SAP
Database Utilities: TOAD 8.0/7.1, Aginity, Oracle SQL Developer, And SQL Server Management Studio, Teradata Studio
Operating System: Red Hat Linux and Windows XP/Vista/7.
Data Modeling: Ralph-Kimball Methodology, Bill-Inmon Methodology, Star SchemaSnow Flake Schema, Physical and Logical Modeling, Dimension Data Modeling, Fact Tables, Dimension Tables, Normalization and De Normalization
Confidential, Omaha, NE
Senior Informatica Developer
- Understanding of business requirements and enhancing the existing data warehouse architecture
- Used Informatica Designer to create, Load, Update mappings using different transformations to move data to different data marts in Data Warehouse
- Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, ODS, SQL Server...etc into the Staging area and then into Netezza target
- Developed and documented data mappings, transformations and Informatica sessions
- Developed PL/SQL stored procedures and triggers in T-SQL for various data cleansing activities
- SQL*Loader was used to load the data from files
- Created source, target, transformations, sessions, batches and defined schedules for the sessions
- Worked on Teradata Insert, Update, Insert and Dynamic SQL to transform and Load Summary tables built on Core Data Warehouse
- Worked on Cisco Tidal for scheduling and Automating the Informatica workflows
- Used Shell Scripting to automate the loading process
- Used incremental loading technique to load incremental data into enterprise data warehouse.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Worked on powerful data profiling using Informatica Developer, to empower data analysts to investigate and document data quality issues.
- Involved in version control of code from development to test and Production environments.
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Source, Target, Mapplet, and Transformation objects.
- Involved in Unit & Integration Testing of Mappings & Sessions.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Involved in migrating Mappings, Sessions and Workflows between development, test and production environments.
- Scheduled Sessions and Batch Processes based on demand, run on time, run only once using Informatica Scheduler
- Troubleshoot various reports using Cross Tab, Master Detail and different charts including Line and Pie Charts for analysis
- Designed reports with Slice and Dice & Drill down analysis
Environment: ETL/Informatica PowerCenter 9.5/9.1, Informatica Power Exchange 9.1, Oracle 12i, Teradata, Netezza, PL/SQL, XML, Flat files, UNIX, Toad, SQL Developer, Cisco Tidal, DB2, Perl, Batch, Windows Server 2008
Confidential, Miami, FL
Senior Informatica Developer
- Responsible for gathering Business requirements and translate into technical requirements
- Developed Mapping by using various transformations to suit the business user requirements and business rules to load and eliminate unrequired data from SQL server 2008, Sybase, Oracle and other data sources into Netezza target
- Extensively used Teradata SQL Assistant to extract data from Data warehouse
- Expertise in debugging and production support
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Control, Command, Decision, Session in the workflow manager
- Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes
- Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer
- Developed mappings/sessions using Informatica Power Center 9.1/8.6.1 / Power exchange 8.5.1 for data loading.
- Responsible for managing Informatica Services like Repository Service, Integration Services, Table Manager Services etc.
- Designed and developed error handling
- Involved in data quality profiling, standardization and testing
- Monitored daily loads and provided on call production support (L2) during data Loads.
- Migrated repository objects, and scripts from development environment to QA/production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
- Creating Change Control Requests as required by assigned projects and ensuring end to end extensive support as Data Analyst and address overall data quality
- Installation & configuration of Informatica services
- Created repository & assigned grants to users to access the repository
- Used transformations like Joiner, Expression, Connected, Unconnected lookups, Filter, Aggregator, Stored Procedure, Rank, Update Strategy, Router, Sorter, Sequence generator etc
Environment: ETL/Informatica PowerCenter 9.1, Informatica Power Exchange 8.6, Oracle 11g, Sybase, Teradata, Netezza, PL/SQL, XML,Flat files, UNIX, Toad, SQL Server 2008, XML files, DB2, Perl, Shell Scripting, HPQC, Red hat Linux
Confidential, Boston, MA
- Involved in Data transfer from OLTP systems forming the extracted sources
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system
- Analyzed the sources, transformed the data, mapped the data and loaded the data into targets using Power Center Designer
- Designed and developed Oracle PL/SQL Procedures
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing
- Participated in the design of Star & Snowflake schema data model
- Designed and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts
- Worked on Informatica Utilities - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer
- Worked with various transformations like Source Qualifier, Expression, Filter, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations
- Responsible for creating business solutions for Incremental and full loads
- Involved in creating Shell Scripts to automate Pre-Session and Post-Session Processes
- Developed workflows, sessions and job groups to schedule the loads at required frequency using cisco tidal scheduler
- Involved in data quality profiling, standardization and testing
Environment: Informatica Power Center 8.6.1, Oracle 10g, SQL, PL/SQL, Quest TOAD and Windows Server 2003
Confidential, Dublin Ohio
- Acted as a key contributor for Production support and involved in supporting the Data Integration application for ETL d Functional Design Documents (FDS), Systems Design Specification (SDS) to understand the overall application
- Worked as onsite lead in the Onsite/Offshore Model Production Support
- Monitoring the data loads and Enhancement of existing application for business changes/ improvement/audits
- Performance Tuning of the mappings to handle increasing data volume
- Resolved different issues in the support environment like code issues, connectivity issues, data issues, scheduling conflicts, managing dependency jobs, space issues on the Unix server or databases, capacity planning, system maintenance etc so worked with appropriate teams in resolving the production issues in a timely manner by identifying the issues.
- Responsible for proper knowledge transfer from development team and ensure all the required documents are handed over.
- Responsible for maintaining different logs like mapping status document, Issue log, Defect detection log etc.
- Responsible for analyzing the Error table data as well as rejected data and fix them.
- Played an active role in preparing the ETL specification documents for customization.
- Managing the work load timely by distributing it appropriately among the team and prioritizing the severity issues in a timely manner.
- Able to quickly investigate/identify the root causes of the issue, if required reach to other teams if it’s an external issue and fix it ASAP.
Environment: Informatica 7.1.1, Oracle 9.1, Teradata V2R5/ V2R6, SAP R/3, SQL server 8.0, UNIX and Unicenter.
Sr. Informatica Developer
- Understanding the customer requirements and analyzing or resolving the discrepancies in the business requirements
- Worked to design the proper architecture which can isolate the application component (business context) of thedata integrationsolution from the technology, it also helps for reuse - reuse of skills, design objects, and knowledge
- Prepared the Technical Design documents as per the Functional Requirements document and Logical Data model
- Worked on IDE/ IDQ for creating profiles in identifying different patterns of source data and examined with business team for correcting/ Analyzing
- Extensively used Informatica to load data from VSAM, flat files and DB2
- Developed Power Exchange data maps that were used to pull the data from files on the Mainframe / VSAM files
- Developed Shell Scripts for retrieving files from FTP server, achieving the source files, Concatenating files and finally to deliver them to remote shared drive
- Developed the error Logic for streamlining, automating the data loads for cleansing incorrect data and developed Auditing mechanism to maintain the load statistics of transactional records
- Designed and developed standard load strategies to load data from source systems to Atomize Database which is the final target system.
- Prepared documentation on the design, development, implementation, daily loads and process flow of the mappings and participating in review design documents
- Used IBM DB2 Control center, Control editor to check the table design, records populated in the DB and used to test the data to check whether it is correctly loaded into the required schemas as per the business requirements
- Extensively involved in performance tuning using various components in the mappings, sessions or database tables and used Parameter files, Variables, cache mechanism and using SQL overrides
Environment: Informatica Power Center 8.1, Power Exchange Navigator v.8.1, Mainframe source system, DB2, AIX OS as platform and CA7 Scheduling tool.