- ETL Expert with 7 years of Data warehousing experience using Informatica Power Center and experience in implementation of ETL methodology, Data Extraction, Data collection Transformation and Loading for Business Intelligence.
- Thorough understanding of System Development Life Cycle (SDLC) experience from requirement analysis through Design & Development using modern programming languages.
- Extensively worked on ETL using Informatica - Power Center /Power Mart.
- Designed complex Mappings and expertise in performance tuning.
- Experience in data warehousing specializing in Star Schema / Snow Flake schema Design and ETL strategy design and Business objects.
- Participated in Business User’s Meetings to understand their requirements discussed and designed Reports.
- Extensively worked on ETL using Informatica Power Center/ Power Mart.
- Extensive experience in Data Base back-end programming writing/tuning complex SQL queries and also writing stored procedures using PL/SQL.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star Schema & Snowflake Schema and HR/Payroll models used in dimensional modeling.
- Proficient in using Informatica workflow manager, workflow monitor, server manager, (Interactive, command line utility) to create, schedule and control workflows, tasks and sessions.
- Excellent Working Knowledge on multiple Platforms like UNIX, Windows XP/VISTA/7
- Extensive experience in developing Stored Procedures, Database Triggers.
- Handled various databases like Oracle, DB2 and Netezza.
- Involved in Highly Proficient in Data Modeling, Strong in Data Warehousing concepts, Dimensional, Star Schema and Snowflake Schema methodologies. Complete understanding of Ralph-Kimball and Inmon approaches to Data Warehousing
- Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
- Excellent Communication and interpersonal skills. Ability to quickly grasp new concepts, both technical and business related.
- Ability to work individually as well as in a team with excellent problem solving and troubleshooting capabilities.
- Aptitude in communicating at all levels of the management with experience in coordinating various client meetings, presentations and group discussions. Particularly excel in the roles that require liaison between the user groups and IT.
- Experience in working in an Agile and Scrum environment.
ETL Tools: Informatica v9 PowerCenter 9.x/8.x/7.x/6.x
Languages: SQL, PL/SQL, C, XML, HTML, Visual Basic 6.0
Operating Systems: UNIX, Windows NT/ 98/2000/2003/2007 Linux
Tools: Netezza, TOAD, SQL Navigator, SQL Developer,, Salesforce, DB Visualizer
Databases: Oracle 11g/10g/9i/8i, T-SQL, MS SQL Server 2000/2005/2008 , MS Access, DB2
Job Scheduling: Tidal, Control-M
Confidential Los Angeles, CA
- Part of Production Support and Enhancement Team.
- Purpose of the project was to participate in tasks involving Production Support, Maintenance and Enhancement.
- Development tasks involved Incremental Fix, History Fix and Full Load tasks.
- Worked on Slowly Changing Dimensions both Type 1 and Type 2.
- Conducting Root Cause Analysis on defects assigned.
- Coordinated with various teams within the organization and worked with Business Users, Analysts and Quality Analysts.
- Used DB Visualizer to run SQL queries and worked on OLTP sources and OLAP sources.
- Worked on Multiple databases like Oracle and Netezza on this project extensively.
- Fixed defects raised and detected by various users.
- Prepared Mapping Documents and raised CTASKS to migrate ETL objects to different environments like QA and Production.
- Used various applications like Quality Center to keep a track on defects and update on the development on the defects.
- Worked in an Intense Environment with Waterfall Methodology which involved Daily Stand up Meetings with the Business Intelligence team comprising of 20 team mates.
- Working in Auto Insurance Domain helped understand the business of the insurance companies.
- Worked on Parameter Files for the mappings and managed the parameter in various servers and did an impact analysis before pursuing development.
- Responsible for the execution of the project and meeting the business requirements
- Purpose of the project required normalizing the data to suit the business requirement and information comprised of confidential information from government website
- Automated the process of tracking changes and controlling the inserts, updates and deletes for the weekly and monthly loads of the data
- Responsible for the mappings, logic, sessions, workflows and worklets.
- Involved in scrum meetings on a daily basis to update the happenings and changes in the strategies in meeting the requirement
- Dealt with multiple schemas and repositories of the organization s data warehouse
- Performed unit testing and validated the logic and moved to the Quality Analysts for further testing
- Interfaced with Java Developers and System Engineers with a senior Informatica developer while calling Web service to extract and load the data into data warehouse
- Assisted the team lead in migration of the objects into different environment
- Used Normalizer, Aggregator, Expression, Lookup, Sequence Generator and Router on a regular basis to execute the logic designed to meet the business requirement
- Worked with multiple team comprising of developers, analysts and testers
- Participated in discussions enhancing the performance of the mapping by using best practices while creating the mappings
- Used parameters and variables in creating process log workflows to run all the sessions simultaneously to load all tables which had parent-child relationships
- Used Filezilla to transfer and import file from the server to the system and check for the results of the logic.
- Worked with flat files throughout the project and used functions like MD5 to track and capture changes from different load times
- De-normalized or flattened data on couple occasions while joining different tables to get results on specified search dealing with Tax id, SSN and other personal ID s.
Environment: Informatica v9, Salesforce 8.5, UNIX, Oracle 11g/10g, Toad10.5, SQL DeveloperConfidential NJ
- Responsible for the design of Informatica mappings.
- Handled various databases like Oracle and Netezza for this project.
- Used various transformations like Filter, Aggregator, Lookup, Update strategy, Filter, source qualifier etc.
- Performing impact analysis and performing requirement analysis.
- Extensively involved in working on different active and passive transformations and involved in generating, modifying and processing the data.
- Responsible for the development/testing of Informatica mappings.
- Have done data Analytics and exploration of data to fix the history data by developing the SCD and also involved in Unit Testing, and Performance Testing.
- Implemented Incremental loading of mappings using Mapping Variables and Parameter Files
- Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
- To support large Data Warehouses of Pfizer’s sales and marketing division and used SQL and also worked on scheduling using TIDAL (Briefly).
- Worked with Sales Force to extract data from cloud and copying it to the Oracle database and then extracting the data from Oracle sources and Netezza.
- Dealt with development and enhancement of existing Pfizer’s data warehouses.
- Successfully implemented Type 1 and Type 2 dimensions for inserting and update slowly changing dimensions in targets for maintaining targets and Kimball Methodology.
Environment: Informatica v9, Salesforce, Netezza, UNIX, Oracle 11g/10g, Toad10.5, SQL Developer, T-SQL, TIDALConfidential San Jose, CA
Sr. ETL/Informatica Consultant
- Responsible for Requirement Gathering, Project Co-ordination and Analysis of source data
- Translated business rules and functionality requirements into ETL procedures
- Implemented extraction/transformation/load (ETL) interfaces to facilitate transitioning from old legacy Salesforce systems to Oracle systems and Netezza on few occasions.
- All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Implemented best practices in ETL Design and development and ability to load data into highly de-normalized tables and Kimball Methodology.
- Performance tuning has been done to increase the throughput for both mapping and session level and SQL Queries Optimization as well.
- Successfully implemented Type 1 and Type 2 dimensions for inserting and updating slowly changing dimension tables in target for maintaining the history
- Extensively used Informatica debugger to figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs..
- Worked extensively on Informatica Partitioning when dealing with huge volumes of data to optimize the performance of mappings.
- Prepared unit and integration test plans, code review, testing, debugging, deployment of Informatica Jobs into production environment.
- Worked extensively on SQL, PL/SQL and UNIX Shell Scripts.
- Worked on Data Ware House Builder to integrate the data during the later stages of the project and Business Objects.
- Provided support and quality validation thru test cases for all stages Unit and Integration testing.
- Provided end user training and Production System Support.
- Worked on Data Quality and Data Profiling using IDQ briefly.
Environment: Informatica v9, Salesforce ER/Studio Enterprise 8.5, UNIX, Oracle 11g/10g, Toad10.5, SQL Developer, IDQConfidential
Sr. ETL Developer
- Implemented near real time data warehousing solution. Informatica Power Exchange has been used to read the Oracle database logs to capture the changes in the data. The changes are then propagated to a staging area through a “Pull” mechanism using Informatica Power Centre & Power Exchange connector.
- Involved in designing and developing logical & physical data models using Erwin to support the operational reporting applications.
- Performed peer code reviews to ensure compliance with ETL standards.
- Enhanced system performance by optimizing and tuning database objects, reports and ETL processes and handles Netezza.
- Assisted the team members in designing & developing ETL mappings and workflows.
- Tuned Existing Oracle scripts for better performance.
- Coded SQL Scripts to create the Development Database, Testing Database, Production Database, including Tablespaces, added datafiles to tablespaces, managed control files, Rollback segments creation, Users, Synonyms, Roles, profiles, Privileges.
- Developing the Informatica mappings, executing sessions, and validating the results.
- Developed SQL and PL/SQL codes for various procedures, functions, and packages to implement the business logics in an Oracle database.
- Managed Metadata associated with Informatica. Queried the metadata for reporting purposes.
- Worked on the complete Life cycle of Business Intelligence project with focus on Extraction, Transformation and Loading of data using Informatica Power Center.
- Developed several complex Oracle Scripts.
- Strong in using workflow tasks like Session, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-sessions and Pre/Post commands.
- Created, updated and maintained technical documentation, Unit Test plans and release notes.
- Worked as a backup Admin, handled production support issues and resolved issues during Project deployment.
Environment: Informatica Power Center 8.6/9.1, Oracle11g, UNIX, Windows 2000/XP, Quality Centre & Toad.
Confidential Overland Park, Kansas City
- Extensively used Informatica 8.1 to load data from sources involving SQL Server and Flat files to DB2 databases.
- Involved in the Extraction, Transformation and loading of the data from various sources into the dimensions and the fact tables in the Data Warehouse.
- Created reusable Transformations and Mapplets and used them in various mappings.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created Informatica mappings with PL/SQL procedures to build business rules to load data.
- Most of the transformations were used like the Source qualifier, Aggregators, Connected & Unconnected lookups, Filters & Sequence.
- Assisted in Generating reports using the report-generating tool, Business Objects 6.0
Environment: Informatica 8.6, Business Objects XI, Oracle 11g, SQL Server 2005, Flat files, SQL, PL/SQL, Windows NT.Confidential
- Interacted with business community and gathered requirements based on changing needs and incorporated identified factors into Informatica mappings to build Data Marts.
- Extensively worked on Power Center Designer to develop mappings using several transformations such as Filter, Joiner, Lookup, Rank, Sequence Generator, Aggregator and Expression transformations.
- Implemented Type 1 and Type 2 Slowly Changing Dimensions.
- Created parameter files and used mapping parameters and variables for incremental loading of data.
- Involved in performance Tuning of Transformations, mappings, and Sessions for better performance.
- Used pre-session and post-session scripts for dropping and recreating indexes on target table.
- Generated test plans and performed unit testing.
- Involved in developing the Tables, Views, Stored Procedures, Materialized views and Indexes.
Environment: Informatica Power center, Oracle 8i, TOAD, windows 2000/XConfidential
- Involved in design and development activities of the project.
- Interacted with the end users and superior staff at branch level, region level and corporate level, for collecting the user requirements
- Analyzed requirements and drawn E-R Diagrams
- Developed stored procedures and triggers in PL/SQL
- Developed around 35 screens to support all activities
Environment: Oracle 8i, PL/SQL, SQL, SQL Loader, HTML 4.0, XML, Flat files, Windows NT, HP-UX, Shell Scripting