Sr. Informatica Developer Resume
Atlanta, GA
PROFESSIONAL SUMMARY
- Over 7 years of extensive experience wif Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center Informatica 10.X/9.x/8.x/7.x, IDQ etc.
- Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2, Teradata 13/12/V2R5/V2R4,, SQL, PL/SQL, SQL*Plus, Stored procedures, triggers.
- Strong data modeling experience using Star/Snowflake schemas, Dimensional Data modeling, Fact & Dimension tables, Physical & logical data modeling.
- Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse.
- Involved in Troubleshooting data warehouse bottlenecks, Performance tuning - session and mapping tuning, session partitioning & implementing Pushdown optimization.
- Designed and developed complex ETL mappings using Source Qualifier, Application source Qualifier, XML Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator transformation.
- Strong understanding of OLTP and OLAP and Data warehousing and ETL process using Informatica Power Center 10.X/9.x/8.x.
- Expertise in developing Mappings and Mapplets, Sessions, Workflows, Worklets and Tasks using Informatica and Creating Parallel and sequential jobs and parameter files.
- Worked wif Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.
- Hands on experience wif Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Experience wif Teradata utilities like Fast load, Fast Export, Multi Load& has experience in creating Basic Teradata Query definition (BTEQ) scripts.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Experience in integration of various data sources like SQL Server, Oracle, Teradata, Flat files, DB2.
- Experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
- Extensively used Control-M for scheduling the Informatica workflows and creating UNIX shell scripts and Perl scripts.
- Extensively worked on various transformations like Lookup, Joiner, Router, Rank, Sorter, Aggregator, Expression, etc.
- Strong understanding to Data Warehousing, Data Architecture, Data Modeling, Data Analysis, SDLC Methods .
- Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
- Experienced dealing wif outsourced technical resources and coordinating global development efforts
TECHNICAL SKILLS
ETL Tools: Informatica Power Center10.1/9.6/9.5/8.6/7.1, Informatica Data Quality (IDQ), SSIS etc.
DB Tools: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access
Languages: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl, Python.
Operating Systems: UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS.
Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.
Scheduling Tools: Control-M, Autosys, TWS.
PROFESSIONAL EXPERIENCE
Confidential, Atlanta, GA
Sr. Informatica Developer
Responsibilities:
- Working wif business users and business analyst for requirements gathering and business analysis.
- Designing and customizing data models for Data warehouse supporting data from multiple sources on real time.
- Worked on Informatica Power Center 10.1 - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and tuning of mappings for better performance.
- Extensively used Power Centre to design multiple mappings wif embedded business logic.
- Involved in data profiling using IDQ (Analyst Tool) prior to data staging.
- Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the mapping designer.
- Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target files in efficient manner.
- Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
- Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries also by using various components like. Parameter files, Variable.
- Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
- Written Unit test scripts to test the developed mappings.
- Involved in business documents walk through wif functional teams for designing application documents, Mapping documents, data flow diagrams.
- Developed mapplets and rules using expression, Labeler and standardizer transformations using IDQ.
- Deployed code from IDQ to power center.
- Extensively used debugger to test the logic implemented in the mappings.
- Performed error handing using session logs and tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
- As per business we implemented control tables to track the operations of source and target systems.
- Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
Environment: Informatica Power Center 10.1, Informatica Data Quality 9.6.1, Oracle 11g, UNIX, DB2, Control-M Job Scheduler, FTP Client, IBM data studio, putty, JIRA, Confluence.
Confidential, Los Angeles, CA
Sr. Informatica/IDQ Developer
Responsibilities:
- Worked wif Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
- Perform System Analysis and Requirement Analysis, design and write technical documents and test plans.
- Created a Hybrid process in IDQ by combining both IDQ Developer and analyst version through LDO (Logical Design Objects).
- Worked on IDQ Analyst for Profiling, Creating rules on Profiling and Scorecards.
- Worked wif Management for creating the requirements and estimates on the Project.
- Coordinated wif DBA in creating and managing tables, indexes, table spaces, auditing and data quality checks.
- Designed IDQ mappings which is used as Mapplets in Power center.
- Writing and tuning SQL queries, views, stored procedures, and functions in support of Data Quality.
- Involved in gathering business requirements for the data-warehouse as well as business-intelligence reports to be used by the management.
- Involved in massive data profiling using IDQ (Analyst Tool).
- Used IDQ's standardized plans for addresses and names clean ups.
- Worked on IDQ file configuration at user's machines and resolved the issues.
- Used IDQ to complete initial data profiling and removing duplicate data.
- Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
- Worked on data cleansing and standardization using the cleanse functions in Informatica IDQ.
- Developed numerous mappings using the various transformations including Address Validator, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser etc.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
- Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
- Implemented Slowly Changing Dimensions (SCD, Both Type 1 & 2).
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Created Post UNIX scripts to perform operations like copy, remove and touch files & Automated the entire processes using UNIX shell scripts.
- Extensively worked on Autosys to schedule the jobs for loading data.
- Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing and Load Testing.
- Responsible for creating reusable transformations and complex mappings, partitioning.
- Communicate wif business users to understand their problem if required and send workaround /step to solve the issue and find the root cause and advice if required any enhancement.
- Design and develop mappings, sessions and workflow as per the requirement and standards
- Responsible for making changes in the existing configuration wherever required, making customizations according to the business requirements, testing and successfully moving the changes into production.
Environment: Informatica Data Quality 9.6.1, Informatica Power Center 9.6.1, Oracle 11g, Winsql 10, UNIX, Winscp, putty, Shell Scripts, Microsoft SQL server management studio, Application Lifecycle management(ALM).
Confidential, ST. Louis, MO
Sr. Informatica / Teradata Developer
Responsibilities:
- Documented high and low-level design document specifications for source-target mapping, based on the transformation rules.
- Documented technical requirements for ETL process and Design documents for each source. Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration.
- Experience wif incremental changes in the source systems for updating in the staging area and data warehouse respectively.
- Created Mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Joiner, etc.
- Analyzed the business requirements and functional specifications.
- Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, SQL Server and Flat files.
- Used Informatica Power Center tool for extraction, transformation and load (ETL) of data in the data warehouse.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
- Using Informatica PowerCenter created mappings to transform the data according to the business rules.
- Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
- Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager
- Understanding of Requirements Specification and analysis of source systems.
- Migrated data wif halp of Teradata FastExport, Insert/ Select, flat files from one system to another.
- Involved in Unit, Integration, System, and Performance testing levels.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Has used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
- Migrated data wif halp of Teradata FastExport, Insert/ Select, flat files from one system to another.
- Involved in test data set up and data quality testing and Unit testing and Integration Testing.
- Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Performed the Unit testing for jobs developed to ensure that it meets the requirements.
Environment: Informatica Power Center 9.5, Teradata 14.0, Informatica Data Quality 9.5, Oracle 11g, PL/SQL, UNIX, Winscp, DB2, ServiceNow ticketing system.
Confidential, Denton, TX
Sr. Informatica Developer
Responsibilities:
- Designed mapping document, which is a guideline for ETL Coding following Standards for naming conventions and best Practices were followed in mapping development.
- Extensively used Informatica various tasks like Decision, Command, Event Wait, Event Raise, Assignment, Timer, control, Link and Email.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner, lookups, Stored procedure and data flow management into multiple targets using router transformations.
- Optimized the SQL override to filter unwanted data and improve session performance.
- Used Debugger to track the path of the source data & also to check the errors in mapping.
- Prepared unit testing document covering field to field's validations and source to target count.
- Scheduling the workflow comprising of different sessions for their respective mappings in order to load data into Oracle database.
- Handled slowly changing dimensions of Type 1 and Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
- Conducted Data Analysis, halped Business Leads in understanding and designing new reports.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
- Automation of job processing, establish automatic email notification to the concerned persons.
- Created automated process to monitor the space in data directory using Perl and Korn Shell.
- Handled creation, modifications, and documentation of Oracle Packages, Procedures, Functions, and Indexes.
- Assisted the other ETL Developers in resolving complex scenarios.
- Involved in promoting the folders from Development to Test and Test to UAT and UAT to Production Environment.
- Involved in different Team review, Time estimation and UAT meetings.
Environment: Informatica Power Center 9.1, SSIS, Oracle 11g, Toad, Autosys, PL/SQL, SQL Plus, XML, Unix Shell Scripting, Clearcase(Migration Tool).
Confidential
Data warehouse Developer
Responsibilities:
- Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process.
- Provided data warehouse expertise including data modeling, Extract, Transform and Load (ETL) analysis, design and development.
- Hands-on Experience in working wif Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets to extract, transform and load data.
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Worked wif various Active and Passive transformations like Source Qualifier, Sorter, Aggregator, Filter, Union, and Router Transformations, Sequence Generator and Update Strategy Transformations.
- Handled versioning and dependencies in Informatica.
- Developed schedules to automate the update processes and Informatica sessions and batches.
- Resolving technical and design issues.
- Developed data transformation processes, maintain and update loading processes.
- Developed and implemented the UNIX shell scripts for the start and stop procedures of the sessions.
- Used UNIX shell scripts to run the batches.
- Developed standards and procedures to support quality development and testing of data warehouse processes.
Environment: Informatica power center 9.1,Oracle 10g, SQL Server, Autosys, UNIX. Shell scripting.