Sr. Lead Informatica Developer/automation Tester Resume
Columbus, OH
SUMMARY:
- Over 7+Years of professional experience in Data warehouse, Relational Database and System Integration, Proficiency in gathering and analyzing user requirements and translating them into business solutions.
- Around 5 years of Strong experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing ETL solutions using Informatica Power Center (9.x/8.x/7.x) and Informatica Data Quality (IDQ).
- Experience in developing ETL mappings and scripts using Informatica Power Center 9.6.1/9.1.0/8.6.1/8.5/8.1, Informatica Data Analyst (IDA) tools using Designer (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Repository Manager, and Workflow Manager & Workflow Monitor.
- Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, and Data Validation
- Experience in performance tuning in SSIS packages by using Row Transformations, Block and unblock experience.
- Experience in using SSIS tools like Import and Export Wizard and SSIS package designer.
- Experience in writing SQL queries, functions, packages, cursors and stored procedures, database triggers using PL/SQL in Oracle.
- Expert in creating PL/SQL procedures, packages, functions, database triggers and other database objects like to generate back - end and front-end reports.
- Understand the data quality rules defined by the business/functional teams and propose the optimization of these rules if applicable, then design and develop these rules with IDQ, including complete unit test planning and performance.
- Experience in a wide range of business Domains - HealthCare, Banking and Insurance
- Excellent in coding using SQL, SQL*Plus, PL/SQL, Procedures/Functions, and Triggers.
- Experienced in integration of various data sources like Sales force, Oracle, DB2, SQL server and MS access into staging area.
- Had good experience with application design and development using PL/SQL (Functions, Procedures, Triggers and Packages).
- Knowledge of data warehousing techniques, Dimensional Data Modeling, Star / Snowflake schema, Fact and Dimensions tables, physical and logical data modeling OLAP and Report delivery methodologies.
- Expertise in Performance tuning mappings, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings, and sessions.
- Extensive experience in Data profilin g and Data cleansing using Informatica tools, SQL Scripts, Stored Procedures and execution of test plans for loading data successfully into the target Systems.
- Planned, created and executed SSIS packages to integrate data from varied sources like Oracle, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
- Experience in OBIEE 10g for scheduling the tasks and unit testing the ordered tasks.
- Used Teradata Utilities like fast load, multi load and fast export.
- Have knowledge on Pentaho components like Database lookup and join, Generate rows, calculator, Row normalizer and denormalizers.
- Involved in massive data profiling using IDQ (Analyst tool) prior to data staging
- Performed data validation by Unit testing, integration testing and System Testing.
- Developed and support data marts along with designing applications utilizing SQL data marts using SSIS and SSRS
- Extensive experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
- Installation and configuration of Informatica Server and Informatica Repository Server in Windows-UNIX Operating Systems.
- Data Validation and Profiling based on the business and functional requirements.
- Experience in integration of various data sources like Oracle, SQL Server, Teradata and flat files into staging area.
- Experience in writing database scripts such as SQL queries, PL/SQL Stored Procedures, Indexes, Functions, Views, and Triggers.
- Proficient in T-SQL, creation of views, triggers, User-defined Functions, Stored Procedures, and CTE’s using dynamic-queries, sub queries and joins.
- Experience in writing ETL or API programs using Oracle.
- Developed UNIX shell scripts to FTP source files, validate source files, automated archival of Log files, create ETL event start/stop files.
- Knowledge on reporting tools like Cognos, Microstrategy, OBIEE and Business Objects with an idea of Universes, reports and adhoc reports.
- Involved in complete software development life cycle (SDLC) of project with experience in domains like Clinical, Finance, and Banking.
- 24x7 Production Support for business continuity.
- Willingness to learn new concepts and ability to articulate alternative solutions.
- Design and build the DR (Disaster Recovery) Informatica Infrastructure setup.
- Informatica infrastructure Capacity planning and Charge back process.
- Experience in preparing documentation like HLD, LLD and Test case documentation.
- Quick learner, ability to work in groups or alone, ability to work on tight schedules.
- Team player with good interpersonal and communication skills and the ability to work in a team environment.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 10.1, 9.6.1/9.5.1/9.1/9.0.1/8.6/8.5/8.1/7. x, B2B, IDQ, Informatica MDM v10, Informatica B2B/DTS 8.6.2, Informatica Designer, Informatica Cloud REST API, Workflow Manager, Work flow Monitor, ETL, DataMart, Data cleansing, SSIS, Data Profiling, OLAP, OLTP, Mapplet, Transformations, Autosys, SQL*Loader, Control Center, DataStage 11.5
Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.5/4.0, Oracle Designer, Visio
Business Analysis/Data Analysis: Functional Requirements Gathering, User Interviews, Business Requirements Gathering, Process Flow Diagrams, Data Flow Diagrams
Reporting Tools: Business Objects, OBIEE, Microstrategy 9.x, SSRS, Cognos 10.1
Databases: Oracle 11g/10g/9ix, MS SQL Server 2012/2008/2005, Sybase ASE 12/12.5.3, Netezza, Teradata, DB2
Languages: C, C++, Java, SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Siebel 7.x, Perl, Ruby Mine, Python, R
Environment: Unix Clones:- HP-UX10/9, Sun Solaris 8/7/2.6/2.5, AS/400, LINUX, Windows 95/98/2000/XP, WinNT4.0
Big Data Technologies: Hadoop, Hive, HDFS
Others: MS-Project, MS-Office, MS-VISIO 2010/2013, ESP Scheduling
PROFESSIONAL EXPERIENCE:
Confidential, Columbus, OH
Sr. Lead Informatica Developer/Automation Tester
Roles & Responsibilities:
- Developed mappings and mapplets to load the profiling results using the rules and the logical data objects.
- Developed Informatica mappings using heterogeneous sources and targets including flat files, oracle, Teradata, SQL server, DB2, Customized XML, etc.
- Enabled pushdown optimization on Teradata using Informatica.
- Developed product/project level acceptance test cases for all the objects using Ruby Mine and Cucumber for Acceptance Test and IT Testing.
- Performed acceptance test driven development (ATDD), TDD.
- As a part of test driven development, test is written and data is generated using Ruby tool. Once the test is complete, the coding is done using Informatica tool from Source environment (Flat File) to Stage (Oracle database), Stage (Oracle) to Vault (Oracle database) and from Vault (Oracle database) to DataMart (Netezza database).
- Perform pair programming and worked on advanced Informatica concepts including command-line parameterization, multi-loads, concurrent execution, partition techniques, SCD Type-I, type-II
- Running Informatica workflow through scheduling tools like CA Workstation ESP and Perl scripts.
- Used SCM workbench tool for code promotion to higher environment.
- Involved in the QA, Prod migration of Informatica Workflows, Oracle Tables, and Perl Scripts.
- Performed IT testing through Oracle and Teradata queries and run some ruby scripts.
- Extensively used various Data Cleansing and Data Conversion functions in various transformations.
- Followed the required client security policies and required approvals to move the code from one environment to other i.e. (from DB2 to Teradata).
- Checked sessions and error logs to troubleshoot problems and also used debugger for complex.
- Developed test plans and developed test cases/test strategy with input from the assigned business analysts and data architect.
- Created ETL test data for all ETL mapping rules to test the functionality of the informatica mapping and tested the ETL informatica mappings and other ETL process (Data Warehouse Testing).
- Validated the data flow source to target tables by verifying the mappings and transformations in informatica.
- Upgrade Informatica repositories from v9.6.1 to v10.1.
Environment: Informatica Power Center 9.6.1, Oracle 11g, MS-SQL Server, Informatica servers on Unix Putty, TOAD, Ruby, Cucumber, SVN, ESP, Teradata, Perl, Harvest-Workbench, MS Office Suite
Confidential, Ann Arbor, MI
Sr. Informatica Lead/Developer
Roles & Responsibilities:
- Reviewing business and functional requirements for various modules so as to have better understanding of the overall application.
- Used Informatica Power Center 9.6.1 for extraction, loading and transformation (ETL) of data in the data warehouse
- Configured and installed informatica MDM Hub server, cleanse Server, resource kit in Development, QA, pre-prod and prod Environments
- Developed complex mappings in Informatica to load the data from various sources
- Exported mappings from IDQ to Informatica Power Center
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure.
- Design the ETL process to source the data from source and load it into ODS tables
- Created mappings and sessions to implement technical enhancements for data
- Developed complex mappings and SCD type-II mappings in informatica to load the data from various source to ODS tables
- Created MDM mappings using IHA best practices to capture the errors and have a clean load using informatica.
- Data Loads to warehouse by extracting data from sources like Oracle and Delimited Flat files.
- Informatica B2B data transformation for supporting transformations and mappings via XML for most healthcare standards.
- Used SQL plus database application tool to write complex SQL queries against other legacy systems to retrieve data and validate it for Siebel import process.
- Experienced on creating profile, rules and mapping with IDQ and also worked on IDQ transformations like labeler, parser, standardizer, match, Merge and Exception.
- Involved in standardization of Data like of changing a reference data set to a new standard.
- Worked on parameterize of all variables, connections at all levels in UNIX.
- Did performance tuning of Informatica components for daily and monthly incremental loading tables and performed PL/SQL performance tuning.
- Used Oracle SQL developer for creating PL/SQL (Trigger, sequence, stored procedure).
- Developed stored procedures in PL/SQL for cleaning up data and providing underlying structure for reporting using SQL.
- Developed several complex mappings in Informatica a variety of Power center transformations, mapping parameters, mapping variables, Mapplets and parameter files in mapping designer using both IDQ and Informatica Power Center.
- Created complex stored procedures, Triggers, Functions, Indexes, Tables, Views, SQL joins and other T-SQL code to implement business rules.
- Created SSIS packages to perform filtering operations and to import the data on daily basis from OLTP system, Staging DB to Data warehouse and Data Marts.
- Transformed data from various data sources using OLE DB connection by creating various SSIS packages.
- Worked on UNIX and Oracle SQL Developer to develop queries and create procedures and packages Oracle.
- Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Worked with ETL Migration Team and Migrated Informatica folders from Dev to Test repository and from Test to Prod Repository.
- Profiled the data using Informatica Analyst tool to analyse source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
- Performed Unit testing, Integration testing, and coordinated with QA for UAT for code change and enhancement.
- Established policies and procedures for a data governance group within the corporation to initiate plans for enterprise data.
- Exposure to Informatica B2B data transformation that supports transformation of semi and un-structured data types.
- Involved in redesigning ETL mappings to improve data quality
- Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
- Created sessions, worklets, workflows for the mapping to run daily, biweekly and monthly based on the business requirements.
- Created Connected, Unconnected and Dynamic Lookup transformations for better performance.
Environment: Informatica Power Center 9.6.1/9.5, Informatica B2B, Siebel tools, Oracle 11g, Informatica Data Quality (IDQ), DVO, MS-SQL Server, SSIS, Informatica servers on Unix Putty, TOAD, Business Objects XI, MS VISIO, MS Office Suite
Confidential, Phoenix, AZ
Sr. Informatica Developer
Roles & Responsibilities:
- Analysed the key functionalities of the institute and performed Data Analysis and abstracted the transactional nature of data.
- Analysed the source systems for erroneous, duplicative, and integrity issues related to the data.
- Reviewing business and functional requirements for various modules so as to have better understanding of the overall application.
- Used Informatica Power Center 9.1/9.6.1 for extraction, loading and transformation (ETL) of data in the data warehouse
- Developed complex mappings in Informatica to load the data from various sources
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure.
- Designed mapping using B2B data transformation.
- Experience in working with SQL Server Integration Services (SSIS).
- Worked with Informatica IDQ to determine Data Quality issues and Remediation process for bad data.
- Developed PL/SQL scripts to transfer tables across the schemas and databases.
- Writing validation packages using PL/SQL package using Data Structures.
- Used Ref cursors and collections for accessing complex data resulted from joining of large number of tables in PL/SQL blocks and even have experience with exception handling.
- Developed various complex stored procedures, packages, interfaces and triggers in PL/SQL.
- Created mappings and sessions to implement technical enhancements for data
- Data Loads to warehouse by extracting data from sources like Oracle and Delimited Flat files.
- Involved in standardization of Data like of changing a reference data set to a new standard.
- Worked on Parameterize of all variables, connections at all levels in UNIX
- Created test cases for unit testing and functional testing
- Coordinated with testing team to make testing team understand Business and transformation rules being used throughout ETL process.
- Developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data using different IDQ transformations.
- Created SSIS packages to Export data from text file to SQL Server Database.
- Created Customized BI reports to meet the user requirements using Cognos.
- Worked with creating Dimensions and Fact tables for the data mart
- Worked with data import/ export and integration with legacy systems and also modified and designed batch jobs to import/ export data between Siebel database and legacy systems.
- Installed B2B data exchange, server plugin, client plugging and configuring with powercenter
- Created Informatica mappings, sessions, workflows, etc., for loading fact and dimension tables for data mart presentation layer
- Develop simple Data Services jobs to pull data from SAP to Netezza.
- Co-ordinate with Offshore team to analyze, develop and test Migration projects
- Familiar with Data Governance standards and cause detection for data quality problem.
- Have implemented SCD (Slowly Changing Dimensions) Type I and II for data load
- Performance tuning by analyzing and comparing the turnaround times between SQL and COGNOS
- Did performance tuning of Informatica components for daily and monthly incremental loading tables
- Designed dynamic SSIS packages to transfer data crossing different platforms, validate data and data clearing during transferring, and achieved data files for different DBMS
- Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages Oracle.
- Worked on Netezza database for loading data into data warehouse.
- Successfully developed drill down reports and hierarchies, created cascading values using Cognos.
- Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
Environment: Informatica Power Center 9.6.1/9.5, Informatica Data Quality (IDQ), DVO, Informatica B2B 8.6.2, Informatica Cloud REST API, Oracle 11g, MS-SQL Server, Informatica servers on Unix Putty, SSIS, Seibel tools, TOAD, Cognos 10.1, HP Application Lifecycle Management, MS Office Suite
Confidential, NYC, NY
Sr. Informatica Developer
Roles & Responsibilities:
- Involved in Preparing the High Level Design (HLD) and Low Level Design (LLD) documents for ETL Informatica process.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica.
- Developed several generic Informatica jobs which can be reused for several processes.
- Designed and developed complex mappings that involved Slowly Changing Dimensions, Error handling, Business logic implementation
- Created sessions, worklets, workflows for the mapping to run daily, biweekly and monthly based on the business requirements.
- Created Connected, Unconnected and Dynamic Lookup transformations for better performance.
- Developed complex mapping logic using various transformations like Expression, Lookups (Connected and Unconnected) Joiner, and Filter, Sorter, Router, Update strategy, Sequence generator, Java, Rank, Aggregator, SQL, Xml, and Normalizer.
- Profiling customer data and identified various patterns of the phone numbers to be included in IDQ plans.
- Worked with ETL Migration Team and Migrated Informatica folders from Dev to Test repository and from Test to Prod Repository.
- Involved in uploading of the data from flat files into Databases and validated the data with PL/SQL procedures.
- Wrote PL/SQL stored procedures, functions, Packages and Package constructors to enforce business rules.
- Extensively used bulk collection in PL/SQL objects for improving the performance.
- Worked with the B2B operation console and configure the partner management.
- Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
- Extensively worked on Mapplets, Reusable Transformations, and Worklets there by providing the flexibility to developers in next increments.
- Created Mapplet and used them in different Mappings.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Designed and developed mappings to extract, Transformations and load into target tables using different IDQ transformations.
- Used Informatica to extract data from various data sources and data marts like Netezza.
- Implemented SCD Type 1 and SCD Type 2 methodologies in ODS tables loading, to keep historical data in data warehouse.
- Involved in developing the logical and physical data models using Erwin.
- Involved in Upgrade of Informatica Power Center Standard edition to advanced edition.
- Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).
- Created Macros in Teradata to enable Change Data Capture (CDC) to identify the delta and maintain the data mart in sync with the source system.
- Designed mappings to read data from various relational and file source systems such as Teradata, Oracle, flat files and XML files.
Environment: Informatica Power Center 9.1.1, Informatica B2B data exchange, Oracle 11g/10g, Autosys Scheduler, Netezza, Informatica Servers on UNIX Putty, TOAD
Confidential, Buffalo, NY
Informatica Developer
Roles & Responsibilities:
- Created different transformations using Informatica power center 8.6 for loading the data into targets using various transformations like Source Qualifier, Java Transformation, SQL transformation, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, and Sequence Generator. Designed the mappings between sources (external files and databases) to operational staging targets.
- Prepared a detailed Technical Design Document after analyzing the Functional Specifications and the Architectural Diagram. The Technical Design Document captured all functional and technical requirements.
- Developed numerous Complex Informatica Mappings, Mapplets and reusable Transformations.
- Designed and created complex source to target mapping using various transformations inclusive of but not limited to Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
- Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.
- User and Group management(Assign privileges and permissions to users/groups)
- Upgrade Informatica repositories from v8.6.0 to v9.1.0
- High availability recovery and set the sessions to recover automatically in case of any failure.
- Security management (accessibility, folder and group permissions)
- Server Administration (start-up / shutdown, configuring the services) and system Maintenance (Log analysis, backups and archival)
- Used Informatica to extract data from DB2, UDB, XML, Flat files and Excel files to load the data into the Teradata
- Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
- Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor
- Used Power Exchange to source copybook definition and then to row test the data from data files etc.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
Environment: Informatica Power Center 9.1, Oracle 11g, Power Exchange 9.1, Teradata V13.0, Fast Load, Multiload, Teradata SQL Assistant, MS SQL Server, UNIX.
Confidential
ETL/Informatica Developer
Roles & Responsibilities:
- Responsible for dimensional modelling of the data warehouse to design the business process.
- Parsing high-level design specification to simple ETL coding and mapping standards.
- Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
- Worked on SQL tools like TOAD to run SQL queries and validate the data.
- Worked on database connections, SQL Joins, views in Database level.
- Created Informatica mappings with PL/SQL procedures to build business rules to load data.
- Most of the transformations were used like the Source qualifier, Aggregators, Connected & Unconnected lookups, Filters & Sequence
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing of Informatica mappings
- Coded PL/SQL scripts.
- Implemented design improvements to increase scalability of the target data mart to avoid duplicate records. These included the increased parallelism to allow sessions to be spread across all available processors. This scalability proved crucial in allowing enterprise wide usage of the data mart.
- Used Informatica repository manager to backup and migrate metadata in development, test systems.
- Involved in data cleansing, mapping transformations and loading activities.
- Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
Environment: Informatica Power Center 8.6.1, Oracle 10g, Power Exchange 8.6, MS SQL Server 2008, UNIX, Erwin Data Modeler 4.1, Autosys