Informatica Developer Resume
San Francisco, CA
SUMMARY:
- Around 8 years of professional IT experience in Analysis, Design, Development, Testing and Implementation of various Data warehousing Software Applications.
- Over Six years of experience on Informatica Power Center and Power Exchange with strong business understanding and knowledge of Financial, Marketing, Consumer and inventory management projects. Hands on experience in all aspects of Software Development Life Cycle (SDLC).
- Expertise in Informatica PowerCenter7.x/8.x/9.1Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer, SSIS, Workflow Manager, IDQ, Infosphere MDM and Workflow Monitor.
- Experience in Extraction, Transformation and Loading data for data warehousing using Informatica Power Center, Power Exchange from heterogeneous source systems like Flat files, Excel, XML, UDB DB2,Teradata, Mainframe Files, SAP R3, Oracle, SSIS, Sybase, SQL Server etc.
- Experience with Relational and Dimensional Modeling, Created Star and snow flake models for Dimensional and fact tables in Data warehousing application, using Erwin.
- Knowledge of Dimensional Modeling, Ralph Kimball Methodology, Relational (3NF) Modeling on OLTP system using Bill Inmon Methodology and using the hybrid approach of star schema and 3rd Normal Form.
- Expertise in implementing complex Business rules by creating robust mappings, Mapplets, reusable transformations using transformations like Source qualifier, Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc .
- Good Experience in Talend DI Administration, Talend Data Quality and Talend Data Mapping.
- Experience in data migration across different databases and up gradation to informatica 9.1 from version 8.6.
- Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more.
- Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x.
- Having good working knowledge on Big Data Ecosystem: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop, MangoDB and Flume.
- Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from different servers.
- Experience in Talend Big Data Integration for business demands to work towards Hadoop and NoSQL.
- Redacted document format files which are in JSON format files and used MangoDB, NoSQL to analyze the data.
- Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
- Expert in implementing Type 1 and Type 2 Slowly Changing Dimensions/ CDC as per the requirements.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, Targets, mappings and sessions.
- Expertise in Session partitioning, tuning of lookup, aggregator and joiner caches for performance enhancements.
- Expertise in Sharding distributed Cassandra and MangoDB system. Experience in building Cassandra cluster. Monitoring Cassandra cluster for resource utilization.
- Strong in database programming using PL/SQL and SQL with creation of packages, stored procedures, functions, triggers, materialized views, cursors and performance tuning of SQL.
- Practical knowledge of Data warehousing concepts and principles (Kimball/Inman) - Star Schema, Snowflake schema, Surrogate keys, Normalization/Demoralization, FACT and Dimensional Tables.
- Strong in scheduling ETL load using job scheduling utilities like Control M, ESP, Tidal Scheduler and Austosys.
- Good experience in performing Unit testing, System Integration testing (SIT), User acceptance testing (UAT) and experience production support for issues raised by application users.
- Ability to work effectively working as a team member as well as an individual. Experience in working and co-ordinate with offshore and On-site models.
TECHNICAL SKILLS:
RDBMS: Oracle11g/10g/9i/8i, SQL Server2005/2008,Teradata V2R5, DB2 UDB 8.1, MongoDB
ETL Tools: Informatica PowerCenter 9.1/8.1/8.5/8.6/7. x / Power Exchange / Infosphere MDM / Power Connect, Talend
BI Tools: Business Objects XI R2/R1/6.x, Cognos. UNIX,IBM AIX 4.3/4.2, Windows NT/2000/XP
Job Scheduling Tools: Control-M7.0,TIDAL Scheduler, Autosys 4.5
Defect Management tools: HP Quality Center 9x, Test Director 7x, Remedy 7x
Languages: SQL, PL/SQL, TSQL, Unix Shell Scripting, XML, SSIS, C, C++, C#, Cobol, JavaScript, NZSQ0LData Modeling: ER Studio, Erwin 4.1,UML, Kimball Data Modeling
Database frontend tools: Oracle SQL Developer, Aqua Data Studio 8.0.7, Quest Toad for Oracle, DB Artisan.
PROFESSIONAL EXPERIENCE:
Confidential, San Francisco, CA
Informatica Developer
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
- Involved in Requirement Analysis, ETL Design and involved in massive data cleansing prior to data in staging tables from Oracle, MDM, flat files, DB2, UDB, SSIS, IDQ, and SQL Server, MY SQL, XML.
- Developed complex Informatica power Center Mappings with transformations like Source qualifier, Aggregator, Expression, lookup, Router, MDM, IDQ, Filter, Rank, Sequence Generator, and Update Strategy.
- Created Mapplets, reusable transformations and used them in different mappings.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Extensively worked on Teradata stored procedures and functions to conform the data and have load it on the table.
- Developed the required data warehouse model using Star schema for the generalized model following Kimball Methodology.
- Implemented Star schema logical, physical dimensional modeling techniques for data warehousing dimensional and fact tables using Erwin tool.
- Developed and implemented the UNIX shell script for the start and stop procedures of the sessions.
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.
- Implemented read ps in MongoDB replica set.
- Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
- Created Talend Mappings to populate data into dimensions and fact tables.
- Worked on handling performance issues of Informatica Mappings, Infosphere MDM, evaluating current logic for tuning possibilities, Created PL/SQL procedures triggers views for better performance.
- Tuned SQL Queries in Source qualifier Transformation for better performance.
- In-depth understanding of MongoDB HA strategies, including replica sets and sharding
- Tuning the ETL-Informatica code in Mapping level, and session level.
- Worked on ID, Power Exchange for change data capture (CDC).
- Developed the designed techniques and transformation techniques for acquiring the implementation of Master Data Management.
- Experience in using Talend Administration Console (TAC) for scheduling and executing Incremental ETL loads.
- Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing.
- Experience in configuring high availability using MongoDB replica sets across multiple data centers.
- Expert in Talend job migration and deployment to different environment and successfully scheduled job in TAC.
- Provided production support by monitoring the processes running daily, Provided data to the reporting team for their daily, weekly and monthly reports.
- In-depth understanding of MangoDB HA strategies, including replica sets and sharding
- Configuring high availability using geographical MongoDB replica sets across multiple datacenters
- Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica, Talend (ETL).
- Involved in team weekly and by monthly status meetings.
Environment: Informatica PowerCenter9.1, MDM, IDQ, Teradata,Power Exchange,Oracle10g, Kimball Data Modeling, MongoDB, SQL Server 2008, MySQL, Toad, Talned 4.1, SQL, PL/SQL (Stored Procedure, Trigger, Packages), Erwin, SSIS, MS Visio, Tidal, Windows XP, AIX, Web sphere, UNIX Shell Scripts, XML.
Confidential, Baltimore, MD
Informatica Developer
Responsibilities:
- Performed major role in understanding the business requirements and designing and loading the data into data warehouse (ETL)
- Created Informatica power center 8.6 Mappings to populate the data into dimension and Fact tables.
- Used Informatica8.6 client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
- Used Teradata data mover to copy data and objects such as tables and statistics from one system to another
- Part of Business Intelligence and Warehousing (BIW) group, supported design and development of Enterprise-wide, integrated Data Warehouse environment for future deployment of Decision support data marts utilizing Kimball’s methodology
- Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements
- Used different transformations like Aggregator, Lookup, Filter, Expression, Router, Update Strategy and Sequence Generator.
- Created mappings for various sources like DB2, IDQ, MDM, SSIS, Oracle, etc to load and integrate the instrument level details to ware house.
- Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin and ER/Studio.
- Informatica Debugger is used to test the mappings and fixed bugs.
- Developed UNIX shell scripts to get data from external systems flat files to EDW stage area, and to schedule various workflows to load the data into target system.
- Used MongoDB third party tools (ROBO MONGO, MONGOOWL, MONGOVUE) and mongo-built in binaries to monitor and analyze the performance of MongoDB.
- Experince in intergrating Talend, Hive and NoSql with source databases to Target databases.
- Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status
- Participated in Performance Tuning of ETL maps at Mapping, Session, Source and Target level as well as writing Complex SQL Queries from ABSTRACT Data model.
- Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance in the existing sessions.
- Experience managing security/encryption for MongoDB installations
- Defined various ETL jobs and data extraction and data loading jobs using Talend.
- Involved in Unit testing, System testing and User Acceptance Testing.
- Involved in Production support and trained the other developers to handle issues.
Environment: Informatica PowerCenter 8.6/8.0, SQL, PL/SQL, MDM, IDQ, Teradata,Kimball Data Modeling, Oracle 10g, MS SQL server 2005, MySQL, Toad, Sybase, Erwin, TOAD 8.5, UNIX, Web sphere, shell Scripting, AIX, SSIS, Flat Files, Windows NT/2000.
Confidential, Dallas, TX
Informatica Developer
Responsibilities:
- Worked with source system developers, technical experts, business owners and users to identify data sources for defining data extraction methodologies. Analyzed the source data and made decisions on appropriate extraction, transformation and loading strategies in Datawarehose.
- Exclusively Worked on Informatica9.1 - Source Analyzer, Mapping Designer & Mapplet Designer, Workflow Manager, IDQ, Workflow Monitor designer tools.
- Used Conversion process for SAP,VSAM, DB2, Infosphere MDM, IDQ and mainframes source files using Informatica Power Exchange
- Experienced in logical and physical data modeling of staging and Target tables for data warehousing environments using Data modeling tool Erwin and Oracle Designer.
- Expert in using Agile/Kimball Data Modeling Methodology.
- Installed, Configured Talend ETL on single and multi server environments.
- Imported Metadata from Teradata tables.
- Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
- Familiar with MongoDB write concern to avoid loss of data during system failures.
Experience with security/authentication/authorization
- Well versed in Kimball data warehouse Philosophies. Experienced in Normalization/De-normalization techniques for optimum performance in OLTP and OLAP environments.
- Develop the source data extraction process from source systems to intermediate staging and the Loading from staging to the data warehouse.
- Implemented Slowly Change Dimension Type I & II with Change Data Capture to capture history and created process flow mapping for insert/update using effective end date while loading data into the data warehouse.
- Created Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
- Involved in Performance tuning for sources, targets, SSIS, mappings and SQL queries in the transformations to increase the performance.
- Created Mapping parameters and variables and Session parameters according to the requirements and performance related issues.
- Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica a nd Talend (ETL), Cognos 10, Impromptu and Powerplay.
- Used SQL tools like SQL Developer, TOAD to run SQL queries to validate the data.
- Involved in working with DBA’s and Data Modellers for project coordination. Involved in creating Detail design for each interface.
- Defined collections to store data in MongoDB.
- Created UNIX Shell scripts, PMCMD for Automating of Workflows and Session schedule.
- Used Control-M as the scheduling tool for scheduling Batch jobs.
- Expertise in MongoDB Schema Design using DB Ref, Manual Ref.
- Create test cases, test plans, test scripts, system integration and UAT plan effectively to deliver a quality product.
Environment: Informatica Power Center 9.1, Teradata, IDQ, Teradata V2R6,power Exchange, Infosphere MDM, Kimball Data Modeling, Oracle10g, DB2, UDB, Talend 4.1, SQL Server 2008,MY SQL, XML, HP Quality Center 9.0,Toad, MangoDB, SQL Developer, Control - M,PL/SQL (Stored Procedures, Triggers, Packages),Erwin, MS Visio, SSIS, UNIX shell Scripting,AIX, Web sphere, Windows XP
Confidential, Richmond, VA
Informatica Developer
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
- Involved in Requirement Analysis, ETL Design and involved in massive data cleansing prior to data in staging tables from Oracle, flat files, DB2, UDB, Infosphere MDM, and SQL Server, MY SQL.
- Developed complex Informatica power Center Mappings with transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Sequence Generator, and Update Strategy.
- Created Mapplets, reusable transformations and used them in different mappings.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Implemented Star schema logical, physical dimensional modeling techniques for data warehousing dimensional and fact tables using Erwin tool.
- Developed and implemented the UNIX shell script for the start and stop procedures of the sessions.
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.
- Strong hands on experience using Teradata Utilities(SQL,BETEQ,Fast Load,Multi Load,FastExport,Tpump,Visual Explain,Query man)
- Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
- Developed Slowly Changing Dimension for Types 1 SCD, Type 2 SCD with Change Data Capture (CDC).
- Worked on handling performance issues of Informatica Mappings, evaluating current logic for tuning possibilities, Created PL/SQL procedures triggers views for better performance.
- Tuned SQL Queries in Source qualifier Transformation for better performance.
- Tuning the ETL-Informatica code in Mapping level, and session level.
- Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing.
- Used MongoDB third party tools (ROBO MONGO, MONGOOWL, MONGOVUE) and mongo-built in binaries to monitor and analyze the performance of MongoDB.
- Provided production support by monitoring the processes running daily, Provided data to the reporting team for their daily, weekly and monthly reports.
- Involved in team weekly and by monthly status meetings.
Environment: Informatica PowerCenter8.6, Infosphere MDM, IDQ, Power Exchange,Oracle10g, SQL Server 2008, MySQL, Toad, SQL, PL/SQL (Stored Procedure, Trigger, Packages), Teradata,Erwin, MS Visio, Tidal, MongoDB, Windows XP, AIX,UNIX Shell Scripts.
Confidential, Ashburn, VA
ETL Informatica Developer
Responsibilities:
- Involved in study of existing operational systems, data modeling, and analysis and translated business requirements into data mart design.
- Define the entity-relations - ER Diagrams and designed the physical databases for OLTP and OLAP (data warehouse).
- Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
- Designed migration procedures and Created applications in Python to do the data migration from Oracle to MongoDB.
- Developed mappings to extract data from SQL Server, MySQL, Oracle, Flat files, DB2, Mainframes, XML, Web sphere and load into Data warehouse using the PowerCenter, power exchange.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Expression, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Router, Filter, Aggregator and Sequence Generator transformations.
- Created reusable transformations and mapplets based on the business rules to ease the development.
- Designed and developed UNIX shell scripts to schedule jobs. Also wrote pre-session and post-session shell scripts
- Collaborated with Informatica Admin in process of Informatica Upgradation from PowerCenter 7.1 to PowerCenter 8.1.
- Defined collections to store data in MongoDB.
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision
Used various debugging techniques to debug the mappings.
- Designed migration procedures and Created applications in Python to do the data migration from Oracle to MongoDB.
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
Environment: PowerCenter8.1, Oracle9i, SQL Server 2000, SQL,PL/SQL, shell scripts,MongoDB, Mainframe DB2, MySQL, SAP EDI, MS Visio, ERWIN Data Modeling tool, Web sphere, Toad 8.0, Windows 2000,UNIX, XML.
Confidential, Reston, VA
Informatica Developer
Responsibilities:
- Extensively used Informatica Power center for extracting, transforming and loading data from relational sources and non-relational sources.
- Extensively used the transformations Sequence Generator, Expression, Filter, Router, Sorter, Rank, Aggregator, LOOK UP (Static and Dynamic), Update Strategy, Source Qualifier, Joiner, Normalizer.
- Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets for data load to data warehouse.
- Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
- Developed mapping parameters and variables to support SQL override.
- Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
- Create and Administer Repositories using Repository manager of Informatica.
- Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
- Also involved in moving the mappings from Test Repository to Production after duly testing all transformations.
- Calculated the Cache requirements for Lookup, Joiner and Aggregator transformations and increased the memory on Informatica server to avoid the paging to the disc.
Environment: Informatica Power Center 7.1, Oracle 9i, UNIX, Sybase, MS SQL Server 2004, Windows 2000
Confidential
Data Warehouse/Database programmer
Responsibilities:
- Responsible for requirement analysis of the application
- ETL tool Informatica was used to load strategic source data to build the data marts.
- An operational data store was created. Metadata Build up was designed for performing data mapping.
- Also involved in Mass data loads, refreshing the data in various applications, Performance evaluations, modifying the existing code to accommodate new features.
- Used various Transformations like Aggregator, Router, Expression, Source Qualifier, Filter, Lookup, Joiner, Sorter, XML Source qualifier, Stored Procedure and Update Strategy.
- Worked extensively on Flat Files, as the data from various Legacy Systems are flat files.
- Have setup Test and Production Environment for all mappings and sessions.
- Created and configured Sessions in Workflow Manager and Server Manager.
- Debugged the sessions using Debugger and monitored Workflows, Worklets and Tasks by Workflow Monitor.
- Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the mappings.
- Developed Oracle stored procedures, packages and triggers for data validations.
Environment: Informatica Power Center 6.2, Oracle 8i, SQL, PL/SQL, IBM DB2, TOAD, Windows 2000, XML.