We provide IT Staff Augmentation Services!

Lead Informatica Developer Resume

SUMMARY

  • 10+ years of experience in business Intelligence as ETL Informatica PowerCenter /Informatica IDQ Analyst and Developer/ Database developer.
  • Extensively used Informatica PowerCenter 10.2/9.6/9.1/8.6 , Informatica Data Quality (IDQ) 10.2/9.6/9.1 as ETL tool for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time.
  • Experience in relational databases like Oracle, DB2, Teradata, Netezza and SQL Server.
  • Experience in integrating data from flat files - fixed width, delimited, XML, WSDL, Web Services by using various transformations available in Informatica such as - Source qualifier, XML parser, Web services consumer transformation.
  • Extensively used data masking transformation to mask NPI data (SSN, birth date, account number etc) in Dev, QA environments.
  • Used various Informatica Powercenter and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, labeler, parser, address validator, match, comparison, consolidation, standardizer, merge to perform various data loading and cleansing activities.
  • Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.
  • Complete knowledge of using XML, MQ series as the source and target.
  • Extensively used data masking transformation for masking / scrubbing various sensitive fields such as social security number, credit card number, agreement numbers etc.
  • Integrated Informatica data quality mappings with Informatica powercenter.
  • Worked closely with MDM teams to understand their needs in terms of data for their landing tables.
  • Knowledge (haven’t worked on the product yet) of Informatica MDM (Sipherian MDM).
  • Created various profiles using Informatica Data Explorer (IDE) & IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
  • Used various performance techniques in Informatica such as - partitioning, tuning at source/target/transformation, usage of persistent cache, replacing transformations that use cache wherever possible.
  • Created complex mapplets to be shared among team members.
  • Extensive experience in developing various tasks, workflows, worklets, mapplets and mappings.
  • Extensive knowledge on debugging of Informatica mapping/workflows, unit testing of the mapping/workflows.
  • Extensively used Informatica repository manager for exporting / importing of workflows. Workflow monitor for workflow status.
  • Extensive knowledge in database external loaders - SQL loader (Oracle), LOAD (DB2), TPT, Fastload, Multiload, Tpump (Teradata), Bulk writer (Netezza).
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Extensive experience with using Bulk Writer (external loader) of Netezza to load flat files as well as data using Informatica.
  • Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
  • Extensive knowledge on different types of dimension tables - type 1 dimension, type 2 dimension, junk dimension, confirmed dimension, degenerate dimension, role playing dimension and static dimension.
  • Extensive experience in writing unix scripts to invoke Informatica workflows using PMCMD command, to perform data validations between source and target in terms of counts.
  • Created complex unix scripts using awk, sed and arrays for updating Informatica parameter files with correct process dates and connection string details in parallel.
  • Extensive knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Used Jira for creating stories and tracking task status for Agile.
  • Extensive knowledge on scheduling tools - Control-M, Autosys.

TECHNICAL SKILLS

ETL Technology: Informatica Power Center 10.2/9.6/9.5/9.1/8.6/7.1 ,, Informatica Data Quality 9.6.1. Informatica Power Exchange / 9.69.5/9.0.1/8.6

Data Warehouse: Multidimensional Model Design, Star Schema Development

Data Modelling: MS Visio 2010/2007, Erwin 4.5/3.1

Databases: Oracle 12C/11g/10g/9i, MS SQL Server 2012/2008/2000 , MS Access, Sybase, DB2, MySQL, Teradata 14/13/12, Netezza.

Programming: SQL, PL/SQL, HTML, UNIX Scripting, HDFS, Hive.Spark

Reporting Tools: Power BI, Cognos 7. Qlik View 12, Qlik Sense 3

Operating Systems: Windows 2007/XP, Linux, Sun Solaris, AIX, MS-DOS

Applications: MS Office, MS Project, FrontPage, Toad 9.2/8.6

Management: MS-Office 2010/2007, MS Project

Deployment: Perforce, Team city

Defect & Task Tracking: Quality Center, JIRA, Agile Central

PROFESSIONAL EXPERIENCE

Confidential

Lead Informatica Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse, used Erwin to design the business process, dimensions, and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Worked on Various Teradata utilities like FLOAD, MLOAD and TPump.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 10.2/9.6.1/9.5/8.6.1 , Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Teradata, BTEQ, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0,

Confidential, East Hartford, CT

IDQ Developer /Informatica Developer/Informatica IDQ Administrator

Responsibilities:

  • Performed the roles of Senior ETL Informatica and Data Quality (IDQ) developer on a data warehouse initiative and was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW), perform data cleansing activities using various IDQ transformations.
  • Collaborated with data architects, BI architects and data modelling teams during data modelling sessions.
  • Extensive experience in building high level documents depicting various sources, transformations and targets.
  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems - Salesforce and TouchPoint.
  • Used Informatica Data Quality transformations to parse the “Financial Advisor” and “Financial Institution” information from Salesforce and Touchpoint systems and perform various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.
  • Extensively used Standardizer, Labeler, Parser, Address Validator, Match, Merge, Consolidation transformations.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Converted and published Informatica workflows as Web Services using Web Service Consumer transformation as source and target.
  • Created reusable components, reusable transformations and mapplets to be shared among the project team.
  • Used ILM & TDM to mask sensitive data in Dev, QA environments.
  • Used XML & MQ series as the source and target.
  • Used in-built reference data such as - token sets, reference tables and regular expressions to and built new reference data objects for various parse/cleanse/purging needs.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly.
  • Created Informatica mappings keeping in mind about Informatica MDM requirements.
  • Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
  • Worked extensively with Oracle external loader - SQL loader - to move the data from flat files into Oracle tables.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN , Collect Statistics , Hints and SQL Trace both in Teradata as well as Oracle.
  • Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
  • Extensively used OLAP queries - Lead, Lag, First Value, Last Value to analyze and tag rows for type-2 processing.
  • Generated explain plans for performance tuning of queries and identifying the bottlenecks for long running queries, worked with DBA to fix the issues.
  • Extensively used Power Exchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Oracle tables.
  • Extensively used Power Exchange for Salesforce to read data from relational sources (Oracle) and load into Salesforce objects.
  • Used Netezza Bulk writer to load huge amounts of data into Netezza database.
  • Extensive experience in querying Salesforce objects using workbench.
  • Extensively used JIRA & ServiceNow for creating requests for access, production migrations, component migrations & production related service requests.
  • Extensively used Enterprise Manager tool in Control-M to load the charts and run the jobs for initial load of the tables whenever a new environment is created.
  • Owned the defects from production as well as from system testing and worked on the solutions.
  • Coordinated with QA team during QA environment build, reviewing test cases, test execution and defect assignments.
  • Extensive knowledge on defect tracking tools - TRAC, RMS as version control management tool.
  • Assisted in preparing implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Extensive experience in PL/SQL programming, stored procedures, functions and triggers.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Worked extensively on Business Objects reporting tool, created universes from scratch keeping in mind with all of the possible scenarios, contexts, loops, derived tables.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment: Informatica Power Center 10.2/9.6, Informatica Data Quality (IDQ) 10.2/9.6, Informatica Data Explorer (IDE) 9.6, Salesforce, TDM, Teradata, Oracle 11i, DB2 10.1, SQL Server 2012, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, SQL Developer, SQL Loader, Netezza, Bulk writer, MQ series, Load, Ingest, T-SQL, PL/SQL, RMS, Linux, AIX, ERWIN, Teradata modelling, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, PowerExchange for Salesforce, XML, Rally, UC4, JIRA, Jenkins, ServiceNow, Control-M, Enterprise Manager, Autosys, JIL Scripts, Lotus Notes, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.

Confidential, Richmond, VA

Informatica Developer/IDQ

Responsibilities:

  • Performed the roles of Senior ETL Informatica and IDQ developer on a data warehouse initiative and was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) and additional target data warehouses.
  • Collaborated with data architects, BI architects and data modelling teams during data modelling sessions.
  • Involved in functional design reviews and lead technical design reviews.
  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into Teradata, Oracle, DB2 and SQL Server targets.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems - Claims, billing, payments.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Used Informatica Data Quality transformations to parse individual consumer information from various vendors such as Acxiom, Alliant to identify the true source of policy holder’s origination and assign credits accordingly by performing various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Created reusable components, reusable transformations and mapplets to be shared among the project team.
  • Extensively used Informatica Data Quality transformations - Labeler, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier.
  • Extensively used in-built reference data such as - token sets, reference tables and regular expressions and created new set of reference tables to identify the noise data and standardization of input data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Created various data quality mappings in Informatica Data Quality tool and imported them into Informatica powercenter as mappings, mapplets.
  • Used TDM to mask sensitive data in Dev, QA environments.
  • Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
  • Worked extensively with Oracle external loader - SQL loader - to move the data from flat files into Oracle tables.
  • Coordinated with QA team during QA environment build, reviewing test cases, test execution and defect assignments.
  • Extensive knowledge on defect tracking tools - TRAC, RMS as version control management tool.
  • Assisted in preparing implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Extensive experience in PL/SQL programming, stored procedures, functions and triggers.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Used Informatica powerexchange to connect to Netezza and loaded huge amounts of data using Bulk writer external loader.
  • Knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Used Jenkins as a packaging and deployment tool for migrating ETL & Unix components.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment: Informatica Power Center 9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, TDM, Netezza, Bulk Writer, Teradata 13, Oracle 11i. DB2 10.1, SQL Server 2012, SQL Developer, SQL Loader, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, T-SQL, PL/SQL, RMS, Linux, AIX, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, MQ Series, PowerExchange for Salesforce, XML, UC4, Control-M, Rally, Enterprise Manager, Autosys, JIL Scripts, Jenkins, Lotus Notes, JIRA, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.

Confidential

ETL / Informatica Developer / I Informatica DataQuality

Responsibilities:

  • Experienced in the use of agile approaches, including Extreme Programming, Test-Driven Development and Scrum.
  • Data profiling use IDQ developer and Informatica analyst before start mapping notice client and SME about data quality.
  • Interaction with the Business Partners to understand the business of the application, gather requirements, analyzed and put into technical design documents (HLD & LLD).
  • As an active team member, involved in the initial phases of the design and was taken part in the analysis of the product.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Involved in the creation of oracle Tables, Table Partitions, Materialized views and Indexes and PL/SQL stored procedures, functions, triggers and packages.
  • Working with various sources such as Flat files, Relational, XML and Webservices as part of Informatica Mappings.
  • Worked with Mapping and Workflow variables by using the value of mapping variables on the workflow level.
  • Used concept of staged mapping in order to perform asynchronous Web Services request and response as part of Informatica Mappings
  • Responsible to tune ETL mappings to optimize load and query Performance
  • Wrote UNIX Shell Scripting for Informatica Pre-Session, Post-Session Scripts and also to run the workflows to create the Front-End screen insert activities.

Environment: Informatica Power Center 9.6.1 (Designer, Repository Manager, Workflow Manager), Oracle 9i, Oracle 11g, TOAD, SQL Developer, UNIX Shell Scripts, Microsoft Visio 2007, Admin server, UNIX, Windows 2000/NT

Confidential

ETL Informatica Developer

Responsibilities:

  • Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code.
  • Perform analysis on quality and source of data to determine accuracy of information being reported.
  • Data profiling use IDQ developer and Informatica analyst before start mapping notice client and SME about data quality
  • Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica.
  • Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
  • Designed and developed the Mappings using various transformations to suit the business user requirements and business rules to load data from SQL Server, file and XML file sources targeting the views (views on the target tables) in the target database (Oracle).
  • Developed custom data transformation (structured/unstructured data) process using Item field content master data transformation tool.
  • Worked on Informatica PowerCenter tool - Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Developed standard and re-usable mappings and mapplets using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Source Qualifier, Sorter, Update strategy and Sequence generator.
  • Developed PL/SQL stored procedures for Informatica mappings.
  • Scheduled the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data for management reporting.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.s
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Providing technical support and troubleshooting issues for business users.
  • Actively Participated in problem solving and troubleshooting for the applications implemented with Informatica.
  • Used Workflows Manager to create the Sessions to transport the data to target warehouse.

Environment: Informatica PowerCenter 9.6.1, Power Exchange 9.6.1, Erwin 4.1, EDI, Autosys r11.0 oracle 9i, PL/SQL, Teradata 12, SQL, XML, SQL Server 2005, Windows 2003, Unix.

Hire Now