Senior Informatica Powercenter & Idq Developer Resume
Weehawken, NJ
SUMMARY:
- 10 years of IT experience in analysis, design, development, testing and implementation of business applications systems for Insurance, Financial, Supply Chain Management and health care sectors.
- Strong Data Warehousing ETL experience of using Informatica Powercenter, Informatica IDQ, IDE and different databases for extracting, transforming, cleansing and loading data from various sources such as flat files, relational tables, XML into various targets, in batch & real time.
- Used various Informatica Powercenter and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, labeler, parser, address validator (Address doctor engine), match, comparison, consolidation, standardizer, merge to perform various data loading and cleansing activities.
- Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.
- Strong experience is creating various profiles and scorecards using Informatica Data Explorer (IDE) & IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
- Strong experience in configuring MQ Series as the source and parsing the XML inside the MQ Series using XML parser transformation in batch and real time.
- Good understanding of Informatica MDM (Sipherian MDM) including landing tables, staging tables, base objects, match, merge and unmerge process.
- Created an ELT process to automate the process of handling type-1 & type-2 SCD’s, thus minimizing the development effort.
- Used various performance techniques in Informatica such as - partitioning, tuning at source/target/transformation, usage of persistent cache, replacing transformations that use cache wherever possible.
- Designed and enabled Informatica workflows as Web Services using Web Service as the source & target and exposed them for real time processing from different 3rd party clients including Java.
- Extensive knowledge on WSDL, XML, SOAP messages and web services.
- Strong experience in invoking REST web services through HTTP transformation.
- Used HTTP transformation to invoke SOAP & REST web services through URL, posted files into various 3rd party applications.
- Extensive knowledge in database external loaders - SQL loader (Oracle), LOAD (DB2), TPT, Fastload, Multiload, Tpump (Teradata), Bulk writer (Netezza).
- Extensively used data modelling tools such as ERWIN to design teradata tables and relationships among them.
- Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database, Fastexport to export data out of Teradata tables, created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
- Extensive experience with using Bulk Writer (external loader) of Netezza to load flat files as well as data using Informatica.
- Created complex SQL queries using common table expressions (CTE), analytical functions like - lead, lag, first value, last value etc.
- Extensive knowledge on different types of dimension tables - type 1 dimension, type 2 dimension, junk dimension, confirmed dimension, degenerate dimension, role playing dimension and static dimension.
- Created complex PL/SQL programs, stored procedures, functions and triggers.
- Declared cursors to move the data between tables, for temporary storing of data to perform various DML operations in stored procedures.
- Extensive experience in writing unix korn scripts to invoke Informatica workflows using PMCMD command, to perform data validations between source and target in terms of counts, perform cleanse and purging of source staging data with the help of various command including sed, awk etc.
- Extensive knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
- Extensive used Jenkins on packaging and deployment of various components.
- Used Rally for creating stories and tracking task status for Agile.
- Extensive knowledge on scheduling tools - Control-M, UC4, Autosys (JIL Scripts), Tivoli (TWS) and CRON.
PROFESSIONAL EXPERIENCE:
Confidential
Senior Informatica Powercenter & IDQ Developer
Responsibilities:- Senior ETL Informatica & Data Quality (IDQ) developer on a data warehouse initiative, responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW), perform data cleansing activities using various IDQ transformations.
- Used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
- Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
- Used Informatica Powerexchange to read and load data into Salesforce objects.
- Created Informatica workflows and IDQ mappings for - Batch and Real Time.
- Created an ELT process to automate the process of handling Type-1 & Type-2 SCD changes, thus minimizing the development time.
- Integrated Informatica with USPS using Address Validator (Address Doctor) transformation to validate incoming address requests and converted and published Informatica workflows as Web Services using Web Service Source & Target & web service provider capabilities of Informatica.
- Used Web Services consumer transformation to access various 3rd party web services, used HTTP transformation to access REST web service GET & POST methods to download and upload attachments to different applications.
- Parsed incoming messages from MQ series with customer information & log details.
- Experience in using Informatica data masking transformation to mask sensitive data (SSN, Date of birth) when bringing down the data from Prod environment to other environments.
- Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
- Worked on a POC to use Informatica powerexchange for cloud connector to access Workday cloud connector and read/load data into Workday application.
- Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly.
- Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
- Worked on performance tuning of Informatica and IDQ mappings.
- Worked extensively with Oracle external loader - SQL loader - to move the data from flat files into Oracle tables.
- Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
- Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
- Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN , Collect Statistics , Hints and SQL Trace both in Teradata as well as Oracle.
- Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics, wrote Teradata Macros and used various Teradata analytic functions.
- Knowledge on Teradata Manager, TDWM, PMON, DBQL, extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
- Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
- Extensively used OLAP queries - Lead, Lag, First Value, Last Value to analyze and tag rows for type-2 processing.
- Extensively used PowerExchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Oracle tables.
- Extensively used PowerExchange for Salesforce to read data from relational sources (Oracle) and load into Salesforce objects.
- Used Netezza Bulk writer to load huge amounts of data into Netezza database.
- Extensive experience in querying Salesforce objects using workbench.
- Extensively used JIRA & ServiceNow for creating requests for access, production migrations, component migrations & production related service requests.
- Used Jenkins to automate packaging and deployment of various ETL, Unix components.
- Scheduled jobs using Control-M manager, Autosys.
- Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
- Created complex Business Objects reports using Infoview from scratch.
- Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
- Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.
Environment: Informatica Power Center 10/9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, Informatica MDM 10.1, Data Masking, Salesforce, Powerexchange for Salesforce, Teradata, Oracle 11i, DB2 10.1, SQL Server 2012, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, SQL Developer, SQL Loader, Netezza, HTTP, REST Web services, Bulk writer, MQ series, SQL Plus, Ingest, T-SQL, PL/SQL, RMS, Linux, AIX, ERWIN, Teradata modelling, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, XML, Rally, UC4, JIRA, Jenkins, ServiceNow, Control-M, Enterprise Manager, Autosys, TWS, JIL Scripts, Lotus Notes, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.
Confidential,Weehawken, NJ
Senior Informatica Powercenter & IDQ developer
Responsibilities:- Data Warehousing initiative to fetch data related to Loan accounts and various securities submitted for loans and analyze various inputs and trigger points to create a dedicated data mart for holding “Securities Backed Account” information.
- The process included reaching out to various sources including EDW, Invoking data from 3rd party Web services (bills and payments), standardizing, matching, consolidating and loading into SBL datamart using Informatica Powercenter & IDQ.
Environment: Informatica Powercenter 9.6.1, Informatica Data Quality (IDQ) 9.6.1, Informatica Data Explorer (IDE) 9.6, Informatica PowerExchange for Mainframes, Oracle 11i, Teradata 13, Fastload, HTTP, REST web services, Multiload, Tpump, Teradata Parallel Transporter (TPT), SQL Plus, Netezza, Bulk writer, AIX-UNIX, T-SQL, Shell script, Autosys, JIL Scripts, SQL Loader, RMS, ServiceNow.
Confidential,Bloomington, IL
Senior Informatica Powercenter & IDQ Developer
Responsibilities:- One of the largest implementations in the Insurance industry, involving multiple teams, migrating 20 years of legacy data, converting existing Claims application into a modernized solution, redirecting the entire claims data into the new solution and integrating all the existing applications into the new application.
- Used Informatica as an ETL tool to perform data integration, cleansing, standardization, match and consolidation of data. Along with Informatica, used database proprietary external loaders to load huge volumes of data into database.
Environment: Informatica Power Center 9.5, Informatica Data Quality (IDQ) 9.5, Teradata, DB2 10.1, Oracle 11i, SQL Server 2012, SQL Loader, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, Winsql, Putty, SQL Plus, UltraEdit, Data Masking, PowerExchange for mainframes, MQ Series, PowerExchange for Salesforce, XML, UC4, Rally, Autosys, JIL Scripts, Jenkins, JIRA, Unix shell scripting, XML Spy.
Confidential,Fort Myers,FL
Informatica Developer
Responsibilities:- Data Warehousing initiative to migrate & integrate various sources currently involved in providing quotes, locations, location rentals, member rewards, member classification, rentals, discounts of parent firm along with the 3 new firms acquired by the firm. Used Informatica powercenter to integrate all the above sources and moved data into Salesforce.
Environment: Informatica Powercenter 8.6, Informatica PowerExchange for Mainframes, Powerexchange for Salesforce, Oracle 10i, DB2 8.6, AIX-UNIX, SQL Plus, Shell script, SQL Loader, Load, PL/SQL, PVCS, Visio.