Informatica Lead Developer/architect Resume
EXPERIENCE SUMMARY:
- Over 11 + years of extensive experience in IT Software & ETL tools
- Experience in Insurance, Retail, Banking, Financial and Healthcare Domains
- Proficient in gathering requirements, Integration Architecture, Leading the team, Hands on development.
- Experienced in Informatica Data Quality(IDQ/BDQ), BDM, EDC, Axon, IICS, ICRT, PC, PWX, MDM
- Built IDQ Rule/Mapplets, Mappings for Cleansing, Standardizing, Parsing, Matching, Merging and Enrichment process using Informatica developer tool/IDQ/Big data quality(BDQ), Big Data Masking
- Worked in Informatica Big data Management(BDM/BDE) to process flat files, RDBMS(Oracle, SQL Server), JSON, XML, REST API, AWS Cloud(AWS S3, Redshift, RDS, HBASE - EC2, EMR), Hadoop (HDFS, HIVE, HBASE) using Native, Blaze, SQOOP, Spark, Map reduce modes, Yarn, Cloudera, Ambari, Python scripts
- Programming knowledge in Unix Shell scripting and Python scripts
- Experienced in Integration with Informatica Cloud mappings for Salesforce Applications
- Experienced in Informatica B2B Data Transformation(DT) to process Data processor transformation, semi structured, un-structured file formats and knowledge in B2B Data Exchange(DX)
- Worked with various sources and target in Informatica such as JSON files, Oracle, SQL server, Teradata, Salesforce (SFDC), Sybase, VSAM, DB2, XML, Flat file, COBOL files, etc
- Worked on MDM table configurations, Data mapping for Informatica MDM Landing, Staging and Base objects, Match/Merge rules config, cleansing worked in Slowly Changing Dimensions (SCD) and Change Data Capture (CDC) models in Power center
- Good experience in Data Warehouse, Data mart, Dimensional Data Modeling
- Experience in analyzing the existing system, Impact Analysis, understanding the business functionality
- Hands on experience in performance tuning Informatica mappings, debugging, review code, maintain versioning, Unit testing, maintain standards and best practices
- Experience in Install, configuration and Administration of Informatica products on Windows, Unix and Linux
- Worked in Both Agile and Waterfall methodologies method in Software development life cycle(SDLC)
- Excellent Interpersonal Skills with the ability to work independently and with the Team, communication to higher management, stake holders and business owners
TECHNICAL SKILLS:
Data Warehousing ETL: Informatica Power Center, Informatica Big Data Edition(BDE/BDM), Axon, EDC/EIC, IDL, Informatica Power Exchange, Informatica Cloud, Informatica Data Quality (IDQ/IDE), Informatica MDM Hub, Informatica B2B Data Transformation/Data Exchange, Metadata Manager(IMM), SAP Business Objects Information Design Tool (IDT), Erwin Data Modeler, OBIEE
Mainframe: JCL, COBOL, CICS,VSAM,DB2,IMS,Z/OS
Others: SQL, PL/SQL, XML, JSON, TOAD, SQL developer, Shell scripting, Tidal, Autosys, Teradata Studio Express, Tableau
Environment: UNIX/Linux, Windows (7,XP, NT/95/98/2000)
Databases: Oracle 11g/10g/9i, MS SQL Server 2008, Teradata, IBM DB2UDB, VSAM, IMS, Netezza, Sybase. SFDC(Salesforce) API
PROFESSIONAL WORK EXPERIENCE:
Informatica Lead Developer/ArchitectConfidential
Responsibilities:
- Involved in Requirement Analysis, Designing over all Architecture, Identifying anomalies, Informatica IDQ/BDM Map development and Testing.
- Data migration from on premise databases to AWS cloud Data lake
- Worked in IDQ/BDM for Profiling, DQ rule development(Completeness, Conformity, Consistency, Accuracy), cleansing, standardizing, Parsing, Grouping, Matching, Bad & duplicate exception management to correct Data Quality issues (Human task),Scorecard development,
- Pushdown to Hadoop, Integrated the IDQ scorecard results to Axon Data Governance
- Experienced in data viewer, Column profiling, parameter file generation and application deployment.
- Knowledge and experience in Integration with Axon Data Governance and EDC/EIC - Enterprise Data Catalogue on AWS cloud
- Knowledge and experience in Installation, Configuration and Administering the Informatica Big data
- Attended Informatica official Training for Informatica Axon and EDC/EIC
Environment: Informatica Data Quality/BDM 10.2.0/10.2.1 Developer/Analyst, EDC/EIC 10.2, Axon 5.4, Power center, Amazon Web Services (AWS), S3, JSON, RDS - Oracle, Redshift, HBASE, HUE, SQL Server, Infacmd Batch, Windows 7, Linux/Unix, Unix Shell scripts/Python scripting, Scala, Aginity, EC2, EMR, Putty, WinSCP, Data lake
Informatica BDM Lead
Confidential
Responsibilities:
- Involved in Requirement Analysis, Informatica BDM Map development and Testing.
- Experienced in Data viewer to view the data in each transformation, validating the code in Native and Hive/Blaze mode, running the mapping, Creating workflow, Application creation and Deployment, Running the workflow using Infacmd command and running directly from mappings, Generation of Parameter files, testing and application maintenance.
- Data migration from SQL Server to AWS S3, S3 to HBASE, S3 to Redshift, Hive tables over S3
- Strong experience in debugging the mapping in Hadoop-Blaze push down and in Native mode.
- Independently developed number of Informatica BDM/IDQ Mappings, Dynamic mappings
- Created mapplets to handle common logics like address validation, Email, Phone validation
- Experience in analyzing BDM logs, YARN, Blaze/Hadoop logs, with and without verbose data mode.
- Created Mapping, Workflow Parameters, Generated parameter files
- Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
Environment: Informatica Data Quality 10.1.1, Informatica BDM 10.1.1, Amazon Web Services (AWS), S3, RDS - Oracle, Redshift, JSON, HBASE, HUE, HDFS, Hive, SQL Server, Control-M, Windows 7, UNIX shell/Python scripts, Scala, Spark, Tableau, Aginity, EC2, EMR, Putty, WinSCP, Data warehouse, Data Lake
ETL Lead developer
Confidential, Omaha, NE
Responsibilities:
- Involved in Requirement Analysis, Informatica Map development, Table Creation, debugging, testing and application maintenance.
- Effective working relationships with client team to understand support requirements, and effectively manage client expectations.
- Involved in Agile Methodology Process, Update the Work Status in Daily Standup.
- Independently developed number of Informatica Mappings.
- Experience in integration of various data sources like SQL Server, Oracle, Flat Files
- Experience in UNIX Operating System and Shell scripting.
- Implemented various projects in production.
- Used Debugger in Informatica Power Center Designer to check the errors in mapping
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
Environment: Informatica Power Center 9.6, SQL Server, TFS Visual Studio, Tidal, Windows 7, PL/SQL, UNIX, SQL developer, WinSCP, Scrum, Agile Methodology, Data warehouse, ODS
ETL Lead developer
Confidential
Responsibilities:
- Involved in all the stages of the Software development Life Cycle development (SDLC).
- Created the Stored procedures, functions in DB and Staged Sproc’s in ETL informatica and Delta/Merge process to load the high volume data.
- Worked on Creating profiles, scorecards, rules in Analyst tool, Building Mapplets, Mappings for Cleansing, Standardizing, De-duping and Enrichment process using IDQ transformations in Developer
- Created Match/Merge rules, Consolidation (BDD/IDD), Human tasks and Application deployment.
- Integrate data quality rules, mapplets and mappings with Power Center for execution.
- Worked on Creating configurations and Data mapping for MDM Landing, Staging and Base objects
- Implemented various projects in production, Analysis of customer queries and providing resolution
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Worked on Informatica Salesforce Power-exchange/Connector and developed multiple staging jobs
- Created Oracle PL/SQL Scripts, Stored Procedures, Functions and Packages
- Created Teradata Scripts using Utilities BTEQ, Fastload, Multiload, FastExport, Tpump
- Interacting with end customers to understand the requirement.
- Analyzed Session Logs, workflow log files and Debugging the code to fix the Testing issues
- Created UNIX shell Scripts for Post Session/Pre Session Commands.
Environment: Informatica Power Center 9.6/10.0, Informatica Data Quality v9.5, Informatica MDM 9.7/10.0, IDD, IDS,, JBOSS,SIF,HM, Oracle, Teradata, BTEQ Scripts, SFDC(Salesforce) power center plug-in, Informatica Cloud (On Demand and Realtime), SQL Server, Rapid SQL, Teradata Studio Express, Erwin Data Modeler, Windows 7, PL/SQL, UNIX, SQL developer, WinSCP, Confluence, JIRA, Scrum, Agile Methodology, Data warehouse, ODS
ETL Lead developer
Confidential
Responsibilities:
- Involved in all the stages of the Software development Life Cycle development.
- Worked in Informatica Big data Edition(BDE) to process data on Hadoop
- Created Informatica Mappings to write to HDFS and Hive
- Imported Power center mappings to IDQ - Model repository (developer tool)
- Redesigned mappings to handle sorting and statefulness in Hadoop
- Processed Inserts, updates by rebuild
- Running mappings via workflow in Native and Hive mode
- Worked on Column Profiling using Hive Pushdown
- Worked in Informatica Cloud mappings on demand and real time for salesforce
- Knowledge in Creating weblog Data Processor, Parser and Serializer
- Implemented various projects in production and did production Support
- Worked in IDQ Standardization, Parser, Grouping, Matching, Identity Matching, Association, Automatic and Manual Consolidation, Address Doctor, Bad records and duplicate exception(Human task),scorecards
- Analyzed Session Logs, workflow log files and Debugging the code to fix the Testing issues
Environment: Informatica Power Center 9.6/9.1, Informatica Data Quality(IDQ), Informatica BDM/BDE 10.1/9.6.1, Informatica Cloud, IMM, Hadoop, Cloudera, Hive, Salesforce, Teradata, Teradata Utilities - TPT, MLOAD, Oracle 11g, SQL Server, DB2UDB,Rapid SQL, Teradata Studio Express, Erwin Data Modeler, SAP Business Objects (IDT), Hyperion, PL/SQL, Unix Shell scripts, Python Scripts, SQL developer, WinSCP, Remedy Change management, Confluence, JIRA, Scrum, Agile, Data warehouse, ODS
ETL Lead developer
Confidential
Responsibilities:
- Good understanding of Data Warehousing principles, ER Diagram and Dimensional Data Modeling
- Creating mappings with different transformations like Parser transformation, Standardizer transformation, Address Validator/Verification (AV), Cleansing, de-duping and data enrichment in IDQ tool. Integrating the Data Quality mapplets to Power center
- Created Column profiles, Midstream Profiles, Scorecards and rules/Mapplets using Developer (IDQ)
- Experience in Configuring MDM Multi Domain edition and Data mappings for Landing, Staging and Base Object tables, Match/Merge Process, Automatic and Manual (IDD) Consolidation in Informatica MDM Multi domain Edition Hub.
- Interacting with end customers to understand the requirement.
- Analysis of customer queries and providing resolution. Fixing QC Defects and providing solution.
- Analyzed Session Logs and workflow log files in case of failures
- Created Stored Procedures, Functions and UNIX shell/perl Scripts.
- Involved in mainframe/cobol data extraction and integration with Informatica
Environment: Informatica Power Center 9.6/9.5/9.1, Informatica Developer(IDQ), Informatica Analyst, Informatica B2B Data Transformation/Data Exchange(DX/DT), JMS, Informatica MDM Multi domain Hub Edition 9.5, Oracle 11g, JBOSS,SIF,HM, Windows 7, PL/SQL, DB2, UNIX,, Data warehouse, ODS, WinSCP, HP Open view Service Center, Quality Center.
ETL Lead
Confidential
Responsibilities:
- Developed Technical Specifications of the ETL process flow
- Worked in IDQ Standardization, Parser, Grouping, Matching, Identity Matching, Association, Automatic and Manual Consolidation(IDD-Informatica Data Director) in Informatica Developer tool
- Implement data quality rules to address issues identified during analysis phase.
- Standardize address data using Address Doctor and Reference tables created.
- Integrate data quality rules, mapplets and mappings with Power Center for execution.
- Experience in Configuring MDM and Data mappings for Landing, Staging and Base Object tables
- Experience in MDM/IDQ Match/Merge Process, Automatic and Manual (IDD) Consolidation in Informatica MDM Hub Multi domain Edition.
- Installed and Configured Informatica Power Center client tools and connected with each database
- Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup, Update Strategy, Sequence generator transformations.
- Set up Metadata driven utility design for the ETL processes using Informatica.
- Involved in Production Support in resolving issues and bugs.
- Worked on SQL stored procedures, functions and packages in Oracle.
- Created and maintained UNIX shell/perl scripts for pre/post session
- Scheduled various daily and monthly ETL loads.
Environment: Informatica Power Center 9.1,, Informatica Developer(IDQ)V9, Informatica Analyst V9, Informatica MDM Hub V9, Oracle 11g, JBOSS,SIF, PL/SQL, TOAD, Teradata, Rapid SQL, Studio Express, SQL SERVER 2005/2008,TIDAL, XML, Web services, SOAP,REST,WSDL,UNIX, Windows 7, EDI X12, Mainframe JCL,COBOL,DB2,VSAM, Data warehouse, ODS
Senior Software Engineer
Confidential
Responsibilities:
- Responsible for development, support and maintenance for the ETL (Extract, Transform and Load) Worked in redesigning the existing mappings to enhance performance.
- Created stored procedures, PL/SQL Scripts, tables, views, synonyms, and test data.
- Good understanding of Data Warehousing principles, ER Diagram and Dimensional Data Modeling
- Extensively used ETL to load data from various sources into Staging area and then to
- Experience in performance tuning of sources, mappings, targets and sessions
- Wrote complex SQL queries based on the given requirements and used volatile tables, temporary tables, derived tables for breaking up complex queries into simpler queries
- Involved in 24x7 production support and resolved issues and bugs.
- Used debugger to test the mapping and fixed the bugs.
- Created Teradata Scripts using Utilities BTEQ, Fastload, Multiload, FastExport, Tpump
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session Configurations.
Environment: Informatica Power Center 8.6.1, IBM DB2, Teradata, Mainframe JCL, COBOL DB2, CICS, VSAM, UNIX, Oracle 11g, Data warehouse, ODS
Software Engineer
Confidential
Responsibilities:
- Developed number of Complex Informatica Mappings and Reusable Transformations.
- Handled performance tuning the Informatica Code (mappings, sessions).
- Worked on processing Semi-Structured, Unstructured files like EDI X12, JSON, PDF, Word using Informatica Data transformation
- Created EDI partners and On-boarded and integrated with Informatica for Automation using Informatica Data Exchange(DX)
- Involved in designing staging, Data mart environments and build DDL scripts to reverse engineer the logical/physical data model using Visio, and Power Designer.
- Created Stored Procedure, Functions in Various Databases.
- On call production Support of the application
- Tracked all the statuses of the projects, conducted team meetings and provided Status to Manager.
- Analyzed Session Logs and workflow log files and Debugging the code to fix in case of failures.
- Worked in JCL,COBOL,DB2 Programs
Environment: Informatica Power Center 9.1,Informatica B2B DX/DT, Oracle 11g, Teradata, SQL Server2005, Control-M, Windows XP, PL/SQL, UNIX, SQL developer, COBOL, JCL, DB2, CICS, IMS, SQL, VSAM, Z/OS, XML, Java, Web services, SOAP,REST,WSDL,UNIX, Data warehouse, ODS
Software Engineer
Confidential
Responsibilities:
- Developed Informatica mappings, workflows to load data.
- Installed and Configured Informatica Power Center client tools
- Extensively used Informatica to load data from Oracle, and Flat Files to Oracle.
- Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup, Update Strategy, Sequence generator transformations.
- Used debugger to test the mapping and fixed the bugs.
- Used the Workflow manager to create workflows and tasks, and created worklets.
- Worked on SQL stored procedures, functions and packages.
- Created and maintained UNIX shell/perl scripts for pre/post session operations
- Worked in COBOL Programs, JCL Batch job creation
- Extensively involved in Optimization and Tuning of mappings and sessions in Informatica
- Scheduled various daily and monthly ETL loads using Control-M
- Involved in writing UNIX shell/perl scripts to run and schedule batch jobs.
- Involved in Production Support in resolving issues and bugs.
Environment: Informatica Power Center 8.1/8.6.1, Oracle 9i, SQL Server 2000/2005, Teradata, PL/SQL, TOAD, UNIX Shell Scripting, JCL, COBOL, DB2, CICS, Z/OS, Windows XP, DB2, UNIX, Autosys, Data warehouse, ODS