Sr. Informatica Idq/ Mdm Developer Resume
Charlotte, NC
SUMMARY:
- Hands on experience in all aspects of Software Development Life Cycle (SDLC) and Agile/scrum methodologies.
- Hands - on experience with Informatica MDM Hub Console like working with Data Mappings from Landing, staging and to Base Objects, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation and involved in customizing and configuring IDD applications
- Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages
- Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
- Experience in Data Warehouse, Relational Database and System Integration. Proficiency in gathering and analyzing user requirements and translating them into business solutions.
- Efficient in creating source, Confidential databases and developing strategies for Extraction, transformation and loading (ETL) mechanism using Informatica Power Center Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Worked with complex mappings using different transformations such as Source Qualifier, Expression, Filter, Joiner, Router, Update Strategy, Union, Rank, Normalizer, Unconnected / Connected Lookup, Java, Sorter, Sequence Generator and Aggregator.
- Practical knowledge on Git, AWS, Docker, Chef, Jenkins, Linux, Python, Java, Teradata, MySQL, Oracle and scripting.
- Experience in Using of Mongo DB as an open source software avoids the traditional table-based relational database structure in favor of JSON-like documents with dynamic schemas (MongoDB calls the format BSON)
- Implementing design and implementation of SCD - slowly changing dimensions types (1, 2 and 3) and CDC - Change data capture.
- Hands on experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and scheduling them on Pentaho BI Server.
- Have worked with technologies including: XML, Java, Hermes JMS, SOAP UI, ASP, VB, Oracle, SQL Server and IBM Mainframes.
- Strong understanding of Dimensional Modeling, OLAP, Star, Snowflake Schema, Fact, and Dimensional tables and DW concepts.
- Experienced in SSIS programming module to code several SSIS packages by using many programming languages.
- Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files.
- Serial/parallel batch processing, Real time ETL including CDC, Queue- MQ series, JMS, TBCO etc.
- Writing and Executing Test cases & defect tracking using Quality Center
- Involved in migration of objects in all phases (DEV, QA and PRD) of project and trained developers to maintain system when in production.
- Good knowledge of AutoSys scheduling tool, RedGate Tools, TeamCity (build management & continuous integration server), SVNTortoise (subversion client), Star Team(revision control system)
- Excellent analytical skills. Exhibits great ability to grasp newer technologies. Effective team player with excellent communication skills. Self-starter, result oriented team player with an ability to manage multiple tasks simultaneously.
TECHNICAL SKILLS:
Databases: Oracle 9i/10g, 11g/12c SQL Server 2014, 2012, 2008, 2005
Languages: C, C++, Java, J2EE, Visual Basic, SQL, PL/SQL,Siebel,COBAL, Python and UNIX Shell Scripting.
ETL Tool: Informatica Power Center 10.x/9.x/8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)
MDM Packages: Informatica MDM Multi Domain Edition 10.0, 9.7.1, 9.5, 9.1. Informatica Data Director (IDD) 10.0, 9.7, 9.5. Informatica Data Quality(IDQ) 10.0, 9.6, DDM 9.6, SAP
RDBMS: Oracle 12c/11g/10g/ 9i (SQL/PLSQL)
DB Tools: SQL*Plus, SQL Loader, TOAD, OBIEE, BTEQ, Fast Load, Multiload, FastExport, SQL Assistant, Teradata Administrator, PMON, Teradata Manager, Microstrategy, Cognos, Microstrategy, BO
Tools and Utilities: TOAD 10.1, Text pad, Word Pad, SQL Developer 4.0.3
Modeling Tools: Erwin 4.0 data modeler, ER studio 7.5, MS Visio 2007
Environment: Windows 7/XP/2000, Unix, Windows Server 2003, 2008/Linux
Packages: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word), Visual Studio, Java Eclipse
Scheduling Tools: Autosys
Version Control Tools: Clear Case
Data Methodologies: Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete Software Development Cycle. ERWIN 4.0
BI: Microstrategy, SAS, Cognos.
Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0 ,Linux, Windows, UNIX
PROFESSIONAL EXPERIENCE:
Confidential, Charlotte, NC
Sr. Informatica IDQ/ MDM Developer
Responsibilities:
- Used Address Doctor extensively for North America Address validations. Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
- Experience in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
- Created custom rules to validate zip codes, states and segregated address data based on country.
- Created web services for address mapplets of different countries to integrate with SOAP UI.
- Used Informatica MDM 10.1 (Siperion) tool to manage Master data of EDW.
- Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
- Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems.
- Created reference tables to standardize data.
- Experience in validating data quality & business rules by using Mapping document and FSD to maintain the data integrity.
- Used Python scripts to update content in the database and manipulate files.
- Worked with team of developers on Python applications for RISK management.
- Experience in writing SQL test cases for Data quality validation.
- Experience in various data validation and Data analysis activities to perform data quality testing.
- Experience in Investigating and communicating data quality issues and data failures to onsite DQ development team and fix them.
- Experience in end to end Data quality testing and support in enterprise warehouse environment
- Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
- Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 8.6/& 9.1 Integration servers.
- Experience in Data profiling and Scorecard preparation by using Informatica Analyst.
- Strong knowledge in Informatica IDQ 9.6.1 transformations and power center tool.
- Strong exposure in source to Confidential data flows and Data models for various Data quality projects.
- Involved in daily status call with onsite Project Managers, DQ developers to update the test status and defects.
- Strong knowledge in Databases, Data ware house concepts, ETL process and Business Intelligence.
Environment: Informatica MDM 10.1/10.2, Informatica Data Director 10.1/10.2, Python, Informatica ActiveVos 9.2.4.1/4.2 , Informatica Power Center 10.1, Jboss 6.4EAP, RHEL 7, MS SQL Server.
Confidential, Minneapolis, MN
Senior Informatica Developer
Responsibilities:
- Translated the business processes/SAS code into Informatica mappings for building the data mart.
- Used Informatica Power Center to load data from different sources like flat files and Oracle, Teradata into the Oracle Data Warehouse.
- Implemented pushdown, pipeline partition, persistence cache for better performance.
- Applied Business rules that identify the relationships among the data using Informatica Data Quality (IDQ 8.6 8.6).
- Modified existing Informatica Data Quality (IDQ 8.6 8.6) Workflows to integrate the business rules to certify the quality of the Data.
- Defined measurable metrics and required attributes for the subject area to support a robust and successful deployment of the existing Informatica MDM 9.5 platform.
- Planed Informatica MDM 9.5 requirement analysis sessions with business users.
- Created Informatica MDM 9.5 Hub Console Mappings.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Data Modeling using Star Schema and Snowflake Schema. Strong in Source to Confidential data Mapping and CDC (Change Data Capture) using Slowly Changing Dimension Mapping, incremental
- Hands On experience creating, converting oracle scripts (SQL, PL/SQL) to TERADATA scripts.
- Configure rules for power center operations team, no file monitoring, process not started, reject records and long running jobs.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
- Perform POCs on latest products and technologies that can be used in the Enterprise Business Intelligence Area.
- Extensively worked in creation of NoSQL data models, data loads with bloom filters and TTL columns in column families.
- Used of Mongo DB as a open source software avoids the traditional table-based relational database structure in favor of JSON-like documents with dynamic schemas (MongoDB calls the format BSON)
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Assisted the QC team in carrying out its QC process of testing the ETL components.
- Created pre-session and post-session shell scripts and email notifications.
- Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
- Created mappings using Data Services to load data into SAP HANA.
- Involved in Data Quality checks by interacting with the business analysts.
- Performing Unit Testing and tuned the mappings for the better performance.
- Maintained documentation of ETL processes to support knowledge transfer to other team members.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
- Responsible for requirement definition and analysis in support of Data Warehousing efforts.
- Extensive experience with relational databases Oracle 10g/11g, DB2, SQL Server, Teradata, Greenplum, Amazon AWS Redshift
- Sourced data form RDS and AWS S3 bucket and populated in Teradata Confidential .
- Mongo DB provide the support to regular expression searches. Queries can return specific fields of documents, range queries and include user-defined JavaScript functions
- Used Source Analyzer and Warehouse designer to import the source and Confidential database schemas, and the Mapping Designer to map the sources to the Confidential .
- Developed data Mappings between source systems and Confidential system using Mapping Designer.
- Developed shared folder architecture with reusable Mapplets and Transformations.
- Extensively worked with the Debugger for handling the data errors in the mapping designer.
- Created events and various tasks in the work flows using workflow manager.
- Responsible for tuning ETL procedures to optimize load and query Performance.
- Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler.
- Used the Aggregator transformation to load the summarized data for Sales and Finance departments.
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
- Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
- Taken part of Informatica administration. Migrated development mappings as well as hot fixes them in production environment.
- Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
- Created web service jobs by configuring WSDL in designer and used Informatica Web Services Hub to start the Informatica tasks.
- Trouble issues in TEST and PROD. Do impact analysis and fix the issues.
- Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
- Developed Unit test cases and Unit test plans to verify the data loading process and Used UNIX scripts for automating processes.
- Involved as a part of Production support.
ENVIRONMENT: Informatica Power Center 9.1, Informatica MDM 9.5, Informatica IDQ 8.6 8.6, Power Exchange, Teradata, Data Quality Oracle11g, MS Access, Oracle weblogic 10.3.2, UNIX Shell Scripts, Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, Qlikview, Linux, Teradata, SQL Assistant, Netezza, DB2.
Confidential, Maryland
Sr Informatica Developer
Responsibilities:
- Involved in analyzing scope of application, defining relationship within & between groups of data, star schema, etc.
- Analysis of star schema in dimensional modeling and Identifying suitable dimensions and facts for schema
- Involved in the Design and development of Data Mart and populating the data from different data sources using Informatica.
- Participated in development of Reports using Informatica.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Created and reviewed Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, lookups, Stored Procedures and creating PL/SQL procedures, functions, Filters, Sequence, Router, Union and Update Strategy adhering to the Time Warner coding standards
- Enable Agile Business Intelligence (BI) with data virtualization.
- Extensively used Oracle, Netezza, Flatfile, XML file, DB2 data as source and Confidential .
- Wrote multiple programs in Python to monitor virtual machine usage data using VMWare API calls.
- Wrote Stored Procedures using PL / SQL for dropping indexes and again creating.
- Created and executed the test cases for Informatica mappings and UNIX scripts.
- Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
- Regularly interact with Business Intelligence leadership on project work status, priority setting and resource allocations.
- Working with Security Team, Siteminder, Web Hosting and Weblogic teams to implement SSO.
- Have worked with the JMS integration with Informatica
- Worked extensively with Advance analytics like Reference Lines and Bands, Trend Lines.
- Expertise in working with data building Groups, Hierarchies and sets.
- Mastered in different Formatting techniques, using Annotations and Mark labels.
- Developed effective and interactive Dashboards using Parameters and Actions
Environment: Oracle, Informatica Power Center, Power Exchange, Business Intelligence Development Studio, Netezza, Unix Shell Script, tableau 9.5, Putty, DB2, Mainframe COBOL, SQL PLUS,SQL-Loader.
Confidential, MA
Sr. Informatica Developer
Responsibilities:
- Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them in to technical specifications.
- Extracted data from various sources like flat files, XML files, Oracle and loaded into Enterprise data ware house.
- Worked on Informatica 9.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.
- Created complex mappings using the Mapping designer, respective workflows and worklets using the Workflow manager.
- Troubleshooted the mappings using the Debugger and improved the data loading efficiency using Sql-overrides and Look-up Sql overrides.
- Used the Versioning control in order to Track the changes.
- Developed SCD I and SCD type II mappings.
- Implemented incremental loads, Change Data capture and Incremental Aggregation
- Used bulk load utility to load bulk data to the database.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Created UNIX Shell scripts and called as pre session and post session commands.
- Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
- Responsible for moving the mappings and sessions from development repository to production repository and provided 24/7 production support.
- Developed Unit test plans for every mapping developed and executed the test plans.
Environment: Informatica Power Center 8.6.1, Oracle 10g, Toad for oracle, Flat Files, XML Files, Erwin 7.3, MS Visio, Windows 2000, UNIX AIX, Shell Scripting, IDQ, PL/SQL, SQL, OBIEE, Appworx Scheduling Tool.
Confidential, IN
Informatica
Responsibilities:
- Extensively involved in Gathering requirements by holding meetings with users.
- Constructed context diagrams and data-flow diagrams based on a description of a business process. Analyzing the data model and identification of heterogeneous data sources.
- Constructed an extended entity relationship diagram based on a narrative description of a business scenario.
- Created the Source and Confidential Definitions using Informatica Power Center Designer.
- Used Informatica Power center to load the data into data warehouse.
- Development of Informatica mappings and Mapplets and also tuned them for Optimum performance and dependencies.
- Created reusable Transformations for modifying data before loading into Confidential tables.
- Created mapplets in the Informatica Designer which are generalized and useful to any number of mappings for ETL jobs.
- Created Transformations using the SQL script to modify the data before loading into tables.
- Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the mappings.
- Used the Business objects features Slice and Dice and Drill Down for multi-dimensional analysis.
- Scheduled various daily and monthly ETL loads using Control-M.
- Inserted Objects, Conditions, classes, subclasses and user objects according to client's requirement.
- Prepared SQL Queries to validate the data in both source and Confidential databases.
- Managed the database objects Indexes, Triggers, procedures, functions, packages, cursors.
- Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
Environment: Informatica Power Center 8.1/8.6, Oracle 9i, SQL Server, TOAD, Control M, Windows 2003 and UNIX.
Confidential
Oracle Developer
Responsibilities:
- Created the mapping documents based on the data model and the client requirements.
- Developed Informatica Mappings as per the documents.
- Responsible for testing and migration of ETL maps from the development to production environment.
- Worked on performance tuning.
- Prepared test cases and test plans.
- Loading the data received from the wholesalers and the distributors, which are lying in the Unix box into the data warehouse on daily and weekly basis.
- Involved in analysis of data in various modules of the project.
- Worked with the data-modeling tool Erwin.
- Loaded the Dimension tables with the various Sales force data at various levels.
- Worked with MS-Excel spreadsheet and MS-Word for Documentation.
Environment: Informatica Powercenter 6.2, Cognos 5.x/6.1, DB2, Oracle 8i, UNIX (Solaris), Windows NT.