We provide IT Staff Augmentation Services!

Informatica Data Quality Lead Developer Resume

2.00/5 (Submit Your Rating)

Nyc, NY

SUMMARY:

  • Over 12 years of IT experience working on ETL Architecture design, System Analysis, Application Design, Development, Testing, Implementation, Maintenance and Supporting for an Enterprise level Data Integration, Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store (ODS), Data Warehouse (DW)/Data Mart (DM), using Informatica PowerCenter ETL tool.
  • 7 + years of working experience in finance industry with domain knowledge in Securities, Collateral/loans, Fixed Income, Trades, Customers, Employee and other financial/compliance data.
  • 11 years of extensive experience in Informatica PowerCenter ETL tool including, Power Exchange, IDQ, MDM
  • Good Knowledge in Big data technologies using Informatica ETL with HDFS, Hive adaptors.
  • Experience in Data Modeling techniques like Dimensional Data modeling, Star/Snowflake/Hybrid Schema modeling, and Conceptual/Logical/Physical data modeling using Erwin/Visio tools. Build data quality assessments and measurement of critical data utilized strategic systems and processes
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Experience in developing Transformations, Mapplets and Mappings using Informatica Designer to implement business logic.
  • Experience in integration of various data sources like Oracle, Teradata, Exadata, Netezza, Sybase, flat files, XML, Share Networks, Web services, CDC (Change Data Capture) and Informatica Data Replication (IDR) .
  • Experience in power centre IDQ of developing plans for Analysis, standardization, Matching and merge, Address doctor, consolidating data from different components.
  • Analyse the data quality issues and support the Global Functions, ensuring the data quality scorecards are measuring fit for purpose data and for audit purpose
  • Design and development of BDM mappings in HIVE mode for large volumes of data inserts/updates
  • Implemented informatica BDM mappings for extracting data from DWH to Data Lake
  • Proven understanding with Hadoop, HIVE, PIG and HBASE
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements
  • Scheduling jobs with AUTOSYS, Control M and Tidal
  • Experience in creating Perl, Python and UNIX shell scripts
  • Expertise in creating Complex Informatica Mappings and reusable components like Reusable transformations, Mapplets, Worklets and reusable control tasks to work with reusable business logic.
  • Experienced in configuring Nodes and Repositories in Informatica Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Experience in implementing SCD (Slowly Changing Dimension) Type 1, Type 2, and Type 3.
  • Good experience in performance techniques like partitioning and push - down optimization, SQL Queries, tuning of informatica mappings and other database related tuning.
  • Conducted System, UAT and Functionality testing and investigated software bugs.
  • Extensive experience in conducting Requirement-gathering sessions, writing Business Requirement Document, Functional Requirement Document.
  • SQL Tuning and creation of indexes for faster database access, better query performance in Informatica by partitions, creating Explain Plan, SQL hints for query and indexing
  • Scheduled the batch jobs; deployment of objects between environments.
  • Knowledge in design and development of Business Intelligence reports using BI tools Hyperion Essbase, Business Objects and Cognos.
  • Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyze the golden record.
  • Have Knowledge on Informatica MDM concepts and implementation of De-duplication process and IDD.
  • Hands on experience in maintaining version controls

TECHNICAL SKILLS:

ETL Tools: Informatica Power Centre 9.6/8.6/8.5/8.1/8.0/7. x/6.x/5.x, IDQ 9.5.1/9.1, & Power Exchange, CDC, Source Analyzer, Mapping Designer, Mapplet, Transformations, Workflow Monitor, Workflow Manager, Power Exchange, Power Analyzer Data Profiling and Data cleaning, Flat file system Fixed width, Delimiter, OLAP

Business Intelligence: Business Objects, Cognos and Hyperion Essbase

RDBMS: Oracle 12C/11G/10G/9i, SQL*Loader, Teradata, Netezza, Exadata, Sybase

Data Modelling: Erwin 9.X, Microsoft Visio

Tools: Toad, SQL DBx, PL/SQL Developer, SQL Assistant, Rapid SQL 8.x, Putty, WinSCP, Autosys, Remedy, Quality Centre, Control-M, JIRA

Languages: SQL, PL/SQL, Perl, Python, UNIX Shell Scripting, PL/SQL

Operating Systems: Windows, UNIX/LINUX, Sun OS

PROFESSIONAL EXPERIENCE:

Confidential, NYC, NY

Informatica Data Quality Lead Developer

  • Support the Data Quality Program, including the creation and execution of DQ queries using Informatica IDQ and JIRA, management of DQ exceptions, and support of a DQ Scorecard.
  • Assist in expanding the monitoring and measurement activities across the Americas.
  • Analyse and create new DQ rules using Informatica, introducing new features, coordinating software upgrades, and migrating existing procedures.
  • Assist in the expansion of the data quality scorecard to in corporate new items and improve efficiencies.
  • Analyse gaps and deficiencies within business areas by studying current reporting and analytical practices.
  • Support data clean-up and reconciliation through root cause analysis (RCA) and coordination of production changes. RCA will require hands-on data analysis using SQL.
  • Implement and update data governance procedures, standards, training documents, presentations, and Data Governance detail plans.
  • Assist Data Stewards and Business Process Owners to achieve Data Quality & Monitoring goals.
  • Analyse and provide data metrics to management in order to help prioritize areas for data quality improvement
  • Write code for complex system designs. Write programs that span platforms. Code to and/or create Application Programming Interfaces (APIs)
  • Good knowledge on writing shell-scripts to invoke mappings or workflows
  • Proficient at writing SQL queries and verifying the results
  • Should be familiar in migrating objects from IDQ to PowerCenter or other Informatica tools
  • Good at writing test scenarios, performing unit testing, and verifying the end results with business requirements
  • Perform both record-level and large-scale manual additions, adjustments and corrections to continuously ensure overall data quality and integrity
  • Build subject matter expertise.
  • Identify gaps between current capabilities and new requirements.
  • Assist the Chief Data Office with various department related activities.
  • Perform data quality activities i.e., data quality rule creation, edit checks, identification of issues, root cause analysis, value case analysis and remediation.
  • Provides strategic direction of the planning, development and implementation of various projects in support of enterprise data governance office and data quality teams.
  • Own the development of the 'Integrated Control Framework (ICF)' for data quality.
  • Use Informatica Data Quality (IDQ) to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the data.
  • Check for duplicate or redundant records and provide the information on how to proceed backend ETL Process.
  • Partner with data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data
  • Implement data quality process including translation, parsing, analysis, standardization and enrichment of data at point of entry and batch modes.
  • Deploy mappings that runs in a schedule, batch or real-time environment. Partner with data stewards to provide summary results of data quality analysis, which will be used to make decision regarding how to measure business rules and quality of the data.
  • Collaborate with various business and technical teams to gather requirements around Data quality rules and propose the optimization of these rules, then design and develop these rules with IDQ. Design and develop Custom Objects and rules, Reference Data tables and create/import/export mappings.
  • As per business requirements, perform thorough data profiling with multiple usage patterns, root cause analysis and data cleaning and develop scorecards utilizing informatica data quality (IDQ)
  • Develop 'matching' plans, help determine best matching algorithm, configure identity matching and analyze duplicates. Build human task workflows for exception management and processing.
  • Develop KPI & KRIs for the data quality function. Drive improvements to maximize value of data quality (drive changes to have access to required metadata to maximize impact of data quality & quantity cost of poor data quality)
  • Perform data quality activities i.e. data quality rule creation, edit checks, identification of issues, root cause analysis, value case analysis and remediation. Strengthen data steward role within Bank’s 1st and 2nd lines of defence
  • Augment and execute the data quality to ensure achievable and measurable milestones are mapped out for delivery and are effectively communicated.
  • Document all mappings, mapplets and rules in detail and handover documentation to the Customer.

Environment: Informatica Developer 10.2, Informatica PowerCenter10.2,Tableau, Oracle 11G, PL/SQL,DB2, GreenPlum Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Tidal, JIRA, Rational Rose/Jazz, SharePoint, HP ALM

Confidential, NYC, NY

Informatica Data Quality Lead Developer

  • As a lead Data Quality developer, initiated the process of Data profiling by profiling different formats of data from different sources and started analysing its dimensions to find its actual structure and the rules which are implemented as part of Standardization.
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
  • Most of data which belongs to various members and Providers will be carried out throughout the development.
  • Extensively worked on Informatica IDE/IDQ
  • Involved in impact analysis and estimating development efforts for CRs
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging
  • Very strong in implementation of data profiling and score cards
  • Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
  • Good skills in analysing trend charts from score cards to analyse the threshold which is to be considered in further development
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Very strong knowledge of Informatica Data Quality transformations like Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations
  • Worked on exception management and human task in IDQ

Environment: Informatica Developer 10.1.1 HotFix1, Informatica PowerCenter10.12, Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Tidal, JIRA, Rational Rose/Jazz, SharePoint

Confidential

Senior BI Analyst

  • Provided Architectural Road Map, direction and work packets for ETL needs.
  • Created detail ETL Standards documents for Design, Development, Release Management and Production Support.
  • Design Detail ETL spec for development and ensured Quality ETL Deliverables.
  • Created detail ETL Migration Processes for Informatica, Database, Scheduling, O/S and H/W teams.
  • Design and Develop Reusable Common objects shared across various Repositories.
  • Automated, Redesigned and tuned several ETL Process for optimal utilization of Time and resource
  • SAP Integration using ALE, IDOC, BAPI
  • XML as a source and Target. Having depth knowledge on XML, XSD and WSDL
  • Worked in design, development and unit testing of the jobs
  • Responsible for overseeing the quality procedures related to project
  • Tuned Informatica Mappings and Sessions for optimum performance
  • Involved in Code review, bug fix while testing is going on.
  • Scheduling the jobs, running and monitoring the jobs through Tidal scheduler
  • Applying the Data Validation rules and raising exception handling mechanisms
  • Validated data accuracy and integrity for partner's sales and delivery metrics
  • Applying the reject reprocessing.
  • Involved in impact analysis and estimating development efforts for CRs
  • Using Informatica PowerCenter 10.1, pulling the data from Hadoop Ecosystem i.e. HIVE tables then processing in developer tool with quality checks
  • Involved setting up Hadoop configuration (Hadoop cluster, hive connection) using informatica BDM
  • Design and development of BDM mappings in HIVE mode for large volumes of data INSERT/UPDATE
  • Proven understanding with HADOOP, HIVE, PIG and HBASE
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
  • Involved in migration of the mappings from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
  • Worked on Data quality transformations like match, merger, key generator
  • Worked on exception management and human task in IDQ
  • Having extensive experience in data analyst to create rules, exception management, score cards and profiling
  • Scheduled/automated scorecards and profiles

Environment: Informatica PowerCenter10.2, Informatica PowerCenter10.1 BDM, Oracle 11G, Netezza, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Tidal, Rational Rose/Jazz, SharePoint

Confidential

Senior ETL Developer/IDQ Developer

  • Provided Architectural Road Map, direction and work packets for ETL needs.
  • Created detail ETL Standards documents for Design, Development, Release Management and Production Support.
  • Design Detail ETL spec for development and ensured Quality ETL Deliverables.
  • Created detail ETL Migration Processes for Informatica, Database, Scheduling, O/S and H/W teams.
  • Design and Develop Reusable Common objects shared across various Repositories.
  • Automated, Redesigned and tuned several ETL Process for optimal utilization of Time and resource
  • Identify and investigate the sensitive fields for each source file to determine fields to be anonymized
  • Identify best target solution for the entire anonymization process, to be incorporated in the current development process
  • Re-engineering of Informatica mappings affected by anonymization
  • Testing of the changes and preparation for production implementation

Over Technical Responsibilities:

  • Designing Extraction, Transformation and Load strategies for Trades, Sensitivities, Collaterals, Positions, Securities, Issuers, Ratings, Counterparty and Reference data.
  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
  • Developed rules and mapplets that are commonly used in different mappings
  • Used various transformations like Address validator, parser, joiner, filter, matching to develop the maps
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
  • Involved in migration of the mappings from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
  • Worked on different environments with different source and target databases like Teradata, DB2, and SQL server.
  • Developed the mapping to process multiple flat file as source and staged the data into teradata and DB2 databases.
  • Responsible for migration of the work from dev environment to testing environment
  • Responsible for solving the testing issues.
  • Involved in building and maintaining the Data Dictionary for Trades, Sensitivities, Collaterals, Positions, Securities, Issuers, Ratings, Counterparty and Reference data.
  • Involved in impact analysis and estimating development efforts for CRs
  • Tuning the huge data loads for optimal performance.
  • Worked in design, development and unit testing of the jobs
  • Responsible for overseeing the quality procedures related to project
  • Tuned Informatica Mappings and Sessions for optimum performance
  • Involved in Code review, bug fix while testing is going on.
  • Scheduling the jobs, running and monitoring the jobs through Dollar U scheduler
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Preparing SQL Queries for testing the mappings and balancing the data
  • Created test case scenarios for data verification and provided the necessary input to the QA team for validating the data
  • Created deployment groups, migrated the code into different environments
  • Created functional specifications for reporting module and been appreciated for a quick turnaround.
  • Involved in user-training of reporting related to business process and also was responsible for implementing the same
  • Analysis on the failure Jobs and fix the issue to for smoother Production run.
  • Feed issues were resolved in time manner to avoid affecting the processes and business. Feeds monitor regularly later on and update RMM regularly on that issue.
  • Every actions track in Remedy and assign to particular team/person to get it resolved in time Remedy tickets get update on each steps of the action taken to resolve the issue.
  • Script developed to notify the Failure of a particular feed with the component size and record counts. This script running recursive in system in backend and notify myself as mail with details.
  • Script developed to check the crontab entries for critical process (EUR COB, NA COB, PAC COB etc.) whether commented or not before the scheduled time to avoid any delay of the start of the process.
  • Script developed to check the FeedGen source entries of a particular feed and the arrivals of the components of the feeds.
  • Script developed to help to know feed current status, time taken during loading, encrypting password.
  • Identify and escalate the performance issue of particular feed on their component arrivals, process times, Powermart process and loading time and processes to higher level to get it resolve.
  • Updating users with detailed information and solutions with in time frame.

Customer Relationship

  • Provided timeous feedback and maintain professional communication to all stakeholders of the Bank.
  • Worked with business users to clarify requirements and translate the requirement into technical specifications. Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.

Environment: Informatica Power Center9.6,IDQ 9.5.1,CDC, Oracle 11G, Exadata, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

Confidential

ETL System Analyst

  • Interacts with end users directly like traders, business users & other support teams to coordinate for resolving the issues.
  • Proactive monitoring of applications to ensure system is up & to identify the incidents before user complains regarding the same.
  • Responsible for diagnosing and remediation of the issue and reporting to the end user.
  • To make sure all the critical reports are sent to the Business users and down streams as per SLA.
  • Involved in resolving batch issues by doing initial investigation and coordinate with Development teams accordingly.
  • Liaise with development teams for bugs, fixes & new feature to be implemented in the applications.
  • Suggesting improvements to enhance supportability of the application and enhancing the documentation around common support tasks
  • Involves Data investigation such as wrong feed delivery, missing information in the data files and corrupted data flown to down streams and communicate to the Business users regarding the delay.
  • Make sure that the issues are resolved within SLA period & KPI’s will be followed without miss.
  • Investigate the root cause of the incident and fix it with the respective remedial solutions with in the agreed period of time.
  • Ensure prompt communication of the resolution of the issue to the user and ensure to get his/her confirmation on the resolution.
  • Part of support includes health checks of the trading applications and proactive approach towards the resolution of the incidents before business complains to us. This will ensure that most of the incidents within the system would be resolved and that will result in smooth business for the day
  • Applying the proper work around for the known solutions to improve the resolution time for the incidents & keep the business informed always about the incident status from time to time.
  • Maintenance, support and enhancement of new and existing scripts as part of the change/release management process.
  • Having experience in intensively using Autosys and HP Open view tools.
  • Involved in enhancements of the reports based on user request.

Environment: Informatica Power Center9.6, IDQ 9.5.1 Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

Confidential

Sr. System Engineer

Responsibilities:

  • Design and Developer - Expense, workforce and Under Writing(U/W) Modules
  • Created Business Rules, Calculation Scripts
  • Created Web forms and standard reports using smart view.
  • Created Rules file to load data into Essbase
  • Created tables in Oracle SQL developer to facilitate data load into Essbase Staging
  • Created required Hierarchies in the respective cubes using Outline Load Utility.
  • Migrated artifacts like web forms, Smart lists from one environment to the other by shared services.
  • Created Maxl Scripts for scheduling Calc Scripts to run in order for data loading in Essbase
  • Assign access to users/groups

Environment: Informatica Power Center7, Teradata, Hyperion Essbase, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Center, Remedy, SharePoint

Confidential

Sr. System Engineer

Responsibilities:

  • Created new dimension build, data load rules for new ASO and BSO cube.
  • Resolving DTD (Drill to Detail) issues in Excel Spreadsheet Add-In.
  • Providing timely support whenever it failed on data load / dimension build batch and browsed through the root cause and eliminated the same so that it is not encountered again.
  • Creating user accounts and extending access to users.
  • Creation of filters and assigning filters to groups and users.
  • Written and ran the PL/SQL procedures in oracle to load data into data sources.
  • Ran and enhances the UNIX scripts in the process of cube loading.
  • Solving data integrity P1 priority issues as our application cube consists P1 priority.
  • Generating the reports using Essbase Excel add-in and Business Objects 6.5 and supported if any issues.
  • Handling the Remedy Tickets raised by the EA Business Fraternity and resolving them within the SLA(s) as defined.

Environment: Informatica Power Center7, Teradata, Hyperion Essbase, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

Confidential

Software Engineer

Responsibilities:

  • Created new dimension build, data load rules for new ASO and BSO cube.
  • Involved in analyzing the data that is loaded in the data warehouse on a daily basis through the files provided from 16 data providers.
  • Built Informatica mappings to extract data from XML sources and loading data to stage Tables.
  • Analyzed the star schema models to understand data in the data warehouse. Developed mappings to implement slowly changing dimensions and built code with reliable error/exception handling and rollback framework.
  • Migrated workflows, sessions, mappings and database scripts from Development environment to production at the time of elevation.
  • Used version control to check-in and check-out the code that is being migrated.
  • Extensively involved in production support, testing the data and day to day aspects of the data warehouse including deploying new code and loading history data into the data warehouse.
  • Involved in supporting and monitoring the oracle databases and coordinating with Oracle DBA for any database related issues.
  • Extensively used PL/SQL code while loading data and updating data in the warehouse.
  • Developed pre source shell scripts in UNIX to format the data files coming from mainframe and convert them into CSV files for processing the files through Informatica.
  • Developed shell scripts for automating Parameter file updates on daily basis.
  • As a production support person coordinated file transfer process with Amex.
  • Responded to help desk calls from customers and coordinated solutions with the rest of the developer’s team. Took care of any process failures, easily identified cause of the problem and resolved issues.

Environment: Informatica Power Center6, Teradata, Hyperion Essbase, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint.

Confidential

Sr. System Engineer

Responsibilities:

  • Created new dimension build, data load rules for new ASO and BSO cube.
  • Maintaining, monitoring and supporting dimension build, data loads and Reporting.
  • Monitoring daily runs of jobs from $Universe (scheduling tool) on UNIX environment.
  • Involved in Solving Priority 1 Business Cases and Maintaining, supporting the production in a Full Cycle basis.
  • Involved in critical and high priority time bound issues providing detailed root cause and resolution to business users, documenting the issues and solutions for future reference.
  • Batch job support (24x7) including Month End, Quarter End and Year End Support to make data available for reporting at the earliest.
  • Providing Production and Non-Production Support and also coordinating with DBAs.
  • Publishing the reports to the users (monthly/quarterly/yearly) or as and when demanded by the Business users.
  • Responsible for maintenance of proper quality standards in deliverables, version control, defect tracker, error log etc., and Major Business and Operations Support.
  • Ensured that the whole team is in sync with the Customer Advocacy Finance Business policy.
  • Maintaining robust security access to the applications as defined by the Business users.
  • Maintaining and supporting the production in a Full Cycle basis.

Environment: Informatica Power Center6, Teradata, Hyperion Essbase, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), VISIO, UNIX/LINUX, Shell Scripting, Autosys/Informatica Scheduler, HP Quality Centre, Remedy, SharePoint

We'd love your feedback!