We provide IT Staff Augmentation Services!

Sr Informatica Data Quality Developer/analyst Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Around 7 years of strong data warehousing experience using Informatica. strong experience on designing, testing, deploying, and documenting the data quality procedures and their outputs in Data Quality and Data Analysis using with Informatica IDQ/IDE 9.5/9.6/10.1/10.2/10.4.
  • Knowledge on master data management which plays a key role in delivering trusted data to key business initiatives quickly.
  • Strong business understanding and knowledge of Oil & gas, Insurance, Financial projects.
  • Expert in all phases of Software development life cycle (SDLC) - Project analysis, requirements, design documentation, development, unit testing, user acceptance testing, implementation, post implementation support and maintenance.
  • Used IDQ (Informatica Data Quality) to extract, transform and load (ETL) rules and source-to-target mappings to derive additional business rules for data quality checks.
  • Experienced in working with Metadata Manager.
  • Used different Data Quality transformations like Standardizer, parser, Match, etc. in the mappings.
  • Performed Importing Meta data, Column Profiling, Dependency Profiling, Redundancy Profiling (Cross Table Analysis), Orphan Analysis.
  • Managed Exception tables created reference tables and managed them by uploading the data on regular basis from the inputs of Source Record Team.
  • Experience in building logical data objects for building the rules.
  • Experience in Building the Data Services.
  • Performed administrator activities in admin console like changing the properties of application like enabling the cache for ltd., enabling the indexing, redeploying the application, starting the application, and stopping the application.
  • Monitored the virtual tables performance in the admin console.
  • Build custom profiles using IDQ to analyze the data and run scorecards.
  • Performed the Hana remediation tasks.
  • Experience in validating the results on the dashboard based on each business unit.
  • Experience on SQL Editor such as TOAD.
  • Hands-on experience in tuning mappings, identifying, and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Good team player with leadership abilities and excellent communication skills.

TECHNICAL SKILLS

Languages: C, C++, Java.

Design Tools: Erwin4.0/3.5, Oracle Designer / 2000, ER Studio

Databases: Oracle: 12c,11g, 10g, 9i and 8i Microsoft: MS Access, SQL Server 2005/2008, MySQL

Query Tools: TOAD, SQL Developers Plus

ETL Tools: Informatica Data Quality 10.4/10.2/10.1/9.6/9.5

Operating Systems: Microsoft Windows - 7, Vista, XP, 2000, NT 4.0, OS/2 UNIX - Sun Solaris, HP-UX

Others: Collibra, Jira, Share point, MS Office, Service now, Putty, Tidal, Power BI, Spotfir e

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Sr Informatica Data Quality Developer/Analyst

Responsibilities:

  • Interacted with subject matter experts and data management team to get information about the business rules.
  • Worked with data stewards in defining the terms associated to rules that are defined in collibra data governance tool.
  • Created logical data objects which will be the source for profiles and scorecards.
  • Build business rules as mapplets using various passive transformations.
  • Followed best practice methods for rule development
  • Created profiles and scorecards.
  • Created various financial checkouts in IDQ for various amounts to match and validate as per the business and created scorecards to analyze the current process.
  • Exported the profile results to data stewards to analyze the data.
  • Added multiple rules to a single profile.
  • Collaborated with data stewards to identify critical data elements to build business rules on different dimensions of data quality like completeness, conformance, accuracy ..etc.
  • Created business rules if necessary and reused the existing business rules to reduce the development time and overhead.
  • Performed code review on other developers work before the objects go for release.
  • Developed the mappings with business rules and loaded the data into tables.
  • Created metrics and added them to scorecards where users can drilldown on live data.
  • Performed performance tuning on existing mappings.
  • Created workflows and applications.
  • Configured the scorecards in analyst tool.
  • Monitored the daily jobs in production and tuned if problem persists.
  • Guided other developers and data stewards if they have problem in using analyst tool.
  • Parameterized the filter conditions at the mapping level.
  • Created reference tables and used them as lookup in creating business rules.
  • Created custom querys to improve the performance.
  • Created physical data objects on files and relational tables.
  • Used different active and passive transformations to great effect.
  • Monitored the logs at unix level and admin console level.
  • Redeployed the applications at the admin console.

Environment: UNIX, Informatica Data Quality 10.4,10.2, IDQ/IDE, TOAD for Oracle, Oracle12.1c/11g, SQL Server2005/2008, Power Builder, Core FTP, Share point.

Confidential, Raleigh, NC

Informatica Data Quality Developer/Analyst

Responsibilities:

  • Interacted with subject matter experts and data management team to get information about the business rules for cleaning the source system data.
  • Created Business Terms and defined the definitions in the glossary.
  • Created and configured the data lineage in the metadata manager.
  • Created profiles and scorecards in the analyst tool.
  • Created logical data objects based on different entities for each domain and reused them for building business rules and data services.
  • Used match transformation effectively to match the data between two systems and helped SME’s to fix the data at the source level with the match results.
  • Optimized the performance of data services and other mappings by caching the l.D.O and using the cache instead of hitting the production tables more than once.
  • Created the mapplets for common objects so that all other domain users can use them which in turn reduces the development time.
  • Collaborated with the business users in gathering the requirements for rule building and exceptions.
  • Built almost 400 rules as mapplets and integrated with dashboard where the business users can see the fail records and pass records in bars and chart’s view based on each rule or field.
  • Helped the business users in using the dashboard so that they can send the reports to the end users to start cleaning their source system data.
  • Worked on different source systems like SAP, TOBIN, QUORUM, WELLVIEW, TOW,TDM,PI,IHS
  • Worked with data stewards in implementing the exceptions based on the system timestamp.
  • Used the router transformation very effectively where we pulled the data once and passed it to multiple rules instead of pulling the data each time for each rule and used union transformation to union all the rules result to pass into target.
  • Developed the rules by using passive transformation like expression and aggregated the results after the record passes the mapplet or before the mapplet based on the requirement.
  • Built the reusable transformations like lookup and effectively used them in different mappings.
  • We have used transformations like sorter, filter, aggregator, unconnected lookup, normal lookup, joiner etc. based on the requirement.
  • Build the simple and complex sql queries based on the requirement for unit testing.
  • Built the sql scripts for inserting, updating, and deleting the records.
  • Used the store procedure in the mappings for truncating the tables.
  • Troubleshooted the mappings at the source level, transformation level and target level.
  • Implemented the exceptions by collaborating with the business.
  • Built some high priority business rules on different dimensions like completeness, conformance, consistency. Which helped them in saving millions of dollars.
  • Evaluated the results at the dashboard level based on each business unit.
  • Created the mapping for business users for tracking of pass records and fail records.
  • Followed best practice methods for rule development.
  • Collaborated with tidal team in setting up dependencies on various mappings.
  • Migrated the applications and mappings from development to stage using the templates.
  • Monitored the jobs in admin console for job failures.
  • Created various applications and added the mappings to the application and redeployed it to the integration service.
  • Stopped the application in admin console before redeployment and started it after the redeployment.
  • Implemented and governed the best practices related to enterprise metadata management standards.
  • Provided the information of key fields like the owner of that field, definition of it and the source of it to the stake holders.

Environment: UNIX, Informatica Data Quality 9.6/9.5,10.2, IDQ/IDE, TOAD for Oracle, Oracle12.1c/11g, SQL Server2005/2008, Power Builder, Core FTP, Share point, Shell Scripting.

Confidential, Tampa, FL

Informatica Data Quality Developer/Analyst

Responsibilities:

  • Created Profile report used IDQ to know weakness and strengths of source data and created Scorecards.
  • AppliedInformaticaData Quality standards, guidelines, and best practices on the various sources.
  • Developed (mappings, workflows, Logical data object, Profiling, Scorecards), and automated entire process.
  • Understand existing DQ landscape to provide key insights on the scope and magnitude of DQ issues and supporting Processes and procedures.
  • Involved inconceptual, logical, and physical datamodeling and used star schema in designing the data warehouse.
  • Identified key data sources that are in-scope, validating/defining critical data elements and business rules.
  • Optimized theperformance of the Informatica power center mappingsby various tests on sources, targets, and transformations. Identified theBottlenecks, Removed the Bottlenecksandimplemented performance tuning logicon targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Exported the valid and invalid documents to data stewards and business analyst to understand the data.
  • Applied the Completeness, Conformity, Integrity, Timeliness, Synchronization/Consistency to Create the Scorecards, trend charts.
  • Identified data quality issues, anomalies, patterns, based on defined business rules and created metrics and score cards by IDQ Profiling.
  • Applied data analysis, data cleansing, data matching, exception handling, and reporting and monitoring capabilities in IDQ (Informatica data quality)
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica Data quality.
  • Created logical data objects in IDQ for profiling using with multiple tables joined based on the business requirement.
  • Exported the IDQ mapping to Power center to maximize the performance on high volume loading.
  • Responsible for identifying reusable logic to build several Rules/Mapplets in Informatica Data qualities to increase profile retrieval process.
  • Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity, Timeliness, And Synchronization/Consistency.
  • Extensively used IDQ profiling for the database and flat file data and applied different strategies to get detail understanding about data.
  • Applied all the created rules in IDQ profile to get scorecards to analyze the business users and data stewards.
  • Cleansed, standardized, labeled, and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
  • Created various financial checkouts in IDQ for various amounts to match and validate as per the business and created scorecards to analyze the current process.
  • Tuned IDQ Development process to push most of the process to database and removed unwanted ports and transformations.
  • Extensively worked for address validation and name matching (Standardization) task at IDQ
  • Used IDQ profile on given business data and applied column profile, Primary key profile, function dependency profile, foreign key profile and join profile.
  • Created exception to highlight the accepted, rejected records for data stewards to understand the exception data process.
  • Used Metadata Manager to show how data objects will be impacted by a proposed change before it is implemented.
  • Performed detailed impact analysis with Metadata Manager to show the results when changes are made to metadata used in mappings, sessions, workflows, sources, and targets.

Environment: UNIX, Informatica 9.5.0, IDQ/IDE, TOAD for Oracle, Oracle10g/11g, Netezza, SQL Server2005/2008, Erwin 4.0, Data Flux,PL/SQL, OBIEE, Power Builder, Core FTP, Sun Solaris 8.0, Shell Scripting.

Confidential

ETL Developer

Responsibilities:

  • Extensively Worked on Informatica tools such as Source Analyzer, Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer
  • Extensively used all the transformations like source qualifier, aggregator, filter, joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator etc. and used transformation language like transformation expression, constants, system variables, data format strings etc.
  • Involved in running the loads to the data warehouse and data mart involving different environments.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, sessions, tasks etc.
  • Configured incremental aggregator transformation functions to improve the performance of data loading.
  • Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing change data capture.
  • Involved in unit testing and resolution of various bottlenecks that came across.
  • Prepared technical design documents and test cases.
  • Build mapplets using expression transformation for reusability purpose.
  • Used pre sql and post sql at the source level and target level.
  • Involved in troubleshooting and performance tuning of data loading process.

Environment: Informatica Power Center 9.1, Oracle, UNIX, SQL*Plus, PL/SQL.

We'd love your feedback!