We provide IT Staff Augmentation Services!

Informatica Developer/idq Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Have 6.5 years of progressive hands - on experience in analysis, ETL,Data Quality processes,design and development of enterprise level data warehouse architectures, designing, coding, testing, and integrating ETL.
  • Good experience on working with Informatica Power center and IDQ HA environment on Grid
  • Good experience on configuration of the Model Repository service, Data Integration Service, Analyst Service, Content Management Service, Search, Scheduler, Email services on IDQ domain
  • Experience with security administration like to creating the native users, custom roles, native groups, native and then applying the roles on top of the native groups
  • Good experience with creating the different types of source/target connections in Informatica domainlike Oracle,DB2,SQL Server, HDFS, HIVE, Teradata and File connections
  • Great experience on Address Doctor configuration, troubleshooting the issues, adding different types of licenses, Updating the address reference files monthly from DQC
  • Good experience with upgrading the address doctor engine version and AD50.cfg file configuration for powercenter environments
  • Experience on automating the process for code migration, repository backups and health check process
  • Good experience on change management process for production implementation
  • Troubleshooting the existing ETL and Data Quality, BDM process on product issue and working with Product vendor R&D for resolution steps
  • Establishes, enhances, implements, enforces, and maintains ETL, Data Quality and data integration best practices
  • Experience in Data Extraction, creating Column Profiling, Rule Profiling, Mid-stream Profiling, Join analysis profiling, Data Domain, Domain discovery, Scorecards, Data linage Data Cleansing, Data Standardization, Match & Merge process and Data De-Duplication using Informatica Data Quality 10.2 HF2.
  • Good experience on rule specification, Mapping specifications, Virtual data objects, mapplets, rules and reference data management.
  • Experience with deploying the mapplets as service and using them in downstream process.
  • Experienced in mapping techniques for Type 1, Type 2, Type 3 Slowly Changing Dimensions, fact loads On Incremental basis using Control tables/mapping level variables/parameter files
  • Extensively worked with complex mappings using various transformations like Expression, Lookup, Filter, Router, Union, Aggregator, Joiner, Update Strategy, Sequence generator, Java Transformation, Labeller, Match, Merge, Decision, Exception, Key Generator, Standardizer, Case Converter, Consolidation,Parser, Address validator and Reusable Transformations, User Defined Functions etc.
  • Good Experience with HDFS,HIVE and Sqoop connectors
  • Good experience on automation of the IDQ and BDM profiles, Scorecards, mappings and workflows
  • Good experience on Autosys tool for creating the box and command jobs on both GUI and JILL
  • Good working experience on onshore and offshore model
  • Have Experience on Production on-call support for the complex High Available databases, checking logs, debugging, provided work around to supported the entire system for daily/Weekly/Monthly loads
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Excellent analytical, problem solving, technical, project management, training, and presentation skills.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.1.1/10.2 HF2

DQ /Governance Tools: Informatica Data Quality (IDQ) 10.1.1 HF1/10.2 HF1/10.2 HF2

Big Data Tools: Informatica Big Data Management (BDM) 10.2

Databases: Oracle12C/11g, Teradata15.1, DB2 UDB 8.1, MS SQL Server 2012/2014,, Greenplum and MySQL

Operating Systems: UNIX,AIX and Linux

Programming: SQL, UNIX Shell Scripting

Scheduling Tools: Autosys (JILL & GUI) andControl-M

Software Development Methodology: Agile and Waterfall.

SDLC Tools: JIRA and Rally Applications

Big Data Hadoop: HDFS, HIVE, PIG, SCOOP

Versioning tools: Tortoise SVN

Code Deployment/DevOps Tools: UCD, Jenkins and App Picker

Domain Expertise: Finance/Banking, HealthCare/Insurance, Sales and mortgage

PROFESSIONAL EXPERIENCE

Confidential

Informatica Developer/IDQ Developer

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue.
  • Involved in doing the data profiling and creation of scorecards to analyze the data with respective different measures and Automating the profile and scorecard run with UNIX scripting
  • Created mappings, workflows, mapping specifications, rule specification, mappletes, rules, reference data, LDO, CDO, VDO and applications in IDQ.
  • Good experience on IDQ code deployment
  • Created the Bad/De-Dup record exception process and configured the Human Task for automating the exception process.
  • Working with different Data Governance teams for IDQ code promotion to across all the environments
  • Experience on Data Governance principles and Meta data management
  • Creating the UNIX scripts for Profile and scorecard automation process
  • Good working experience on exposing the IDQ address validation mapplets as Web Service and testing them through SOAP calls.
  • Good experience on creating the different types of DB Connection on IDQ and BDM domains
  • Working on Performance tuning of LDO’s, profiles and scorecards
  • Creating the change requests as per change control process to promote the Data Quality objects to higher environments
  • Build the Automation process using the code migration using UCD, Jenkins and app picker
  • Worked on Scheduling Jobs and monitoring them through CA scheduler tool (Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Environment: Informatica Power Center 10X,IDQ 10XMDM, UNIX, Oracle, Shell, IDQ, PL/SQL, Erwin Tidal, Autosys, Oracle 11g/10g, Java, Teradata.

Confidential

Informatica Data Quality Developer

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue.
  • Involved in doing the data profiling and creation of scorecards to analyze the data with respective different measures and Automating the profile and scorecard run with UNIX scripting
  • Created mappings, workflows, mapping specifications, rule specification, mappletes, rules, reference data, LDO, CDO, VDO and applications in IDQ.
  • Good experience on IDQ code deployment
  • Created the Bad/De-Dup record exception process and configured the Human Task for automating the exception process.
  • Working with different Data Governance teams for IDQ code promotion to across all the environments
  • Experience on Data Governance principles and Meta data management
  • Creating the UNIX scripts for Profile and scorecard automation process.
  • Good experience on creating the different types of DB Connection on IDQ and BDM domains
  • Working on Performance tuning of LDO’s, profiles and scorecards
  • Creating the change requests as per change control process to promote the Data Quality objects to higher environments
  • Worked on Scheduling Jobs and monitoring them through CA scheduler tool (Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Environment: : Informatica Power Center 10X, Informatica Data Quality 10X, Erwin, Teradata, Autosys, SQL Assistance, DB2, XML, Oracle 11g, MQ Series, Toad and UNIX Shell Scripts.

Confidential

Developer / Data Quality Analyst

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue.
  • Created the Address Validation and Data Processor mapplets and Integrated with power Center.
  • Involved in doing the data profiling and creation of scorecards to analyze the data with respective different measures and Automating the profile and scorecard run with UNIX scripting
  • Created mappings, workflows, mapping specifications, rule specification, mappletes, rules, reference data, LDO, CDO and applications in IDQ.
  • Good experience on IDQ code deployment and Code integration with Informatica power center.
  • Good experience on privileges, roles, groups, user in IDQ Admin console
  • Good experience on Connection creations in IDQ Admin console
  • Created the Bad/De-Dup record exception process and configured the Human Task for automating the exception process.
  • Created Informatica Power Center mappings, sessions, workflows and automated the runs

Environment: : Informatica Power Center 10X,Informatica Data Quality 10X, Erwin, Teradata, Autosys, SQL Assistance, DB2, XML, Oracle 11g, MQ Series, Toad and UNIX Shell Scripts.

We'd love your feedback!