We provide IT Staff Augmentation Services!

Idq Developer/power Center Resume

3.00/5 (Submit Your Rating)

San, FranciscO

SUMMARY:

  • 11+ years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Informatica Data Quality.
  • Compiled statistics of Business specification doc’s, BRS docs, and configuration docs.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Designed and developed Reference Integrity, Technical and Business Data Quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 10.2 /9.6.1 environments.
  • Deployed workflows as an application to run them and tuned the mappings for better performance.
  • Extensively worked on IDQ Mapping Designer, Mapplet Designer, Transformation developer Designer, Workflow Manager and Repository.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, and Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit.
  • Experience in Creating Expression Rules, Creating Reference Tables from Profile Columns and Creating and Running Scorecards from the Profile Results.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Stored Procedures and Normalizes.
  • Quick understanding of Relational Source Database Systems and data models for building accurate transformation logic that can be used in Data Migration and Data Integration.
  • Extensive knowledge of Master Data Management with emphasis on Title Data, Customer, and Product Data management and Data Governance.
  • Created landing, base and staging tables according to the data model and number of source systems.
  • Design, develop and test all Hub components like Data model, Cleanse rules, Trust, Roles and Data security, Batch processes etc.
  • Creating match paths, match rules, match rule sets, system trust scores and trust score decay with Validation rules.
  • Worked on end - to-end implementation of automated data integrity profiling, monitoring and reporting tool.
  • Experience with SQL Plus and TOAD, PL/SQL Developer as an interface to databases- to analyze, view and alter data.
  • Expert in all phases of Software development life cycle(SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
  • Exclusively worked on different flavors of UNIX and Windows operating systems and good Experience in UNIX shell scripting.
  • Well exposed and worked closely with testing teams in performance unit testing, user acceptance testing and system integration testing.

TECHNICAL SKILLS:

RDBMS: Oracle 11g/10g/9i, SQL Server 2012/2010/2008 R2/2008 MS Access 9.0/7.0, Teradata 12/13, My SQL.

ETL Tools: Informatica 10.2/9.6.1/9.5.1/9.5 / 9.1/9.0.1/8.6/7.1 products (Power Center, MDM 10.2/10.1 Data Explorer, Data Analyst, Data Transformation Studio/Informatica Developer, Work Bench, Metadata Manager and Informatica Lifecycle Management), Proactive Monitoring, SSIS.

BI/OLAP Tools: Cognos 9.0/8.0, Business Objects XI, Hyperion, Tableau.

Operating Systems: Windows 2008/XP/Vista/7.

Data Modeling: Logical/Physical Data Modeling, IBM Rational Rose, Microsoft Visio 2004, 2007, 2010.

Programming Languages: SQL, PL/SQL, C, C++, .NET, C#.

Methodologies: Data Profiling, Data Quality, Data Services, Change-Data-Capture (CDC), Push Down Optimization (PDO).

Version Control Tools: Informatica, Tortoise SVN, Visual Source Safe and Win CVS.

Schedulers: Control M, Autosys.

Other Tools: Toad, Oracle SQL Developer, Embarcadero Rapid SQL, Putty, MS Project Planner, Citrix Environments.

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco

IDQ Developer/Power Center

Roles & Responsibilities:

  • Worked with Business Analysts to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue.
  • Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues.
  • Worked as Data analyst to analyze the source systems data.
  • Worked with Informatica Data Quality 10.2 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of data quality.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Involved in generating and applying rules to profile data for flat files and relational data by creating rules in IDE and case cleanse, parse, standardize data through mappings in IDQ and generated as Mapplets in PC.
  • Involved in checking Data accuracy standards for NULLS, Comparison, key generator and labeler using reference table manager in PWH and rolled based on statistics through Threshold value and Score Cards applied the Data cleansing process.
  • Proficient in creating Column and Rule Based Profiling with PDQ and LDO's for deep dive into Master tables.
  • Experienced in creating Scorecard to Project the different checks like Consistency, Completeness, Accuracy and Conformity.
  • Designed IDQ mappings which is used as Mapplets in Power center.
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Worked on Mapplets and created parameter files wherever necessary to facilitate reusability.
  • Extensively used Informatica Functions LTRIM, RTRIM, DECODE, ISNULL, IIF, INSTR and date functions in Transformations.
  • Performed code migration of mappings and workflows from Development to Test and Production Servers through deployment groups for DEV, TEST and PROD repositories in retaining shortcuts, dependencies with versioning.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Involved in writing queries for verifying the targets data using TOAD.
  • Worked with UNIX commands and used UNIX shell scripting to automate jobs. Wrote UNIX scripts to back up the log files in QA and production.
  • Used pmcmd command to run workflows though shell scripts.

Environment: Informatica Power Center 10.2, IDQ developer 10.2 Oracle 11g, Oracle SQL* LOADER, Flat Files, UNIX Shell Scripting, TOAD, SQL, Putty, WinSCP, Autosys.

Confidential - Charlotte, NC

IDQ Developer/Power Center

Roles & Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Created profiles to analyze the structure of source data. Created and applied rules in profiles.
  • Extensively worked on Address validator transformation in IDQ to parse the partial address and populated the full address in the target table
  • Created mappings in Informatica Data Quality (IDQ) using Parser, Standardizer and Address Validator Transformations.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Captured DQ metrics (like Total record count, Distinct record count, Count of a distinct field, Count of null, Referential checks etc.) on the inbound/raw data (i.e., HAL or Stage), and then again on the processed/output data (i.e., STAGE or HYB).
  • Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations etc.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations.
  • Worked on Webservice Hub to process request and send response using SOAP.
  • Redesigned some of the mappings in the system to meet new functionality.
  • Used Informatica Power Center to create mappings for extracting data from various systems.
  • Used workflow Monitor to monitor the performance of the Jobs.
  • Extensively worked in the performance tuning of the ETL process at mapping and transformation level.
  • Used SQL tools like TOAD and SQL Assistant to run SQL queries and validate the data in the database.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Create scorecards to review data quality.
  • Debug the sessions by utilizing the logs of the sessions.

Confidential - Branchville, NJ

IDQ Developer

Roles & Responsibilities:

  • Lead analysis sessions, gather requirements and write specification and functional design documents for enhancements and customization.
  • Understand the existing subject areas, source systems, target system, operational data, jobs, deployment processes and Production Support activities.
  • Created the High Level and low-level Design Document for the DQ Services.
  • Profile source data using IDQ tool, understand source system data representation, formats & data gaps Created Exception handling process and worked on the best practices and standards for exception handling routines.
  • Build profiling, cleansing and validation plans.
  • Analyze profiling results and make recommendations for improvements.
  • Establishes data quality dimensions, such as data completeness, conformance, consistency, validity, and timeliness.
  • Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues.
  • Worked on scorecards and trend charts for data issues.
  • Used Informatica data quality tool to cleanse and standardize the data while loading to staging tables.
  • Developing the Mappings to move the data loaded from Landing to Stage by using various cleanse functions.
  • Created Web services in IDQ developer tool and generated WSDL file for real time services.
  • Environment: Informatica Platform 9.6.1/10.1, SQL-SERVER 2012, Windows XP and Agile Methodology, HP Quality Center, SQL Scripting, SOAP UI.

Confidential, Carrollton, TX

IDQ Developer

Roles & Responsibilities:

  • Interacted with subject matter experts and data management team to get information about the business rules for data cleansing.
  • Work with the project team and data owner to gather requirements.
  • Created the Technical Design Document for the DQ Services.
  • Analyze profiling results and make recommendations for improvements.
  • Built Physical Data Objects and developed various Mapping, Mapplets/rules using Informatica Data Quality based on requirements to profile, validate and cleanse the data.
  • Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).
  • Worked on detailed analysis of Master data based on attributes such as Data Completeness, conformity, consistency, accuracy, duplication and Integrity. Analysis involved creating a presentation.
  • Involved in Implementation of DQ Services using Informatica Data Quality. Implementation Included Cleansing and Matching Services.
  • Worked with various developer modules like profiling, standardization and matching.
  • Showing various counts, creating Scorecard using Informatica Analyst and Trend analysis report.
  • Utilized Informatica Data Quality (IDQ) software to provide name and address cleansing for improving data quality and customer house holding purposes.
  • Assist in code testing, migration, deployment, enhancement and bug fixes.
  • Good understanding of relational database management systems like Oracle, PLSQL and SQL and extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems.
  • Created scripts and automated them with the help of Control M Scheduler which automated the Profile and Profile Model batch execution for the Data Quality rules, ETL and Metadata Load batch.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Create a weekly process presenting cleansing, validation and duplicate suspect Results to all the distributors.

Environment: Informatica Developer (IDQ 9.6.1), Informatica Power center 9.6.1, Oracle 11g, PL/SQL, SQL.

Confidential - Milwaukee, WI

IDQ Developer

Roles & Responsibilities:

  • Designing & documenting the functional specs and preparing the technical design.
  • Works with data stewards across the enterprise to identify critical data elements and define data quality criteria including business rules, definitions, and tolerance levels.
  • Involved in creating a Metadata Manager in DEV/UAT and PROD environments to pull the Metadata Information for Various Relational databases like Oracle, Teradata, DB2 etc.
  • Works with Data Governance to set standards for how data is used and consumed. Enacts and enforces uniform data entry standards.
  • Establishes the criteria for how often data quality is checked for accuracy.
  • Establishes data quality dimensions, such as data completeness, conformance, consistency, validity, and timeliness.
  • Develops data improvement processes to maintain and/or improve its value
  • Centralizes data quality processes into one data quality program. Incorporates data cleansing, standardization, and matching processes handled by external vendors.
  • Data Quality Assessment Completes data profiling activities; assessment of existing data for completeness and accuracy relative to the quality specifications for the data.
  • Worked with Informatica Data Quality (IDQ) Developer/Analyst Tools to remove the noise of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Involved in an ETL development to fetch the above Data Quality Objects results from profiling warehouse to IQM Data Mart.
  • Developing several complex mappings in Informatica a variety of Power center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
  • Used Metadata manager to Import and Export of Metadata and Promote Incremental changes between environments from development to testing phase
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one-time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica9.5.1, Oracle11g, SQL Server 2012, HP-UX, IDQ.

Confidential, Rochester, NY

IDQ Developer

Roles & Responsibilities:

  • Worked with Business and IT teams in educating Data Quality process and showcasing IDQ Capabilities in developing Enterprise wide Data Quality Solution.
  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Configured Metadata Repository to create the lineage for Erwin data models, Databases consisting of Teradata, Oracle, Sql Server, Power center ETL objects and Business Objects reports.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Performed data profiling based on the functional specifications provided business team.
  • Worked with data quality plans, components, uses the data quality report viewer to get reports, dash boards.
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ webservices.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues.
  • Worked with Informatica data quality core components like Informatica Analyst (web based) and Informatica developer.
  • Worked extensively to create Reference tables and standardize the data and Match analysis.
  • Worked extensively with address validate to cleanse the address elements. Created input and output templates.
  • Worked on scorecards and trend charts for data issues.
  • Created Metadata Manager Service, Reporting Service and Reference Table Manager Service in ISP Admin Console to effectively use Metadata Manager.
  • Customization of standard Informatica Meta Model to accommodate a full range of metadata as defined for the collection and storage of Enterprise Metadata.
  • Integrate the address doctor with address validation transformation to cleanse the address.
  • Develop Cleansing Services to remove noise words, standardize data, Validate Addresses, Parse data.
  • Develop Matching Services for de-duplication as well dual-source matching.
  • Extracted data from Oracle and Teradata then used SQL Server for data warehousing.
  • Write Shell script running workflows in UNIX environment.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica 9.1/9.5, Oracle 10g, SQL Server 2010, HP-UX.

Confidential, OH

ETL/IDQ Developer

Roles & Responsibilities:

  • Analyzed the data based on requirements, wrote down the techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ).
  • Developer to remove the noises of data using Parser, Labeler, Standardization, Merge, Match, Case Conversion, Consolidation, Address Validation, Key Generator, Lookup, Decision etc.
  • Creating Reference Tables from Profile Columns and Creating a Scorecard from the Profile Results and Editing the Scorecard, Configure Thresholds and Address's cleansing
  • Created Reference/Master data for profiling using IDQ Analyst tools. Used the Address Doctor Geo-coding table to validate the address and performed exception handling, reporting and monitoring the data.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ application for matching and merging process.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.

Environment: Informatica 8.6 Power Center, Oracle 10g, SSRS, SQL-server 2008.

Confidential

ETL Developer

Roles & Responsibilities:

  • Involved in design, development and maintenance of database for Data warehouse project.
  • Involved in Business Users Meetings to understand their requirements.
  • Profiling of Source data to measure data quality. Source data involved attributes such as Business Names, Postal addresses etc.
  • Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once.
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
  • Importing Source/Target tables from the respective databases by using Execute Package Task using Control Tasks in SQL Server Integration services (SSIS).
  • SSIS packages created using different transformations, like Slowly Changing Dimensions, Look up, Aggregate, Derived Column, Conditional Split, Fuzzy Lookup, Multicast and Data Conversion.
  • Migrated data in various formats like text based files, Excel spreadsheets, to SQL Server databases using SQL Server Integration Services (SSIS) to overcome the transformation constraints.
  • Used Control Flow Tasks like For Loop Container, For Each Loop Container, Sequential Container Execute SQL task email task and Data Flow Task.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings
  • Coded PL/SQL scripts.
  • Worked with report builder, report server project for generating report.
  • Worked with different parameterized reports.
  • Generating reports using SSRS from SQL Server Database (OLTP) and SQL Server Analysis Services Database (OLAP).
  • Generated the Reports Weekly and Monthly wise as per the Client Requirements.
  • Writing the Stored Procedures which meet the requirements.

Environment: Informatica PowerCenter 7.1.3, Oracle 10g, UNIX, SSIS, SSRS, SQL-Server.

We'd love your feedback!