We provide IT Staff Augmentation Services!

Sr Informatica Developer/idq Developer Resume

3.00/5 (Submit Your Rating)

Long Beach, CA

BACKGROUND SUMMARY:

  • 8 years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Experience in designing and implementing data warehouse applications, mainly in ETL processes using Informatica Power Center v10.x/v9.x/v8.6.1/v8.1 using Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Consultant), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience working with Cloud Computing on Platform Salesforce.com
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.
  • Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of teh transformation to cleanse and profile teh incoming data by using Standardizer, Labeler, Parser, Address Validator Transformations of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Experience in creating batch scripts in DOS and Perl Scripting.
  • Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.
  • Experience in both Waterfall and Agile SDLC methodologies.
  • Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principals (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
  • Data modeling experience in creating Conceptual, Logical and Physical Data Models using ERwin Data Modeler.
  • Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Performed teh data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture teh changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Hands on experience in MDM development.
  • Involved in teh designing of Landing, Staging and Base tables in Informatica MDM.
  • Created MDM mapping and configured match and merge rules to integrate teh data received from different sources.
  • Optimized teh Solution using various performance-tuning methods (SQL tuning, ETL tuning (me.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Extensively used Autosys and Tidal for scheduling teh UNIX shell scripts and Informatica workflows.
  • Extensive knowledge in all areas of Project Life Cycle Development.
  • Strong analytical, verbal, written and interpersonal skills.

TECHNICAL SKILLS

Databases: Oracle 11g/10g/9i/8i, SQL Server 2005/2008, MS-Access, DB2 8.2, Teradata

Operating Systems: Unix, Windows

Languages: PL/SQL, T-SQL, UNIX Shell

Tools: Informatica Power center v10.x/v9.x/v8.6.1/v8.1, Informatica IDQ, Informatica Cloud, TOAD, MS Office Tool Set, SQL Server Analysis Services, HP Quality Center 9.2/10, Rapid SQL, MQ Series.

OLAP: OBIEE 10.1.3.X, 11g, Siebel Analytics 7.8/7.7, COGNOS 6.x (Framework Manager, Report Studio)

GUI: Visual Basic 6.0, Oracle FORMS 4.5, Oracle REPORTS 2.5

Scheduling Tools: Autosys, Informatica Scheduler

Methodologies: Star Schema, Snow Flake Schema, Ralph Kimball Methodology

Reporting Tools: Business Objects, COGNOS

PROFESSIONAL EXPERIENCE:

Confidential, Long beach, CA

Sr Informatica Developer/IDQ Developer

Responsibilities:

  • Worked with Informatica Data Quality 10.0 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.0
  • Worked on requirements gathering, architecting teh ETL lifecycle and creating design specifications, ETL design documents.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables teh creation of a single view of customers, halp control costs associated with mailing lists by preventing multiple pieces of mail.
  • Responsible for Unit and Integrating testing of Informatica Sessions, Batches and teh Target Data.
  • Schedule teh workflows to pull data from teh source databases at weekly intervals, to maintain most current and consolidated data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing teh development time.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract teh data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
  • Perform Informatica MDM design, implementation, experience on master data management Informatica Data Quality (IDQ 9.6.1) is teh tool used here for data quality measurement.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Experienced with Informatica PowerExchange for Loading/Retrieving data from mainframe systems.
  • Building cubes using Pentaho schema workbench to design analysis and dashboards and designing interactive reports with Pentaho report designer.
  • Identifying teh existing system opf financial planning.
  • Customize teh EPM product based on teh requirement and existing files.
  • Developed ETL process using Pentaho PDI to extract teh data.
  • Design teh Source - Target mappings and involved in designing teh Selection Criteria document.
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
  • Responsible for manually start and monitor production jobs based on teh business user’s requests.
  • Responsible to look into production issues and resolved them in timely manner.
  • Developed Informatica process to replace stored procedure functionalities and provide a time TEMPeffective and high data quality application to teh client.
  • Analyze teh business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in teh data warehouse house.
  • Prepared ETL Specifications and design documents to halp develop mappings.
  • Created Mappings for Historical and Incremental loads.
  • in and checkout versions of objects.
  • Used Jenkins for deploying in different environments.
  • Configured and maintained source code repositories in Clearcase, GIT.
  • Worked on staging teh data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute teh data warehouse.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute teh shells scripts.
  • Project based on Agile SDLC methodology with 2 weeks of software product release to teh business users.
  • Take part in daily standup and scrum meetings to discuss teh project lifecycle, progress and plan accordingly, which is teh crux of Agile SDLC.
  • Provide post release/production support.

Environment: menformatica Power Center 10.1, IDQ 10.0, Informatica MDM 10.0, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts, Teradata.

Confidential ., Atlanta, GA

Sr. Informatica IDQ Developer

Responsibilities:

  • Analyzed and thoroughly studied various data sources and different development environments within teh organization.
  • Extensively worked on extracting teh data from various flat files (fixed width, delimited), applying teh business logic and tan loading them to teh oracle databases.
  • Extensively worked with Source qualifier, Filter, Joiner, Expression, Lookups, Aggregator, Router, Sequence Generator, and Update Strategy.
  • Used various informatica transformations in teh development of complex mappings.
  • Extracted data from different heterogeneous source systems applied business logic using transformations and loaded to teh target systems using Informatica power center.
  • Used Informatica Data Quality (IDQ) to profile teh data and apply rules to Membership & Provider subject areas to get Master Data Management (MDM).
  • Worked on multiple projects using Informatica developer tool IDQ
  • Involved in migration of teh maps from IDQ to power center.
  • Design reference data and data quality rules using IDQ and involved in cleaning teh data using IDQ.
  • Used IDQ to profile teh project source data, define or confirm teh definition of teh metadata, cleanse and accuracy check teh project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes
  • Worked closely with business for requirement gathering and to understand teh project needs.
  • Interacted with environmental team and data architects in design and implementation data models.
  • Designed and developed complex mappings to load teh Historical, Weekly and Daily files to Oracle database.
  • Extensively worked on data conversion and data cleansing.
  • Created different move-it jobs to have inbound/outbound transition of files between WellCare and different vendors.
  • Coded UNIX shell scripts, created command task, and email task for providing teh pre-session post-session requirements for various informatica jobs.
  • Provide database coding to support business applications using T-SQL
  • Worked on automation of informatica job flow using autosys boxes/jobs.
  • Extensively worked on basic SQL queries such as creating altering Tables, Indexes, Views also worked with PL/SQL stored procedures. Queried various tables to get resultant datasets as per teh business requirements.
  • Prepared ETL mapping documents explaining complete mapping logic.
  • Prepared unit test document and performed unit testing, regression testing.
  • Provided QA/UAT support while code promotion and worked with QA's to resolve any defects if found.
  • Worked with teh Release Management Team, teh DBA Team, and teh UNIX team for smooth code productions.

Environment: Informatica PowerCenter 9.6.1/9.5, Informatica Data Quality (IDQ), Oracle 11g, PLSQL Developer, Flat Files, MS Excel 2010, MS Visual Studio 2010, UNIX, WinSCP, MS Access, Autosys.

Confidential, Baltimore, MD

Sr Informatica Developer/MDM

Responsibilities:

  • Interacted with teh Data Analysis team to gather requirements and created Business Specification Documents. Analyzed teh approach Documents/Templates and suggested a few fixes. Analyzed source to target mapping document.
  • Worked on data cleansing, data Extraction, Transformation and Loading design, development, source to target mappings, data warehouse transformations, performance tuning, testing and documentation,
  • Used Informatica Power Center to extract/transform and load data from different operational data sources like Oracle, Flat files like CSV files, XML files, Cobol files, SQL server from which customers data is coming and is loaded into Teradata warehouse,
  • Installed & Configured MDM Hub on Dev, Test and Prod Server, cleanse, and Address Doctor in Dev, QA,
  • Actively involved in implementing teh land process of loading teh customer/product Dataset into Informatica MDM from various source systems,
  • Worked on data cleansing and standardization using teh cleanse functions in Informatica MDM,
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store,
  • Configured JMS message Queue and message triggers with SIF API,
  • Configured Web services using SIF API Interface,
  • Worked on complex mappings using transformations like Aggregator, Expression, Joiner, Filter, Sequence Generator, Connected and Unconnected Lookup, Dynamic Lookup, Filter, Router, Union, and Update Strategy using Informatica Power Center Designer,
  • Implemented Type 2 Slowly Changing Dimension (SCD) to access teh full history of accounts and transaction information,
  • Implemented Change Data Capture (CDC) using teh MD5 function,
  • Worked on Informatica command line utilities like PMCMD, PMREP,
  • Responsible for teh performance tuning of teh ETL process at source, target, mapping and session levels,
  • Responsible for troubleshooting bottleneck problems by creating teh indexes at database level,
  • Created and configured workflows, worklets, and sessions using Informatica Workflow Manager,
  • Made use of session log files and workflow log files to debug errors in mappings and sessions,
  • Used Mapping Parameters and Variables for reusability of code snippets,
  • Worked with version control tools like SVN and scheduler tools like autosys,
  • Worked with Teradata utilities like Multiload, FastExport, and BTEQ scripts,
  • Coded shell scrips in UNIX and used UNIX tools like WinSCP,
  • Supported teh testing team, UAT team and production support teams.

Environment: Informatica Power Center 9.1.1, Informatica MDM Multi-Domain, IDD, SIF, Oracle11g, Teradata V2R12, SQL Server 2008/R2, UNIX, Teradata utilities.

Confidential

Informatica Developer

Responsibilities:

  • Performed requirements gathering, developed Analysis & Design document (A&D), developed Project time line,
  • Designed and developed teh ETL Mappings for teh source systems data extractions, data transformations, data staging, movement and aggregation,
  • Developed standard mappings using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter for weekly, quarterly process to loading heterogeneous data into teh data warehouse. Source files include delimited flat files, and SQL server tables,
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy, Union, SQL, Java and Sequence generator.
  • Extracted data from Flat files, Oracle, SQL and DB2 into teh Operational Data Source (ODS) and teh data from Operational Data Source was extracted, transformed and applied business logic to load them in teh Global Data Warehouse Using Informatica PowerCenter 9.1.0 tools.
  • Executed sessions, both sequential and concurrent, for efficient execution of mappings and used other tasks like event wait, event raise, email and command,
  • Used Informatica for loading teh historical data from various tables for different departments
  • Developed Informatica mappings for Slowly Changing Dimensions Type 1 and 2,
  • Created Mapplets for implementing teh incremental logic, in identifying teh fiscal quarter of teh transaction and in calculating various requirements of business processes,
  • Involved in teh Unit & Integration testing of teh mappings and workflows developed for teh enhancements in teh application,
  • Code migration of Informatica jobs from Development to Test to Production. Performed Unit testing, Integration Testing, Job & Environment Parameters Testing along teh way,
  • Scheduled and ran Extraction and Load processes and monitored tasks and workflows,
  • Tuned teh MMW databases (stage, target) for better, faster and efficient loading and user query performance,
  • Created Informatica mappings with PL/SQL stored procedures/functions to incorporate critical business functionality to load data,
  • Extensively worked on performance tuning of mappings, sessions, Target, Source and various ETL processes.

Environment: Informatica 9.0.1, Oracle, ODS, DB2, Erwin, Oracle, SQL server, Shell scripting, TOAD.

Confidential

ETL Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to teh field user and support for production environment,
  • Involved in Analysis of teh current system and Design solutions to create a centralized data warehouse as well as planning migration solutions from teh current system to teh new system.
  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Used Informatica to extract data from different Source systems DB2 and Flat files tan into Oracle Target system,
  • Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into teh data warehouse,
  • Involved in Development and Data Integration of teh main components using Informatica Power Center,
  • Used different Transformations like Expression, Router, Filter, Sequence Generator, Stored Procedure, Connected and Un-connected Look ups etc.,
  • Actively involved in teh design and implementation of Physical and logical Data warehouses and Data marts, OLAP solutions, Multidimensional analysis,
  • Developed logical models building hierarchies, attributes and facts extracting tables from warehouse,
  • Actively involved in teh scripting team for shell script to automate and migrate data from ODS to Data warehouse,
  • Used teh command line program PMCMD to communicate with teh Informatica Server and for monitoring teh workflow and tasks,
  • Involved in performance tuning, monitoring and optimization of ETL loads.
  • Involved in different team review meetings,
  • Created teh mapping and Functional Specifications,
  • Managed software releases after each stage of teh testing and defect removal. As well as resolving issues from UAT and System testing.

Environment: Informatica Power Center 8.6, Oracle 9i, Toad, ODS, Quest Central for DB2, UNIX, Windows NT, XML, Teradata, IBM DB2, MS Excel 97, Flat files, SQL, PL/SQL and UC4 Scheduling tool.

We'd love your feedback!