We provide IT Staff Augmentation Services!

Sr. Informatica/teradata Developer Resume

San Jose, CA


  • 8+ years of extensive industry experience with proven expertise in Software development, administration, Data Integration (Power center 10.1), Data Quality assurance (IDQ) 10.1, Power exchange 9.5.
  • Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata TD 14.0/13.0/12.0.
  • Strong experience in using Informatica Data Quality (IDQ) tool. Created IDQ Mapplets for address cleansing, telephone cleansing and SSN cleansing. Worked with most of the IDQ transformations (like standardizer, Parser, Exception etc.) in Mapplets.
  • Experienced in using IDQ tool for Profiling, applying rules, Creating Scorecards and develop mappings to move data from source to target systems.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server and Mainframe applications.


Informatica Tools: Informatica Power Center 10.1, Informatica MDM Multi - Domain 9.x, 10, Informatica Data Quality 10.1, Informatica Analyst 10.1(IDE), Business Glossary,, Informatica Address Doctor, IDQ Cleanse functions, Informatica Metadata Manager 9.6.1, Informatica Data Validation Option(DVO)

Database Technologies: Oracle 9i,10g/11g, Teradata, SQL server 2005/2008

Programming Languages: SQL, UNIX Shell Scripting.

Web Technologies: CSS3, HTML5,JQUERY,JAVASCRIPT, ADOBE Creative Suite

Scheduling Tools: UC4, CONTROL M, Crontab, Autosys, Skybot

Tools: Toad, Teradata SQL Assistant, SQL plus, SQL Developer, Erwin, PL/SQL Developer, Putty, WINSCP, Jira, Rally

Operating Systems: Windows 9x/2000/XP/Vista/7/8,Linux/Unix, Ubuntu

S/W Methodologies: Agile, Waterfall models


Confidential - San Jose, CA

Sr. Informatica/Teradata Developer

Roles and Responsibilities:

  • Created Source to Target mapping document for various mappings and Rules developed for business line.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Worked on data profiling & various data quality rules development using Informatica Data Quality.
  • Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica Developer.
  • Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Created the Rule Specifications and applied them on profiles Using Informatica Analyst.
  • Developed the Business/Technical Glossaries and loaded them using Informatica Analyst.
  • Extracted the Profile Statistics data from the Informatica repository tables and loaded them into the Fact tables.
  • Supported the ETL Inbound for the Legacy MDM solution built.
  • Developed, support and maintenance for the ETL (Extract, Transform and load) processes Using Oracle and Informatica Power Center.
  • Worked on Informatica web services Consumer transformation to fetch the MDM executed batch related data using SIF API WSDL by passing parameters to Power Center mapping through SoapUI.
  • Involved in creating labels and migrating ETL code between different environments.
  • Built Interfaces and automated with Informatica ETL tool and Unix Shell Scripting.
  • Created technical design documents for the developed Informatica coding.
  • Designed ETL processes and develops source-to-target data mappings, integration workflows and load processes.
  • Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL
  • Used IDQ Address validator transformation which can cleanse whole world address Data and enhanced by making some modifications.
  • Worked with PowerCenter team to load data from external source systems to MDM hub.
  • Created the SQL Views, Table pairs and created the DVO jobs for validating the Source and Target Tables data as part of Automation Testing.
  • Created various AutoSys jobs for the scheduling of the underlying ETL flows.
  • Helped in establishing ETL procedures and standards for the objects improving performance enhancements and migrated the objects from Development, QA, and Stage to Prod environment.
  • Developed Mappings using Informatica Power Center to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), and Aggregator, Update Strategy, Joiner, Filter and Sorter transformations.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Worked in building mappings for populating data into the MDM Landing tables by processing the errors as DQ violations and re-processing them.
  • Built mappings that fetch data from MDM HUB outbound JMS message Queues where published XML messages are processed and feed back to the sources.
  • Worked with Informatica Power Center Workflow Manager to create sessions, work flows and work-lets.
  • Supporting the ETL Inbound for the Legacy MDM solution built.
  • Debugged existing MDM outbound views and changed it according to the requirement.

Environment: Teradata R12/R13, Teradata SQL Assistant, SQL, Informatica MDM 10.x, Informatica Powercenter 9.6.1/10.1,Informatica Data Quality (IDQ) 9.6.1/10.1, Informatica Analyst 9.6.1/10.1,Informatica DVO 9.6.1/10.1,Oracle 11g/10g, Oracle, SQL Developer, Teradata, Informatica BDE 9.6.1, Hive, HDFS, UNIX, Shell Scripting, Uc4, Rally, Flat Files, HP ALM

Confidential - San Jose, CA

Informatica/ Teradata Developer

Roles and Responsibilities:

  • Coordinating with clients for Requirements gathering and analysis.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Worked with Informatica Data Quality (IDQ) tool kit, Analysis, data cleansing, data matching, data conversion, exception.
  • Loaded data in to the Teradata tables using Teradata Utilities BTEQ, Fast Load, Multiload, and Fast Export and TPT.
  • Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
  • Used reference tables and rules created in Analyst tool.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Used various IDQ transformations like Standardizer, Address Validator, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression
  • Implement Data Quality Rules using IDQ to check correctness of the source files and perform the data cleansing/enrichment.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Designined, Implemented the Inbound & Outbound interfaces to send data to MDM System to the sources
  • Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Created the Master Data Management (MDM) Hub Console configurations like Stage Process configuration, Load Process Configuration.
  • Published the records to downstream applications with help of ETL power center with the help of message queue.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.

Environment: Informatica Data Quality 9.5 (IDQ), Informatica Power center 9.5, Informatica MDM, SQL Server 2012, Oracle 11g, UNIX, CITRIX, Micro Strategy, JIRA, Flat Files, Skybot

Confidential - Waltham, MA

Informatica/ Teradata Developer

Roles and Responsibilities:

  • Interacted with the Business Team to understand the requirements and prepared technical documents for implementing the solutions as per Business needs.
  • Configuration of Hub and creation of hub objects like landing tables, Staging tables and Base
  • Objects, Queries and packages, mappings, Cleanse functions, Batch groups, message triggers
  • Data Model Creation based on requirements.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance
  • Load the data from flat files to Hub landing tables using various transformations.
  • Load data from landing tables to staging tables during stage process.
  • Performed Cleanse Operations during when data is loading into stage tables.
  • Enable delta detection on specific columns.
  • Defining Trust and validation rules for base tables while loading data from stage tables.
  • Configuration of Match, Merge rules.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Hub Data modeling along with hierarchy manager.
  • Import &Export of ORS using Metadata manager.
  • Compare repositories and generate change lists that describe differences between them using Metadata manager.
  • Validate the metadata in Informatica Hub repository and generate a report of issues.

Environment: Informatica MDM 9.1, Informatica Power Center 9.1, Oracle 11g, WebSphere, IDD, TOAD, SQL Server 2012, Flat Files.


Informatica Developer


  • Involved in design and development of data warehouse environment, for business users and/or technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
  • Worked on Informatica Power Center 8.6/9.0 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
  • Using Informatica Designer Designed Mappings, which populated the Data into the Target Star Schema
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Translation of Business processes into Informatica mappings for building Data marts.
  • Involved in the Migration process from Development, Test and Production Environments.
  • Used SQL tools like SQL Developer to run SQL queries and validate the data.
  • Wrote stored procedures and triggers in Oracle 8i for managing consistency and referential integrity across data mart.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Scheduled the batches to be run using the Workflow Manager.
  • Involved in pushing the Informatica components in to production and scheduling the jobs.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.

Environment: Informatica Power center 8.6/9.0, SQL Server 2008, Oracle 10g, Teradata, Teradata SQL Assistant, UNIX Shell Scripting, Crontab, HP Quality Control

Hire Now