We provide IT Staff Augmentation Services!

Senior Teradata / Informatica Developer Resume

Miami, FL


  • Over 9 years of experience as senior Teradata, ETL Informatica PowerCenter and database developer.
  • Extensively used Teradata versions 14/13/12, Informatica PowerCenter 9.6/9.5/9.1/8.6/8.1 , Informatica Data Quality (IDQ) 9.5/9.1 as ETL tool for extracting, transforming and loading data from various source data inputs to various targets.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator transformation, Active and Passive transformations, Joiner and Update Strategy transformations.
  • Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2, type3.
  • Expertise in RDBMS, Data Warehouse Architecture and Modeling. Thorough understanding and experience in data warehouse and data mart design, Star schema, Snowflake schema, Normalization and Demoralization concepts and principles.
  • Experience in working with Mainframe files, COBOL files, XML, and Flat Files.
  • Extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica Power Center & Oracle PL/SQL technologies.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modeling.
  • Extensive knowledge on Data Profiling using Informatica Developer tool.
  • Implemented Slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information designed and developed change data capture solutions (CDC) for the project, which captures and analyzes changes from daily feeds to maintain history tables.
  • Involved in Informatica upgrade from Informatica Power Center 7.1.1 to Informatica Power center 8.6 and Informatica Power Center 7.1.1 to Informatica Power center 8.1.1.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co - ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • Proficient in Teradata database design (conceptual and physical), Query optimization, Performance Tuning.
  • Strong hands on experience using Teradata utilities (FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
  • Experience in Master Data Management (MDM) architectures and business processes.
  • Familiar in Creating Secondary indexes, and join indexes in Teradata.
  • Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Expert in troubleshooting/debugging and improving performance at different stages like database, Workflows, Mapping, Repository and Monitor
  • Involved in Informatica administration such as creating folders, users, change management and also involved in moving code from DEV to TEST and PROD using deployment groups in Informatica Repository Manager.
  • Experience in handling different data sources ranging from flat files, Excel, Oracle, SQL Server, Teradata, DB2 databases, XML files.
  • Excellent Hand full working experience in running projects by using scheduling tools like TWS and AutoSys
  • Strong knowledge and understanding of data modelling (Star and snow flake schemes), ER diagrams, Data flow diagrams/Process diagrams.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance bottlenecks.
  • Proficient in applying Performance tuning concepts to Informatica Mappings, Session Properties and Databases.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Excellent communication skills and experienced in client interaction while providing technical support and knowledge transfer.


ETL Tools: Informatica Powercenter 9.6.1, 9.5, 9.1.0, 8.x, Informatica MDM, IDQ.

Database: Oracle 9i/10g/11g, SQL Server 2014/2008/2005 and Teradata 14.0

Data Modelling Tools: Erwin 4.1

Scheduling Tools: TWS, Auto Sys

Languages: SQL, PL/SQL, C, C++

Scripting Languages: Korn Shell, Bash shell scripting, UNIX shell scripting

Operating Systems: Windows, Red Hat Linux, Sun Solaris


Confidential, Miami, FL

Senior Teradata / Informatica Developer


  • Involved the Efficient Architecture of the ETL Technical Process flow (i.e) High level Design documents for implementation across multiple countries.
  • Meetings Senior Teradata and Informatica powercenter developer on a data warehouse initiative responsible for requirements gathering, preparing mapping document, designing ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) and additional target data warehouses.
  • Used Components of Ab Initio to extract and transfer the data from multiple operational data sources like TERADATA , DB2 UDB, SQL Server and Oracle to destination data marts in Oracle
  • Understand Information Delivery Network (IDN) Architecture and Data warehouse Transformation end-to-end processes.
  • Created the Sybase stored procedures into the new Teradata environment and created Teradata tables, views, and other database objects for the migration. Also converted Sybase stored procedures, views, tables to Teradata Universal Data warehouse Production environment.
  • Developed the scripts to migrate data from Sybase to Teradata using Teradata Parallel Transporter and implement the Extraction-Transformation-Loading of data through ETL tools.
  • Developed Fast load, Multi Load, Fast export, BTEQ scripts for loading data from sybase to Teradata and used generic Teradata Parallel Transporter(TPT) scripts to extract the data from Sybase to Unix data files and then to load into Teradata staging environment.
  • Involved in implementation of the Partition Primary Index (PPI) and modifying the critical tables PPI and improved the performance of the project and identifying the secondary index to various tables.
  • Developed the scripts to perform the data validation and the verification of the between parallel production systems.
  • Develop source to target mappings and implement the transformation, business rules and logic in the scripts and schedule the jobs through scheduler tools and monitor the production jobs, maintain and fix the issues.
  • Unit Testing, Database testing, Data verification and validation of the code developed and compare the results with Sybase and Teradata systems.
  • Identify the performance bottlenecks in the production processes and identify key places where SQL can be tuned to improve overall performance of the production process
  • Scheduled Jobs using scheduling tools like Autosys.
  • Automated the entire process using UNIX shell scripting
  • Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Analysts, Project Managers and peer developers to analyze system requirements, design and develop software applications.
  • Created new extracts for external vendors and use Informatica ETLs for new workflows to move data out of multiple data sources.

Environment: Teradata 14, Informatica Power Center 9.5.1, Sybase IQ and Sybase ASE database, UNIX Shell Scripting, Perl scripting, CVS versioning tool, EngineG Migration scheduling.

Confidential, Glenn View, VA

Senior Informatica Developer


  • Translate Created the Design for Extraction process from legacy systems using combined techniques of Data Replication and Change Data Capture.
  • Completed the Gap Analysis which includes identifying the gaps between the downstream partner requests to the source system files and to fill the gaps either by rejecting the downstream partner's requests or requesting additional files from the source system.
  • Worked extensively with Teradata utilities - Fastload, Multiload, TPump to load huge amounts of data from flat files into Teradata database.
  • Implemented Lookup’s, lookup local, In-Memory Joins and rollup’s to speed up various Ab Initio Graphs.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump, Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Involved in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked extensively with Informatica transformations to source, cleanse and parse the data, load into Teradata database.
  • Extensive experience with DB2 load and Ingest external loaders.
  • Created explain plans for long running queries, worked with DBA's to identify and solve the bottlenecks by adding appropriate indexes.
  • Extensively used Informatica web services Consumer transformation to invoke 3rd party web services on Billing, Payments.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.
  • Enabled Informatica workflows as web service to be invoked and used by different client systems.
  • Created test Java programs to test the Informatica web services and Try-it option on Web Service Hub.
  • Extensively used XML parser transformations to parse results from Web services output i.e. SOAP messages.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into Oracle database.
  • Extensively used Informatica DVO (Data Validation Option) for data comparison between source and target tables, data validation after workflow completion.
  • Used Informatica DVO to create table pairs, test rules, SQL views options to create various data validation scenario's and used RunTests command to execute the table pairs and send out emails with the status of data validation.
  • Extensively used Informatica DVO for unit testing in Dev environment.
  • Involved in performance tuning of existing SQL queries.
  • Involved in performance tuning of Informatica mappings.
  • Extensive knowledge on banking products, equities and bonds.
  • Created complex ETL Informatica procedures to load data from legacy systems.
  • Prepared implementation document for moving the code from development to QA to production environments.
  • Worked with QA team and implementation team during different phases of the project.

Environment: Informatica Power Center 9.1, Teradata 12, Oracle 11i, DB2 9.1, Ingest, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), BTEQ, Ab- Initio, Teradata SQL Assistant, Web services, Java, Business Objects XI R2, Linux, Windows XP, SQL, PL/SQL, XML, SQL Loader, TOAD, Tivoli scheduler, control-M, UC4, Toad 9.5, Korn shell, Erwin.

Confidential, West Chester, PA

Senior Teradata Developer /ETL Developer


  • Conduct source System Analysis and developed ETL design document to meet business requirements.
  • Developed Informatica Mappings and Workflows to extract data from PeopleSoft, Oracle, CSV files to load into Teradata staging area using FastLoad/Tpump utilities.
  • Developed Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.
  • Developed ETLs to load data from source to 3NF, stage to 3NF and Stage area to Work, work to 3NF using Informatica Push Down optimization technique to utilize Database processing power.
  • Designed and developed custom Data Quality audits to identify and report the data mismatch between source and target systems and alert Operations Team.
  • Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.
  • Created Uprocs, Sessions, Management Unit to schedule jobs using $U.
  • Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.
  • Coordinated with the offshore project team members on daily basis for the continuation of tasks and resolving any issues.
  • Acted as a single resource with sole responsibility of Ab Initio - Teradata conversions.
  • Supported System Integration and User acceptance tests to obtain sign off.
  • Used UNIX scripts to access Teradata & Oracle Data
  • Developed UNIX shell scripts for data manipulation
  • Involved in writing proactive data audit scripts.
  • Involved in writing data quality scripts for new market integration
  • Developed complex transformation code for derived duration fields.
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
  • Post go live Production Support and Knowledge Transfer to Production Support team

Environment: Teradata V2R6/V2R5, Teradata SQL Assistant, Ab- Initio, BTEQ, MLOAD, ARCHMAIN,BOXI R3.1,Teradata Manager, Mainframes DB2, Erwin Designer, UNIX, Windows 2000, Control-M, Clear Case, Shell scripts.

Confidential, San Antonio, TX

Senior ETL/Data warehouse Developer/ Teradata Developer


  • Writing complex SQL Queries, Stored Procedures, Triggers, User Defined Functions to implement the business logic.
  • Create and Maintain Teradata Databases, Users, Tables, Views, Macros and Stored Procedures using Teradata Administrator (WinDDI), SQL Assistant (Queryman), SQL DDL, SQL DML, SQL DCL, BTEQ, MLoad, Fastload, FastExport, TPUMP, Statistics Index and Visual Explain
  • Wrote Unix shell script to perform ETL interfaces (BTEQ, MLoad, Fastload, and FastExport jobs)
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Responsible to build new and modify existing adhoc reports using SSRS as per requirements.
  • Used Teradata utilities like for Teradata ETL processing huge volumes of data throughput Fast Load, Fast Export, MultiLoad, Tpump and also Involved in implementation and batch monitoring
  • Responsible for deploying the SSIS Packages from development to production server..
  • Created UML Diagrams including Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration Diagrams and Deployment Diagrams and Entity Relation(ER) Diagrams.
  • Worked on Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad or FastExport. Created numerous Volatile, Global, Set, MultiSet tables.
  • Involved in Data analysis, Data model designs and Preparation of metadata documents for all source tables.
  • Captured the DQ metrics using the Profiles and Created scorecards to review data quality using IDQ
  • Used various Transformations, Dataflow and Control Flow, Implemented Event Handlers and Error Handling in SSIS.
  • Involved in giving Quick resolutions and design changes when there was a requirements changes.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Involved in tracking and reviewing the defects in all phases of the project using HP Quality centre
  • Assisted in troubleshooting the Production support problems, which are related with Teradata database and Informatica Data Quality.
  • Experience in creating the Profiles, Scorecards and add the custom Rules and Filters in Informatica Analyst and Developer Tools.
  • Developed UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Involved in analysis process of low level designs and provided design documents
  • Migrated objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.
  • Used the Rational Clearcase for versioning the various objects like the mappings, workflows, SQL scripts and Perl scripts.
  • Involved in tracking and reviewing the defects in all phases of the project using Clear Quest
  • Developed data mapping document and development of Group Ordering extracts - Order Lines
  • Involved extensively in Unit testing, integration testing, system testing and UAT.
  • Packaging of the developed component to be deployed in production environment using Change man package (Enhancements) Fixing production bugs. Temporary and permanent fix within Application Service Level Agreement

Environment: Teradata14, Teradata 14.10, Tableau 8.1, Informatica, SQL Assistant 13.0, SQL Server 2014, SSIS, BTEQ, IDQ 9.6.1, FastLoad, FastExport, Multiload, Tpump, MS Visio, View Point, Data Mover.

Confidential, Phoenix, AZ

ETL Developer


  • Informatica 9.5 been used to load the data from Oracle to Netezza tables.
  • Used Jira and confluence as project tacking tool, where we used to have all the tasks built and updated.
  • Confluence has used to upload all the documents and decision tasks in confluence.
  • Played a main role in creating all the templates to create different status pages add our tasks and track them in Jira.
  • As this project falls under Waterfall methodology, still we used to have daily scrum meetings and I took lead to conduct this daily standup meetings.
  • As this project has very small dead line and has a high priority because the contract is going to be end by Dec-2016 with E-bay, worked hard as a team by not letting the work goes into risk.
  • Prepared all the designing documents consist of Data flow from Source to Target by mapping all the columns, which are required for reporting purpose.
  • Handled the complex mappings by modifying some of the core tables which consist of Confidential customer data and also the sales tables that are involved in Batch load.
  • Created different (Detailed and High level) Data Model diagrams to explain the flow of the data by using Data Modular tool called ERwin.
  • Extensively worked understanding the Business requirements and designed the logic to populate the data as expected.
  • Created DDL and DML scripts that have structure of new tables, and modifications of existing tables.
  • Built Mappings, work lets and workflows to load the data into staging area, and then into DW tables.
  • Used Push down Optimization to increase the Performance.
  • Created Tidal jobs for automation of work Flows.
  • Took Responsibility of creating the implementation plan document and worked closely with the admin's during go live.
  • Provided One-month warranty support, which is a regular process in Confidential .

Environment: Informatica 9.1/9.5, SQL Server, Datastage, Oracle, Netezza, Tidal.


ETL/BI Developer


  • Created different transformations to load data into targets using various transformations like Source Qualifier, Router, Joiner, Lookup, Expression, Aggregator, Sequence Generator etc.,
  • Reused mapplets and transformations.
  • Developed and modified mappings for Extraction, Staging, Slowly Changing Dimensions of Type1, Type2, Type3, Facts and Summary tables duly incorporating the changes to address the performance issues as stated in the Re-architecture specifications.
  • Developed mappings for Slowly Changing Dimensions of Type1, Type2, Facts and Summary tables using all kinds of transformations.
  • Scheduled Sessions and Batched on the Informatica Server using Informatica Server Manager/Workflow Manager.
  • Developed workflows and worklets for mappings using Workflow Manager to load data from source to target tables.
  • Implemented performance tuning logic on targets, sources, mappings and sessions to provide maximum efficiency and performance.
  • Designed, defined, and documented test cases and plans for testing and debugging the mappings developed.
  • Used SyncSort to effectively reduce the elapsed time and speed up the ETL process.
  • Involved in creating analytics workflows, On-Demand (Ad-hoc) and Scheduled dashboard reports using Power Analyser.
  • Developed various reports using Power analyzer, like drill down, cross tab etc.,
  • Interacted with users in troubleshooting problems reported.
  • Tested data and data integrity among various sources and targets.

Environment: Informatica PowerCenter 7.1.1, HP-Unix 11i, Oracle 9i, Microstrategy 8.1.2, Microsoft VSS, TOAD, CA7 Scheduler.

Hire Now