We provide IT Staff Augmentation Services!

Informatica/teradatadeveloper Resume

Atlanta, GA


  • 7+ years of experience in Information Technology wif standing on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center/ Informatica cloud/ Informatica Power Exchange from various database sources.
  • Expertise in Teradata Database design, implementation, and maintenance mainly in large - scale Data Warehouse environments.
  • Experience working wif Teradata Parallel Transporter (TPT), FastLoad, MultiLoad, BTEQ and Fast-Export, SQL Assistant, DDL and DML commands.
  • Involved in Informatica upgrade projects from one version to another version.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling.
  • Extensive experience in developing Stored Procedures, Functions, Triggers and Complex SQL queries.
  • Experience in Error Handling & Debugging and implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows in Informatica ETL mappings.
  • Created Unit Test cases and Unit Test results for Informatica Objects to be tested.
  • Expertise in Implementing CDC - Change Data Capture.
  • Performed teh data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Expertise in Query Analyzing, performance tuning and testing.
  • Experience in writing UNIX shell scripts to support and automate teh ETL process.
  • Strong experience in ETL Audit Process (ABC - Audit, Balance & Control) to make sure Data Quality & Integrity. Set up email alerts process wif threshold if any data loading issues through ETL Process.
  • Experience in Data Extraction using Informatica Cloud wif Salesforce as source
  • Technical expertise in ETL methodologies, Informatica Power Center, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Expertise in testing teh applications and creating test cases for Unit Testing, Integration Testing and make sure teh data extracted from source is loaded into target properly in teh right format.
  • Use teh Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
  • Highly motivated and goal-oriented.
  • Implemented DWH Projects using Agile & Waterfall Methodologies.
  • Strong Understanding of Scrum Process.
  • Excellent verbal and communication skills clear understanding of business procedures and ability to work as an individual as well as part of a team.
  • Excellent Analytical and Problem-Solving Skills.


Databases: Teradata 12.0/13.x/14.10, SQL/MYSQL server and Oracle

Informatica Tools: Informatica Power Center 10.0.1/ 9.6.1/9.5.1 /9.1.1/8.6.1/8.1.1/7.1. , Informatica Cloud, power center exchange, IDQ GITHUB

Programming Skills: SQL, PL/SQL, Teradata SQL, Oracle SQL

Operating Systems: Windows, UNIX, LINUX

Teradata Utilities: BTEQ, SQL Assistant, Database Window, Fast Load, MultiLoad, Fast Export, TPT

Teradata DBA Utilities: VIEWPOINT, Query Monitor

Scheduling tool: Control-M Scheduler

Scripting Languages: UNIX Shell Scripting, BTEQ


Confidential, Atlanta, GA



  • Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into teh Teradata using Teradata Utilities.
  • Involved in Requirement analysis in support of Data Warehousing efforts along wif Business Analyst and working wif teh QA Team in Waterfall methodology.
  • Loaded data in to teh Teradata tables using Teradata Utilities BTEQ, FastLoad, MultiLoad, and Fast Export and TPT.
  • Worked wif various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Extensively worked wif various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Extensively worked in teh performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that teh data is loaded in teh tables in a timely and appropriate manner.
  • Writing scripts in Python for scheduling tasks.
  • Writing Teradata Macros and used various Teradata analytic functions.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Extensively used Pre-SQL and Post-SQL scripts for loading teh data into teh targets according to teh requirement.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested teh mappings.
  • Successfully upgraded Informatica 9.6.1 to 10.1 and responsible for validating objects in new version of Informatica.
  • Knowledge on Data integration wif Salesforce CRM using Informatica cloud
  • Extracted teh raw data from Salesforce CRM to staging tables using Informatica Cloud.
  • Automated/Scheduled teh cloud jobs to run daily wif email notifications for any failures.
  • Responsible for Unit Testing, Integration Testing and helped wif User Acceptance Testing.
  • Worked extensively wif different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing teh Mappings.
  • Experience in using SVN as version control for migration.
  • Managed enhancements and coordinated wif every release wif in Informatica objects.
  • Provided support for teh production department in handling teh data warehouse.

Environment: InformaticaPower Center 10.1/ 9.6.1/9.5.1 , Informatica Cloud, Salesforce, Oracle 11g, DB2,Teradata15/14, Flat Files, Sql Assistant, TOAD, UNIX, Windows, SVN.

Confidential, Honolulu, HI

Informatica/ Teradata Developer


  • Involved in working wif Business Analyst & Data Modelers.
  • Coded, tested, implemented and maintained medium to highly complex ETL mappings using Informatica.
  • Coded teh Informatica jobs for long term reliability, and accuracy.
  • Worked wif teh existing application support team to help identify root cause of issues during teh workflow runs, after identifying teh issues took corrective actions and implemented long-term prevention plans.
  • Prepare/maintain documentation on all aspects of ETL processes to support noledge transfer to other team members.
  • Worked extensively on different types of transformations like source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, Data Masking, Union, Stored Procedure and sequence generator.
  • Used Data Masking transformation to secure teh confidential claims data.
  • Very good noledge and Hands on experience on Flat file processing and ability to develop data validation mappings during teh flat file loads.
  • Ability to identify and implement teh parallel processing capabilities in teh workflow sessions and use those capabilities to reduce teh Workflow load times.
  • Hands on experience on Performance Tuning using Session Partition, Lookup Properties, and Session Properties.
  • Converted Hive QL, Sqoop, Map reduce in Hadoop process to Informatica.
  • Extensive experience working wif Big Data Edition to extracting teh data from Hadoop systems
  • Developed complex mappings using Informatica Power Center Designer to transform and load teh data from various source systems like Oracle 10g, Flat files, SQL Server, DB2 and to load data into Teradata.
  • Written teh Shell Scripts based on teh project requirements.
  • Used Teradata Utilities - BTEQ, MultiLOAD, FastLOAD & TPT for faster loads in to stage & Base.
  • Expertise is writing MultiLoad & Fastload scripting.
  • Involved in tuning Teradata BTEQs for recurring production failures.
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations based on teh design specifications.
  • Troubleshooted and Performance tuned complex ETL mappings to reduce teh session run time.

Environment: Informatica Power Center 9.6.1/9.5.1 , Oracle 11g, DB2, XML, Flat Files, Teradata 14/12, Hadoop, Hive QL, Maestro, UNIX, Windows, Toad.

Confidential, Milwaukee, WI

Informatica/ Teradata Developer


  • Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
  • Responsible for creating Technical Design documents, Source to Target mapping documents and Test Case documents to reflect teh ETL process.
  • Extensively worked wif Teradata utilities likeBTEQ, Fast Export, FastLoad, MultiLoadto export and load data to/from different source systems including flat files.
  • Creatingdatabases, users, tables, triggers,macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Extracted data from various source systems like Oracle, SQL server and flat files as per teh requirements.
  • Used extensively Teradata Analytic plans such as Teradata Visual Explain, Teradata Index Wizard and Teradata statistics wizard.
  • Writing scripts for data cleansing, data validation, data transformation for teh data coming from different source systems
  • Implemented Data parallelism by using Multi-File System, Partition and De-partition components and preformed repartition to improve teh overall performance.
  • Developed teh Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Performed application level DBA activities creating tables, indexes and monitored and tuned Teradata BTEQ scripts using Teradata Visual Explain utility.
  • Tuning of Teradata SQL statements using Explain analyzing teh data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Flat files are loaded into empty tables using FastLoad and then used in teh queries to do joins.
  • Scheduling teh automated jobs for daily, weekly and monthly jobs using UNIX Shell scripts for Control-M scheduling.
  • Worked on different tasks in Workflows like sessions, Event raise, Event wait, Decision, E-mail, Command, Assignment, Timer, Worklets and scheduling of teh workflow.
  • Designing and developing mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data.
  • Worked on UNIX shell scripting for automating Informatica workflows.
  • Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting teh data from teh Source systems to teh Staging and to Data Warehouse system.
  • Performance Tuning of teh Informatica Mappings by adopting Explain plan, cutting down query costs using Oracle hints, changing teh mapping designs.
  • Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
  • Maintained and deployed teh source code using GIT hub version control
  • Well versed wif all stages ofSoftware Development Life Cycle (SDLC).

Environment: Teradata 14.10 /13.x/12.x, Informatica Power Center 9.1/9.5, SAS DI Studio 9.4, Oracle 10G/11G, UNIX, SQL Server 2012/2008, GITHUB and Windows, Control-M.


Informatica/ Teradata Developer


  • Developed Informatica mappings for getting teh data from teh Source systems to teh Staging and to Enterprise Data Warehouse system.
  • Extensively used transformations to implement teh business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner.
  • Designed complex mappings involving target load order and constraint-based loading.
  • Create/build and run/schedule Workflows and Worklets using teh Workflow Manager.
  • Extensively worked in teh performance tuning of teh programs, ETL Procedures and processes.
  • Helped coding shell scripts for various administration activities for daily backup.
  • Performance Tuning of teh Oracle & Teradata using explain.
  • Experienced in using Teradata utilities BTEQ, MultiLoad, FastLoad & FastExport.
  • Optimizing/Tuning mappings for better performance and efficiency, Creating and Running Batches and Sessions using teh Workflow Manager, extensively used UNIX Shell scripts for conditional execution of teh workflows.
  • Optimized teh performance of Mappings, Workflows and Sessions by identifying and eliminating bottlenecks.
  • Performed Unit Testing Confidential development level, Source code migration and documentation
  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Involved in solving Trouble Tickets raised by Data Warehouse Users.
  • Experience wif Prod on call support.

Environment: Informatica 8.6.1, Oracle 10g, Teradata, Flat Files, Flat Files, SQL Programming, Unix, Windows, MS Excel, SQL *Plus.

Hire Now