We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00 Rating


  • Over 7+ years of experience in data warehousing,with proficiency as Teradata developer and strong expertise in ETL process.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Load, Multi Load,Fast Export to export and load data to/from different source systems.
  • Performance tuning including collecting statistics, analyzing explains & determining which tables needed statistics and increased performance.
  • Strong expertise in SQL, creating macros, triggers, views, stored procedures and different types of joins in Teradata.
  • Good knowledge in Unix Shell scripting and have hands on experience in batch programming.
  • Proficiency in data modelling and concepts like normalization, dimensional data modeling, Star schema, Snow Flake schema, Fact and Dimensions tables,Slowly changing dimension and Change Data Capture (CDC).
  • Expert in analyzing, designing, developing, installing, configuringanddeployingMS SQL Server suite of products with Business Intelligence in SQL Server Integration Services (SSIS) and SQL Server Reporting Services(SSRS).
  • Worked extensively with huge volumes of data, analyzing record sets for data quality and Data validation.
  • High Level understanding in Informatica.
  • Strong Exposure in Microsoft .Net application development.
  • Excellent communication skills and technical writing skills with hands on experience translating the business requirement to technical specifications and creating mapping documents, test plans, test scripts and status reports.
  • Ambitious, self - motivated, ability to work independently as well as in teams, possess multi-tasking skills, result oriented engineering professional.


ETL Tools: SSIS, SSRS, Informatica Power center

Databases: Teradata 13.10/12, V2R6, My SQL, SQL server 2005/2008

Languages: ASP.NET, C#, Basics in C, C++

Teradata Utilities: BTEQ, FLoad, MLoad, Fexp, OLE Load, TOAD

Scripting Languages: UNIX, VB script

Version Control: Clear case, TFS, VSS

Domain: Health care, Banking

Others: MS Visio, MS OFFICE


Confidential, Charlotte, NC

Environment: Teradata, SSIS, MS SQL Server 2008, SharePoint, MS Access, Visual Studio 2010, .NET 4, SQL server management studio, TFS


  • Communication with business users and analysts on business requirements. Gathering and documenting the technical and business Meta data.
  • Developed applications using ASP.net 4.
  • Provided import export functionalities to Excel from the UI.
  • Optimized a rules engine script to enhance performance and reduce the execution time required.
  • Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Involved in writing Teradata SQL bulk programs and in Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
  • Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Used AJAX controls (Calendar) and java scripts for validations
  • Replaced cursor used in the queries with while loop to increase performance.
  • Implemented stored procedures and triggers.
  • Solely responsible for the SSIS package that was required to load the flat files in share point to a different directory and verify the count of records loaded into destination table. Built the package using various components in SSIS and also did the performance tuning for the same.
  • Used Job scheduler to schedule the execution of the package and notify the appropriate users after loading or if the file is absent.
  • Developed the HLD LLD documents and the unit test cases scripts.

Confidential, Wilmington, DE


  • Implementing active data ware housing on the Teradata platform, extended use of Fast Load, Multiload, BTEQ and migrating data from SQL server (2005) to Teradata.
  • Developed SSIS package to retrieve data from the flat file of different sources and a staging package to transform the data and load into the final source destination.
  • Loading data from various sources like OLEDB, flat files to SQL Server database Using SSIS Packages and created data mappings to load the data from source to destination.
  • Implemented Event Handlers and Error Handling in SSIS packages.
  • Mentored entry-level trainee about the project and explained his role.
  • Optimized BTEQ query and streamlined the data flow processes.
  • Fabricated all aspects of project documentation, including functional, technical specifications, testing templates, ETL mapping, report layout etc.
  • Extensively developing Teradata query, macros and stored procedures.
  • Tasked with developing the ETL process for loading their historical data into the Data Warehouse.
  • Developed reports in SSRS using features like drill down, charts and filters.

Environment: SSIS, SSRS,MS SQL Server 2008, Teradata 12,TOAD, BTEQ,Windows Batch scripting, My SQL,OLE LOAD, CITRIX,VSS

Confidential, Richmond, VA


  • Worked on Informatica power Center tool for data conversion from source system. Dealt with various transformation logics like Lookup, Aggregator, Joiner, Source Qualifier, Normalizer, Expression, Update strategy.
  • Understanding of source mapping documents, mapping rules and using them to load target tables.
  • Involved in writing scripts for loading data from landing zone tables to CSA tables, applying the transformations/mapping rules and then loading the data from CSA tables to EDW tables.

Environment: Teradata 12,TOAD, CITRIX, Informatica, Visio.



  • Being the sole offshore member, my responsibility was to acquire the required data and mask the critical fields of customers and encrypt them before sending to Argus via flat files.
  • Developed more than 130 scripts (Fast export, multi load and fast load) for different subject areas that retrieves the actual data, and stores the encrypted data in new tables and sends the bundled flat files to Argus through FTP.
  • Responsible for collect statistics on all types of tables.
  • Developed scripts in UNIX Vi editor and used clear case for version control.
  • Created stored procedures in Teradata to mask critical fields like account number and encrypt using Unique Primary Index, row number and a table inclusive unique field.
  • Developed wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts

Environment: Teradata 12, Teradata 13,Putty, Clear case, Unix.



  • Analyzing the data flow diagrams for different subject areas and creating queries in Westpac EDW with multiple joins to retrieve desired data output.
  • Developing BTEQ jobs involving extensive queries and complex procedures to identify potential customers for campaigns from different portfolios.
  • Executing the BTEQ scripts and verifying the appropriateness of the customer list for campaigns by doing data sampling.
  • Creating SAS MA (outbound) and SAS RTDM (Inbound) test cases for extensive internal testing and SIT.
  • Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
  • Worked on clear case and UNIX Vi editor.

Environment: Teradata12, Putty, Clear case, Unix, BTEQ, Fload, Mload, Fast export.



  • Managing database objects like tables, indexes, triggers, views and modification of stored procedures, triggers, cursors and views.
  • Extensive use of joins, stored procedures to simplify complicated queries involving large number of tables.
  • Rebuilding the indexes at regular intervals for better performance.
  • Used Event Handlers for Exception Handling (On Error, On Post Execution) and SSIS Logging and has Success/Failure email notification as well as storing SSIS error log into tables.
  • ConfiguredMail Profilefor sending automatic mails to the respective people when a job is failed or succeed.
  • Created Jobs,SQLMail, Alerts and scheduled SSIS packages usingSQLServer Agent
  • Build tables, UPI, NUPI, USI, NUSI, macros and stored procedures.

Environment: MSSQLServer 2008, T-SQL,SQLServer Integration Services (SSIS)

We'd love your feedback!