We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Having 4+ Years of IT experience focusing mainly on design, development, documentation in Teradata, Data warehousing environment and Application development.
  • Strong Knowledge of TeradataArchitecture & Good knowledge in Data Warehouse concepts and principles.
  • Designed, developed, tested and supported Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse.
  • Experience on ETL tool Informatica Power Center on OLAP and OLTP environment for Health care clients.
  • Extensively used Teradataapplication utilities like BTEQ, MLOAD, FLOAD, TPUMP and FAST EXPORT.
  • Good Understanding of Data warehousing concepts and Relational Database.
  • Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De - normalization Concepts.
  • Worked extensively with slowly changing dimensions. Had experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP and OLTP systems.
  • Good Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used in Relational, Dimensional and Multidimensional data modelling
  • Worked extensively on Teradata Query Submitting and processing tools like BTEQ and Teradata SQL Assistant.
  • Expertise in Developing scripts for loading the source data into the target tables using Teradata Utilities.
  • Identifying long running queries, scripts, Spool space issues and implementing appropriate tuning methods. Involved in building tables, views and Indexes.
  • Good understanding of ANSI SQL and various Teradatafeatures like Join types, Statistics, Explain, PPI.
  • Experienced working on Performance Tuning, PPI Concepts, Join index, Secondary Indexes, Compression techniques, strategies, Query optimization, Teradata SQL
  • Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes in Teradata.
  • Have a strong experience in Teradatadevelopment and index's (PI, SI, PARTITION, JOIN INDEX)
  • Experience in writing complex SQL to implement business rules, extract and analyze data
  • Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL's.
  • Expertise in database programming like writing Stored Procedures (SQL), Functions, Views in Teradata.
  • Advanced SQL skills, including use of derived tables, unions, multi-tableinner/outer joins.
  • Highly motivated with the ability to work effectively in teams as well as independently.

TECHNICAL SKILLS

Programming Languages: C, SQL, PL-SQL

Databases: Teradata, Oracle

Teradata Load and Unload Utilities: FLOAD, FASTEXPORT, MLOAD, TPUMP, TPT

Database Administration Tools: Index Wizard, Query scheduler, Workload analyzer, Viewpoint

Query management tools: BTEQ, SQL assistant

ETL Tools: Informatica

Administrator tools: Teradata Manager, Teradata Administrator

Scheduler: Tivoli

Operating Systems: UNIX, LINUX and Windows

PROFESSIONAL EXPERIENCE

Confidential

Teradata Developer

Responsibilities:

  • Analyze the Low-level design documents based on the business process document.
  • Extensively works in data Extraction, Transformation and Loading from source to target system using Teradata utilities like Multiload, Fastload.
  • Involves in loading data from text files to dataware house landing zone tables using TeradataUtilities.
  • As a initial load process, we will Mload this file in to Teradata staging table and archive the file in to archive folder after Mload completes successfully.
  • This actual load process contains mostly Teradata stored procedures and are scheduled to run every day through 3rd party scheduling tool.
  • Writing BTEQ Scripts to load data from landing zone to Work tables in staging area based on the ETL mapping documentation.
  • Involved in Create / Modify / Drop Teradata objects like Tables, Views, Indexes, Procedures, and Databases.
  • Performance tuning of BTEQ scripts.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL).
  • Writing complex SQL queries to perform the analysis on a requirement.
  • Strong knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Works with Teradata utilities like BTEQ, Fast Load and Multi Load.
  • Worked on Teradata SQL Assistant querying the target tables to validate the BTEQ scripts.
  • Worked with collect statistics and join indexes.
  • Creating SQL and BTEQ scripts in Teradatato load the data into EDW as per the business logic.
  • Involved in unit testing, systems integration and user acceptance testing.
  • Participated in knowledge transfer sessions to Production support team on business rules, Teradata objects and on scheduling jobs.

Environment: Teradata 15.0, SQL Assistant, BTEQ, Fastload, Multiload, TPUMP, Oracle 10.2g, Tivoli, Linux

Confidential

Teradata/ETL developer

Responsibilities:

  • Analyzing the Business requirements and System specifications to understand the Application.
  • Designed Informatica mappings to propagate data from various legacy source systems to Teradata.
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • The interfaces were staged in Teradata before loading to the Data warehouse.
  • Writing BTEQ Scripts to load data from landing zone to Work tables in staging area based on the ETL mapping documentation
  • Developing complex sql and bteq scripts for processing the data as per coding standards.
  • Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy etc.
  • Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
  • Developed Mload scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
  • Creating, loading and materializing views to extend the usability of data.
  • Works with Teradata utilities like BTEQ, Fast Load and Multi Load.
  • Involved in SQL scripts, stored procedures in Teradata to implement business rules.
  • Perform tuning activity for the highly skewed queries.
  • Mainly handle the spool space request from the end users.
  • Creating join indexes as per architectural strategies and standards also creating PPI's on large tables to improve performance of the queries.
  • Automated Unix shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Making modifications as required for reporting process by understanding the existing data model and involved in retrieving data from relational databases.
  • Managing queries by creating, deleting, modifying, and viewing, enabling and disabling rules.
  • Loading the data into the warehouse from different flat files.
  • Database testing by writing and executing SQL queries to ensure that data entered has been uploaded correctly into the database

Environment: Teradata 15, Informatica Power Centre 9.1, Oracle 10g, Bteq, Mload, putty, windows 7.

Confidential

Teradata developer

Responsibilities:

  • As a initial load process, we will Mload this file in to Teradata staging table and archive the file in to archive folder after Mload completes successfully.
  • This actual load process contains mostly Teradata stored procedures and are scheduled to run every day through 3rd party scheduling tool.
  • After completion of load process, we will run Validation process to validate the data loaded in target tables against source.
  • Develop Fast load scripts to load data from host file in to Landing Zone table.
  • Involved in writing the ETL specifications and unit test plans for the mappings.
  • Apply the business transformation using BTEQ scripts.
  • Creating Database Tables, Views, Functions, Procedures, Packages as well as Database Sequences, Triggers and database link.
  • Written SQL queries for retrieving the required data from the database.
  • Tested and Debugged PL/SQL packages.
  • Create BTEQ scripts to load the data from staging table in to target table.
  • Used standard packages like UTL FILE, DMBS SQL, and PL/SQL Collections and used BULK Binding involved in writing database procedures, functions and packages.
  • Provided required support in Multiple (SIT, UAT & PROD) stages of the project.
  • Prepared BTEQ import, export scripts for tables.
  • Written BTEQ, FAST LOAD, MLOAD scripts.
  • Involved in unit testing and prepared the test case.
  • Validated the target data with the source data

Environment: Teradata 13.10, TPT, BTEQ, Teradata SQL Assistant, Teradata 14, FASTLOAD, MLOAD, UNIX, Windows 2000.

We'd love your feedback!