We provide IT Staff Augmentation Services!

Teradata/etl Lead Resume

5.00/5 (Submit Your Rating)

OR

SUMMARY

  • 9 Years of total IT experience and Technical proficiency in the Data Warehousing involving Business Requirements Analysis, Application Design, Data Modelling, Development, testing and documentation.
  • 8 Years of experience in Teradata Database design, implementation and maintenance mainly in large scale Data Warehouse environments, experience in Teradata RDBMS using FastLoad, MultiLoad, TPump, FastExport, Teradata SQL Assistance, Teradata Parallel Transporter and BTEQ Teradata utilities.
  • Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
  • Certified Teradata consultant with experience in Teradata Physical implementation and Database Tuning, technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
  • Technical expertise in ETL methodologies, Informatica 6.0/6.1/6.2/7.1/8.6 - Power Center, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Analyzed, Designed and documented requirements for data migration projects between numerous source Legacy/Oracle application feeds and the new Teradata platform.
  • Extensive database experience and highly skilled in SQL in Oracle, MS SQL Server, Teradata, Sybase, Mainframe Files, Flat Files, MS Access.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling.
  • Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
  • Expertise in Query Analyzing, performance tuning and testing.
  • Experience in writing UNIX korn shell scripts to support and automate the ETL process.
  • Experience in different database architectures like Shared Nothing and Shared Everything architecture.
  • Very good understanding of SMP and MPP architectures.
  • Experience in Tableau software.
  • Exclusive knowledge in Identification of User requirements, System Design, writing Program specifications, Coding and implementation of the Systems.
  • Excellent communication and inter personnel skills, Proactive, Dedicated and Enjoy learning new Technologies and Tools.
  • Strong commitment towards quality, experience in ensuring experience in ensuring compliance to coding standards and review process.

TECHNICAL SKILLS

Operating Systems: MS-DOS, MS Windows 9x/NT/2000/XP, Sun Solaris 2.5/2.6/8/9/ 10

ETL Tools: Teradata SQL Assistant, BTEQ, FastLoad, MultiLoad, FastExport, TPump, Informatica Power Center 6.0/6.1/6.2/7.1/8.1/8.6 , Power Exchange 5.x.

Languages: SQL, PL/SQL, HTML, DHTML, Visual Basic, UNIX Shell Scripting

Applications: MS Word, Excel, Outlook, FrontPage, PowerPoint, MS-Visio

Databases: Oracle 8i/9i/10g, Teradata, MS SQL Server, Sybase, Flat Files, IBM Mainframe Systems

Data Modeling: Erwin 4.x

BI Tools: Cognos, Micro Strategy & Tableau

Other Tools: TOAD 7.x, SQL Assistant 7.x, CVS

PROFESSIONAL EXPERIENCE

Confidential, OR

Teradata/ETL Lead

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Implemented logical and physical data modeling with Star and SnowFlakes techniques using ERwin in Data Mart.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Responsible for Collect Statics on FACT tables.
  • Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Created tables, views in Teradata, according to the requirements.
  • Provided architecture/development for initial load programs to migrate production databases from Oracle data marts to Teradata data warehouse, as well as ETL framework to supply continuous engineering and manufacturing updates to the data warehouse (Oracle, Teradata, MQ Series, ODBC, HTTP, and HTML).
  • Performed the ongoing delivery, migrating client mini-data warehouses or functional data-marts from Oracle environment to Teradata.
  • Migrate / Convert Data from Oracle to Teradata DW using Oracle Data Pump/OWB, OLE DB and DTS.
  • Worked on Informatica Advanced concepts & also Implementation of Informatica Push down Optimization technology and pipeline partitioning.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Multiload and FastLoad.
  • Used various transformations like Source qualifier, Aggregators, lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, Sorter, Normalizer, Stored Procedure, Union etc.
  • Used Informatica Power Exchange to handle the change data capture (CDC) data from the source and load into Data Mart by following slowly changing dimensions (SCD) type II process.
  • Used Power Center Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Developed UNIX scripts to automate different tasks involved as part of loading process.
  • Worked on Tableau software for the reporting needs.

Environment: Teradata V2R12 & 13, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 8.1, Cognos, Tableau, UNIX, Korn Shell scripts.

Confidential, Cincinnati, OH

Sr. Teradata Developer

Responsibilities:

  • Responsible for gathering requirements from business analyst and functional analyst and format them according to the business needs, identify tables required for data model.
  • Created Star Schema model with required facts and dimensions.
  • Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Create and Maintain Teradata Tables, Views, Macros, Triggers and Stored Procedures
  • Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.
  • Educated the team on how to do the performance tuning on complex queries with efficiency of MULTI-TABLE concept, Join Index, PPI, Partitioning PPI, Secondary Index mechanism, backup and
  • Recovery, day-to-day request such as altering a table, capacity planning, general SQL tuning and skew factor.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
  • Handled workload management using Priority Scheduler & Teradata Dynamic Query Manager and Teradata Manager.
  • Worked on complex queries to map the data as per the requirements.
  • Populate or refresh Teradata tables using Fast load, Multi load & TPump utilities for user acceptance testing and loading history data into Teradata.
  • Worked on Teradata Parallel Transport utility (TPT) coding.
  • Created UNIX Scripts for triggering the Stored Procedures and Macro.
  • Involved in Performance tuning for the long running queries.
  • Preparing Test Cases and performing Unit Testing.
  • Involved in unit testing & integration testing.
  • Client site for implementation and support 24 * 7 supports for the project until it goes live.

Environment: TeradataV12, Teradata SQL Assistant, BTEQ, FLoad, FExport, MLoad, TPT, TPump Erwin4.1.4, Informatica 8.1, Business ObjectsXiR2, Quest Toad 9.3, UNIX Shell Scripting, SQL*Loader, Smart Putty, SQL Server, Windows XP, UNIX.

Confidential, Atlanta, GA

Sr. Teradata/Informatica Developer

Responsibilities:

  • Design, Develop ETL process and create UNIX shell scripts to execute Teradata SQL, BTEQ, jobs.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.
  • Assisted DBA to Create tables and views in Teradata, Oracle database..
  • Dealt with Incremental data as well Migration data to load into the Teradata.
  • Enhanced some queries in the other application to run faster and more efficiently.
  • Created data files by using FastExport and have developed a common FTP script to port them on to the client’s server.
  • Used various Teradata Index techniques to improve the query performance.
  • Developed the shell scripts to automate the Call Detail Records - Informatica processes and subsequent concatenation of load ready files.
  • Worked on different subject areas like Product, Billing, Subscription, and Party.
  • Worked on the National Billing Instance (NBI) - module which involving modifying no. mappings and corresponding FastLoad and BTEQ scripts.
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Informatica Workflow Manager.
  • Developed Source to Target Mappings using Informatica PowerCenter Designer from Oracle, Flat files sources to Teradata database, implementing the business rules.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
  • Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit and system test.
  • Created series of Macros for various applications in TERADATA SQL Assistant.
  • Responsible for loading data into warehouse from different sources using Multiload and Fastload to load millions of records.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Created several custom tables, views and macros to support reporting and analytic requirements.
  • Supported the MicroStrategy reporting needs.
  • Passing the requirements to the offshore and coordinating with offshore team.
  • Performed Unit testing, System Testing and Integration testing to validate the data being staged.

Environment: Teradata V2R5, Teradata SQL Assistant, Teradata Manager, Teradata Administrator, Oracle 8i, Informatica Power Center 7.1, MicroStrategy 8, MS-Access, MS-Excel, TOAD, SQL, UNIX and Windows NT, CVS.

Confidential, Walnut Creek, CA

Teradata Developer

Responsibilities:

  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Created a BTEQ script for pre population of the work tables prior to the main load process.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MultiLoad scripts and FastLoad scripts.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
  • Created a shell script that checks the corruption of data file prior to the load.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • Created and automate the process of loading using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Involved in troubleshooting the production issues and providing production support.
  • Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Involved in troubleshooting the production issues and providing production support.
  • Developed unit test plans and involved in system testing.

Environment: Teradata V2R5, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, ERWin Designer, Quality Center, UNIX, Windows 2000, Shell scripts.

We'd love your feedback!