Teradata Developer Resume
Charlotte, NC
SUMMARY
- Enterprising Individual with 8+ years of experience in the IT industry with extensive experience in Data Warehousing Teradata (V2R4.x/5.x/6.x/ 12.0/13.0/14.0/15.10 ), Oracle PL/SQL & UNIX Environment for DSS.
- Seven Years of experience in Teradata Database design, Implementation and Maintenance mainly in large - scale Data Warehouse environments.
- Certified Teradata Consultant with experience in Teradata Physical implementation and Database Tuning.
- Extensive experience in Administration and Maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environments.
- Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
- Solid Experience in designing and developing of Extraction transformation and Loading (ETL) process using the Ab-Initio Software.
- Experience in different database architectures like Shared Nothing and Shared everything architectures. Very good understanding of SMP and MPP architectures.
- Experience in working with various heterogeneous Source Systems like Mainframe OLTP, MySQL, Oracle, SQL Server, DB2, ERP, Flat files and Legacy systems.
- Extensively worked with BTEQ, FASTEXPORT, FASTLOAD and MULTILOAD Teradata utilities to export and to load data to/from flat files.
- Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
- Expertise in Query Analyzing, performance tuning and testing.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like PMON, Teradata Workload Analyzer and Teradata Visual Explain.
- Extensively worked on Query tools like SQL Assistant, TOAD and PLSQL Developer.
- Good Knowledge in Logical and physical modeling using Erwin. Hands on experience in 3NF, Star/Snowflake schema design and De-normalization techniques.
- Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
- Experience in extracting source data from Mainframe OLTP systems by writing several COBOL and JCL scripts.
- Experience in writing UNIX shell and PERL scripts to support and automate the ETL process.
- Experience in Oracle RDBMS Architecture.
- Involved in Unit Testing, Integration Testing and preparing test cases.
- Involved in production support activities 24/7 during on call and resolved database issues.
- Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
- Strong problem solving, analytical, interpersonal skills, communication skills and have the ability to work both independently and as a team.
TECHNICAL SKILLS
Databases: Teradata 12/13/13.10/14/14.10/15.10 , Oracle 10g/11g, DB2, MS-SQL Server 2000/2005/2008.
DB Tools/Utilities: Teradata SQL Assistant 13.1, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Visual Explain, Teradata Administrator, SQL Loader, TOAD 8.0.
BI Tools: MicroStrategy reports.
Programming Languages: C, SQL, PL/SQL, UNIX and PERL Shell Scripting.
ETL Tools: Informatica.
Data Modelling: Logical/Physical/Dimensional, Star/Snowflake, OLAP, ERWIN.
Scheduling Tools: UC4, Autosys, Crontab.
Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0 , Linux, Windows, UNIX.
PROFESSIONAL EXPERIENCE
Confidential, CHARLOTTE, NC
Teradata Developer
Responsibilities:
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Implemented logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data Mart. Created source-to-target data mappings.
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Created COBOL programs and worked on creating JCL scripts to extract data from Mainframes operational systems. Extracted data from mainframe DB2 tables.
- Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS. Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
- Worked on exporting data to flat files using Teradata Fast Export.
- In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
- Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
- Automated the ETL process using UNIX Shell scripting.
- Worked exclusively with the Teradata SQL Assistant to interface with the Teradata
- Developed whole framework for ETL using PL/SQL to move data from legacy systems
- Modified or enhanced existing ETL processes to include additional client migration requirements.
Environment: Teradata 15.10.14, UNIX Putty.
Confidential, Cary, NC
Teradata Developer
Responsibilities:
- Extensively involved in creating the detailed design documents, source to target mapping documents, test plans and technical design documents for the Creating and implementation as per the client requirements.
- Involved in writing clear requirements and bridge the gap between IT teams and the business teams.
- Responsible for redesign, performance tuning and enhancements of the existing ETL Process and also created ETL process for reporting requirements.
- Coordinating the Users and Reporting teams for the development efforts and served as the subject matter expert on Data Warehouse and ETL processes
- Involved with test data creation, creating test cases, QA testing process for data quality and performance
- Participated in data model (Logical/Physical) discussions with Data Modelers and creating both logical and physical data models.
- Involved in Investigating and resolve data issues across platforms and applications, including discrepancies of definition, format and functions
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the business analysts
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, Macros
- Extensively used various Teradata Set Tables, Multi-Set table, global tables and volatile tables for Loading/Unloading.
- Working closely with CA7 Schedulers to set up job stream through CA7 to run daily, weekly and Monthly process jobs.
- Involved in creating UNIX Shell Wrappers to run the deployed Ab Initio scripts.
- Worked with EME/sandbox to build the version control, impact analysis for various Ab Initio projects across the organization.
- Developed and modified UNIX shell scripts as part of the ETL process.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Involved in providing 24/7 production support to various ETL applications.
Environment: Informatica Power Center 9.1/8.6, Teradata 14/13.10, Unix, Oracle 10g, Control-M, Fixed width files, TOAD, SQL Assistant, Unix Shell Scripting, Erwin 7.3, Autosys, ETL/ELT.
Confidential, Dallas, TX
Teradata Developer
Responsibilities:
- Extensively involved in creating the detailed design documents, source to target mapping documents, test plans and technical design documents for the Creating and implementation as per the client requirements.
- Involved in writing clear requirements and bridge the gap between IT teams and the business teams.
- Responsible for redesign, performance tuning and enhancements of the existing ETL Process and also created ETL process for reporting requirements.
- Coordinating the Users and Reporting teams for the development efforts and served as the subject matter expert on Data Warehouse and ETL processes
- Involved with test data creation, creating test cases, QA testing process for data quality and performance
- Participated in data model (Logical/Physical) discussions with Data Modelers and creating both logical and physical data models.
- Involved in Investigating and resolve data issues across platforms and applications, including discrepancies of definition, format and functions
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, Macros
- Extensively used various Teradata Set Tables, Multi-Set table, global tables and volatile tables for Loading/Unloading.
- Working closely with CA7 Schedulers to set up job stream through CA7 to run daily, weekly and Monthly process jobs.
- Involved in creating UNIX Shell Wrappers to run the deployed Ab Initio scripts.
- Worked with EME/sandbox to build the version control, impact analysis for various Ab Initio projects across the organization.
- Developed and modified UNIX shell scripts as part of the ETL process.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Involved in providing 24/7 production support to various ETL applications.
Environment: Informatica Power Center 9.1/8.6, Teradata 14/13.10, Unix, Oracle 10g, Control-M, Fixed width files, TOAD, SQL Assistant, Unix Shell Scripting, Erwin 7.3, Autosys, ETL/ELT.
Confidential
Teradata/ ETL Developer
Responsibilities:
- Involved in the analysis and implementation of their system.
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
- Worked with different Data sources ranging from SAP, MDB, Teradata, flat files, XML, Oracle, and SQL server databases.
- Created ERWIN data dictionaries and logical models for the data warehouse implementation
- Involved heavily in writing complex SQL queries based on the given requirements.
- Created data feed using Teradata Fast Export and FTP the data files on Oracle box.
- Involved in developing Multi load, Fast Load and BTEQ scripts.
- Created and automate the process of freight and shrink loading process using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.
- Used Fast Load for loading into the empty tables.
- Used volatile table and derived queries for breaking up complex queries into simpler queries.
- Performance tuning, monitoring and index selection while using PMON, Teradata Dashboard, Statistics wizard and Index wizard and Teradata Visual Explain to see the flow of SQL queries in the form of Icons to make the join plans more effective and fast.
- Created a Clea0nup process for removing all the Intermediate temp files that were used prior to the loading process.
- Regular Interactions with DBA’s. Did table rebuilds to back up the data from the final tables.
- Involved with test data creation, creating test cases, QA testing process for data quality and performance.
- Streamlined the Teradata scripts and shell scripts migration process on the UNIX box using Autosys.
- Created a shell script that checks the corruption of data file prior to the load.
- Involved in Creating the Unix Shell Scripts/Wrapper Scripts that uses for scheduling jobs.
- Involved in troubleshooting the production issues and providing production support.
- Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
- Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Created Mapping document from Source to stage and Stage to target mapping.
- Designed Mappings by including the logic of restart.
- Worked with systems analysts to understand source system data to develop accurate ETL programs.
Environment: Informatica Power Center 9.5.1,Informatica Developer 9.5.1,Unix, Teradata 13.10/14, Fixed width files, TOAD, Harvest (SCM)Windows XP and MS Office Suite.