We provide IT Staff Augmentation Services!

Teradata Application Developer Resume

0/5 (Submit Your Rating)

Irwindale, CA

SUMMARY

  • Over 7+ years of IT consulting experience in analysis, design, coding, development, testing and maintenance of data warehouse systems. Hands on experience include developing Data Warehouses/Data Marts/ODS using in Telecom, Banking, Finance, Insurance.
  • Strong experience in Teradata Database in Data warehousing Environment.
  • Expertise in using Teradata SQL Assistant and data load/export utilities like BTEQ, Fastload, MultiLoad, Fast export and TPump on UNIX/Mainframes environments.
  • Experience in Data Extraction/Transformation, Data (Dimensional) Modeling, Reporting, Data Migration, Data Quality, Retail Analytics and Data Analysis.
  • Proficient in Teradata V2R 6.2/12/13.10 database design (conceptual and physical), Query optimization, Performance Tuning.
  • Experience in both technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and STAR schema and/or SNOW FLAKE schema for OLAP, Multi - dimensional cubes.
  • SQL/Database developer experience in writing efficient SQL queries, PL/SQL Scripts, fine tuning queries and wrote several SQL queries for ad-hoc reporting.
  • Familiarity with Teradata’s MPParchitecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes, secondary indexes, Teradata Explain etc.
  • Extensively used Oracle SQL Loader, SQL Plus and TOAD. Good understanding of Oracle hash, b-tree and bitmap indexes.
  • Strong experience with data analysis, data modeling, extraction, loading, creation of tables, Creation of Universes, views, query optimization and performance tuning.
  • Worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.
  • Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
  • Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, transformations and sessions.
  • DevelopedException Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Developed slowly changing dimension mappings of Type1, Type2 and Type3 (version, flag and time stamp).
  • Experience in developing Reusable components and Partition sessions across teh projects.
  • Experience in developingIncremental Aggregation mappings to update teh values in flat table.
  • Sound Knowledge of Data Warehouse/Data Mart, Data Modeling Techniques. Very good understanding of Dimensional Modeling.
  • Excellent communication, interpersonal skills and TEMPhas a strong ability to work as part of a team and as well as handle independent responsibilities.

TECHNICAL SKILLS

Hardware: NCRS Unix Servers 4400/5100/5500 , NCR Windows 2000 servers 4850/4470

Operating Systems: Unix, Windows XP, Windows 2000, Windows NT 4.0 and Z/OS and MVS (OS/390), MVS, HP-UX, AIX 5.0/5.2/5.3.

Databases & DWH: Teradata V2R6.2/,V12,V13.10 Oracle (8i/9i), SQL Server 2000, Informatica Power Centre 9.1, Dimensional Modeling, ERwin 7.2.

Teradata Tools & Utilities: Query Facilities: SQL Assistant, BTEQ Load & Export: FastLoad, Multiload, FastExport, SQLPlus

Data Modeling: Erwin

Languages: SAS 9/8, C, Cobol, JCL, SQL, PL/SQL

PROFESSIONAL EXPERIENCE

Confidential, Irwindale, CA

Teradata Application Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Development of scripts for loading teh data into teh base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Collected Multi-Column Statistics on all teh non-indexed columns used during teh join operations & all columns used in teh residual conditions.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of teh ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing teh data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Use SQL to query teh databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATARDBMS.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
  • Developed teh Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and tan move data from staging into Base tables
  • Performed Space Management for Perm & Spool Space.
  • Reviewed teh SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.
  • Developed procedures to populate teh customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Dealt with initials, delta and Incremental data as well Migration data to load into teh Teradata.
  • Analyzing data and implementing teh multi-value compression for optimal usage of space.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which teh tables are joined.
  • Performance tuning, monitoring and index selection while using Statistics wizard and Index wizard and Teradata Visual Explain to see teh flow of SQL queries in teh form of Icons to make teh join plans more effective and fast.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of teh ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing teh data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
  • Flat files are loaded into databases using FastLoad and tan used in teh queries to do joins.
  • Use SQL to query teh databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance
  • Use PMON, Teradata manager to monitor teh production system during online day.
  • Excellent experience in performance tuning and query optimization of teh Teradata SQLs.
  • Worked closely with teh end users in writing teh functional specifications based on teh business needs.
  • Developed, documented and executed unit test plans for teh components.
  • Extensively used of Performance tuning techniques to minimize teh run time and create pre-session caches with load being balanced on server.

Environment: Teradata 12/13, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, DATASTAGE 9.1,Mainframe DB2,Oracle 11.

Confidential, Baltimore, MD

Teradata/ETL Consultant

Responsibilities:

  • Worked with systems analysts to understand source system data to develop accurate ETL programs.
  • Loading Data into teh Enterprise Data Warehouse using Teradata Utilities such as BTEQ, Fast Load, Multi Load and Fast Export in both mainframes and Unix environments
  • Utilized BTEQ for report generation and running teh batch jobs as well
  • Reviewed teh SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.
  • Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
  • Tuning of Teradata SQL statements using Explain analyzing teh data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Use SQL to query teh databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts.
  • Automated teh BTEQ report generation using Unix scheduling tools on Weekly and Monthly basis.
  • Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • With teh in-depth expertise in teh Teradata cost based query optimizer, identified potential bottlenecks with queries from teh aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI/USI/NUSI/JI etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance
  • Worked on Teradata Priority scheduler for distributing teh work load equally on teh server to reduce downtime
  • Involved in teh collection of statistics on important tables to has better plan from Teradata Optimizer.
  • Did teh performance tuning of user queries by analyzing teh explain plans, recreating teh user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Created Scripts were run through UNIX shell scripts in Batch scheduling.
  • Worked on Informatica (Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets and Transformations)
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Developed, documented and executed unit test plans for teh components.
  • Documenting teh developed code, run teh sessions and workflows, while keeping track of source and target row count.
  • Collected performance data for sessions and performance tuning by adjusting Informatica session parameters.
  • Created pre-session and post-session shell scripts and mail-notifications.
  • Extensively worked on teh Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Extensively used of Performance tuning techniques to minimize teh run time and create pre-session caches with load being balanced on server.

Environment: Teradata 12, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, Informatica 9.0 7.3.2, Mainframes DB2, ERwin Designer, UNIX, Windows2000, shellscripts.

Confidential, Menands, NY

ETL developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Translated requirements into business rules& made recommendations for innovative IT solution.
  • Development of scripts for loading teh data into teh base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Assisted DBA to Create tables and views in Teradata, Oracle database.
  • Dealt with Incremental data as well Migration data to load into teh Teradata.
  • Has experience in tuning some batch BTEQ queries.
  • Enhanced some queries in teh other application to run faster and more efficiently.
  • Data was extracted from Teradata, Processed/Transformed using Ksh programs and loaded into Data Mart.
  • Used various Teradata Index techniques to improve teh query performance
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
  • Modified BTEQ scripts to load data fromTeradata Staging area to Teradata data mart
  • Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit and system test.
  • Created series of Macros for various applications in TERADATA SQL Assistant.
  • Responsible for loading data into warehouse from different sources using Multiload and Fastloadto load millions of records.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Involved in teh Design and development of Data Mart and populating teh data from different data sources using Informatica.
  • Documented data conversion, integration, load and verification specifications.
  • Parsing high-level design specifications to simple ETL coding and mapping standards.
  • Worked with teh various enterprise groups to document user requirements, translate requirements into system solutions and produce development, testing and implementation plan and schedule.
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Developed, documented and executed unit test plans for teh components.
  • Documenting teh developed code, run teh sessions and workflows, while keeping track of source and target row count.
  • Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.
  • Extensively used un-connected Look-Ups for minimizing teh run time and add to teh performance of teh server, while making sure to avoid duplicates of data from other source systems.
  • Created pre-session and post-session shell scripts and mail-notifications.
  • Extensively worked on teh Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Extensive use of Performance tuning techniques to minimize teh run time and create pre-session caches with load being balanced on server.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
  • Employed performance tuning to improve teh performance of teh entire system.
  • Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Understanding teh entire functionality and major algorithms of teh project and adhering to teh company testing process.

Environment: Informatica Power Center 7.1, Oracle10g, PL/SQL Developer, SQL Server 2005, UNIX, Oracle 8i, MS-Access, TOAD, Teradata Administrator, Teradata V2R5, Teradata SQL Assistant, BTEQ,Windows 2003

Confidential, Minneapolis, MN

Teradata Developer

Responsibilities:

  • Analyzed Business requirements, prepared teh Physical design, High level designs and technical specifications.
  • Has experience in tuning some batch BTEQ queries.
  • Enhanced some queries in teh other application to run faster and more efficiently.
  • Extensively used teh Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Created some dat files by using FAST Export and has developed a common FTP script to port them on to teh clients’ server.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
  • Designed teh mappings between sources (external files and databases) to Operational staging targets.
  • Did teh performance tuning for Teradata SQL statements using TeradataExplain command.
  • Data was extracted from Teradata, Processed/Transformed using Ksh programs and loaded into Data Mart.
  • Used various Teradata Index techniques to improve teh query performance.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
  • Arranging teh meeting meetings on regular basis and go over teh open issues.
  • Monitoring database space, Identifying tables with high skew, working with data modelling team to change teh Primary Index on tables with High skew.
  • Involved in building tables, views and Indexes.
  • Involved in ad hoc querying, quick deployment, and rapid customization, making it even easier for users to make business decisions.
  • Working in Cogonos report design and suggesting best practices to improve teh performance.

Environment: Teradata V2R6.2, Mainframes DB2,MVS, HummingBirdBI, Cognos.

We'd love your feedback!