We provide IT Staff Augmentation Services!

Sr. Teradata Developer/etl Developer Resume

2.00/5 (Submit Your Rating)

NC

PROFESSIONAL SUMMARY

  • 7+ years of working experience in data migration, Enterprise Data Warehousing, Including Teradata, UNIX and manual testing.
  • Experience in Teradata Database design, Development, Implementation and Maintenance mainly in large scale Data Warehouse environments.
  • Extensive experience in Administration and Maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environments.
  • Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
  • Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
  • Experience in different database architectures like Shared Nothing and Shared everything architectures. Very good understanding of SMP and MPP architectures.
  • Strong Teradata SQL, ANSI SQL coding skills.
  • Extensively worked with BTEQ, FASTEXPORT, FASTLOAD and MULTILOAD Teradata utilities to export and to load data to/from flat files.
  • Expertise in Query Analyzing, performance tuning and testing.
  • Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like PMON, Teradata Workload Analyzer, Teradata Dynamic Workload Manager and Teradata Manager.
  • Extensively worked on Query tools like SQL Assistant, MS SQL Server, Aginity workbench and PLSQL Developer.
  • Good Knowledge in Logical and physical modeling using Erwin. Hands on experience in 3NF, Star/Snowflake schema design and De - normalization techniques.
  • Extensively worked on performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
  • Skillfully used OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, and group by grouping set etc to generate detail reports for marketing folks.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
  • Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
  • Experience in writing UNIX shell and PERL scripts to support and automate the ETL process.
  • Experience in Oracle RDBMS Architecture.
  • Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2, Oracle, and SQL Server and loaded into Oracle and Teradata DW.
  • Involved in Unit Testing, Integration Testing and preparing test cases.
  • Involved in production support activities 24/7 during on call and resolved database issues.

TECHNICAL SKILLS

Databases: Teradata 13/14, MS-SQL Server, Netezza, Oracle.

DB Tools/Utilities: Teradata SQL Assistant 13/14, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Manager, Teradata Query Manager, Teradata Administrator, Teradata SQL Assistance, PMON, SQL Loader, TOAD 8.0,Team Viewer, Aginity Netezza Workbench.

Programming Languages: C, C++, SQL, PL/SQL, UNIX and PERL Shell Scripting.

ETL Tools: Ab-Initio, Informatica, Big Data.

Data Modelling: Logical/Physical/Dimensional, Star/Snowflake, OLAP, ERWIN.

Scheduling Tools: Autosys, Crontab (UNIX)

Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0, Linux, Windows, UNIX, Z/OS.

PROFESSIONAL EXPERIENCE

Confidential, NC

Sr. Teradata Developer/ETL Developer

Responsibilities:

  • Developed and maintained existing application portfolio and planned the project implementation.
  • Data Architecture and Data Modeling were enhanced extensively as required, involved in the analysis and design of the system.
  • Actively managed the planning, organizing of activities for JIRA tickets.
  • Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
  • Translate business requirements into system solutions as per Basel views.
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor to pull the data from Netezza and load the data into Teradata.
  • Using various transformations like Filter, Expression, Sequence Generator, Joiner and Union to develop robust mappings in the Informatica Designer.
  • Worked with Netezza utilities (NZSQL and NZLOAD) on regular basics to load the data and unload the data.
  • Developed and implemented the model changes for business year 2013 and 2014 for a work stream in an application.
  • Managed and took lead in development to perform tasks in Enterprise Capital Management team (ECM).
  • BTEQ and load utilities (fastload, multiload, tpump) are used as per the requirements for developing the month end release tickets. Created data manipulation and definition scripts using the same utilities.
  • Views creation, modification and developed scripts for automation of processes as per the business.
  • Created UNIX scripts for various purposes like FTP, Archive files and creating parameter files.
  • Experience in maintaining code repository by using subversion and production deployment with troubleshooting activities.
  • Developed scripts to load high volume data into empty tables using Fast Load utility.
  • Teradata performance tuning via Explain, PPI, AJI, Indices, collect statistics or rewriting of the code.
  • Involved in the analysis of the Issues and proposing the solutions to the client.
  • Involved in the analysis of test results, preparing test cases and test data, documenting the test results and participated in end-end testing activities.
  • Developed and altered front end GUI for the work stream based on which business users will be able to access the process and initiate runs during month end without developer’s involvement.
  • Involved in process creation for existing SABER to SABER2 migration, which is based on Quartz, python and Netezza, teradata databases.
  • Involved in Creating the UNIX Shell Scripts/Wrapper Scripts dat uses for scheduling jobs.
  • Coordinated work across various applications, operations as necessary to maintain or deliver the business solution.
  • Involved in weekly meetings with the users for decision making for changing the existing programs for special processing.
  • Developed the recommendations for continuous improvement in efficiency and effectiveness by following Bank's CAB Processes.
  • Offshore team management, assigning daily tasks and organizing stand up calls when required.
  • Prepared the documents in details which will be shared across the organization.
  • Involved in 24x7 production support.

Environment: Teradata RDBMS 13,14(SQL Assistant BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power), UNIX (putty, WINSCP 4.2.7, SSH Tectia), QUATZ Desktop2, MS SQL Server Management Studio 10.0.55, Oracle 11.6g, PL/SQL Developer 9.0,Notepad++ v6.3,Aginity Netezza Workbench 2.1, Toad, Informatica Power Center 9.6.

Confidential, GA

Teradata Developer

Responsibilities:

  • Interacted with the Functional Analysts to understand the flow of the business.
  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Worked efficiently on Teradata Parallel Transport and generated codes.
  • Generated custom JCL scripts for processing all mainframes flat files, IBM DB2.
  • Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Responsible for trouble shooting, identifying and resolving data problems.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases, Teradata load utilities, SQL.
  • Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
  • Created series of Teradata Macros for various applications in Teradata SQL Assistant.
  • Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
  • Performance tuning for Teradata SQL statements using Teradata Explain command.
  • Created several SQL queries and created several reports using the above data mart for UAT and user reports.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Analyzing data and implementing the multi-value compression for optimal usage of space.
  • Excellent experience in performance tuning and query optimization of the Teradata SQLs.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Performed scheduling techniques with ETL jobs using scheduling tools, jobs through pmcmd commands, based on the business requirement.
  • Developed Shell Scripts for getting the data from source systems to load into Data Warehouse.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Used PMON/Viewpoint to tune SQL’s, which are taking max AMP/Io skew of the Teradata Server.
  • Support SSRS/SSIS reporting team with the data sourced from Oracle.
  • Good exposure to onshore - offshore model.

Environment: NCR Teradata V13, BTEQ,DB2, Teradata SQL, Teradata SQL assistant, Fast Load, MultiLoad, FastExport, Viewpoint, PLSQL,TOAD, ERWIN, Oracle SQL, JCL, UNIX, Shell scripting.

Confidential, DE OC

Teradata Developer

Responsibilities:

  • Meetings with business/user groups to understand the business process and gather requirements. Extracted and analyzed the sample data from operational systems (OLTP system) to validate the user requirements. Created high level design documents.
  • Loading Data into the Enterprise Data Warehouse using Teradata Utilities such as BTEQ, Fast Load, Multi Load, Fast Export and Tpump in both mainframes and Unix environments
  • Utilized BTEQ for report generation and running the batch jobs as well
  • Developed the Teradata macros which pulls the data from several sales table and performs calculations and aggregations and dumps into a results table
  • Utilized the Teradata utilities to load the data into EDW from DB2 sources using JCLs and Cobol scripts
  • Performance Tuning for the existing Teradata SQL scripts for the OTL code
  • Utilized Global temporary tables(GTT’s) in an efficient way by reducing the run time of the jobs
  • Build tables with UPI, NUPI, USI, NUSI, macros and stored procedures.
  • Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, Interleave and Merge) to achieve data parallelism.
  • Created common graphs to perform common data conversions dat can be used across the applications using parameter approach using conditional DMLs.
  • Troubleshot problems by checking sessions and error logs.
  • Interacted with metadata to troubleshoot session related issues and performance issues, tuning of ETL Load process.
  • Very good understanding of the several relational Databases such as Teradata, Oracle and DB2. Wrote several complex SQLs using sub queries, join types, temporary tables, OLAP functions etc.
  • Successfully identified problems with the data, produced derived data sets, tables, listings and figures dat analyzed the data to facilitate correction.
  • Data manipulation by merging, Appending, Concatenating, sorting datasets.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, Fast Export, Teradata Parallel Transporter, DDL and DML Commands.
  • Performance tuning for Teradata SQL statements using Teradata Explain command and Run stats.
  • Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as Group By, Rollup, Rank, Case, Union, Sub queries, Exists, Coalesce, Null etc.
  • Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the testing plans.
  • Involved in Creating the Unix Shell Scripts/Wrapper Scripts dat uses for scheduling jobs.
  • Involved in after implementation support, user training and data models walkthroughs with business/user groups.

Environment: Teradata 12, Erwin 7, Teradata Administrator, Teradata SQL Assistant, Teradata Visual Explain, BTEQ, Multi Load, Fast Load, Fast Export, MVS, UNIX Shell Scripts, Erwin.

Confidential, CO

Teradata Developer

Responsibilities:

  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
  • Database-to-Database transfer of data (Minimum transformations) using ETL (Ab Initio).
  • Fine tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.
  • Sorted data files using UNIX Shell scripting.
  • Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
  • Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on column property, column value, and referential integrity.
  • Acted as a single resource with sole responsibility of Ab Initio - Teradata conversions.
  • Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
  • Created COBOL programs and worked on creating JCL scripts to extract data from Mainframes operational systems. Extracted data from mainframe DB2 tables.
  • Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS.
  • Created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Worked on exporting data to flat files using Teradata FastExport.
  • Analyzed the Data Distribution and Reviewed the Index choices.
  • In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
  • Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
  • Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and DE normalize.
  • Prepared Unit and Integration testing plans. .
  • Involved in after implementation support, user training and data models walkthroughs with business/user groups.

Environment: Teradata 12, Ab Initio (GDE1.15, Co>Op Sys 2.15), Fastload, Multiload, FastExport, UNIX, Unix Shell scripts.

We'd love your feedback!