- Over 7 years of experience in information technology and have extensive explosure as Teradata performance consultant, Teradata production system administration and product Support Engineer with a keen ability to deliver solutions to meet business need and achieve the desired solution.
- Proficient in Teradata query and application tuning/optimization.
- Handling System restarts, system down, system slow performance, identifying root cause and giving resolutions.
- Setup and reviewing workload management process, giving solutions to various customer concerns on workload management set up and providing Teradata’s best practices.
- Expert in performing system TASM, Security audits to check for the Teradata best practice and providung recommendations.
- Proficient in giving performance recommendations to suspect queries and long running queries in the system.
- Good knowledge on handling hardware issues, suspecting slow LUN issues and coordinating with GSC.
- Expertise in system monitoring/health through viewpoint and taking necessary actions.
- Generating and analyzing Teradata PDCR Capacity and Performance reports.
- Workload analysis using Teradata Priority Scheduler, TDWM and Teradata Active System Management (TASM)
- Experienced in Teradata database management activities like Teradata space management, user management
- Teradata database Backup, Archive and Restore using Netbackup/TARA and NETVAULT.
- Worked on migrating servers from V2R6 to TD13.10.
- Good Knowledge in Teradata tools like Teradata Sql Assistant, PMON, Teradata Administrator, Viewpoint, Netvault.
- Good understanding of Hadoop eco - system components and proven knowledge on HDFS, Hive, Sqoop and Oozie.
- Developed and offloaded few of the Teradata dimension tables to Hadoop by implementing in hive and scheduled thorugh Oozie flows and Coordinator.
- Creating Teradata Roles according to the user groups and granting access to the production databases.
- Creating Teradata Profiles and maintaining them.
- User Management (Creating users, assigning profiles, roles depending on user’s specification according to the organizational policies.
- Good experience on Mainframes ( MVS ) Job Control Language ( JCL ).
- Worked in several areas of Data Warehouse including Analysis, Requirement Gathering, Design, Development, Testing, and Implementation.
- Experience in design and setup BI standards, best practices and rules for DWBI environment on enterprise level.
- Well experienced on handling Status calls with business and technical teams and delivering Qualitative products with innovative solutions to business.
- Good knowledge of Dimensional Data Modeling, Star Schema, Snow-Flake schema, FACT and Dimensions Tables.
- Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL’s .
- Very good experience in customer facing skills and release and change management.
- Proficient in Teradata Database Design (Physical and Logical), Application Support and setting up the Test and Development environments.
- Extensive knowledge on DWBI process in all stages.
- Involved in Data Migration between Teradata and DB2 ( Platinum).
- Expert DBA skills in Teradata DBMS Administration, development and production DBA support, use of Teradata Manager, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL, PDCR and ARCMAIN, TASM for workload management.
- Have a strong experience in Teradata development and index’s (PI, SI, PARTITION and JOIN INDEX) etc.
- Worked on Data Mining techniques for identifying claims on historical data.
- Expertise in database programming like writing Stored Procedures ( SQL ), Functions, Triggers, Views in Teradata, DB2 & MS Access.
- Experience in writing complex SQL to implement business rules, extract and analyze data.
Database: Teradata 15, 14, 13, 12, Teradata V2R6, Oracle, SQL Server
Tools: & Frameworks: Teradata SQL Assistant, BTEQ, Teradata Viewpoint, Teradata Administrator, Teradata Dynamic workload manager, Unity Director, Data Mover, TSET, Priority Scheduler, Teradata Index Wizard, Teradata Statistics Wizard, Teradata Visual Explain, Multiload, Fastload, FastExport, Tpump.
ETL Tools: Informatica Power center
Languages: Job Control Language (JCL), Java
Operating Tools: Microsoft Windows, UNIX, Linux
Confidential, Elmhurst, IL
Sr. Teradata Developer
- Involved in project cycle plan for the data warehouse, source data analysis, data extraction process, and load strategy planning.
- Closely interacted with the Data modeling team and business Analyst regarding the Mapping document.
- Identify and Fix the issues raised by Business in production.
- Used Rational Team Concert to check in and check out the different Sprint related ETL code.
- Extensively worked in data Extraction, Transformation and Loading data using Teradata Standalone Utilities, TPT from DB2, Main Frames (AS400/iSeries), Delimited and Fixed Width Flat Files to Teradata.
- Created database objects like Tables (Temp/Global/Volatile), Macros, Views, and Procedures.
- Extensively used UPI, PI, PPI, Secondary Index and Join Index for the data application layer tables as part of tuning the high CPU impact queries.
- Involved in DBA activities in the tests, such as creation of users, spool, temporary, permanent space. Also involved in performance tuning of the existing queries and scripts.
- Extensively used the Teradata Stand-alone utilities Fast load/Multiload/Fast Export/TPT to Load/Export data into/from database objects.
- Worked on all the TPT (SQL Insertor, Load, Update, Export and ODBC) operators for Loading and exporting the data.
- Used TPT to load the data from Oracle to Teradata database using ODBC operator especially CLOB, BLOB data.
- Approve the Staff Hierarchy Files on GDC (Generic Data Capture) which are uploaded by Business
- Executed ETL Data stage jobs and also monitored from Data Stage Run Director
- Used SQL Assistant to build the SQL queries (Correlated sub queries, Joins, Aggregate Functions etc.,) to implement Business transformation rules and for data validation.
- Frequently created Temporary tables, GTT tables and Volatile tables during code build in Bteq scripts.
- Involved in writing a procedure to check the up-to-date statistics on tables.
- Set up Sftp between Source and Destination Linux servers.
- Worked in Production Support on rotation basis for releasing, holding and Force run the jobs through Control M after fixing any of the failed jobs.
- Extensively used Teradata View point for monitor UAT, PROD to check the Skew for long running SQL's and CPU usage
- Used Unix script to trigger the Bteq, TPT, Mload, Fast Load and Fast Export jobs
- Extensively used TPT Easy Loader for loading the flat file into Tearadata Table's and also copying the data from one sever to another server for data analysis.
- Did data reconciliation in various source systems and in Teradata.
Environment: Teradata 15.10, DB2 UDB, Oracle 10g, Main Frames (AS400/iSeries), Linux, Fast load, Multiload, Fast Export, BTEQ, Teradata SQL Assistant, TPT (Load, Update, Export, ODBC, Easy Loader), Teradata Viewpoint, Data Stage 9.1, Control M, RTC v4, Mercury Quality Centre.Confidential, VA
- Analyzing the Business requirements and System specifications to understand the Application.
- Importing data from source files like flat files using Teradata load utilities like FastLoad, Multiload, and TPump.
- Creating Adhoc reports by using FastExport and BTEQ .
- Designed Informatica mappings to propagate data from various legacy source systems to Oracle.
- The interfaces were staged in Oracle before loading to the Data warehouse .
- Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy etc.
- Responsible for Tuning Report Queries and ADHOC Queries .
- Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
- Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
- Exported data from Teradata database using Teradata Fast Export .
- Used UNIX scripts to run Teradata DDL in BTEQ and write to a log table.
- Creating, loading and materializing views to extend the usability of data.
- Automated Unix shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency
- Making modifications as required for reporting process by understanding the existing data model and involved in retrieving data from relational databases.
- Involved in working with SSA requestor responsibilities which will be assigned for both project and support requests.
- Managing queries by creating, deleting, modifying, and viewing, enabling and disabling rules.
- Loading the data into the warehouse from different flat files .
- Database testing by writing and executing SQL queries to ensure that data entered has been uploaded correctly into the database
- Transfer files over various platforms using secure FTP protocol
- Involved in creating Unit test plans for and testing the data for various applications.
Environment: Teradata V12.0, Informatica, Business Objects XIR3.1, Crystal reports, Teradata Utilities Multiload, FastLoad, FastExport, BTEQ, Tpump, SQL Server 2000,Sybase, DB2, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server.Confidential, Atlanta, GA
- Understanding the specification and analyzed data according to client requirement.
- Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
- Managing database space, allocating new space to database, moving space between databases as needed basis.
- Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
- Proactively monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
- Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew .
- Worked on moving tables from test to production using fast export and fast load.
- Extensively worked with DBQL data to identify high usage tables and columns.
- Implemented secondary indexes on highly used columns to improve performance
- Worked on exporting data to flat files using Teradata FEXPORT .
- Worked exclusively with the Teradata SQL Assistant to interface with the Teradata .
- Written several Teradata BTEQ scripts to implement the business logic.
- Populated data into Teradata tables by using Fast Load utility .
- Created Teradata complex macros, View s and stored procedures to be used in the reports.
- Did error handling and performance tuning in Teradata queries and utilities .
- Creating error log tables for bulk loading.
- Actively involved in the TASM workload management setup across the organization. To define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed basis.
- Worked on capacity planning, reported disk and CPU Usage growth reports using Teradata Manager, DBQL and Reusage.
- Used Teradata Manager collecting facility to setup AMP usage collection, canary query response, spool usage response etc.
- Developed complex mappings using multiple sources and targets in different databases, flat files.
- Developed Teradata BTEQ scripts . Automated Workflows and BTEQ scripts
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
Environment: Teradata 12, Viewpoint, TASM, TARAGUI, Symantec Netbackup, Teradata Administrator, Teradata SQL Assistant, BTEQ, Subversion, Informatica.Confidential
- Involved in Complete Software Development Lifecycle Experience ( SDLC ) from Business Analysis to Development, Testing, Deployment and Documentation.
- Used Teradata utilities Fastload, Multiload, Tpump to load data.
- Wrote BTEQ scripts to transform data.
- Wrote Fast export scripts to export data.
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Constructed Korn shell driver routines (write, test and implement UNIX scripts)
- Wrote views based on user and/or reporting requirements.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
- Performance tuned and optimized various complex SQL queries.
- Wrote many UNIX scripts.
- Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ .
- Gathered system design requirements, design and write system specifications.
- Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Agile team interaction.
- Worked on data warehouses with sizes from 30-50 Terabytes.
- Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.
Environment: Informatica Power Centre 8.6, UNIX, Teradata, Oracle 8i, TOAD, SQLA, Oracle 11g, PLSQL Developer, SQL, PLSQL, UNIX Shell Scripting.