- Over 7 years of experience in information technology and have extensive explosure as Teradata performance consultant, Teradata production system administration and product Support Engineer with a keen ability to deliver solutions to meet business need and achieve the desired solution.
- Proficient in Teradata query and application tuning/optimization.
- Handling System restarts, system down, system slow performance, identifying root cause and giving resolutions.
- Setup and reviewing workload management process, giving solutions to various customer concerns on workload management set up and providing Teradata’s best practices.
- Expert in performing system TASM, Security audits to check for the Teradata best practice and providung recommendations.
- Proficient in giving performance recommendations to suspect queries and long running queries in the system.
- Good knowledge on handling hardware issues, suspecting slow LUN issues and coordinating with GSC.
- Expertise in system monitoring/health through viewpoint and taking necessary actions.
- Generating and analyzing Teradata PDCR Capacity and Performance reports.
- Workload analysis using Teradata Priority Scheduler, TDWM and Teradata Active System Management (TASM)
- Experienced in Teradata database management activities like Teradata space management, user management
- Teradata database Backup, Archive and Restore using Netbackup/TARA and NETVAULT.
- Worked on migrating servers from V2R6 to TD13.10.
- Good Knowledge in Teradata tools like Teradata Sql Assistant, PMON, Teradata Administrator, Viewpoint, Netvault.
- Good understanding of Hadoop eco - system components and proven knowledge on HDFS, Hive, Sqoop and Oozie.
- Worked on data backup and restore using TARA and DSA environments.
- Developed and offloaded few of the Teradata dimension tables to Hadoop by implementing in hive and scheduled thorugh Oozie flows and Coordinator.
- Creating Teradata Roles according to the user groups and granting access to the production databases.
- Creating Teradata Profiles and maintaining them.
- User Management (Creating users, assigning profiles, roles depending on user’s specification according to the organizational policies.
- Good experience on Mainframes ( MVS ) Job Control Language ( JCL ).
- Worked in several areas of Data Warehouse including Analysis, Requirement Gathering, Design, Development, Testing, and Implementation.
- Experience in design and setup BI standards, best practices and rules for DWBI environment on enterprise level.
- Well experienced on handling Status calls with business and technical teams and delivering Qualitative products with innovative solutions to business.
- Good knowledge of Dimensional Data Modeling, Star Schema, Snow-Flake schema, FACT and Dimensions Tables.
- Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL’s .
- Very good experience in customer facing skills and release and change management.
- Proficient in Teradata Database Design (Physical and Logical), Application Support and setting up the Test and Development environments.
- Extensive knowledge on DWBI process in all stages.
- Involved in Data Migration between Teradata and DB2 ( Platinum).
- Expert DBA skills in Teradata DBMS Administration, development and production DBA support, use of Teradata Manager, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL, PDCR and ARCMAIN, TASM for workload management.
- Have a strong experience in Teradata development and index’s (PI, SI, PARTITION, JOIN INDEX) etc.
- Worked on Data Mining techniques for identifying claims on historical data.
- Expertise in database programming like writing Stored Procedures ( SQL ), Functions, Triggers, Views in Teradata, DB2 & MS Access.
- Experience in writing complex SQL to implement business rules, extract and analyze data.
Database: Teradata 15, 14, 13, 12, Teradata V2R6, Oracle, SQL Server
Tools: & Frameworks: Teradata SQL Assistant, BTEQ, Teradata Viewpoint, Teradata Administrator, Teradata Dynamic workload manager, Unity Director, Data Mover, TSET, Priority Scheduler, Teradata Index Wizard, Teradata Statistics Wizard, Teradata Visual Explain, Multiload, Fastload, FastExport, Tpump.
ETL Tools: Informatica Power center
Data Backup tools: Tara, DSA
Languages: Job Control Language (JCL), Java
Operating Tools: Microsoft Windows, UNIX, Linux
Confidential, Seattle, WA
Sr Teradata Developer
- Involved with business requirements, Technical requirements, and Design documents and coordinated with Data analyst team on the requirements.
- Work in coordination with DBA Team on remediation activities. Create users, databases and roles.
- Assisted in gathering business requirement from end users.
- Developed Shared Containers for reusability in all the jobs for several projects.
- Used stages like Transformer, sequential, Oracle, and Hash for Lookup, Aggregator, Folder, Developing complex sql's and bteq scripts for processing the data as per coding standards.
- Invoking shell scripts to execute bteq, fast load, multi load utilities .
- Invoking Korn shell scripts to do reconciliation checks and passing parameter files.
- Created various Dashboard and Reports by using existing data model, Discoverer Reports, Noetix Views and Oracle Forms & Reports build in DWBI environment
- Used the Core Java for the basic programming of the modules.
- Used core Java to build new functionality based on requirements.
- Created Semantic Layer views for end users.
- Developed Semantic views on all base databases end users for read access
- Interacted with various Business users in gathering requirements to build the data models and schemas.
- Responsible for re-architecting and redesigning ETL for data loads to EWD.
- Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings.
- Expertise with Job scheduling console ( Tivoli ) on GUI and MVS.
- Developed and review the code, support QA and performed Unit and UAT testing .
- Implemented changes in coordination with Infrastructure team and Provide Warranty Support.
- Troubleshooting any issues related to production, database development and documenting issues.
- Working on improvement of existing process by integrating various sources and helping in maintenance code hence ensuring process is efficient.
- Installed Teradata drivers for the teradata utilities. Refreshed the data by using fastexport and fastload utilities.
- Used Teradata Administrator, Teradata Manager Tools & ViewPoint for monitoring and control the system.
- Extracted data from various sources like Oracle, DB2, and SQL server and loaded into Teradata
- Review of statistics and joins for performance improvement of Teradata SQL's using DBQL , Explain Plans
- Making changes to Physical model to assist with performance improvement by implementing partitioning, compressions and indexes on EDW tables.
Environment: Teradata 14, Unix, Korn Shell, Teradata-Bteq, Fast load, Multi load, Fast export, Oracle, SQL server, Aqua data studio, Toad.
- Analyzing the Business requirements and System specifications to understand the Application.
- Importing data from source files like flat files using Teradata load utilities like FastLoad, Multiload, and TPump.
- Creating Adhoc reports by using FastExport and BTEQ .
- Designed Informatica mappings to propagate data from various legacy source systems to Oracle.
- The interfaces were staged in Oracle before loading to the Data warehouse .
- Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy etc.
- Responsible for Tuning Report Queries and ADHOC Queries .
- Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
- Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
- Exported data from Teradata database using Teradata Fast Export .
- Used UNIX scripts to run Teradata DDL in BTEQ and write to a log table.
- Creating, loading and materializing views to extend the usability of data.
- Automated Unix shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency
- Making modifications as required for reporting process by understanding the existing data model and involved in retrieving data from relational databases.
- Involved in working with SSA requestor responsibilities which will be assigned for both project and support requests.
- Loading data from different system to Vertica database. Backing up, restore and recovery into Vertica database .
- Managing queries by creating, deleting, modifying, and viewing, enabling and disabling rules.
- Loading the data into the warehouse from different flat files .
- Database testing by writing and executing SQL queries to ensure that data entered has been uploaded correctly into the database
- Transfer files over various platforms using secure FTP protocol
- Involved in creating Unit test plans for and testing the data for various applications.
Environment: Teradata V12.0, Informatica, Business Objects XIR3.1, Crystal reports, Teradata Utilities Multiload, FastLoad, FastExport, BTEQ, Tpump, SQL Server 2000,Sybase, DB2, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server.
Confidential, Atlanta, GA
- Understanding the specification and analyzed data according to client requirement.
- Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
- Managing database space, allocating new space to database, moving space between databases as needed basis.
- Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
- Proactively monitoring bad queries, aborting bad queries using PMON , looking for blocked sessions and working with development teams to resolve blocked sessions.
- Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew .
- Worked on moving tables from test to production using fast export and fast load.
- Extensively worked with DBQL data to identify high usage tables and columns.
- Implemented secondary indexes on highly used columns to improve performance
- Worked on exporting data to flat files using Teradata FEXPORT .
- Worked exclusively with the Teradata SQL Assistant to interface with the Teradata .
- Written several Teradata BTEQ scripts to implement the business logic.
- Populated data into Teradata tables by using Fast Load utility .
- Created Teradata complex macros, View s and stored procedures to be used in the reports.
- Did error handling and performance tuning in Teradata queries and utilities .
- Creating error log tables for bulk loading.
- Actively involved in the TASM workload management setup across the organization. To define TASM Workloads , developed TASM exceptions , implemented filters and throttles as needed basis.
- Worked on capacity planning, reported disk and CPU Usage growth reports using Teradata Manager, DBQL and Reusage.
- Used Teradata Manager collecting facility to setup AMP usage collection, canary query response, spool usage response etc.
- Developed complex mappings using multiple sources and targets in different databases, flat files.
- Developed Teradata BTEQ scripts . Automated Workflows and BTEQ scripts
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
Environment: Teradata 12, Viewpoint, TASM, TARAGUI, Symantec Netbackup, Teradata Administrator, Teradata SQL Assistant, BTEQ, Subversion, Informatica.
- Involved in Complete Software Development Lifecycle Experience ( SDLC ) from Business Analysis to Development, Testing, Deployment and Documentation.
- Used Teradata utilities Fastload, Multiload, Tpump to load data.
- Wrote BTEQ scripts to transform data.
- Wrote Fast export scripts to export data.
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Constructed Korn shell driver routines (write, test and implement UNIX scripts)
- Wrote views based on user and/or reporting requirements.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
- Performance tuned and optimized various complex SQL queries.
- Wrote many UNIX scripts.
- Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ .
- Gathered system design requirements, design and write system specifications.
- Excellent knowledge on ETL tools such as Informatica , SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Agile team interaction.
- Worked on data warehouses with sizes from 30-50 Terabytes.
- Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.
Environment: Teradata 12/13, Teradata SQL Assistant, SQL, VSS, Outlook, Putty, MLOAD, TPUMP, FAST LOAD, FAST EXPORT, TDWM, PMON, DBQ.