Teradata Lead Resume
SUMMARY
- Total 12+ Years of experience in Data Warehousing providing business solutions using Teradata, MS SQL Server, DB2 and Oracle as databases
- 9+ years of experience as Team member, Team lead and managing a team of Teradata Developer, DBA and Data Modeler / Architect.
- Extensive experience in Teradata tools and utilities like BTEQ, FastLoad, MultiLoad, Fast Export, Tpump, TPT, TSET, PMON, Teradata Administrator, Manager, View point, Arcmain, Net Vault, Net Backup using TARA GUI, DSA etc., and ETL tools like ODI, Data Stage Server / parallel extender, Informatica with ETL concepts, Toad, Sql Developer
- Handling a team of 15 members in which I was involved in ODC setup, s/w installation, KA/KT process, etc. till go live and then coordinating with onsite and offshore.
- Recently got trained on Hadoop CDH
- Worked in Germany for Confidential and in Singapore for Confidential Bank as Teradata DBA
- Experience in dev, support, maintenance projects and handling very large databases for PROD, DEV and TEST environments supporting under either on call or 24x7
- Development of SQL Queries for ETL and Semantic structures for DWH by creating Tables in ETL Staging and Semantic layers in EDWH.
- Design and develop all the staging tables needed to transform and store data from OLTP environment prior to export to data warehouse. Also created dimension, fact tables for DWH.
- Experience in Design, Implementation and Maintenance of OLTP & OLAP Data Warehouse systems, Data & Technical architecture, ETL processes, Business Intelligence and Enterprise Application integration using tools like ODI, Data Stage, Informatica and replication tools like GoldenGate from source to Staging.
- Performance tuning of Data Warehouse and ETL data load, Documentation of requirements, design, development & testing phases, team management, code review, analysis of existing system, impact analysis, integration testing, defects analysis and fix.
- Expertise in building Semantic Layer which will provide basis for the analytics and reporting needed for the reporting platform.
- Experience with physical data modeling forward engineering, reverse engineering, complete compare and database design using Erwin.
- Experienced working on Performance Tuning, PPI Concepts, Join index, Secondary Indexes, Compression techniques, strategies, Query optimization, Teradata SQL.
- Hands on experience in installing, configuringHadoopecosystem components likeHadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop and Pig.
- Assisted in designing, development and architecture ofHadoopeco system.
- Creating and implementing an End to End Software / Version upgrade implementation, node addition and hardware relocation activities.
- Extensively worked on Netvault backup utility like addition of tapes, creating pools, Drive allocation, creating save sets etc.
- Extensively worked on Mainframes, UNIX Shell scripts.
- Knowledge in scheduling tools like Control - M, Cron Tab, Autosys, Appworx.
- Following ITIL process for handling incidents, Change requests, Service requests, Problem tickets by using tools like BMC Remedy etc.
- Generate, present reports to management about activities performed, issues faced, CR management, high priority incidents and resolution in defined Service level goals and management activities like Resource shift, Shift-Handover, Leaves status, On-call Rotation, monthly highlights, team building by technical trainings & sessions
TECHNICAL SKILLS
RDBMS: Teradata V14x, 13x, 12, V2R6, V2R5, Oracle 8i, 9i
Administrator Tools: Teradata Manager, Administrator, SQL Assistant (Query man), Performance Monitor (PMON), Viewpoint, Studio Express, TSET
Database Utilities: Lockdisp, Showlock, Vprocmanager, Scandisk, Checktable
System utilities: dbscontrol, ferret, PDCR
Load/ Unload Utilities: BTEQ, FastExport, FastLoad, MultiLoad, Tpump, TPT, Data Mover
Backup Tools: Net vault backup, Net backup, TARA/DSA and Teradata ARC
ETL Tools: ODI, Data stage 7.5/Px, INFORMATICA 8.x
Reporting Tools: Cognos 8.x, Business Objects 6.5, SAS 9.1.3
Modeling Tools: Erwin 7.x, 8.2
Scheduling Tools: Control-M, Cron Tab, Appworx.
Replication Tools: GoldenGate
Domains: Manufacturing, Banking, Automobile, Retail, Telecom
Operating Systems: Windows, UNIX, Mainframes
PROFESSIONAL EXPERIENCE
Confidential
Teradata Lead
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Participate in Data Modeling discussions and provide inputs to build new data model.
- Analyse Business Report requirements and prepare design documents based on BRD's.
- Analyse Source and Target systems and prepare Data Element Mapping documents.
- Done reverse engineering, forward engineering.
- Check to see data models or databases are in synch by using complete compare between different environments.
- Review the data model with functional and technical team
- Maintain change log for each data model.
- Assist developers, ETL, BI team and end users to understand the data model.
- Responsible for the entire data model related sign offs.
- Co-Ordinate with ETL team for further ETL Jobs development.
- Teradata DBA support to application teams in creating database objects, tuning queries and handling loading jobs.
- Developing jobs for loading and unloading by using Data stage Px Designer.
- Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Repository Manager and Workflow Manager
- Used all the transformations like Filter, Joiner, Expression, Aggregator, Router, Rank, Lookup and Update Strategy transformations
- Worked On Cloudera Distribution on major contributors to ApacheHadoop.
- Installation and configuration,HadoopCluster and Maintenance, Cluster Monitoring and Troubleshooting and Transform data from RDBMS to HDFS.
- Responsible for implementation and ongoing administration ofHadoopinfrastructure.
- UsingHadoopcluster as a staging environment for the data from heterogeneous sources in data import process
- Configured High Availability on the name node for theHadoopcluster - part of the disaster recovery roadmap.
- Develop Unit Test Plan and Unit Test Cases to validate data against test scenarios that cover both business and technical requirements.
- Develop Generic scripts in Unix to execute Bteq scripts to load the data into Staging Layer and Semantic Layer.
- Develop BTEQ scripts to Extract, Transform and Load data from multiple source applications to IDW/ICDW by supporting Unix Scripts
- Implement performance tuning logics on Sources and Targets tables to provide maximum efficiency and performance at the reporting level.
- Create Materialized views / JI on the Summary Tables.
- Prepare Audit entries
- Perform Peer Review and Code Walkthrough.
- Complete code test and migrate the code to QA and UAT environments.
- Assist in Deployment and provide Technical & Operational support during Install.
- Post implementation support.
- Coordinate with onshore and offshore teams.
- Review the code developed by offshore team and validates the test results.
- Ensure overall quality of all the deliverables within the timelines.
- Able to motivate and lead others in a team environment.
- An ability to build rapport and trust quickly with work colleagues.
Environment: Teradata 14x,UNIX, Appworx, Erwin 8x, Informatica 9x, Datastage 8x, GoldenGate
Confidential
Teradata Lead
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Work closely with six Confidential local market teams like management, application dba, BI to Identify root cause and provide solution for any critical job failure, response on escalation.
- Effectively interacting with the client daily/Weekly call and technical discussions for smooth transition and ensuring timely and defect free delivery within the SLA.
- Participate in the meeting with different groups from offshore and onsite to give them knowledge on Teradata whenever they need.
- Designing, creating and tuning physical database objects (tables, views, indexes) with Users and developer's support.
- Working on code and application enhancements like modifying the scripts as per the request both in SQL and in UNIX.
- Tuning the application queries, reporting queries, long running queries and ETL queries using the optimization techniques like SQL code review, secondary indexes (NUSI), Join Indexes (JI’s) and Aggregate Join Indexes (AJI’s)
- Data Monitoring, production load monitoring by report and handle load / data integrity issues.
- Reconcile the data with the source system extracts to ensure correct DWH load completed.
- Process and transform the data for the countries as per the mapping rules provided. Each country will be handled separately for processing purposes.
- Preparing job scheduler list and maintaining the dependency while scheduling the jobs on scheduling tools.
- Creating incidents and coordinating with Teradata GSC during system unavailability P1/P2 issues like AMP down, node failures, DBS hung and other hardware related failures by collecting necessary information from TSET.
- Table design and indexes selection.
- Catering to Adhoc requests (e.g. Creation of users, Database Objects, Granting privileges to a user/role/profile, resetting the password, release of load locks, space movements.)
- Creating and executing the DDL’s / DML’s as requested by the developers.
- Database maintenance like space management by monitoring, identifying & troubleshooting problems and fixing them by maintaining space by releasing unwanted objects etc.
- Involved in hardware and version up gradation of the systems.
- Net vault Backup/Restore/Analyze monitoring, job creation/modify, define and schedule backups in assigned window, work on ad-hoc request.
- Involved in Tape management like assigning tapes first to the blank pool and then moving those tapes to the new pools by creating new pools, report management etc.
- Performing Data copy, backup / restore jobs via Netvault, Providing support to Development Team, ETL team for Teradata related queries.
- Providing Teradata solutions to end business users and resolving the Teradata login issues.
- Data refreshing from production to development and test boxes.
- Handling the situations when there is stuck-up with Teradata like when the system is in Flow control etc.
- Proactively monitor database for blocking issues, system capacity for CPU, IO usage using viewpoint, PMON, and TD Manager
- Many queries have been tuned to perform better by fixing the skew, reducing the spool usage etc. resulting in millions of impact CPU savings.
- Develop Statistics Collection processes. Identify Objects, Columns, and Indexes for Stats Collection using Statistics wizard by defining workloads
- Co-ordination & Implementation of MVC analysis & Turnover done of a monthly basis for space saving on Production systems.
- Handle Teradata related failures and provide instant resolution for them (spool outs, perm space requirement etc.).
- Reviewing the export, import and ARCMAIN scripts prepared by the groups.
- Installed Teradata utilities on platforms like Windows, Linux etc.
- Refreshed the data by ARC, fast export, Multiload, fast load and Data Mover utilities.
- Identifying the suitable objects for archiving consulting with business users and writing jobs for Archiving to use the existing space efficiently; this includes the staging and archive data.
- Use of Data Dictionary/Directory tables and views to manage the system
- Assisting developers/users to tune queries.
- Overseeing daily and weekly back-ups.
- Analyse the workloads and configure TDWM rules and throttles using DBQL.
- Configured Alerting for Database health like session blocking, Space threshold, restart.
- Preparation of Production support document.
- Change request management, population of History tables.
- Implementing the DBQL data collection and using the DBQL (Query logging data) on User Level or Account Level for tuning the code.
- Reviewing the projects in all the phases, Design, development, testing and migrations.
- Housekeeping activities like table splitting, DBQL maintenance, taking backup of core loads.
- Maintenance activities like Check Table, Scandisk, Pack disk using Ferret Utilities to make the sufficient free cylinder to help the System perform optimally
- Using the amp usage, resusage and DBQL to analyse the resource usage by account and verifying the bottlenecks
Environment: Teradata 14x,13x, Windows, UNIX, Mainframes
Confidential
Assistant Consultant Teradata
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Providing support to Development ETL team for Teradata related queries
- Table design and indexes selection
- Providing Teradata solutions to end business users
- Creating and managing user accounts, databases and data with appropriate security.
- Creating and executing the DDL’s / DML’s as requested by the developers.
- Database / Application Tuning like performance tuning.
- Database maintenance like space management, identifying and releasing unwanted objects
- Performing backup and restore activities using TARA.
- Reviewing the export, import and ARCMAIN scripts prepared by the groups.
- Data refreshing from production to development and test boxes
- Monitoring the system using PMON,TD Manager, Viewpoint
- Handling the situations when there is stuck-up with Teradata.
- Developing scripts for loading and unload utilities like Fast load, Multiload, Fast Export, BTeq.
Environment: Teradata 14x,13x, Windows, UNIX, Mainframes
Confidential
Assistant Consultant Teradata
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Providing support to Development ETL team for Teradata related queries
- Table design and indexes selection
- Providing Teradata solutions to end business users, effectively interacting with the client daily/Weekly call and technical discussions for smooth transition and ensuring timely and defect free delivery within the SLA.
- Participate in the meeting with different groups from offshore and onsite to give them knowledge on Teradata whenever they need
- Involved in Tape management like assigning tapes first to the blank pool and then moving those tapes to the new pools by creating new pools, report management etc.
- Creating incidents and coordinating with Teradata GSC during system unavailability P1/P2 issues like AMP down, node failures, DBS hung and other hardware related failures by collecting necessary information from TSET.
- Catering to Adhoc requests (e.g. Creation of users, Database Objects, Granting privileges to a user/role/profile, resetting the password, release of load locks, space movements.)
- Creating and executing the DDL’s / DML’s as requested by the developers.
- Designing, creating and tuning physical database objects (tables, views, indexes) with Users and developer's support.
- Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables.
- Designed and developed process to handle high volumes of data and high volumes of data loading in a given SLA.
- Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Informatica scheduling tool.
- Expertise in creating control files to define job dependencies and for scheduling using Informatica.
- Involved in jobs scheduling, monitoring and production support in a 24/7 environment.
- Database maintenance like space management by monitoring, identifying & troubleshooting problems and fixing them by maintaining space by releasing unwanted objects etc.
- Involved in hardware and version up gradation of the systems.
Environment: Teradata13x, Oracle, Windows, UNIX, Informatica 8x
Confidential
Teradata Team Lead
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Work with Business team, Architect and developer for creating/modify physical data models from Logical model applying business rules and accordance with Database architecture
- Create SQL Scripts from Physical Data Model.
- Generate scripts/ schema and reports from data model.
- Review the data model with functional and technical team
- Maintain change log for each data model.
- Done reverse engineering, forward engineering.
- Check to see data models or databases are in synch by using complete compare between different environments.
- Review the data model with functional and technical team
- Assist developers, ETL, BI team and end users to understand the data model.
- Responsible for the entire data model related sign offs.
- Co-Ordinate with ETL team for further ETL Jobs development.
- Teradata DBA support to application teams in creating database objects, tuning queries and handling loading jobs.
- Generate SQL Scripts, DDL's from Physical Data Model
- Assist developers, ETL, BI team and end users to understand the data model.
- Responsible for the entire data model related sign offs.
- Co-Ordinate with ETL team for further ETL Jobs development.
- Providing support to Development ETL team for Teradata related queries
- Table design and indexes selection
- Providing Teradata solutions to end business users
Environment: Erwin 7x, 8x, Teradata 12, Oracle, Windows, UNIX
Confidential
Teradata Team Lead
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Creating and managing user accounts, databases and data with appropriate security.
- Creating and executing the DDL’s / DML’s as requested by the developers.
- Database / Application Tuning like performance tuning.
- Database maintenance like space management, identifying and releasing unwanted objects
- Performing backup and restore activities using TARA.
- Reviewing the export, import and ARCMAIN scripts prepared by the groups.
- Data refreshing from production to dev and test boxes
- Monitoring the system using PMON,TD Manager, Viewpoint
- Developing scripts for loading and unloading utilities like Fast load, Multiload, Fast Export, BTeq.
- Handling the situations when there is stuck-up with Teradata.
- Developing scripts for loading and unloading utilities like Fast load, Multiload, Fast Export, BTeq.
- Developing jobs for loading and unloading by using ODI Designer and Data stage Px Designer.
- Preparing models, interfaces and packages in ODI to load data.
- Generating scenarios for the packages and interfaces.
- SQL code review for performance tuning
- Provided support to Development ETL team and Production team for Teradata related queries.
- Working on code and application enhancements like modifying the scripts as per the request both in SQL and in UNIX.
- Reconcile the data with the source system extracts to ensure correct DWH load completed.
- Process and transform the data as per the mapping rules provided.
- Report and handle load errors.
- Preparing job scheduler list.
- Involved in unit testing.
Environment: ODI, Datastage, Cognos8.4, Oracle 9i, Teradata 6x, UNIX, and Windows
Confidential
Teradata Developer
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Database Administration tasks such as creating users, Roles and Profiles
- Creating and executing the DDL’s / DML’s as requested by the developers.
- Database / Application Tuning like performance tuning.
- Database Maintenance - monitoring & troubleshooting problems and fixing them by maintaining space, etc.
- Performing backup and restore activities.
- Developing scripts for loading and unloading utilities like Fast load, Multiload, Fast Export, BTeq.
- Reviewing the export, import and ARCMAIN scripts prepared by the groups.
- Handling the situations when there is stuck-up with Teradata.
- Developed complex mappings using Informatica Power Center Designer to transform, load the data from various source systems like Oracle, DB2, SQL server, Flat files into the target EDW.
- Created Error handling strategy for trapping errors in a mapping and sending errors to error table
- Used Debugger to validate mappings and also for gaining troubleshooting information about data by inserting breakpoints.
- Developed fine-tuned Transformations, Mappings and Mapplets between source systems and warehouse components
- Worked on code Migration Strategies between Development, Test and Production repositories
- Created and scheduled Workflows using Informatica workflow manager.
- Data Monitoring and Production load monitoring.
- Daily Load analysis and Data Validation.
- Major operation support by solving Priority 1 cases.
- Working on code and application enhancements.
- Maintaining and supporting the production and deploying enhancements into production.
- Participating in meetings with client.
- Handling batch failures in daily, month end and year end activities for reaching 100 percent SLAs.
- Actively participated in performance tuning of queries.
- Given internal sessions on Teradata to the application support and duty teams.
- Participated in conference calls with onsite coordinators.
- Ensuring timely and defect free delivery
Environment: Informatica, Oracle 9i, Teradata 6x, UNIX, Windows.
Confidential
Data Stage ETL Developer
Responsibilities:
- Understanding BRD, Data required document and mapping documents.
- Involved in extracting, discussing about requirements from users and team members of source owners.
- The data is extracted from different sources like Flat files, Oracle loaded to target staging to Final Warehouse i.e. Teradata database by using the Data Stage.
- Involved in development of creating server jobs using Data Stage Client Tool i.e. Designer.
- Involved to prepare ETL Specification for developing jobs
- To perform the aggregation operations on the extracted input data and filtering the data Using Built in Active Stages like AGGREGATOR, SORT STAGE and TRANSFORMER.
- By using the local and shared containers for simplifying the complex designing.
- Done Performance tuning in Data Stage.
- Used Job Control and Job Sequencer to run the job by using parameters.
- Involved in compiling, validating, running, Monitoring and scheduling jobs.
- Data Monitoring and Production load monitoring.
- Daily Load analysis and Data Validation.
- Involved in reprocessing of incoming aggregates data from mediation cdr to aggregated mcdr.
- Involvement in implementation of BTEQ Scripts and Bulk load jobs.
- Population of History tables.
- Loading data for end users using fast load and multi load utilities and exporting data to flat files using fast export.
- Resolve and work on issues across multiple functional areas.
- Effectively monitor and take action to ensure coordination and effectiveness of all components and activities and decide on issues requiring escalation.
- Involvement in optimization.
- Administration of Teradata database in development and production like user, space and session management.
- Monitoring ETL jobs until production jobs are stabilized.
Environment: Oracle 9i, Teradata V2R5/6, Data Stage 7x, BO 6x, UNIX, Windows.