Sr. Teradata Dba Resume
MN
SUMMARY:
- A Dynamic, Energetic and talented individual with 6+ years of experience with Teradata and SQL Server Database Administration.
- Expertise in Installation, configuration and maintenance of Teradata and SQL Server.
- Experience in Database backup and disaster recovery procedures.
- Experienced in deploying Teradata patches, install, fix and figuring out the settings.
- Experienced in archiving, restoring and recovering data on Teradata using
- Created complex DTS Packages for Data Migration and Transformation.
- Experience in data migration from Oracle/Access/Excel Sheets to SQL Server 2005/2000/7.0
- Excellent knowledge in moving database objects between Teradata systems and Teradata Database
- Migrating Teradata database V2R5 to V2R6
- Experience in MS SQL Server upgrades from 7.0 to MS SQL Server 2000 and from 2000 to MS SQL server 2005.
- High expertise in cross database coding/design skills( Oracle, Teradata and SQLServer)
- Administering Teradata database using Teradata Manager, Teradata Administrator
- Create and Maintain Databases, Users, Tables, Views, Macros and Stored Procedures using SQL Assistant (Queryman), BTEQ, MLoad and Fastload.
- Good experience in Database Administration along with Database Maintenance, Backup / Recovery, Performance Tuning, Application and Database
- Very well trained and experienced in scheduling backups and recovery of the entire EDW databases across various geographical locations for the business continuity and response time
- Highly skilled in using tools for Managing Teradata Resources to minimize the occurrence of impeded performance, maximize throughput and manage of consumption of resources.
- Provided 24x7 Production - on call support for all the production databases
- Experienced in reading DBQL, PMON to identify monster queries and Hot Amps.
- Teradata Explain, Stats Collection, Multi value Compression.
- Skilled on performing BAR activities using NetVault 7
- Worked on MVC and familiar with TASM
- Good coding and tuning skills using Teradata load utilities like Fastload and Teradata Parallel Transporter
- Lead the team along with responsibility to finish net change/delta loads in time for the Customers.
- Developed and deployed solutions that support integration of data in all environments - transactional as well as operational and analytical.
- Provided a solid basis for expediting transactions, streamlining operations, making optimal decisions, building large data warehouses, datamarts
- Experience in working with client managers, developers, understand their need provide necessary support.
- Wrote SQL scripts for backend databases, custom stored procedures, macros and Packages along with Referential Integrity triggers. Enhanced and Customized load and back up scripts to the newly developed and in design projects
- Comfortable working in team and can work independently when necessary
- Active team leader with good presentation, problem solving, communication and interpersonal skills.
TECHNICAL SKILLS:
Languages: Confidential -SQL, HTML, Perl, VB, C, Java, JSF, ANT
Applications: Teradata Manager, PMON, Query Manager, Query Scheduler, IIS, Terawatts Utilities, Teradata Statistics and Index Wizard
RDBMS: Teradata V2R5/V2R6, SQL Server 2000, 2005, Oracle 9i/10g
Operating Systems: Windows 98/NT/XP, Windows 2003 Server, UNIX, Mainframe
GUI: SQL Query Analyzer, SQL Profiler, Query Optimizer, Oracle SQL Developer, Ascential Datastage 7.5, Informatica, Microstrategy, TOAD, VERITAS NETBACKUP
Skills: Software Installation, Upgrading & Troubleshooting Administration, Patch & Change Management, Performance Tuning, Unix Shell Scripting, Backup & Recovery.
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis, MN
Sr. Teradata DBA
Responsibilities:
- Creating, maintaining, and supporting Teradata architectural environment
- Worked with Teradata Utilities like BTEQ, Fast Load, MultiLoad, Fast Export, Tpump.
- Working with DB and tables in DEV, UAT, SIT and Production environment.
- Strong experience Teradata Tools like SQL Assistant, PMON, Teradata Administrator, Viewpoint server, Netvault Backup server.
- Supporting of many development teams, production support, query performance tuning, system monitoring.
- Performing backup and recovery operations using Netvault server.
- Developing of pro-active processes for monitoring capacity and performance.
- Supporting the application development teams for database needs and guidance.
- Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
- Designed DDLs and efficient PIs along with Identity Keys for efficient data distribution
- Assisted Developers with coding and effective Join issues
- Responsible for Backups every night after Major loads.
- Developed statistics macros and automated to run based on the frequency.
- Highly successful in testing fail over nodes and vprocs migration.
- Experienced in Loading, archiving and restoring data Duties also involved are storage optimization, performance tuning, monitoring, UNIX shell scripting, and physical and logical database design.
- Controlled and tracked access to Teradata Database by granting and revoking privileges.
- Planned the releases, monitored performance and reported to Teradata for further technical issues.
- Worked with EIW User Support for granting access to the user roles and Batch ID’s.
- Worked with Netvault server to back up all the databases which need to require.
- Worked with Developers/Abinitio team to provide access, implementation of change requests in Production/Non Production environments.
- Worked with SQL tuning, Locks release during backups, Viewpoint server to monitor various environments like Dev, SIT, UAT, Prod, BCP etc.
Environment: Teradata 13, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ MLOAD, TPT, PMON, ARCHMAIN, TASM, Net vault, UNIX, Shell scripts.
Confidential, Saint Louis
Database Administrator/Analyst
Responsibilities:
- Acting as an Onsite coordinator and working to resolve issues. Experience in performing production support for Teradata environments.
- Work with the team to ensure that the associated hardware resources allocated to the databases and to ensure high availability and optimum performance.
- To proactively monitor the database systems to ensure secure services with minimum downtime and responsible to meet the SLA’s.
- Capable of supporting and implementing technology projects as the primary DBA in support of business initiatives
- Primary maintenance support for production and development databases with respect to monitoring the Teradata Manager, PMON window and collecting stats when required for particular databases.
- Check on PERM and SPOOL spaces with respect to the database.
- Performed database installs and upgrades, performance tuning, analyzed performance issues, setup of user access, backup and recovery.
- Implemented database refresh using full export for table level and full database defragmentation, resulting in improved performance and effective space management.
- Develop archival, backup and recovery strategy. Plan, perform and monitor database backups. Performing backup and restore operations of Teradata databases using NetVault and ARCHMAIN. Provide 24x7 on call production support.
- Knowledge in Fastload, Multiload, TPUMP, FastExport and BTEQ scripts
- Analyze job failures in VERITAS and check media drives and tape drives when they are down.
- Superior communication, presentation, analytical and problem solving skill. Work well with all levels of business.
Environment: Teradata Manager, PMON, Teradata Utilities(BTEQ, SQL Assistant, TPump, FastLoad MultiLoad, FastExport), Teradata Visual Explain, Index wizard, Veritas Netbackup, ARCMAIN.
Confidential, Seattle, WA
Teradata Application DBA
Responsibilities:
- Defined account IDs, priority scheduler performance groups, and system date and time substitution variables in user and profile definitions.
- Experienced in Loading, archiving and restoring data Duties also involved are storage optimization, performance tuning, monitoring, UNIX shell scripting, and physical and logical database design.
- Controlled and tracked access to Teradata Database by granting and revoking privileges.
- Implemented Teradata protection features Table design and index selection Table implementations, maintenance, and backup, Problem support, Workload monitoring and control, Policies, procedures and guidelines that govern the Teradata environment, SQL code review, Developer and user support and training, Capacity planning, System software testing and benchmarking and Support and coordination during hardware upgrades
- Designed DDLs and efficient PIs along with Identity Keys for efficient data distribution
- Assisted Developers with coding and effective Join issues
- Responsible for Backups every night after Major loads.
- Allocated spaces to the users, controlled Spool spaces and assigning of table spaces
- Planned the releases, monitored performance and reported to Teradata for further technical issues.
- Explained Users by showing PMON about their inefficient queries through WebEx
- Involved with Teradata to deploy patches, install, fix and figured out the settings.
- Applied DBQL settings to the business and application standards.
- Defined relationships as the identifying and non-identifying relationships ensuring integrity constraints.
- Designed and Architected ETL Preprocess system to capture delta loads.
- Responsible in Troubleshooting and releasing Mloads in different phases and restart scenarios.
- Wrote UNIX shell scripts for initialization process, scheduling and control mechanism.
- Supported the platform along with patches.
- Wrote queries which are resource hog and fine tuned for performance
- Excellent team player and complied to security
- Ran DBQL and Explains to see the in depth frame of the query behavior
- Tutored developers regarding the PIs and Optimizer behavior about different queries.
- Developed Sql scripts to identify data anomalies, complex data validation and for data cleansing purpose.
- Hand coded Teradata SQL queries in Teradata CLI stage, Written Macros, Stored Procedures, triggers in Teradata. Extracted valid data to avoid overhead and Designed Test cases and Error codes and involved in testing the data stage Jobs before running in Pre-Prod, also helped ETL
- Interacted with Business Units and Analysts to cleanse the data and match the Operating Standards.
- Used rich experience of database querying in Performance tuning of ETL Jobs and embedded SQL queries in OCI, CLI, Mutliload, Tpump and SQL-Loader.
- Imported and Exported Repositories cross projects.
- Prepared functional and technical specifications for the preprocess system.
- Highly technical competent in all phases of application systems
- Designed the Utility testing environment for concurrent Loads on the Teradata box.
- Created efficient hash tables for referential Integrity and Lookup purposes for validation and referential purpose.
- Experienced in writing SQL scripts for populating new fields added to tables on a one-shot basis and solving the problem associated with slowly-changing dimension for one of the dimension tables.
- Involved in performing unit testing and integration testing the individual and extract-transform-load jobs in sequence respectively.
- Analyzed the Data Sources in identifying data anomalies patterns value ranges. Wrote SQL scripts for accomplishing the same.
- Compiled and debugging the Jobs based on the Errors.
- Wrote shell scripts for scheduling the ETL process.
Environment: Teradata V2R5, TASM, TPT, Teradata Administrator, Teradata SQL Assistant, Teradata,Manager, BTEQ, PMON, MLOAD, ARCHMAIN, Net vault, Erwin Designer, UNIX, Shell scripts.
Confidential, Cincinnati, OH
Teradata / Ab Initio Developer
Responsibilities:
- Extensively used Ab-Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.
- Understood the business requirements with extensive interaction with Business analysts and reporting teams, and assisted in developing the low level design documents.
- Maintained locks on objects while working in the sandbox to maintain the privacy
- Used inquiry and error functions like is valid, is error, is defined and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
- Created several packages to set up and share global variables, types and transforms which were extensively used for many Ab Initio graphs.
- Implemented a 6- way multifile system in the test environment that is composed of individual files on different nodes that are partitioned and stored in distributed directories in multifile system
- Created COBOL programs and worked on creating JCL scripts to extract data from Mainframes operational systems.
- Partition Components (Partition by Key, by Expression, by round Robin) were used to Partition the large amount of data file into multiple data files.
- Extensively used File management commands like m ls, m wc, m dump, m copy, m mkfs etc.
- Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
- Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
- Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
- Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
- Involved in Comprehensive end-to-end testing.
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup Tables, In-Memory Joins and rollups to speed up various Ab Initio Graphs.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures
Environment: Ab Initio (GDE 1.13 Co>Op Sys 2.13), UNIX, PL/SQL, Oracle 8i, IBM DB2, COBOL, UNIXWindows 2000