We provide IT Staff Augmentation Services!

Data Warehousing Resume Profile

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Proficient in Teradata Database Design Physical and Logical , Application Support, Performance Tuning, Optimization, User Security Administration, Data Administration and setting up the Test Development environments and Teradata Active System Management TASM
  • Having 7 years of experience of Teradata administration, data warehousing ETL, FSLDM and DBA expert maintenance activities, Archive/Recovery operations, Teradata user and database creations Good experience in Teradata RDBMS using FastLoad, MultiLoad, TPump, Teradata SQL.
  • Hands on experience in Teradata Admin/Management/Analysis tools - Teradata Manager, Performance Monitor, Priority Scheduler, Teradata Dynamic Workload Manager, Teradata Workload Analyzer, Teradata Index Wizard, Teradata Visual Explain, Teradata Viewpoint , SQL Assistant and Teradata Administrator
  • Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis
  • Good understanding of Database Skew, PPI partition primary Index , Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash
  • Experience in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers
  • Worked on full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support
  • Experience in Query Analyzing, performance tuning and testing
  • Experience in writing UNIX shell scripts to support and automate the ETL process
  • Good Experience in Creating and managing user accounts.
  • Hands on experience on scripts for monitoring the system, Automated the Arcmain for the database.
  • Experience Installed Teradata drivers for the Teradata utilities. Refreshed the data by using fast export and fastload utilities. Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Good experience performance tuning, including collecting statistics, analyzing explains determining which tables needed statistics.
  • Creation of various Database reports for Space and CPU usage, table growth and skew table's information, logon off data and illegal logon information and query duration report.
  • Strong in backend testing using SQL queries, generating reports to ensure data Integrity.
  • Efficiency in defect tracking and reporting system.
  • Experience in database stored procedures, functions and writing PL/SQL queries using Oracle.
  • Quick learner and excellent team player having ability to meet tight deadlines and work under
  • pressure.
  • Good interpersonal skills, committed, result oriented, hard working with a quest and Zeal to learn new technologies.

TECHNICAL SKILLS:

  • Primary Tools : Teradata SQL, Teradata tools Utilities
  • DBMS/RDBMS : Teradata V2R6.x/12.0/13 .x
  • Teradata Utilities : BTEQ, FastLoad, MultiLoad, Fast Export, Tpump
  • SQL Assistant, Teradata Manager, Archive/Restore,
  • Configuration /Reconfiguration, Table Rebuild,
  • Teradata View point Teradata Parallel Transport utility TPT
  • ETL : Data Stage, Informatica
  • Operating Systems : Windows 95/98/NT/2000/XP, UNIX.
  • Scripting Languages : UNIX Shell Scripting
  • Reporting Tools : BO, Cognos

PROJECT EXPERIENCE

Confidential

Contributions

  • Worked on statistics which improves number of unique values for partitioning columns and for several types of tables using Teradata V13.0
  • Working on Teradata viewpoint Alert setup, workload management, query monitor etc.
  • Handling data migration through one box to another through Data Movers
  • Performance tuning and databases maintenance.
  • Work on, PDCR, DBQL
  • Implementation of new PDM releases/versions.
  • Creation and maintenance of the Teradata Databases, Users, Tables, Views, Macros, Stored Procedures and MLOAD, FLOAD, BTEQ scripts.
  • Creation of common re-usable components utilities for automating DBA support activities based on UNIX shells scripts, Macros, Stored Procedures and Teradata User Defined Functions UDF's .
  • BAR responsibility Netvault, TeraGUI for daily and weekly Teradata backups.
  • User Management, Space Management and Database Security managing access, roles and profiles .

Project 2

Confidential

Description

  • The Consumer Credit Bureau CCB data mart is designed to capture all individual customers credit card, loan, overdraft accounts and their status which are to be reported to Credit Bureau of Singapore CBS for every month. The information required for fulfilling the above requirement will be processed via Enterprise Data Warehouse EDW from respective operational source systems.
  • The scope of the CCB is migrating the existing CCB data mart from its existing Oracle 8.1.3 based processing to Teradata EDW based processing. Under the current process the data from source systems SG CIF, SG Loans, SG Credit Cards and SG Overdrafts is extracted and populated to Operation Data Store ODS database and from ODS the data is populated to CCB data mart for all the regular accounts information and for all charge off accounts information the data is provided by collections team and is directly loaded to CCB data marts tables.

Contributions

  • Responsible for setting up access rights and space allocation for Teradata environment
  • Created Teradata Standards and Best practices document for the ETL and Teradata DBA team to follow the process
  • Designed DDLs and efficient PIs along with Identity Keys for efficient data distribution
  • Design, create, and regular tuning of Physical database objects tables, views, indexes to support normalized and dimensional modelsProvide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring
  • Working on Teradata viewpoint Alert setup, workload management, query monitor etc.
  • Handling data migration through one box to another through Data Movers
  • Performance tuning and databases maintenance.
  • Creation and maintenance of the Teradata Databases, Users, Tables, Views, Macros, Stored Procedures and MLOAD, FLOAD.
  • Creation of common re-usable components utilities for automating DBA support activities based on UNIX shells scripts, Macros, Stored Procedures and Teradata User Defined Functions UDF's .
  • BAR responsibility Netvault, TeraGUI for daily and weekly Teradata backups.
  • User Management, Space Management and Database Security managing access, roles and profiles .
  • Prepare and get sign-off for functional spec, technical spec
  • Prepare and get sign-off for downstream mapping
  • Fill gaps for new upstream attributes required for enhancement
  • Design and Development of Transformation Views, interfaces, Packages, Scenarios in ODI
  • Testing -Conduct SIT,UAT and get sign-off
  • Also Responsible for overseeing the Quality procedures related to the project
  • Reporting directly to client for work progress

Project 3

Confidential

Description

Union Central life Insurance is one of the biggest Insurance Company. It offers life insurance, disability income insurance and annuities tailored to meet your insurance and investment needs. Union Central life Insurance Company is merged with Ameritas and Acacia in 2002 and commonly called as UNIFI Company.

Contributions:

  • Actively participated in transforming enhanced logical data model to physical data model
  • Designed the ETL processes using DataStage to load data from Oracle, Flat Files, Fixed Width , XML files to target Teradata Database
  • Creation of proper Teradata Primary Indexes PI for both planned access of data and even distribution of data across all the available AMPS
  • Redesigned and Optimized the Physical Data Model by defining the right PIs and PPIs using Erwin. Also implemented multi-valued compressions when necessary
  • Responsible for performing application level DBA activities like creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility
  • Responsible for space management and user management including accounting based on performance group allocation
  • Involved heavily in writing complex SQL queries based on the given requirements and used volatile tables, temporary tables, derived tables for breaking up complex queries into simpler queries
  • Responsible for Tuning the Sql queries in order to improve the performance and also monitor system resource usage at User level
  • Review the Cognos reports and provide join optimizations to improve the performance
  • Responsible for creation and maintenance of Teradata Databases, Users, Tables, Views, Macros and Stored Procedures using Teradata Administrator WinDDI , SQL Assistant Queryman , SQL DDL, SQL DML, SQL DCL, BTEQ, MLoad, Fastload, FastExport, TPUMP, Statistics Index and Visual Explain
  • Gather information from different source systems and loaded into warehouse using FastLoad, FastExport, MultiLoad, BTEQ and Unix shell scripts
  • Responsible for Regular Database cleanup activities based on Database space/access

Project 4

Confidential

Description

Teachers Insurance and Annuity Association College Retirement Equities Fund TIAA CREF is a Fortune 100 financial services organization that is the leading retirement provider for people who work in the academic, research. TIAA-CREF is headquartered in New York City in United States. TIAA was created in 1918 as a stock life insurance company for the purpose of providing retirement income for professors through fixed premium guaranteed deferred annuity contracts. TIAA-CREF has been helping those in the academic, medical, and cultural and research fields plan for their retirement by providing different retirement plans.

Contributions

  • Actively involved in Data Gathering, recognizing and confirming the data sources/elements
  • Interact with business partners for the design based on their Business requirements
  • Worked with Data Modeling team for solving physical model related issues
  • Involved in DBA activities in the tests, such as creation of users, spool, temporary, permanent space.
  • Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
  • Performed Application level DBA activities like creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata View Point utility
  • Worked on complex queries to map the data as per the requirements and analyzed the current data movement ETL DataStage process and procedures.
  • Creating and maintaining Database objects and users and review the system utilization for ETL users
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts Fast Load, Multiload and working with loader logs.
  • Worked extensively with Teradata utilities BTEQ, Multiload, Fastload, TPUMP, and Fast export for data analysis purpose
  • Worked with Teradata Utilities like TPT, TPUMP, MLoad, FastLoad, Fast Export, and BTEQ to export or import the files
  • Reviewed the SQL for missing joins join constraints, data format issues, mis-matched aliases, casting errors
  • Query Analysis using Explain plan for unnecessary product joins, confidence factor, join type, order in which the tables are joined
  • Extensively used Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard
  • Tuning of Teradata SQL long running queries using Explain and analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc
  • Performed Statistics on multiple columns on all the non-indexed columns used during the join operations all columns used in the residual conditions
  • Used UNIX shell scripts for automating tasks for BTEQ and other utilities
  • Used PMON, Teradata manager to monitor the production system during online day
  • Supporting different Application development teams, production support, query performance tuning, system monitoring, database needs and guidance.
  • Responsible for Regular Database cleanup activities based on Database space/access/check disk reports
  • Refresh the data from Production to Development using Load utilities in order to do the testing in lower environments
  • Responsible for implementing security changes, administering backup recovery, capacity management and project implementations
  • Archive and Restore Data using Net Vault and creation of Business access views

We'd love your feedback!