We provide IT Staff Augmentation Services!

Sr Teradata Dba Resume

SUMMARY

  • Having almost 9 years of total IT experience in the with strong expertise in working with large scale and midsized Data Warehouse implementations, TeradataAdministration, Monitoring, Performance Tuning, Capacity Planning using Teradata, Informatica Power Center/Power Mart, PowerExchange, Oracle, SAP on UNIX and Windows platforms.
  • Have very good experience administering theTeradatadatabase, Data Modelling, ETL and BI operations.
  • Has solid programming experience inTeradataSQL, UNIX and proficient in usingteradataadmin tools Viewpoint,Teradataadministrator,TeradataManager and Performance Monitor.
  • Extensive experience inTeradataDatabase Design, Application Support, Tuning & Optimization, User & Security Administration, Data Administration and Setting up the Test and Development environment.
  • Solid knowledge inTeradataArchitecture and Query Tuning.
  • Extensive knowledge on Teradata protegrity to implement data protection for sensitive PHI/PII data elements in datawarehouse and ETL processes.
  • Good experience inTeradataRDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT,TeradataParallel Transporter and BTEQTeradatautilities.
  • Experience in Backup and Restore (BAR) methodologies. Design, implement and co - ordinate all back-up and restore strategies.
  • Worked onteradataarchive/restore and data replications tools- Arcmain, TARAGUI, Netback-up and Data mover.
  • Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like viewpoint TASM,TeradataDynamic Workload Manager and TeradataManager.
  • Expertise in implementing complex business rules by creating robust mappings, Mapplets, Reusable Objects using Informatica Power Center and Power Mart.
  • Experience in performance tuning of sources, mappings, targets, sessions, SQL and Teradata Utilities scripts.
  • Extensive business understanding of domains such as Food & Retail, Telecom, Banking (Cards, Loans, Customer and Consumer banking), Financial, Insurance and Healthcare sectors.
  • Extensive understanding and expertise in Dimensional Modeling using Star Schema & Snow Flake Schema, Teradata Communication Model and developing Physical and Logical data Models using ERwin tool.
  • Excellent skills in Data Analysis and maintaining Data Quality / Data Integrity
  • Knowledge and Experience in working on Bigdata hadoop cluster integrating with ETL tools like Abinitio and loading the target RDBMS like Oracle and Teradata.
  • Experience in preparing documentation such as Business Requirement Document, High Level Design, System Requirement Document, Table Level Spec, Functional Requirement Document, System Interface Agreement, System design specifications and UAT Plan Document, ETL specification document etc.
  • Expertise in implementing complex business rules by creating robust mappings, Mapplets, Reusable Objects using Informatica Power Center and Power Mart.
  • Experience in performance tuning of sources, mappings, targets, sessions, SQL and Teradata Utilities scripts.
  • Strong Teradata SQL skills and experience in creating and executing stored procedures.
  • Experience in working with different Databases such as Teradata, Oracle, SQL Server, MS Access and writing efficient and complex SQLs on huge volumes of data.
  • Experienced in working and troubleshooting the various load and unload Teradata Utilities like BTEQ, FastLoad, MLoad, FastExport and TPT (Teradata Parallel Transporter).
  • Expertise in creating jobs that can execute in stream and bulk loads.
  • Expertise in developing custom code generator wrapper scripts which will generate FastLoad and FastExport scripts automatically. This has reduced lot of development time for developers to create these scripts.
  • Strong experience in writing UNIX Shell scripts, SQL Scripts for Development, Automation of ETL process, error handling and auditing purposes.
  • Experience in migrating data warehouse from DB2 to Teradata.
  • Experience in upgrading Teradata 13.0 to Teradata 14.0 versions.
  • Experience in using Control-M, Cronacle and Dollar Universe scheduling tools to organize, schedule and monitor jobs.
  • Complete knowledge of full life cycle design & development for building Data Warehouse.
  • Responsible for code migrations from Dev to Test and then to Production and schedule jobs using Control-M.
  • Responsible for monitoring the production jobs and resolve production issues.
  • Experience in project management, project estimations and resource management activities.
  • Excellent problem solving skills with strong technical background and good interpersonal skills.

TECHNICAL SKILLS

Operating Systems: UNIX, Linux, Windows XP/2000/98/95, Mainframes, HP-UX

Databases: Teradata V14/13/12/V2R6/V2R5, Oracle 10g/9i/8i, DB2, SQL Server 2000/2005, MS Access

Tools: & Utilities: TOAD, Teradata Queryman, SQL Navigator, SQL Loader, Teradata Manager/ViewPoint, PMON, DBQL, Mainframes utilities like JCL, FileAid.

Teradata Utilities: BTEQ, FastLoad, FastExport, MultiLoad, TPump, TPT

ETL Tool: Informatica Power Center 9.5.1/9.1/8.6/8.5.1/8.1.1/7. x

BI Reporting Tools: SAP Business Objects XIR2/XIR3/5.1, Cognos 7.0/6.0, OBIEE 10g/11g

Data Modeling Tools: Erwin 4.5.2

Programming Languages: C, C++, HTML, PL/ SQL, Core Java

Scripting Languages: UNIX Shell Scripting, AWK

Scheduling Tools: Control-M, Redwood Cronacle, Dollar Universe

Methodology: CMMi, Six Sigma, SOX Compliance

Test Management Tools: HP Quality Center 10.x/9.x, ALM

PROFESSIONAL EXPERIENCE

Confidential

Sr Teradata DBA

Responsibilities:

  • Worked asTeradatadatabasearchitectand involved in the end to end system planning, design, mapping and data delivery for the Data protection project..
  • Involved in end - to-end software development life cycle ranging from Data Requirement Analysis, Data Modeling, Data Architecture and implementation.
  • Initiated and coordinated the analysis and design requirements for data marts including schema and ETL scripts.
  • Develop data strategies to load data from various ERP systems like SQL server, Oracle data sources to Teradata and load the protected data on a fly into EDW.
  • Actively involved in designing the model for EDW2.0 in designing the data flow to EDW.
  • Designed the ETL processes using ETL(informatica), SAS&Teradatato load data from Db2/Oracle system and files to targetTeradatadatabase.
  • Closely Interact with key business users for evaluating the new business report requirements
  • Closely work with data stewards for identifying metadata information for the new EDW 2.0 environment.
  • Work with DBA to identify gaps in data modeling, data mapping between source systems andTeradataand provideTeradataperformance knowledge to users
  • Develop strategies for performance tuning and implement tuning methodologies and do integrity check on tuned queries
  • Prepare the Detail Design, Implementation plan and other documents viz Test plans, test cases etc
  • Involved inTeradatadatabase administration activities like monitoring, performance tuning, troubleshooting, maintenance, backup, archive and recovery (BAR).
  • Executing the System/ Integration/ UAT Testing
  • Coding the BTEQ IMPORT, Fast load & Multi Load utility scripts in UNIX for data loading.
  • Strong experience in Creating Database Objects such as Tables, Views, macros, Stored Procedures, Indexes inTeradata.
  • Closely Interact with key business users for evaluating the new business report requirements
  • Performing Research and Development on various stream operators Teradata TPT and various load operators like FLOAD,MLOAD,FEXPORT as part of ETL process.
  • Responsible for developing various Teradata Stored Procedures to be used in the ETL process and documenting the best practices and standards to be used across JPMC.
  • Developed Automation of key performance metrics(KPIs) validations like PJI, UII, Spool Usage, Utility slot holding time using BTEQs.
  • Developed Teradata TPT utility scripts to move the data from PRD environment to the lower environments for the purpose of testing.
  • Responsible for debugging performance issues in Informatica mappings and suggesting necessary changes to the mappings to improve performance.
  • Responsible for performance tuning the Cognos, Tableau and OBIEE report queries for better performance and optimal usage of Teradata resources.
  • Responsible for creating and scheduling the loads using Control-M, BTEQs to enhance parallelism feature of Teradata and documenting the standards.
  • Identified and Implemented performance improvement related tasks like, having correct statistics on tables, Indexes and Teradata Partitioned Primary Index (PPIs). Secondary Indexes, Join Indexes and Aggregated Join Indexes to improve performance of the report queries using large tables.
  • Instrumental in resolving any Teradata related issues like space issues, spooling issues etc.. encountered by the Application Development teams.
  • Responsible for creating the Abinitio graphs for ETL processing and loading, performance testing and tuning the hadoop utility/edge node and data nodes performance to make sure there are no CPU,Memory and IO, load balancing bottlenecks.
  • Responsible for performance testing the target databases like Oracle and Teradata performance when Abinitio ETL loads data from data nodes of hadoop cluster by extracting and reviewing Oracle AWR reports and Teradata DBQL query logging.

Environment: & Softwares: Informatica Power Center 9.5.1, Teradata 14, BTEQ, MultiLoad, FastLoad, FastExport, TPT,ESA(Enterprose Security Administrator),Protegrity data protection,, UNIX (Korn) Shell Scripting, Flat Files, XML Files, Excel, Control-M.

Confidential

Teradata Modeller/Application DBA

Responsibilities:

  • Supporting theTeradataEnterprise Data and Analytics platform including mentoring, implementing best practice recommendations, and participating in theDBAday-to-day activities.
  • Producing the Capacity Planning reports for the management and as well doing the Capacity Forecasting in order to plan for the upgrades.
  • Provide exceptionalTeradataDatabase Administration, Monitoring and Maintenance services for the customer's production environments. Performing Research and Development on various stream operators Teradata TPT and various load operators like FLOAD,MLOAD,FEXPORT as part of ETL process.
  • Data Mover jobs performance and Server maintenance.
  • Involved in Backup / Restore - Performing backup, recovering and monitoring, Implemented Online Archive feature for certain databases which had ETL conflicts.
  • Managing database space, allocating new space to database, moving space between databases on need basis.
  • Responsible for developing various Teradata Stored Procedures to be used in the ETL process and documenting the best practices and standards to be used across JPMC.
  • Developed BTEQ scripts to execute SQLs in batch mode which is replacing job that was developed using SAS programming.
  • Developed Teradata TPT utility scripts to move the data from PRD environment to the lower environments for the purpose of testing.
  • Heavily involved in System Performance activities such as: Workload management, TASM, Job Scheduling, Priority /Resource allocation, Query Performance Tuning and Optimization.
  • Performed bulk data load from multiple data source (ORACLE, legacy systems) toTERADATA RDBMS using BTEQ, Fastload, Multiload, Tpump.
  • Responsible for performance tuning the Informatica mappings, Cognos, Tableau and OBIEE report SQL queries for better performance and optimal usage of Teradata resources.
  • Responsible for creating and scheduling the loads using Control-M, BTEQs to enhance parallelism feature of Teradata and documenting the standards.
  • Identified and Implemented performance improvement related tasks like, having correct statistics on tables, Indexes and Teradata Partitioned Primary Index (PPIs). Secondary Indexes to improve performance of the report queries using large tables.
  • Instrumental in resolving any Teradata related issues like space issues, spooling issues etc.. encountered by the Application Development teams.
  • Responsible for creating the Abinitio graphs for ETL processing and loading, performance testing and tuning the hadoop utility/edge node and data nodes performance to make sure there are no CPU,Memory and IO, load balancing bottlenecks.
  • Responsible for performance testing the target databases like Oracle and Teradata performance when Abinitio ETL loads data from data nodes of hadoop cluster by extracting and reviewing Oracle AWR reports and Teradata DBQL query logging.
  • Raising the Tickets at T@YS / Incidents Managements and coordinating with GSC to implement the fix/patch.
  • Responsible for Performance Reports (PDCR) -Daily, Weekly, Monthly and Quarterly reports generation, review and presentation.

Environment: & Softwares: Teradata 13/14/15, BTEQ, MultiLoad, FastLoad, FastExport, TPT, Informatica Power Center 9.5.1,Tableau, Cognos,PL/SQL, UNIX (Korn) Shell Scripting, Flat Files, XML Files, Excel, Control-M.

Confidential

Teradata Application DBA

Responsibilities:

  • Developed complete Normalized Data Warehouse with Tier - 4 Architecture.
  • Developed Logical and Physical model using Erwin.
  • Creating databases, users in dev, test and production environments.
  • Designed, developed, optimized and maintained database objects like tables, views, indexes, soft RI, common procedure macro etc. Testing and implementation of systems.
  • Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
  • Managing database space, allocating new space to database, moving space between databases as needed basis.
  • Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
  • Proactively monitoring bad queries, aborting bad queries using viewpoint and PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
  • Proactively monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
  • Worked on moving tables from test to production using fast export, fast load and TPT.
  • Extensively worked with DBQL data to identify high usage tables and columns.
  • Implemented secondary indexes on highly used columns to improve performance
  • Good experience with query performance and tuning. Extensively used explain plans, visual explain to understand optimizer plans, created various NUSI index as needed bases, also used Statistics wizard, Index Wizard for collecting stats and recommending indexes.
  • Implemented Single Sign on and LDAP authentications as per the Norms of the Company
  • Developed various DBQL reports such as top 10 queries with high CPU, Top 10 queries with high I/O using PDCR and SYS MGMT.
  • Implemented various Teradata alerts using Alert facility in Teradata Manager and Viewpoint. Involved in setting up alters to SMS DBA for events such as node down, AMP down, too many blocked sessions, high data skew etc.
  • Used Teradata Manager and Viewpoint collecting facility to setup AMP usage collection, canary query response, spool usage response etc.
  • Worked on capacity planning, reported disk and CPU usage growth reports using DBQL, Resusage .
  • Worked on exporting data to flat files using Teradata FASTEXPORT.
  • Automated collection of Stats using Shell scripts.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Populated data into Teradata tables by using Fast Load utility.
  • Created Teradata complex macros and Views and stored procedures to be used in the reports.

Environment: Teradata 13/13.10, Teradata 12, Viewpoint, Teradata Manager, Suse Linux, Shell Scripting, Teradata Administrator, Teradata SQL Assistant, Bteq, MlOAD, TPT, PMON, TARA, TASM,Tpump,NetBackup 6.5,BTEQ,ERWIN.

Confidential

Application Teradata DBA

Responsibilities:

  • Created Teradata objects like Databases, Users, Profiles, Roles, Tables, Views and Macros.
  • Security administration including creating and maintaining user accounts, passwords, profiles, roles and access rights.
  • Capacity planning including review and analysis of usage, capacity, and performance data.
  • Performed Space Management for Perm & Spool Space.
  • Performance trending and analysis including recommending indexing strategies, partitioning, compression opportunities and statistic collection planning.
  • Reviewed the SQL for missing joins & join constraints, data format issues, mis - matched aliases, casting errors.
  • Used various AMP based utilities like VPROC manager, Lockdisp, Showlock, Qrysessn.
  • Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
  • Performed bulk data load from multiple data source (legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Used FastLoad for loading into the empty tables.
  • Analyzed initial workload to understand the impact of various designs prior to having populated tables using TSET.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Involved in troubleshooting the production issues and providing production support.
  • Collected statistics every week on the tables to improve performance.

Environment: Teradata 12, NCR Teradata V2R6, Teradata Manager, Suse Linux, Shell Scripting, Teradata Administrator, Teradata SQL Assistant, Bteq, TPT, PMON, ArchMAIN, TASM, Tpump, Net Backup 6.5, BTEQ, ERWIN.

Confidential

Teradata DBA

Responsibilities:

  • Design and implementation of semantic model.
  • Contribute and participate in project reviews, quality reviews and audits
  • Resolve any Teradata related inquiries, issues or requests raised by users and different application teams
  • Review queries and tables usage on a periodic basis and analyze objects that can be dropped to free up space in the system
  • Use system tables in DBC and identify applications with badly running queries and assist teams in performance gains
  • Database tuning and optimization
  • Recommend indexes and stats for the objects.
  • Implement confidential databases and Row Level Security (RLS) as home grown solution.
  • Problem, Incident and Change Management for Target
  • Review change requests on Production data warehouse to minimize the impact to the system
  • Modeling, Installation and Configuration of new databases and users
  • Create/modify database & roles, provide access rights, perm & spool space to users
  • Monitor for Locks and clear Blocking sessions using Teradata Manager
  • Monitor the Batch jobs, CPU intensive and slow performing queries run by users and suggest changes to improve the performance of the queries
  • Carry out the migrations for different phases of SDLC - Dev, SIT, UAT & Production
  • Refresh the DDLs from Production database to Non-Production environments
  • Perform the DDL QA checks on the objects involved in the migration to ensure quality migration
  • Ad-Hoc requests to support application teams for Space & Access Management

Environment: Teradata 12, NCR Teradata V2R6, Teradata Manager, Suse Linux, Shell Scripting, Teradata Administrator, Teradata SQL Assistant, Bteq, TPT, PMON, ArchMAIN, TASM, Tpump, BTEQ.

Confidential

ETL/Teradata Consultant

Responsibilities:

  • Responsible for attending business and design review meetings to understand the design changes.
  • Has been part of daily scrum meetings as part of AGILE where we present daily status of the work being carried out by each team member and any impediments in performing the work.
  • Responsible for converting all the Informatica mappings, sessions, workflows to Teradata standards.
  • Responsible for tuning the Informatica mappings to perform data loads both bulk and stream adhering to standards and best practices.
  • Checking for any DB2 based SQL overrides in the Informatica code and converts it into Teradata equivalent code.
  • Responsible for developing various Teradata Stored Procedures to be used in the ETL process.
  • Responsible for implementing Teradata Stored Procedures for in the code for different ETL subject area and debugging issues and helping team understand the procedure code to resolve issues.
  • Developed Unix Shell scripts to invoke different ETL processes to automate the whole process for better handling of over all ETL process.
  • Developed Teradata TPT utility scripts to move the data from PRD environment to the lower environments for the purpose of testing.
  • Responsible for debugging performance issues in Informatica mappings and making changes to the mappings to improve performance.
  • Involved in performance tuning sources, mappings and SQL queries for better performance and prepared performance tuning standards to meet SLA and maintained coding standards throughout.
  • Responsible for performance tuning the Business Objects report queries for better performance and optimal usage of Teradata resources.
  • Identified and Implemented performance improvement related tasks like, having correct statistics on tables, Indexes and Teradata Partitioned Primary Index (PPIs) to improve performance of the report queries using large tables.
  • Instrumental in resolving any Teradata related issues like space issues, spooling issues etc., by working closely with DBAs.
  • Responsible for thoroughly testing all the converted Informatica mappings, session and workflows in Teradata platform and compare the test results with that of DB2 data warehouse.
  • Responsible in migrating code changes from lower environments onto the PRD and support production job issues.

Environment: Informatica Power Center 9.5.1, Teradata 14, BTEQ, MultiLoad, FastLoad, FastExport, TPT, ERwin 4.5, OBIEE 11g, PL/SQL, UNIX (Korn) Shell Scripting, Flat Files, XML Files, Excel, Control-M and Redwood Cronacle Scheduling tools.

Confidential

Informatica/Teradata Developer

Responsibilities:

  • Responsible for the creating stored procedures and database triggers.
  • Wrote SQL scripts to create staging tables and developed SQL*Loader control files to load the data from data files into the staging tables.
  • Assists other members in project team in domain or application functionality to complete project design and build phase activities as required for the project.
  • Understand the Business requirements andestimate the effort for the new requirements.
  • Trend analysis of defects/logs for identification of problem areas and to improve productivity.
  • Customization/Creation of ETL components.
  • Responsible for creation of robust informatica mappings involving lookups, aggregator and union transformations to load the data to staging tables.
  • Understand the Business requirements and estimate the effort for the new requirements.
  • With Technical and functional expertise of the system helping other teams to understand and develop new systems that would make the business more productive and efficient.
  • Take care of CR (Change Request) to migrate the informatica mappings and workflows from one environment to higher environmens.
  • With Technical and functional expertise of the system helping other teams to understand and develop new systems that would make the business more productive and efficient.
  • Creation of Unit Test Cases and System test cases for testing the data loaded in datawarehouse.
  • Performance tuning of Informatica components of the application wherever required
  • Preparing the estimation for new CR and bug fixes.
  • Doing the System improvement plan wherever required.

Environment: Informatica 8.x, Erwin, SQL, PL/SQL, Toad, File Zila, Unix/Oracle 9i.

Hire Now