We provide IT Staff Augmentation Services!

Sr Teradata Developer Resume

0/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY:

  • Expert in tuning complex Teradata SQL Queries including Joins, Correlated Sub queries, and Scalar Sub queries.
  • 10+ years of experience in Teradata Enterprise Data Warehouse (EDW) and Data Mart.
  • Teradata Certified Professional in V2R5.
  • Experience working with clients of all sizes in the insurance, financial, Retail, healthcare, manufacturing industries.
  • Good understanding of Data Warehousing concepts such as Third Normal form (3NF) as well as Dimensions, Facts and Star Schema.
  • Expert skills in Teradata DBMS Administration, development and production support, use of Teradata Manager, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL and ARCMAIN, TASM for workload management.
  • Experience working with Teradata issues&incidents through Teradata@YS and the Global Support Center (GSC).
  • Working with SAP BUSINESS OBJECTS/OBIEE team to optimize various report run times by applying various TD performance techniques such as Partitioning, JIs, AJIs, USI and NUSI.
  • Existing FT Physical DM is created based on even distribution of tables by defining 10 to 20 column Primary Indexes not taking usage into consideration. Redesigned and Optimized the Finance Transform Physical DM by defining the right PIs and PPIs using ELDM. Also implemented multi - valued compressions and Soft RIs when necessary.
  • Since Teradata Environment is new to Navistar, mentoring oracle s who are transitioning to Teradata Platform.
  • Implemented various Teradata recommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, Data Mover, and Backup.
  • Created and scheduled various Teradata recommended Performance, Capacity, trending and activity reports such as Space, Usage, Access, security reports.
  • Did the Teradata SQL troubleshooting and fine tuning of scripts given by Analysts and developers.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Copied the data from Production into Dev, SIT, UAT systems using Arc main Scripts.
  • Created Extended Logical Data Model (ELDM) from Logical Model to get it to better Physical Data Model for Discover Bank Data Mart. Identified the correct UPI and NUPI. Created proper UPI&NUPI after going thru Extended Logical Models. Created USI&NUSI if necessary after looking at the usage patterns and performance issues. Applied the Teradata Compression to many columns in various tables to save space.
  • Created PPI for some of the monthly and daily tables to have a good performance on range queries with date constraints. Also created NUSI on PI column for the PPI tables have a good performance on queries with PI access.
  • Implemented ETL level collection statistics for the Indexes and for some of the columns going to be involved in the constraints or join conditions.
  • Defined account IDs, priority scheduler performance groups, and system date and time substitution variables in user and profile definitions.
  • Configured the TDWM and customized for the Discover business model.
  • Responsible for the design and implementation of the entire DW Dimensional Model architecture using Join Indexes and Partitioned Primary Indexes to achieve and exceed the desired performance.
  • Set up the Teradata Workload Manager with the proper partitions and workloads based on Service Levels. Set up Teradata Query manager to automate many of the tasks such as collection of statistics, monitoring spool space, candidate tables for purge, etc.
  • Monitored the production and development systems on a daily basis to catch trouble queries, monitor access violations, and provide capacity planning.
  • Modification of views on Databases, Performance Tuning and Workload Management. Maintenance of Access Rights and Role Rights, Priority Scheduling, Dynamic Workload Manager, Database Query Log, Database Administrator, Partitioned Primary Index (PPI), Multi-Value Compression Analysis, Usage Collections and Reporting of ResUsage, AmpUsage, Security Administration Setup etc. and leading a team of developers for workaround with different users for different, complicated and technical issues.
  • Implemented different performance features for several Subject Areas by choosing the right Primary Indexes, ETL level statistics collection, Column Compressions. Also implemented Partitioning on the periodic monthly and Daily tables to increase the performance of range queries. Also implemented NUSI on PI for PPI tables to increase the performance on NUPI access.

TECHNICAL SKILLS:

Operating Systems: Windows, MPRAS UNIX, LINUX

Languages: SQL,C,SAS,COBOL,JCL

Databases: NCR Teradata V2R6, XML, ORACLE, DB2, Sql Server 2000

TD Tools & Utilities: Teradata manager, Teradata Database Query Log, Teradata Priority Scheduler, Teradata Visual Explain, Microstrategy, Informatica, Teradata SQL Assistant

Data modeling: Erwin

TD load utilities (ETL Tools): BTEQ, Fast load, Multiload, TPUMP, Fast export

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr Teradata Developer

Responsibilities:

  • Highly experienced in SQL performance tuning/SQL optimization and debugging of ETL process.
  • Involved in Teradata Query Tuning and tuned complex Queries, and Views, and implemented Macro’s for reduce Parsing time.
  • Handled Teradata performance SQL Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Involved in Teradata 13.10 activities and implemented all new features such as Character based PPI, and Timestamp PPI, and Temporal tables, and columnar tables.
  • Implemented diff types of join Index’s, which helps optimizer to generate better Explain plans, for complex join conditions.
  • Developed Semantic views, and implemented diff types of Indexes such as NOPI, UPI, NUPI, USI and NUSI, PPI, and JI’s, and AJI’s (Multilevel Join Indexes with compressed).
  • Designed and build a Temporary tables thru Batch Job, for Simplify a Complex views.
  • Performed application level activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
  • Tuned & developed a semantic views for - Micro strategy report generated SQL’s.
  • Experienced in Data modeling phase in Data warehousing design- creation of tables and choosing indexes in Teradata.
  • Performed Space Management for Perm & Spool Space.
  • Handled Space Management.

Environment: Teradata 13.10, SQL, Index, Viewpoint, PDCR DATA/INFO, NET VAULT (Backup Restore), Microstrategy, Semantic Views, (Teradata ELT Utilities - Multiload, FastLoad, SQL Assistant, PMON, TSET, TASM, BTEQ, Tpump), ArcMain, Join Indexes, Microsoft Products, Java, Erwin.

Confidential, Charlotte, NC

Sr.Teradata Developer

Responsibilities:

  • Involved in the complete re-design of the process flow, starting with design considerations, technical specification and source to target mapping.
  • Responsible for solving the issues from EIW customer support on 24x7 On-call issues.
  • Involved in Logical and Physical Data Model designers with Modeling Team.
  • Involved in Teradata 13.10 activities in the tests, such as creation of users, spool, temporary, permanent space. Checking the tables skewing, compression.
  • Performance Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Debugged the Teradata SQL, and collected the recommended Collect stats, and also verified the Join Methods.
  • Having good hands on experience on Teradata SQL (Inner Joins, Outer Joins), and Query Tuning, and Optimization and Debugging as well.
  • Performed application level activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
  • Performed Space Management for Perm & Spool Space.
  • Supported Business Object report generated SQL’s.
  • Involved in Capacity planning review process, and estimated the space for Weekly, Monthly, Yearly.
  • Developed Teradata ETL such as Fast Load, Multi Load, Xport, Bteq Scripts.
  • Having good experience Teradata - creating TD DDL - Compress, and Skew factor, tables, multi-table views, denormalization, and index selection.
  • Having good experience -On Writing, debugging TD macros, stored procedures.
  • Worked on loading of data from several flat files sources to Staging using Teradata TPUMP, MLOAD, FLOAD and BTEQ in Mortgage applications.
  • Interacted with Clients, Data Modeler and Data Analyst to develop the Logical & Physical Design for the new tables.
  • Verified the Skew factor tables and Space analysis using Teradata Administrator.
  • Worked on TSET tool, to get the Explain Log, and send it to Teradata team to analyze it.
  • Build tables, views, UPI, NUPI, USI and NUSI, PPI.
  • Designed the BTEQ scripts for Table creation Error handlings.

Environment: Teradata 13.1, UNIX, (Teradata ELT Utilities - Multiload, FastLoad, SQL Assistant, PMON, TSET, TSAM, Visual Explain, FastExport, BTEQ, Tpump), Abi nitio, ArcMain, TD SQL, Performance Monitor, Microsoft Products, Linux, DB2, Erwin, Viewpoint, Teradata Administrator& Manager, PDCR DATA/INFO, DBQL, NET VAULT(Backup&Restore).

Confidential, Medicaid, NY

Teradata Developer

Responsibilities:

  • Checking the tables skewing, table compression, and Index recommendations.
  • Performed application level activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
  • Designing Archival Jobs, considering archival resources like the number of Media Servers, tapes, various limiting factors of Transmit/receive of data to the media in Netvault.
  • Involved in Logical & Physical Data Model designers with Erwin Modeling Team.
  • Debugged MicroStrategy queries, and enhanced the tuning tips.
  • Monitoring Production system whenever in need and controlling Online and Batch jobs.
  • Converted DB2 Logical data models into Teradata Logical and Physical database models.
  • Interacted with Clients, Data Modeler and Data Analyst to develop the Logical& Physical Design for tables.
  • Debugged the Teradata SQL, and collected the recommended Collect stats, and also verified the Join Methods.
  • Having good hands on experience on Teradata SQL (Inner Joins, Outer Joins).
  • Having good experience Teradata - creating TD DDL - Compress, and Skew factor, tables, multi-table views, denormalization, index selection.
  • Performed bulk data load from multiple data source SQL SERVER, LEGACY SYSTEM to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.
  • Worked on exporting data to flat files using Teradata FEXPORT.

Environment: Teradata 13/V2R6, Teradata Manager, (ETL tools) Multiload, FastLoad, PMON, FastExport, BTEQ, JCL, Viewpoint, Tsam, Tset, TD Data Mover, DB2, Tpump, MicroStrategy, Priority Scheduler, Ldap, DBQL, PMCP, Visual Explain, Unix-AIX, Jcl, Mvs, TSO,ISPF, Td sql, ERWIN.

Confidential, Ar

Data Analyst

Responsibilities:

  • Changes to the Teradata programs in Mainframe that contain the SQL statements pointing to the Old data source to point to the new hierarchy tables.
  • Change the reports under the following SWAS merchandising reporting groups on Retail Link that use people or product hierarchy to pull the hierarchy from the new data source.
  • Performance Teradata Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Build Teradata tables, Stored Procedures, UPI, NUPI, USI and NUSI Index.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Experienced with Backup and Recovery using Teradata ARCMAIN.
  • Extensively used EXPLAIN in analyzing the SQL query execution flow. Using this analysis tuned SQL queries to use Teradata resources efficiently.
  • Generated the Teradata Fload, Mload Scripts in UNIX Operating Systems region.
  • Used Normalization and Demoralization considering merits and demerits of SQL query execution.
  • Collected statistics on the indexes for query optimization
  • Experienced in using Teradata Query Manager for enabling access logging and query logging of certain users.
  • Wrote Teradata extraction, Coding scripts according to the client standards.
  • Extensively used ETL to load data from fixed width and delimited Flat files.
  • Experienced in using various Ab initio Components like Transform, Sort, Reformat, Partition, De-Partition etc and Used Ab initio ETL for complex transformations.
  • Wrote Numerous WRAPPER Scripts in UNIX.

Environment: TERADATA/V2R6, ORACLE, QUERY MAN, RDBMS, UNIX, ETL TOOLS (FAST LOAD, BTEQ, MULTI-LOAD, TPUMP, FXPORT), EXPLAIN, TSQL, ABINITIO, GDE1.13, ESSBASE 9.2, JCL, AUTOSYS, ERWIN, VSAM, DB2, MVS, EZTRIEVE.

Confidential, Franklin Lakes, NJ

Data Analyst

Responsibilities:

  • PROCESS-1: To Calculate MCDA AMT (Member Calculated Due Amt) and Insert this data into the True Up table and this Process will run Twice, once to Create 2006 rows and then Change PARM File to create 2007 rows. Use Tmg Subscriber Database to confirm whether Active or Deactive Member. If member exists on this table then it’s Active.
  • Worked on loading data from EDW and Supporting Subject areas using Ab Initio.
  • Worked on exporting data to flat files using Teradata Fxport.
  • Performance Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Build Teradata tables, views, UPI, NUPI, USI and NUSI.
  • Generated the Teradata Fload, Mload Scripts in UNIX Operating Systems region.
  • Used Ab initio ETL, Microstrategy for complex transformations.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Used TSQL Normalization and Demoralization considering merits and demerits of TSQL query execution.
  • Designing and implementing the Logical and Physical Data Model.
  • Write Conditional TERADATA BTEQ/FLOAD/MLOAD/TPUMP/XPORT Scripts and Create the New Tables.
  • Generate Accounting Adjustment files and Write-off, Summary and Audit files for 2006&2007 for both Active &Inactive Members under MEDICARE PART ‘D’ Plan using FAST XPORT utility.
  • Production Implementation and Post Production Support.

Environment: TERADATA/V2R6, EXPLAIN, QUERYMAN, UNIX, INFORMATICA, ORACLE, TSQL, ETL TOOLS (BTEQ, FAST LOAD, TPUMP, MULTI-LOAD), RDBMS, ESSBASE CUBE 9.2, ABINITIO GDE 1.12, AUTOSYS, ERWIN, FILE MANAGER.

Confidential, Overland Park City, KA

Data Analyst

Responsibilities:

  • Analyzing technical Design Speck and Do modifications on the Existed Coding.
  • Interacted with Clients, Data Modeler and Data Analyst to develop the Logical and Physical Design for a table.
  • Worked on Teradata ETL utilities TPUMP, MLOAD, FLOAD, BTEQ and FXPORT for loading data from several flat files sources to Staging.
  • Performance TSQL Tuning, Query optimization (Explain plans, Collect statistics, Cluster and Non-Cluster indexes).
  • Build Teradata Views, UPI, NUPI, USI, NUSI and Informatica ETL for Complex transformations.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Generated the Fload, Mload Scripts in UNIX Operating Systems region.
  • Written several Teradata Stored Procedure scripts to implement the business logic.
  • Worked on loading of data from several flat files sources to Staging using Teradata TPUMP, MLOAD, FLOAD and BTEQ.
  • Generated the OLAP Reports in Microstrategy to the End Users.
  • Converting the T230&T268 Table data from DB2 region to TERADATA region using FASTLOAD and MLOAD, TERADATA STORED PROCEDURE Utilities.
  • Converting the Finance Segments (WFIN25SD, WFIN26SD) data from IMS Database to TERADATA region.
  • Developed TERADATA Web forms, Stored Procedures, Crystal Reports in the database.
  • Developed Ab initio Graphs using various transform, normalized, renormalized components.
  • Experienced in using various Ab initio Components like Transform, Sort, Reformat, Partition, De-Partition etc.

Environment: TERADATA/V2R6, EXPLAIN, QUERYMAN, TSQL, UNIX, INFORMATICA, (ETL TOOLS) BTEQ, FLAODM MLOAD, ORACLE, ESSBASE CUBE 9.2, CA7, JCL, ABINITIO GDE 1.12, AUTOSYS, ERWIN, FILE MANAGER.

Confidential, MI

System Engineer

Responsibilities:

  • Modifications and adding new insurance policies as per Clients Requirements.
  • Worked on SQL query tuning by reviewing Explain plan, and step print of the query.
  • Collected statistics on the indexes for query optimization
  • Experienced in using Teradata Query Manager for enabling access logging and query logging of certain users.
  • Worked on Teradata ETL utilities Tpump, Mload, Fload, Bteq and Fxport for loading data from several flat files sources to Staging.
  • Used Teradata SQL Assistant to run queries
  • Generated the Teradata Fload, Mload Scripts in UNIX Operating Systems region.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Experienced with Backup and Recovery using Teradata ARCMAIN.
  • Extensively used most popular tool of Teradata system, Priority Scheduler, in controlling the load of the system, where we define performance Group, Allocation groups, performance periods, Resource Partition.
  • Generated the OLAP Reports in Microstrategy to the End Users.
  • Creating users with different profiles and Rules to control the access on the system.
  • Experienced in Teradata Manager which is used to create Alerts, Monitor system, and see historical Reports.
  • Familiar with Teradata Database Query Manager.
  • Extensively used EXPLAIN in analyzing the SQL query execution flow. Using this analysis tuned SQL queries to use Teradata resources efficiently.

Environment: TERADATA/V2R5, EXPLAIN, UNIX, QUERYMAN, SAS, TSQL, ETL TOOLS (FLAOD, B TEQ, TPUMP), ORACLE, INFORMATICA, ESSBASE CUBE 9.2, ELIPS,CA7,OLAP,DSS,ABINITIO GDE 1.12, JCL, AUTOSYS, ERWIN, FILE MANAGER.

Confidential, MI

Consultant

Responsibilities:

  • Involved in the complete re-design of the process flow, starting with design considerations, technical specification and source to target mapping.
  • Worked on loading of data from several flat files sources to Staging using Teradata TPUMP, MLOAD, FLOAD and BTEQ.
  • Interacted with Clients, Data Modeler and Data Analyst to develop the Logical and Physical Design for a table.
  • Generated the OLAP Reports in Microstrategy to the End Users.
  • Created users with different profiles and Roles to control the access on the system.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Generated the Teradata Fload, Mload Scripts in Unix Operating Systems region.
  • Moved database schema changes to stage/production database.
  • Extensively worked on the Performance Tuning of mappings, ETL Procedures and process
  • Performed bulk data load from multiple data source (Unix, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.
  • Experienced in working with Teradata ETL tools like Fast Load, Multi Load, Tpump and Fast Export.
  • Used MVC (Multi Value compression) tool for sizing of the database.
  • Requirement gathering
  • Coordinating communication between the business and the offshore team
  • Low Level design specification preparation.
  • Analyzing and coding programs.
  • Preparing test scripts.
  • Coding and testing the code.

Environment: TERADATA/V2R5, EXPLAIN, BO, QUERYMAN, UNIX, TSQL, ETL TOOLS (FLAOD, MLOAD, TPUMP), RDBMS, ESSBASE CUBE 9.2, ELIPS,CA7,OLTP, DSS,ABINITIO GDE 1.12, AUTOSYS, ERWIN, FILE MANAGER.

We'd love your feedback!