We provide IT Staff Augmentation Services!

Teradata Semantic Developer Resume

5.00 Rating

Providence, RI

SUMMARY

  • Having Seven (7+) years of comprehensive experience with s/w skills in Data Warehousing Involves Requirements Analysis, Application Design, Data Modeling, Development, testing and documentation.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co - ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • 7+ years of experience using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad(Fload), Multi Load(MLoad), Fast Export, and Exposure to Tpump on UNIX/Windows environments and running the batch process for Teradata CRM and as well data testing.
  • 2+ Years of experience in Visual Studios (SSAS/SSIS) and Visual Source Safe (VSS) to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Expertise in database programming in writing of the SQL, Stored Procedures, Functions, Triggers, Views in Teradata, Oracle, DB2 & MS Access .
  • Experienced with data warehousing applications in financial services, banking, insurance, retail industries.
  • Experienced in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica.
  • Experience in writing UNIX shell scripts to check the status, space management and performance of the database. Extensive use of Crontab in UNIX environment for scheduling routine tasks.
  • Involved in all the stages of the software development cycle, namely design, specification, coding, debugging, testing (test plan and test execution), documentation and maintenance of the programs.
  • Experience in application development using system analysis, design and modeling techniques like Unified Modeling language (UML), Sequence diagrams, Case diagrams, Entity Relationship Diagrams (ERD).
  • Good team player and can work efficiently in multiple team environments and multiple products. Easily adaptable to the new systems and environments.

TECHNICAL SKILLS

Databases: Teradata (TD) 14,13.10,12, Oracle11g/10g/9i, SQL Server 2008, 2005, 2000, DB2

Operating Systems: Windows2003/2000/XP/NT, UNIX

Languages: SQL, PL/SQL.

TD Utilities: BTEQ, FastLoad (FLoad), MultiLoad (Mload), Fast Export, TPump, Exp/Imp.

Tools: VSS, Arcmain, Viewpoint, BAR, TARA, Teradata Administrator, SQL Assistant, Toad, PuttySQL*Plus, TFS MS Office, Informatica Power Center 7.X/8.X/9.X, SQL BI Tools.

PROFESSIONAL EXPERIENCE

Confidential, Providence, RI

Teradata Semantic Developer

Responsibilities:

  • Designed Technical Spec Document.
  • Designed flow diagrams which explain the process flow of Semantic development.
  • Interacted with Business System Analyst, Modeler to gather information as per the ETL.
  • Created tables, GTT's, views.
  • Worked on choosing a right PI for a table.
  • Coded BTEQ scripts to meet client requirements.
  • Built complex Teradata SQL code in BTEQ script to populate the data as per business rules.
  • Done Unit testing.
  • Worked on Control M tool to create jobs and scheduling.
  • Worked on Subversion tool to save the version of BTEQ script.
  • Troubleshooted issues such as TEMP Space issue, SPOOL Space issues while testing data.
  • Involved in performance tuning by implementing secondary indexes, join indexes, compression, collecting statistics also by checking the explain plan and skew factor.
  • Provided support during the system test and user acceptance testing.
  • Developed test plans, test cases, and test scripts for source to target validation.

Environment: Teradata14, Teradata SQL Assistant, Putty, UNIX, Control M, Subversion.

Confidential, Richmond, VA

Teradata Lead Developer

Responsibilities:

  • Designed Technical Design Document as per the Scope document provided by BA.
  • Designed flow diagrams to present it to Client as per the fix to the issue they reported.
  • Interacted with Business System Analyst to gather information as per changes to be made on business data model.
  • Worked in UNIX and Window's environment.
  • Created tables, GTT's, views.
  • Worked on choosing a right PI for a table.
  • Coded BTEQ scripts to meet client requirements.
  • Developed code as per mapping document as per to fix issue Client faced in data.
  • Collected statistics.
  • Constructed DDL's for staging and target view/table creation with proper usage of UPI/NUPI.
  • Involved in Unit testing, SIT testing and Regression Testing.
  • Worked on Capacity planning document.
  • Worked on complex SQLs, DML’s to map the data as per the requirements.
  • Troubleshooted issues such as TEMP Space issue, SPOOL Space issues while testing data.
  • Created work books for data object migrations, component migrations.
  • Involved in UNIX scripting.
  • Involved in performance tuning by implementing secondary indexes, join indexes, compression, collecting statistics also by checking the explain plan and skew factor.
  • Used WLM tool to run jobs.
  • Provided support during the system test and user acceptance testing.
  • Developed test plans, test cases, and test scripts for source to target validation.
  • Worked on clear case tool to save new and modified scripts as a version tracker.
  • Worked in onsite-offshore model.

Environment: Teradata14, Teradata SQL Assistant, Informatica 9.1, Putty, UNIX, WLM, Clear Case, Clear Quest, TSRM.

Confidential, Chattanooga, TN

Teradata Developer

Responsibilities:

  • Analyzed the functional documents to understand the healthcare domain fields.
  • Created database object based on the logical models and experience in planning of the physical database and data-files mapping.
  • Using Teradata DDLs created Tables, Views, Macros, Triggers and Stored Procedures.
  • Developed the SQL code as per the ETL mapping document to pull data from source to target.
  • Written the SQL code in BTEQ scripts and transferred the scripts from WINDOWS to UNIX environment.
  • Created AJIs, PPIs, UPIs, NUPIs, SIs, and JIs, HIs for having a better performance while interacting with data.
  • Worked on optimizing and tuning the Teradata SQLs to improve the performance.
  • Involved in performance tuning by implementing secondary indexes, join indexes, compression, collecting statistics also by checking the explain plan and skew factor.
  • Used to run the ETL jobs every week through UNIX batch scripts to load data from source to development database.
  • Prepared run book with status summary reports with details of executed, passed and failed test cases.
  • Worked on utilities such as MLoad, FLoad to load data from flat files to database.
  • Implemented error handling techniques in bteq script for better executing of multiple Queries in the scripts.
  • Used VIEWPOINT to access the query execution time and explain plans, to evaluate the efficiency of queries also utilized the teradata visual tools like Teradata Manager, Database query language (DBQL), TDQM, Visual explains.
  • Suggested to the team a better PI and other optimizing techniques for having a better access to data.
  • Provided support during the system test and user acceptance testing.
  • Developed test plans, test cases, and test scripts for source to target validation.
  • Performed unit testing, integrated testing.
  • Involved in testing of data and performance tuning for the long running queries.
  • Tested the source and target row count and validated the results.
  • Performed natural key test on source and target tables.
  • Took backup of data using TARA.
  • Worked on SAP BO to extract reports for analysis purpose.
  • Worked on SAS to do data mining and produce statistical reports.
  • Developed routine SAS Macros to create tables, graphs and listings for inclusion in clinical study reports.
  • Extensive use of PROC SQL to perform queries.

Environment: Teradata13.10, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq), Teradata SQL, Viewpoint, TARA, SAP BO, SAS, UNIX.

Confidential, Dallas, TX

Teradata Developer

Responsibilities:

  • Involved in Physical Database Design from logical data model (LDM) to physical data model (PDM).
  • Normalized the data to 3NF.
  • Converted the data mart from Logical design to Physical design, defined data types, constraints, indexes, generated Schema in the Database, created automated scripts, defined storage parameters for the objects in the Database.
  • Designed and implemented mapping of Oracle sources to Teradata staging/core tables.
  • Worked with ETL leads to formulate ETL approach.
  • Applied SCD Type 2 concept when populating target tables of the data warehouse
  • Designed/created of physical database objects (tables, views (DDLs), indexes) to support normalized and dimensional models.
  • Employed ETL best practices including trimming, casting, cleansing, and validation of data.
  • Managed design/development of delta loads (transactional data)/full loads (master data).
  • Performed peer code reviews for local as well as offshore resources.
  • Participated and lead by-weekly team status meeting discussions/ scrum.
  • Using TD DDLs created Tables, Views, Macros, Triggers and Stored Procedures.
  • Constructed DDL's for staging and target view/table creation with proper usage of UPI/NUPI.
  • Create and Maintain Users Profile Definition to determine the Performance
  • Coding using Teradata Analytical functions, BTEQ SQL of TERADATA.
  • Executed explain plans to determine optimum performance of queries.
  • Worked on Visual Source Safe (VSS) as to save latest versions of modified code in scripts.
  • Worked on Visual Studios to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Created UPIs, NUPI’s, SI’s, PPIs, Join and hash Indexes/JI’s, HI’s for having good distribution/retrieval of data across/from AMPs.
  • Worked on complex SQLs to map the data as per the requirements.
  • Populate or refresh Teradata tables using FastLoad (FLoad), MultiLoad (MLoad), FastExport, TPump TD load utilities.
  • Involved in Performance tuning of SQLs and debugging SQL for TD load utilities.
  • Explored ways to optimize existing ETL processes and enhance their performance.
  • Worked SSAS processing OLAP Cubes and adding dimensions to it.
  • Worked on SSIS, created packages for execution of SQL Scripts.
  • Developed Test Plans, Test Cases, and Test Scripts for SIT and support for UAT tests.
  • Performed UNIT testing, Integrated Testing.
  • Provided system integration testing/production support for go-live deployments.
  • Involved in Testing of data and Performance tuning for the long running queries.
  • Tested the source and target row count and validated the results.
  • Prepared status summary reports with details of executed, passed and failed test cases
  • Created Test Cases and developed Tractability Matrix and Test Coverage reports.
  • Managed and conducted System testing, Integration testing and Functional testing.

Environment: Teradata 13.10, Teradata SQL Assistant, Viewpoint, Visual Studio, Teradata (TD) Load Utilities (MultiLoad (Mload), FastLoad (FLoad), Fast Export, BTEQ), SSAS, SSIS.

Confidential, Pleasanton, CA

Teradata Lead Developer

Responsibilities:

  • Interacted with business analyst and business team based on the need to understand the business purpose of the requirements.
  • Worked on building a star schema.
  • Requirements analysis, data assessment, business process reengineering, index maintenance & analysis. Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.
  • Created database schemas based on the logical models and involved in planning of the physical database and data-files mapping.
  • Experience using modeling tools, logical and physical database design. Maintaining and updating database models, generating or modifying the database schemas and data warehouse loading.
  • Created tables, views, macros, stored procedures using TD DDL
  • Worked on Informatica 8.X using necessary transformations.
  • Highly proficient in writing loader scripts through TD utilities like FastLoad (FLoad), MultiLoad (MLoad), FastExport, Tpump.
  • Worked on optimizing and tuning the Teradata SQLs to improve the performance of batch.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD.
  • Responsible for the design and development of the database models with the logical data modeling.
  • Worked on Visual Source Safe (VSS) as to save latest versions of modified code in scripts.
  • Worked on Visual Studios (SSAS) to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Designed BTEQ scripts with the necessary SQL incorporated in it and ran through batch files.
  • Used SQL Server 2005 to have an automated run of scripts for Monthly loads.
  • Worked on TPT to transfer data from PROD to TEST environment.
  • Optimized distribution of data on amps thru creation of matching PI's across instances.
  • Implemented different kind of purging techniques to free up space in certain teradata databases and to remove orphan records from collection tables.
  • Provided support during the system test, Product Integration Testing and User Acceptance Testing.
  • Edited the data that is in excel as per the User request by applying filters right from the data connection which is excel.

Environment: Teradata 12, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq), Viewpoint, TASM, Visual Studio, Tableau 4, Teradata SQL, Teradata EDW Roadmaps, Teradata Relationship Manager, SSIS, SSAS, Informatica 8.X

Confidential, Houston, TX

Teradata Developer

Responsibilities:

  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Involvement in physical database design and database optimization, performance tuning.
  • Debugged the problems when migrating from SQL Server to Teradata (Conversion of data types, Views, tables etc.).
  • Created complex mappings to compare two databases SQL Server and Teradata.
  • Developed FastLoad (FLoad), MultiLoad(Mload),TPump and BTEQ scripts to load data from various data sources and legacy systems to Teradata
  • Used Workflow Manager for Creating, Validating, Testing and running the workflows and Sessions and scheduling them to run at specified time.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Analyzing production support documents and finding the feasible time to run their jobs.
  • Performance changes to allow fast handling of transaction processing related request (plan caching).
  • For getting alerts of highly time taking queries used Viewpoint.
  • Created tables, multitable views, views, macros using TD DDL.
  • Created Logical Data Model from the Source System study according to Business Requirements.
  • Created PIs, Sis, PPIs and AJIs to have an efficient performance of the database.
  • Create and execute test plans for Unit, Integration, and System test phases.
  • Automated related tasks by developing UNIX shell scripts used to maintain the core EDW.
  • Worked closely with Project Managers, Business Analysts, and BI Architect, source system owners, Data Management/Data Quality team to ensure timely and accurate delivery of business requirements.

Environment: Informatica Power Center 7.1.3, Teradata 12, Viewpoint, TASM, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq) Teradata SQL, SQL Server, Windows XP.

Confidential, Kansas City, KS

ETL Developer

Responsibilities:

  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Responsible for writing Program specifications for developing mappings
  • Worked on collecting the data from pre-paid and post-paid source systems of Sprint Services.
  • Loaded data into Teradata using Data Stage, FastLoad, BTEQ, and Fast Export, MultiLoad, and Korn shell scripts.
  • Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model.
  • Worked closely with the source team and users to validate the accuracy of the mapped attributes.
  • Troubleshoot and created automatic script/SQL generators.
  • Sending the stats report to the high level management using the pivot tables.
  • Helping the Reporting team by providing the teradata queries
  • Fulfilled ad-hoc requests coming from superiors.
  • Analyzing the specifications and identifying the source data that needs to be moved to data warehouse.
  • Involved in extracting, cleansing and transforming of the data
  • Responsible for writing PL/SQL procedures and functions to load the data marts.
  • Involved in tuning of SQL queries for better performance. Worked on database connections, SQL joins views and aggregate conditions.
  • Knowledge of AbInitio software in the analysis of data, batch processing and data manipulation.
  • Data Quality Analysis to determine cleansing requirements.
  • Involved in the performance tuning of Informatica mappings/sessions for large data files by increasing block size, data cache size, and commit interval.
  • Created mappings with different transformations, mapping parameters and variables
  • Used pmcmd commands in the UNIX scripts and responsible for development of test Cases for unit testing the Autosys jobs and UNIX scripts.
  • Worked on migration issues from development to testing environments and fixed the same.
  • Worked with the QA team to research on the issues they raised.
  • Responsible for documentation of projects
  • Worked on preparing a table of summarization and aggregation of the fact data.

Environment: Informatica 6.1,Teradata V2R5, SFTP, BTEQ, Query man, Multi load, fast export, Oracle9i, ERWIN, PL/SQL, Perl, SQL*Loader, Visual Basic, SAS, Business Objects, Toad, Data Quality, Control M, UNIX and Windows, SQL Server, TOAD

We'd love your feedback!