Teradata Developer Resume
Professional Summary:
- Over six (6+) years of IT experience in Teradata, Informatica, Datastage, SQL, PL/SQL and Unix shell scripting.
- Extensive Experience on Teradata database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables with in committed timelines.
- Expertise in maintaining data quality, data organization, metadata and data profiling.
- Experience in Business Analysis and Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
- Extensive experience with development, testing, debugging, implementation, documentation and production support.
- Expert in Dimensional Data Modeling, using Data Modeling, Star-Join Snowflake Schema Modeling, FACT & Dimensions Tables, Logical & Physical Data Modeling using ERWIN and Oracle Designer.
- Proficient in creating reports using Business Objects XI R2 functionalities such as Queries, Master/Detail and Formula, Slice and Dice, Drilling, Cross Tab and Charts.
- Solid Expertise in Oracle Stored Procedures, Triggers, Index, Table Partitions and experienced in Loading data like Flat Files, XML Files, Oracle, DB2, SQL Server into Data Warehouse/Data Marts using Informatica.
- Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.
- Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
- Expertise in RDBMS, database Normalization and Denormalization concepts and principles.
- Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
- Strong skills in coding and debugging Teradata utilities like Fast Load, Fast Export, MultiLoad and Tpump for Teradata ETL processing huge volumes of data throughput.
- Created tables and views based on the layout sent by Clients.
- Sound Knowledge of Data Warehousing concepts, E-R model & Dimensional modeling (3NF) like Star Schema, Snowflake Schema and database architecture for OLTP and OLAPapplications, Data Analysis and ETL processes.
- Experienced in Informatica Power center Repository from 6.2 to 9.0.1
- Strong experience with ETL tool Informatica 8.1/7.1/6.2, AB intio and DATA STAGE 8.0.1
- Created mapping documents and work flows and data dictionaries.
- Good knowledge of Data Warehouse concepts and principles (Kimball/ Inman) – Star Schema, Snowflake, SCD, Surrogate Keys, Normalization/ De-normalization.
- Experienced in data modeling and Reverse Engineer by using ERWIN, Microsoft VISIO and Oracle Designer
- Designed and modeled many Data Marts as per business requirements.
- Very well conversant with Oracle 9i/10g/11g and SQL Server 2005/2008, Analytical Functions, Database Design, Query Optimization and Performance tuning.
- Experience with IMPORT/EXPORT, Data-Pump, Sql-Loader and built-in packages in Oracle.
- Worked extensively in Development of large Projects with complete END to END participation in all areas of Software Development Life Cycle and maintain documentation.
- Quick adaptability to new technologies and zeal to improve technical skills.
- Good analytical, programming, problem solving and troubleshooting skills.
ETL tools-
Informatica Power Center 9.0.1/8.6.1/7.x/6.x
Programming Language-
Teradata SQL, PL/SQL, ASE isql, C, C++
Databases-
Teradata (V2R12, V2R6,V2R5), SQL Server 2000, Oracle(Pl/SQL)
Teradata Tools & Utilities-
TASM, BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant, Teradata Administrator, TSET, Index Wizard, Statistics Wizards.
Data Modeling Tools-
ERWIN
Migration Tools-
Data Mover.
Process/Methodologies-
Waterfall, Agile Methodology
Professional Experience:
Confidential, Quincy, MA Feb 2011 – Till date
Teradata Developer
Description:
The project involved in migrating data from source systems and is fed into the EDW (Enterprise Data Warehouse). This process includes extraction of data from source systems, applying transformations and loading the data after the query tuning and performance check. The process of Extraction, transformation, compare and loading of the Data from DB2 into the Data warehouse is done using the Teradata Client Utilities likeFast loadand Multiload. Coding of the processes is done using Teradata SQL and Bteq scripts. These Processes are scheduled to run daily, weekly or monthly. Teradata SQL and Client Utilities have played a significant role in migrating the data to Data warehouse a/nd achieving the expected gains.
Responsibilities:
- Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
- Created Logical Data flow Model from the Source System study according to Business requirements on MSVisio.
- Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
- Created UML Diagrams including Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration Diagrams and Deployment Diagrams, Data Flow Diagrams (DFDs), ER Diagrams and Web Page Mock-Ups using Smartdraw, MS Visio & Rational Rose.
- Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
- Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
- Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
- Worked closely with analysts to come up with detailed solution approach design documents.
- Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, FastExports, Multiloads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
- Provided support during the system test, Product Integration Testing and UAT.
- Verifiedif implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, dependencies are set as requested.
- Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
- Provided quick production fixes and proactively involved in fixing production support issues.
- Have strong knowledge in data mover for importing and exporting of data.
- Creating and maintaining source-target mapping documents for ETL development team.
- Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.
- Development of test cases and testing.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
- Analyze business requirements, designs and write technical specifications to design/redesign solutions.
- Involved in complete software development life-cycle(SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.
- Developed technical design documents (HLD and LLD) based on the functional requirements.
- Coordinate with Configuration management team in code deployments.
Environment: Teradata12, Teradata 5500, SQL Assistant 12.0, Business Objects, Hyperion, BTEQ, FastLoad, FastExport, Multiload, Korn Shells, MS Visio, Remedy, Kintana, Data Mover.
Confidential, Stamford, CT Aug 2010 – Feb 2011
Teradata Developer
Description:
Confidential is engaged in the research, development, production, sales, and licensing of prescription and non-prescription (over-the-counter) medicines and hospital products. The company seeks to continually develop and sell innovative, high-quality pharmaceutical preparations, in conjunction with clinical data and medical education, to create more value for patients and the healthcare system.
Responsibilities:
- Created UML Diagrams including Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration Diagrams and Deployment Diagrams, Data Flow Diagrams (DFDs), ER Diagrams and Web Page Mock-Ups using Smartdraw, MS Visio & Rational Rose.
- Worked on the teradata stored procedures and functions to confirm the data and have load it on the table.
- Worked closely with analysts to come up with detailed solution approach design documents.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, FastExports, Multiloads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
- Worked with complex SQL queries to test the data generated by the ETL process against the target database.
- Used SQL Assistant to querying Teradata tables.
- Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
- Provided quick production fixes and proactively involved in fixing production support issues.
- Have strong knowledge in data mover for importing and exporting of data.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
- Analyze business requirements, designs and write technical specifications to design and redesign solutions.
- involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Extensively used transformations in Informatica Power Center, Datastage ETL Environments to move.
- Involved in complete software development life-cycle(SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.
- Coordinate with Configuration management team in code deployments.
Environment: Teradata12, SQL Assistant 12.0, Business Objects, BTEQ, FastLoad, FastExport, Multiload, Korn Shells, MS Visio, Data Mover.
Confidential, Souderton, PA. Apr 2009 – Jul 2010
Teradata Developer
Description:
“Confidentialspecializes in the implementation ofi-nteractive trial management solutions. Confidential serves both pharmaceutical and biotechnology industries, managing multiple facets of clinical trials. Confidential\'s clinical technology enables clients to expedite clinical trial development on a global scale through Drug reconciliation, advanced randomization schemes, real-time enrollment data, and efficient clinical trial material (CTM) management. The project was implemented to integrate the data source and also involved in clinical data samples.
Responsibilities:
- Prepared Business requirement documents, ETL specification documents according to the business requirements from clients.
- Followed the SDLC Agile methodology.
- Worked on Teradata Store procedures and functions to conform the data and load it on the table.
- Created Informatica Mappings and TPT Scripts to load Medical, Eligibility and Pharmacy claims from flat file to table.
- Worked with TPT wizards to generate the TPT scripts for the Incoming Claims data.
- Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
- Tuning SQL Queries to overcome spool space errors and improve performance.
- Worked extensively in Tuning queries by elimination cursor based store procedure and using the set based UPDATE statements.
- Worked in the conversion of SQL server functions into the Teradata Store procedure for confirming the data.
- Designed the Data Quality engine using Dynamic SQL execution.
- Designed the Unique key combination using various fields and join the tables for Reporting purposes.
- Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform.
- Also involved in creating Views, Conformed views and mapping them from the table.
- Worked to create TPT script template for all medical, eligibility and pharmacy flat file coming.
- All Submission files were designed according to HIPAA standards.
- Conducted design review meetings with the senior architect for any new requirement to get the approval and designed the mappings according to the company standards.
- Used MS Visio to explain the architecture and mapping to the clients.
- Worked with complex SQL queries to test the data generated by ETL process against target data base.
- Used SQL Assistant to querying Teradata tables.
- Designed complex mappings Involving target load order and constraint based loading.
- Performance tuning, bug fixing and error handling.
- Involved in creating table for medical, pharmacy and eligibility claim files based on the requirements from the clients and also from the layouts.
- Designed testing procedures and test plans. Provided Production support.
Environment: Teradata12.0/8.5, MS SQL Server, Informatica 8.6, TPT, Fload, Mload, UNIX, Toad, TPT script Wizard 13.0, Teradata administrator, BTEQ, EDIT Plus, Ultra edit.
Confidential, Germantown, MD Oct 2008 - Mar 2009
Teradata Developer
Description:
DMCS (Debt Management and Collections Service)
The Collections Subsystem provides the Department of Education (ED) with a means of inquiring into the current status of accounts and debts. A collector may determine account balances, repayment status, Internal Revenue Service (IRS), Department of Justice (DOJ), and Federal Defaulter status of an account. The subsystem may also be used to send letters, to review the letter history of an account, and to request address data from the Health and Human Services (HHS) National Directory of New Hires (NDNH) data base for both Federal Family Education Loan (FFEL) and guaranty agency (GA) accounts.Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Created tables, views, Macros in Teradata, according to the requirements.
- Used Error handling strategy for trapping errors in a mapping and sending errors to an error table.
- Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
- Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to Teradata RDBMS using BTEQ, MultiLoad and Fast Load.
- Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Responsible for Collect Statics on FACT tables.
- Design and development of the complete Decision Support System using Business Objects.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
- Used Teradata Data Mover for overall data management capabilities for copying indexes, global temporary tables.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to sup-port normalized and dimensional models.
- Created proper Primary Index taking into consideration of both planned access of dat and even distribution of data across all the available AMPS.
- Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
Used volatile table and derived queries for breaking up complex queries into simpler queries. - Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process. Streamlined the Teradata scripts and shell scripts migration process on theUNIX box.
- Developed UNIX shell scripts to run batch jobs in production.
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
Environment: Teradata V12, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FAST EX-PORT, UNIX Shell Scripts. Query Analyzer, TOAD, Windows XP, Control-M scheduling tool, Data Mover.
Confidential.Sep 2006 – Sep 2008
Teradata Developer
Description:
- Confidential’s drive to become as good in Non Food business as it is in Food has initiated lot of thoughts in the management. As part of these initiatives from the Business, MIS has played a crucial role to support them through IT. Currently Non Food business have best forecast system for the Management in UK called TRAIN (Tesco Reporting Adhoc Information for NON FOOD) in order to cope up with the demands. Now Project Scale marks a change in the way the Non-Food business runs in the Republic of Ireland. With Project Scale the TESCO INTERNATIONAL Non-Food retail business will be operated the same way as in UK marking ROI as an integral part of the UK operations.
Responsibilities:
- Designing flow of data from daily tables to weekly tables.
- Coding of Teradata scripts and main frame jobs.
- Preparing unit and System integration test cases.
- Involved in setting up of data for UAT environment and supporting.
Environment: Teradata V2R5.1, Business Objects 6.5, V2R5.1, CA Erwin 4.1,VB6, BTEQ, Fastload, FastExport, Multiload, JCL, TSO/ISPF, SDSF, Rational ClearCase, ClearQuest and Endevor.