Teradata Developer/dba Resume
CA
SUMMARY:
- Over 11+ years of professional IT experience in Teradata, Data Warehousing, RDBMS, Source System Analysis, Data warehouse modeling, Application Design, Development, Testing and Implementation in Data warehouse for Banking and Financial projects
- Strong experience in Teradata Architecture, Developing BTEQ Scripts, Parallel Processing and utilities like MLOAD, Fast Export, Tpump and Fast Load.
- Extensive experience on Data Warehousing Database Teradata Database 12, 13 & 14
- ETL and data integration experience in designing and developing ETL mappings and scripts using Informatica Power Center 8.1/7.1 and Power Exchange 5.1
- Experience in design and development of ETL processes from databases such as Teradata, Mainframe/DB2, Oracle, Sybase and Flat files sources.
- Experience working with Teradata Parallel Transporter (TPT) utility and Temporal Feature.
- UNIX shell scripting (Korn and Bash) experience for developing wrapper scripts for ETL jobs and for creating environment files, to create the jobs and job streams (schedules) for daily runs.
- Strong proficiency in writing SQL (including ORACLE and Teradata).
- Good exposure on optimizing the SQL and performance tuning.
- Thorough knowledge on data warehouse concepts like Star Schema, Snow Flake, Dimension and Fact tables and Slowly Changing Dimensions.
- Hands on experience in Teradata RDBMS using FastLoad, MultiLoad, Tpump, FastExport, TeradataSQL Assistant, Teradata Administrator and BTEQ experience.
- Strong knowledge of Teradata RDBMS Architecture (AMP, Parsing Engine, BYNET and Data distribution).
- Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, training, implementation and post - implementation review.
- Involved in Understanding and translating the business requirements and to High Level and Low level design for ETL processes in Teradata.
- Performing Data validation, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst.
- Extensively worked on Monitoring system capacities, system growth, and using Teradata SQL implanted data compression techniques to improve application performance.
- Extensive experience with ETL tool Informatica in designing the Workflows, Worklets, Tasks and
- Experience in optimizing and performance tuning of Mappings and implementing the complex business rules by creating re-usable transformations, Mapp lets and Tasks.
- Extensive knowledge of Teradata architecture and parallel processing
- Experience in working with other ETL tools like Pentaho Kettle and Data Stage
- Responsible to co-ordinate with multiple teams across vendors for ETL code development, testing, issue resolution and production.
- Developed ETL migration documents and operation manuals for migrating in various environments
- Extensive experience in Production Support of Data Warehousing System
- Extensive Knowledge of RDBMS concepts, PL/SQL, Stored Procedure and Normal Forms.
- Familiar with building end to end solution, onshore-offshore model and multi-site development projects, System Integration, migration, maintenance and Support projects
- Experience on working with Client, Customer and User interfacing projects.
TECHNICAL EXPERTISE:
Operating Systems: UNIX, Windows NT/ 2000/ XP
Database: Teradata 12. xx / 13. xx / 14.xx
Languages: Teradata SQL, PL/SQL, Shell Scripting
Teradata Utilities: Teradata - Mload, Fast Load, BTEQ, Fast Export, Tpump, and TPT
Teradata Tools: Teradata SQL Assistant, Teradata Manager, PMON, Visual Explain
ETL tools: Data Stage 8.5, Informatica Power center 9.0, 9.1, 9.5
Process/Methodologies: Waterfall Methodology, Agile Methodology
Project Planning: Microsoft Project
MS Office Applications: Word, Excel, PowerPoint, Visual Basic, SharePoint
Advanced Excel Skills: Pivot tables, VLookup, HLookup, IF statements, List functions
Testing Tools: Clear Quest, Jira
PROFESSIONAL EXPERIENCE:
Confidential, CA
Teradata Developer/DBA
Responsibilities:
- Interact with business systems as appropriate and gather requirements.
- Developing MLOAD scripts for loading the data from flat files to the staging tables.
- Developing BTEQ scripts to load the data from the staging tables to the base tables.
- Developed the Teradata Macros, Stored Procedures to load data into I Confidential emental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
- Working with utilities Like BTEQ, MLOAD, FLOAD, TPump etc.
- Coded BTEQ scripts to meet client requirements.
- Responsible for coordinating with the business system and analysts in the requirement gathering process and creating technical specific document.
- Gathering requirements from the various systems business users.
- Responsible for project management and planning.
- Co-ordinate with offshore and onshore on regression testing.
- Responsible to do the data validations for test data load support, data load will be process to provide data for downstream testing purpose.
- Involved in performance tuning by implementing secondary indexes, join indexes, compression, collecting statistics also by checking the explain plan and skew factor.
- Provided support during the system test and user acceptance testing.
- Constructed DDL's for staging and target view/table creation with proper usage of UPI/NUPI.
- Suggested to the team a better PI and other optimizing techniques for having a better access to data.
- Performance changes to allow fast handling of transaction processing related request (plan caching).
- Analyzing production support documents and finding the feasible time to run their jobs.
- Document requirements and business scenarios
- Develop applications using Teradata, Flume and Manual ETL Jobs
- Test and Roll out application(s).
- Provide daily support to the application and provide immediate solutions to the
- Operational/functional/data related issues proactively.
- Monitor the daily, weekly, and monthly routines and fix any issues.
- Analyze the data quality issues and discuss solutions with the functional teams.
- Take proactive measures to avoid repetitive issues.
- BTEQ, Fast load, Multiload and FastExport data model changes, Query Tuning, changing the Primary Index & Secondary Index, creating Joins, Index etc.
Environment: Teradata 13.10, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq, TPT), Viewpoint 15.00, UNIX, ER studio.
Confidential, San Rafael, CA
Teradata Developer
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, Fast Load, MultiLoad, and Tpump.
- Defined the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes.
- Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
- Extensively used Parameter file to override Mapping parameter, Mapping Variables, Workflow Variables, Session Parameters, FTP Session Parameters and Source-Target Application Connection parameters
- Used Constraint Based loading & Target load ordering to efficiently load tables with PK-FK relation in the same mapping.
- Architected and developed Fast Load and MultiLoad scripts developed Macros and Stored procedures to extract data, BTEQ scripts to take the date range from the database to extract data.
- Performed reverse engineering of physical data models from databases and SQL scripts.
- Provided database implementation and database administrative support for custom application development efforts.
- Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.
- Provided database implementation and database administrative support for custom application development efforts.
- Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance Testing and Regression.
- Wrote Shell scripts and Stored Procedures for regular Maintenance and Production Support to load the warehouse in regular intervals and to perform Pre/Post Session Actions.
- Provided 24/7 On-call Production Support for various applications and provided resolution for nighttime production job abends, Confidential end conference calls with business operations, system managers for resolution of issues.
Environment: (Designer, Repository Manager, Workflow Manager, Workflow Monitor), Oracle 10G/11G, Teradata 14, UNIX, Citrix, Toad, Putty.
Confidential, CA
Teradata Developer
Responsibilities:
- Involved in requirements gathering, business analysis, Design and Development, testing and
- Performed Tuning and optimization of complex sql queries.
- Worked on Teradata Stored Procedures
- Extracted data into Teradata stating from FTP Site using Informatica ETL Tool
- Implemented slowly changing type II dimensions using date
- Worked effectively with volatile and Global Temporary tables
- Used Parameter files to define values for parameter and variable used in the mappings and sessions.
- Created tables, Indexes, views and stored procedures according the requirements
- Unit testing self and other mappings before deploying the code for different environments.
- Handled UNIX operating system tasks by generating Pre-and Post-Session UNIX Shell Scripts.
- Called stored procedures in PL/SQL for certain key business requirements.
- Effectively utilized PRIMARY INDEX, Secondary Index and PPI.
- Involved in Unit and Integration Testing of the Data Warehouse.
- Wrote, tested, implemented Teradata Fast load, BTEQ Scripts, DDL and DML’s.
- Prepared Technical design documents based on Functional design documents
- Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user acceptance testing and loading history data into Teradata.
- Implemented slowly changing Type 2 dimensions using date
- Created Views as part of developing semantic layer.
- Tuned existing queries by creating join indexes and worked on query optimization
- Written BTEQ Scripts which contains complex transformational logic
- Worked with Volatile tables and global temporary tables to debug issues.
- Worked on production issues and as well as change requests
- Prepared project related and process related functional documents
- Involved in Scrum meeting on day to day activities and involved in technical review meetings
- Performed unit testing and supported deployment activities.
Environment: Teradata12.0 Teradata Priority Scheduler, Teradata SQL Assistant, Fast Load, Multi Load.
Confidential, Union City, CA
Teradata/ETL Developer
Responsibilities:
- Created BTEQ Scripts
- Worked with temporary tables like Global temporary table and Volatile table
- Worked with all major indexes like Secondary index, PPI
- Worked with FLOAD to load data for testing purposes
- Worked with FEXP to export data from different environment
- Created shell scripts
- Created process related documents
- Did Unit testing before migrating the code to different environments.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes.
- Designed and developed ETL Mappings using Informatica to extract data from flat files and Oracle.
- Extensively worked on Teradata SQL using Bteq scripts and Teradata SQL Assistant
- Worked effectively with Volatile table and Global temporary tables
- Tuned sql using explain plan, collect stats and rewriting the current SQL Queries
- Created tables, views, Indexes as per user requirements
- Worked on production issues, change requests and take care of deployment process.
- Involved in writing ETL Queries to validate the as per mapping specifications.
- Validation of star schema using SQL
- Validation of Record count between Source and Target
- Involved in writing Regression Queries.
- Involved in writing smoke test cases, PK test cases and Business Test cases.
- Involved in bench marking of policies from production.
- Involved in client status calls and defects logging in Quality center.
Environment: Teradata, SQL Assistant 12.0, Teradata utilities - BTEQ, Fast Export, Fast Load, Viewpoint
Confidential, Sturtevant, WI
Teradata/ETL Developer
Responsibilities:
- Analyze the existing stored procedure in DB2 and translate to Teradata Stored Procedures and BTEQ scripts, TPT Scripts.
- Analyze the existing Informatica code in DB2 and converting the mappings and sessions to the Teradata Database.
- Expert in building Red hat Linux physical and virtual servers, upgraded Linux servers from one OS release to another.
- Good Experience in bulk data integration and transformation, real time data integration and replication and data quality and governance by using Oracle Data Integration (ODI).
- Experience in implementation and maintenance of VMware, DNS, DHCP, NIS, NFS and SMTP.
- Develop framework for the Data Loads, Audits and Controls.
- Used Teradata Utility tool TARDIS for ETL works from different sources to Teradata.
- Used Teradata utilities FastLoad, Multiload, Tpump to load data from various source systems.
- Created Control-M Jobs for scheduling ETL jobs.
- Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Responsible to tune ETL procedures and STAR schemas to optimize load and query performance.
- Extracted data from Flat files and Oracle database and loaded them into Teradata.
- Worked on deployment process to push code to other environments used Jenkins, Stash, BOB, Perforce.
- Designed, developed, and implemented optimal ETL solutions for automation of client information transformation and movement, using Talend.
- Developed technical Best practices for ETL related activities, including client data movement, quality, and cleansing using Talend.
- Worked on Hive no SQL to transform data to load into Teradata.
- Worked with Informatica IDQ with various data profiling techniques to cleanse, match/remove duplicate data by using IBM Info sphere.
- Provided support in migration of existing ETL framework to Elastic Map Reduce and Hive (Cloud) technology to improve scaling and performance.
- Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
Environment: Teradata 15, Informatica IDQ, SQL Server, DB2, Control-M, Hadoop-hive, Business Objects, UNIX, JIRA.
Confidential, Boston, MA
ETL Developer
Responsibilities:
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the Database.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Created Universes, Classes and Objects for management system based on the warehouse schema.
- Developed Complex Reports (Used multiple data providers, Union, Intersection, minus and Master/Detail, cross tab, Charts).
- Worked with other team members and performed the knowledge transfer.
- Created Ad hoc Reports using Full Client in Business Objects.
- Developed various full client and thin client reports using Business Objects and Web Intelligence.
- Used Informatica designer for developing mappings, transformations, which includes aggregation, Updating, lookup, and summation.
- Provided technical assistance/support to analysts, business community and responded to inquiries regarding errors, problems, or questions with programs/interfaces
Environment: Informatica Power Center, Business Objects (Full Client, Designer, Webi, Supervisor), Info View, Web Intelligence, Oracle DBMS, SQL Server, DB2 .
