Etl Teradata Developer/data Warehousing Analyst Resume
Lincolnshire, IL
SUMMARY
- Nine years of experience in Information technology wif expertise in Teradata Database in Data warehousing Environment, Oracle PL/SQL
- More than 8 years of experience as Teradata Developer in Data warehousing Environment
- Through knowledge in Oracle and Teradata RDBMS Architecture.
- Extensive use of data loading and Export tools like Fastload, Multiload, Fast Export and Tpump.
- Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
- Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
- Well versed wif Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
- Has profound knowledge and experience in Dimensional Modeling and 3NF using teh modeling tool Erwin.
- Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts.
- Automated teh BTEQ report generation using Unix scheduling tools on Weekly and Monthly basis.
- Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
- Involved in teh collection of statistics on important tables to has better plan from Teradata Optimizer.
- Did teh performance tuning of user queries by analyzing teh explain plans, recreating teh user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of teh BTEQ scripts.
- Well versed wif understanding of Explain Plans and confidence levels. Very good understanding of Database Skew.
- Involved in designing and building stored procedures and macros for teh module.
- Involved in troubleshooting, Performance tuning and Performance monitoring.
- Skilled in Data Warehousing Data Modeling using Star Schema and Snowflake Schema..
- Created UNIX Shell scripts for Autosys scheduling.
- Sourced from disparate data sources IBM DB2, Oracle, and SQL Server and loaded into Oracle and Teradata DW.
- Involved in various stages of Software development life cycle.
- Experience on various Operating systems like UNIX, Windows.
- Developed UNIX Shell scripts for Batch processing.
TECHNICAL SKILLS
Databases: Teradata V2R6 / V12/V13/V14, DB2, Oracle 8.x / 9.x/10G
ETL Tools: Ab Initio (GDE 1.15 Co>Op 2.15), EME, Informatica
Reporting Tools: Business Objects 6.x / 5.x, Seagate Crystal Reports 6.x / 5.x / 4.x / 3.x, SQR 6.x, Forms 6i, Reports 6i, and MS Access Reports.
Programming: PL/SQL, VBA, UNIX Shell Scripting, Teradata SQL, ANSI SQL, Transact SQL, C, SQL, C++, JAVA, HTML, XML, COBOL, JCL
O/S: NCR UNIX 00 / 5500, Sun Solaris 2.6 / 2.7, HP - UX, IBM AIX 4.2 / 4.3, MS-DOS 6.22, Novell NetWare 4.11 / 3.61
PROFESSIONAL EXPERIENCE
Confidential, Lincolnshire, IL
ETL Teradata Developer/Data Warehousing Analyst
Responsibilities:
- Interact wif current and potential Business and IT partners in teh effective planning and optimization of porting of their applications/tools to Teradata.
- Create MLOAD, FAST LOAD, FAST EXPORT and BTEQ scripts in Mainframe.
- Maintained database objects required for teh data warehouse application, including: tables, indexes, stored procedures, triggers, SQL scripts and views.
- Fine tune of SQL to improve teh query performance.
- Used Teradata Viewpoint for creating and maintaining teh Sessions. Teradata Viewpoint also used to Monitor, edit, schedule, copy, aborts and deletes teh session.
- Use JCL scripting to design, develop, test and support Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse.
- Defined, documented and executed test cases to effectively validate teh new components of teh data warehouse being developed/enhanced or otherwise maintained.
- Used ETL for efficient mapping and transformation techniques and processed teh data from various source to Teradata target tables.
- Evaluated, planned, managed, tracked, and provided status on system maintenance, enhancement, and support activities at a task level.
- Migration of tables from SQL server and DB2 databases to Teradata.
- Hands on CA workstation to monitor teh daily load jobs.
- Knowledge on various scripts to automate teh workflows through ESP scheduler.
- Involved in development and documentation of proposals for implementation of policies and plans, as related to optimization of teh Enterprise Data Warehouse (EDW) ecosystems and their related technical, administrative and management processes and procedures.
Environment: Teradata V14.00 (Fast load, Mload, Bteq, SQL assistant, Fast export), JCL, Teradata Viewpoint, CA workstation, SQL server, DB2, Mainframe MVS/OS UNIX Shell Scripting.
Confidential, Memphis, TN
Senior Teradata/ETL Developer
Responsibilities:
- Interacted wif teh business Analysts to understand teh flow of teh business.
- Involved in gathering requirements from teh business and design of physical data model.
- Involved in Data Extraction, Transformation and Loading from source systems.
- Involved in writing complexSQLqueries based on teh given requirements
- Loaded data into Teradata using FastLoad, BTEQ, FastExport, MultiLoad.
- Written several Teradata BTEQ, TPT scripts for reporting purpose
- Creating Test cases for Unit Test, System Integration Test and UAT to check teh data quality
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
- Involved in teh collection of statistics on important tables to has better plan from Teradata Optimizer.
- Creating Test cases for Unit Test, System Integration Test and UAT to check teh data quality
- Did teh performance tuning of user queries by analyzing teh explain plans, recreating teh user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Developed teh Teradata macros which pulls teh data from several sales table and performs calculations and aggregations and dumps into a results table
- Worked wif database administrators to determine indexes, statistics, and partitioning to add to tables in teh data warehouse
- Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables
- Performed Unit testing, Integration testing and generated various Test Cases.
- Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Subqueries, EXISTS, COALESCE, NULL etc.
- Prepared Job scheduling docs and Job Stream List Using Dollar U for code migration to test and production.
- Created proper PI taking into consideration of both planned access of data and even distribution of data across all teh available AMPS.
- Developed various abinitio ETL graphs for data extraction from oracle database.
- Extract and load monthly data for a customers.
- Handling teh coordination of Acquisition System Process is design for complex objectives of loading for huge flow of loading data into tables. It is basically a work table is creates using Teradata Fastload Uitility.
- Worked wif various XML files by using abinitio to load into tables.
- Worked wif webservices team to interface for communication between ePRS, CE/SE and Dynamic Profile client requests from EDW.
Environment: Teradata V13 (BTEQ, FastLoad, MultiLoad, Teradata SQL, FastExport) Oracle, Ab Initio (GDE 3.0.1, Co>OS: 3.0.1) EME,PL/SQL Trillium, UNIX Shell-scripting
Confidential, Pleasanton, CA
Senior Teradata Developer/Technical Analyst
Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Developed mappings to load data from Source systems like oracle, AS400 to Data Warehouse.
- Development of scripts for loading teh data into teh base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Dealt wif Incremental data as well Migration data to load into teh Teradata.
- Involved in Designing teh ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouIse.
- Worked efficiently on Teradata Parallel Transport and generated codes.
- Generated custom JCL scripts for processing all mainframes flat files, IBM DB2.
- Developed performance utilization charts, optimized and tuned SQL and designed physical databases wif Mainframes/MVS COBOL, Teradata load utilities, SQL.
- Responsible for trouble shooting, identifying and resolving data problems.
- Created proper PI taking into consideration both planned access and even distribution of data across all teh available AMPS.
- Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
- Created series of Teradata Macros for various applications in Teradata SQL Assistant.
- Involved in writing complex SQL queries based on teh given requirements and for various business tickets to be handled.
- Created Teradata models in Erwin.
- Performance tuning for Teradata SQL statements using Teradata Explain command.
- Created several SQL queries and created several reports using teh above data mart for UAT and user reports.
- Identified and tracked teh slowly changing dimensions, heterogeneous Sources and determined teh hierarchies in dimensions.
- Creating Test cases for Unit Test, System Integration Test and UAT to check teh data.
- Performed scheduling techniques wif ETL jobs using scheduling tools, cron jobs through pmcmd commands, based on teh business requirement.
- Developed Shell Scripts for getting teh data from source systems to load into Data Warehouse.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate teh process.
- Created UNIX shell scripts and called them as pre session and post session commands.
Environment: NCR 4800/5100, Teradata V12 (BTEQ, FastLoad, MultiLoad, Teradata SQL, FastExport) Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, DB2, SAS, Oracle, Ab Initio (GDE 1.15, Co>OS: V1.15) EME,PL/SQL Trillium, Cognos,Shell-scripting
Confidential, Atlanta, CT
Teradata Developer
Responsibilities:
- Involved in Data Modeling to identify teh gaps wif respect to business requirements and transforming teh business rules.
- Preparing Data Flow Diagram wif respect to teh source system by referring TCLDM.
- Development of scripts for loading teh data into teh base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Designed series of Teradata Macros for various applications in Teradata SQL Assistant.
- Extracted operational data into teh data store from teh legacy systems using COBOL and JCL scripts.
- Created JCL scripts to load data from mainframes.
- Organized teh data efficiently in teh Teradata system using Teradata Manufacturing Logical Data Model.
- Extensively used ETL to load data from Oracle database, XML files, and Flat files data also used import data from IBM Mainframes.
- Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
- Developed Scripts to load teh data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad and MultiLoad.
- Created Teradata models in Erwin.
- Writing scripts for data cleansing, data validation, data transformation for teh data coming from different source systems.
- Developing and reviewing Detail Design Document and Technical specification docs for end to end ETL process flow for each source systems.
- Worked on Multi file systems wif extensive parallel processing.
- Extensively Used Transform Components: Aggregator, Match sorted Join, Denormalize sorted, Reformat, Rollup and Scan Components.
- Involved to generate teh Load Ready Files from different databases like Oracle, AS400.
- Regular interactions wif DBA’s.
- Preparing teh Maestro Schedules to Deploy teh code into production.
- Dealt wif initials, delta and Incremental data as well Migration data to load into teh Teradata.
- Developed UNIX shell scripts to run batch jobs and loads into production.
- Involved in Unit Testing and Preparing test cases and also involved in Peer Reviews
Environment: Teradata V12, BTEQ, Teradata Manager, Teradata SQL Assistant, FastLoad, MultiLoad, Fast Export, Rational Clear Quest, Control-M, UNIX, MQ, NDM, FTP, SAS, Ab Initio (GDE 1.14, Co>OS: V1.14) EME, PL/SQL
Confidential, New Berlin, WI
Teradata developer
Responsibilities:
- Involved in creation of Logical and Physical models using Erwin for ODS, EDW, DM and ADB and created DDLs for teh DBA to create structures in teh Teradata environments, development, staging and production. Teh modeling part is done through JAD sessions wif involvement from Enterprise Architects and Business Users.
- Worked in a DBA team to ensure implementation of teh databases for teh physical data models intended for teh data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all teh available AMPS.
- Involved heavily in writing complex SQL queries based on teh given requirements such as complex Teradata Joins, Stored Procedures, and Macros etc.
- Extensively used several SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
- Involved in walkthrough of teh Data Models and ETL design documents wif ETL developers, before each of teh ETL coding iteration. Did teh integration testing and involved wif UAT wif business users.
- Coordinated wif Vendor Fiserv to has scheduled transmission of Bank Operational Data on Daily basis and to has them catalogued in EDS.
- Did teh detailed profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of teh data that can be used for analytical purpose for business analysts.
- Participated in several JAD sessions wif analysts from business side to come up wif teh better requirements.
- Created a Bank Data Mart POC in Data Mart level at user Teradata space to validate teh requirements wif users and also to come up wif teh better mapping document wif right transformations.
- Participated in Agile Iterative Methodology wif teh halp of BT Project Manager. Teh iterations are CD, MM, and Integration wif Card products using Customer Identification Number, Marketing Campaigns and IVR Call and agent response data.
- Implemented bank data marts in ODS, EDW, DM and ADB (Application Database). Coordinated wif Enterprise Warehouse architects to follow teh corporate standards for teh implementation. Used existing Metadata, Audit and ETL frameworks.
- Automated teh entire Bank Data Mart process using Unix Shell scripts and scheduled teh process using Autosys after dependency analysis.
- Well versed wif AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
- Worked extensively to create, schedule, and monitor teh workflows and to send teh notification messages to teh concerned personnel in case of process failures.
- Wif teh halp of Enterprise Metadata team, uploaded teh technical and business metadata to enterprise level Metacenter. Defined audit thresholds for teh Balance and control rejections during ETL process.
Environment: Red Hat Enterprise Linux 64-bit, Windows XP, Informatica Power Center 7.1, NCR Teradata V2R6, Business Objects 11, Crystal Reports 10, VB.
Confidential, Denver, CO
Teradata developer
Responsibilities:
- Involved in teh analysis of teh Issues and proposing teh solutions to teh client
- Involved in teh analysis of test results and documenting teh test results.
- Preparing teh documents in details which will be shared across teh organization
- Create export scripts using Teradata Fast Export Utility.
- Preparing Test cases and Test Data
- Create data manipulation and definition scripts using Teradata Bteq utility Involved in teh analysis and design of teh system
- Involved in Testing of teh prototype.
- Create load scripts using Teradata Fast Load and Mload utilities Code Reviews and Code walk through of teh prototype.
- Create procedures in Teradata SQL.
- Attending corporate meeting for teh kick off of major enhancements at corporate level.
- Organizing meeting wif teh SME’s of teh dependent systems when changes are done for teh existing system.
- Create UNIX scripts for file manipulation.
- Involved in preparation of architecture and infrastructure documentation.
- Involved in Designing DR process for teh ESB application
- Involved in teh Weekly issue meeting wif teh customer.
- Organizing meeting wif teh department heads when changes are recommended for teh existing system for performance improvement.
- Organizing meeting wif teh team on daily basis.
- Must be able to reverse engineer and document existing code
- Involved in decision making for changing teh existing programs for special processing.
Environment: UNIX, Teradata RDBMS,BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power, FTP, SFTP
Confidential, Bentonville, AR
Teradata Analyst
Responsibilities:
- Based on teh requirements created Functional design documents and Technical design specification documents for transformation and loading.
- Translating client’s strategic vision into technical solutions and application identification.
- Designing and reviewing teh ETL solutions in Ab Initio.
- Mentoring teh team in teh technology, design and development areas.
- Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.
- Ensure strict adherence to committed deadlines as per project development / implementation schedule.
- Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
- Extracted data from legacy applications into teh EDW environments, programming in SQL to perform teh data Query functions for Teradata.
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Extensively used Ab Initio to load data from sources involving Oracle, Flat files, SQL Server to Teradata database.
- Used Ab Initio components like Reformat, Scan, Rollup, Join, Sort, Partition by key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
- Involved in complex Ab Initio XFRs and DMLs to derive new fields and solve various business requirements
- Developed UNIX Korn shell wrappers to run various Ab Initio scripts.
- Modified BTEQ scripts to load data from Teradata Staging area and data mart.
Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, Fast Export Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP
Confidential
SQL Developer
Responsibilities:
- Created Database objects including Tables, Indexes, Clusters, sequences, roles, and privileges.
- Calculated and monitored size and space for tables, clusters and indexes.
- Created separate table spaces for Users, Indexes, Rollback Segments and Temporary Segments. Renamed and resized redo log files, user data files, rollback files, index files and temporary files.
- Created and maintained Triggers, Packages, Functions and Procedures.
- Created and maintained roles and profiles for security and limited access of data.
- Designed and implemented Backup & Recovery strategies.
- Analyzed teh database for performance issues and conducted detailed tuning activities for improvement.
- Debugged, performed system/application tuning by identifying long running reports.
- Created users and established application security by creating roles.
- Wrote procedures to meet adhoc requirements to replicate data from one table to another.
Environment: MS SQL Server 6.5, SQL Server 7, MS SQL Server 2000