Sr Teradata/etl Developer Resume
SUMMARY
- Over 7 Years of Total experience in Analysis, Design, Development, Implementation, Modeling, Testing, Reconciling and support for Data Warehousing applications.
- Extensively worked on Data Modeling including Dimensional Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
- Excellent knowledge in Data Warehouse / Data Mart development life cycle, Star schema, Snowflake schema, SCD, Surrogate Keys, Normalization/De - Normalization, Dimension and Fact tables.
- Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multiload, TPT, SQL Assistant, DDL and DML commands.
- Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
- Experience in extraction of data from various Heterogeneous sources (Relational database, Flat Files) to load into data warehouse/data mart targets.
- Sound knowledge in Data Migration from DB2, SQL Server and Oracle to Teradata using automated Unix shell scripting, Oracle/TD SQL, TD Macros and Procedures etc.
- 7 Years’ experience with Teradata SQL, Advanced SQL, Macros, Triggers, Stored procedures, BTEQ and Loader Utilities - Multiload, TPump, FastLoad & TPT.
- Extensive experience in designing logical and physical Data Modeling using Erwin tool.
- Proficient knowledge in ER and Dimensional Modeling, identifying Fact and Dimensional Tables with data modeling tools ERWIN and ER Studio.
- In-depth expertise in the Teradata cost based query optimizer, identifying potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design
- Considerations (PI/USI/NUSI/JI etc.) etc.
- Extensive experience in implementing slowly changing dimensions to maintain historical data and for
- Change data capture(CDC).
- Extensive experience in writing SQL scripts/Analytical queries as per the Business specifications.
- Excellent knowledge of relational databases through exposure to Teradata, Oracle, SQL Server.
- Built UNIX shell scripts, Library wrapper scripts, common to all users in Data warehouse team.
- Built UNIX shell scripts for automation of daily loading of Oracle and flat file data through scheduling.
- Expert with ODBC connections and defining system DSN’s, built shell wrappers to support the ETL process.
- Proficient in writing Perl/shell scripts and UNIX functions to automate and simplify the day to day activities.
- Experience in Performance Tuning of mappings, ETL Procedures and process.
- Experience in Performance Tuning ofTeradataSQL using explains plan and understanding Joins and Data distribution.
- Scheduling the automated jobs for daily, weekly and monthly jobs using UNIX Shell scripts for Tivoli scheduling.
- Daily standup meetings with onsite managers.
- Experience in identifying bottlenecks in ETL processes and performance tuning.
- Heavily involved in performance tuning of the ETL to reduce the load times and the business reports tuning.
- Experienced in Debugging mappings. Identifying bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Leading EDW/ETL production support team and was on weekly on-call rotation and provided on-call support for the projects in production during the warranty period.
- Excellent Coordination skills in working with cross functional team.
- Experience with master data management and worked with all types (1,2,3) of dimension tables.
- Experienced as on call support (24X7) as part of a scheduled rotation with other team members.
TECHNICAL SKILLS
ETL Tools: Teradata Load Utilities (Mload, FastLoad, TPump, Teradata Parallel Transporter), SQL Assistant, Fast Export
RDBMS: Teradata V2R5, Teradata 12,13,14,15, DB2, Oracle.
Languages: UNIX Shell Scripting, Perl, SQL
Object Modeling Tools: Erwin 9.7
SQL Tuning: SQL Tuning, Explain Plan, Table Partitioning, Materialized views.
Scheduling Tools: Tivoli
PROFESSIONAL EXPERIENCE
Confidential
Sr Teradata/ETL Developer
Responsibilities:
- Co-ordinated with the Business Analysts, Data Architects, DM’s and users to understand business rules.
- Designed and Developed the ETL application to integrate Ecom Data into Data warehouse.
- Developed Technical design documents and getting approvals from business team.
- Helped developers to create MLOAD scripts to load data into staging tables.
- Developed scheduling jobs Teradata and UNIX objects to run the jobs on daily/weekly basis depending on business requirement.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
- Built UNIX Wrapper script to extract data from AS/400 to Teradata through Mload.
- Modified existing UNIX Wrapper script to remove unwanted characters in the source file before loading into respective staging table.
- Built Validation scripts to Validate data in target tables with source data.
- Performed data analysis & data cleansing to clean source data as per business rules.
- Handling exception data into exception table for further analysis.
- Responsible for migrations of the code from Development environment to QA and QA to Production.
- Provided production support on various issues during daily loads.
- Worked on analyzing various production issues and necessary enhancements required.
- Participated in knowledge transfer sessions to Production support team on business rules, Teradata objects and on scheduling jobs.
Confidential
Sr. Teradata/ETL Developer
Responsibilities:
- Performed data analysis to item related data coming from new feed PDM.
- Developed mapping document for PDM columns with existing PIM columns.
- Reviewed code developed by team to load existing PIM staging tables from PDM feed.
- Worked on code changes to incorporate version effective time stamp in existing PIM target tables.
- Lead the team in analyzing/coding/testing the changes in EDW.
Confidential
Sr. Teradata/ETL Developer
Responsibilities:
- Developed the ELT application to integrate Mexico Retail Data.
- Developed Mload Scripts to extract source data from AS/400 system (104 Tables) using Teradata Utilities.
- Developed mappings to build the application.
- Developed & compiled stored procedures.
- Designed the Tivoli schedules for application.
Confidential
Teradata ETL Developer
Responsibilities:
- Developed ELT Application to Integrate PIM data.
- Implemented Slowly Changing Dimensions (SCD). Used Type 2 dimension to keep the history changes.
- Developed Scripts to extract source data from SQL Server using Teradata Utilities (Mload & OLELOAD) using Windows Server.
- Performed the Source Data analysis to create Logical/Physical Data Model.
- Developed Mapping document to map source to target columns based on Data model and Business rules.
- Developed & compiled stored procedures to load data into target tables (Development Phase).
- Developed scheduling scripts to run batch load everyday 4AM.
Confidential
Teradata ETL Developer
Responsibilities:
- Developed ELT Application to Integrate Confidential Data.
- Developed Source Data Extracts from AS/400 system (150 Tables) using Teradata Utilities & shell scripts.
- Developed stored procedures to load Confidential data into POS sales/Inventory data into existing target tables.
- Built validation scripts to validate sales/inventory data after batch process completes every day at 9 AM.
- Tuned & Optimized Teradata SQLs to minimize waiting time for the reporting team to report on Confidential Sales Data.
- Built a process to capture Exchange rate for Confidential .
Confidential
ETL Developer
Responsibilities:
- Developed ELT Application to Integrate Confidential Data.
- Developed Source Data Extracts from AS/400 system using Teradata Utilities & shell scripts.
- Developed stored procedures to load Confidential data into sales/Inventory data into existing target tables.
- Built validation scripts to validate sales/inventory data after batch process completes.
- Tuned & Optimized Teradata SQLs to minimize waiting time for the reporting team to report on Confidential Sales / Inventory Data.
Confidential
ETL Developer
Responsibilities:
- Developed ELT Application to Integrate Confidential Data.
- Developed Source Data Extracts from AS/400 system using Teradata Utilities & shell scripts.
- Built UNIX wrapper scripts to extract source data using table name as input parameter.
- Developed stored procedures to load Confidential data into sales/Inventory data into existing target tables.
- Built validation scripts to validate sales/inventory data after batch process completes every day at 9 AM.
- Tuned & Optimized Teradata SQLs to minimize waiting time for the reporting team to report on Confidential Sales Data.