Senior Teradata/etl Consultant Resume
Cary, NC
SUMMARY
- Enterprise individual with around 7+ years of experience in Information Technology with expertise in ETL Development in a Data Warehousing Environment.
- 7+Years of experience as an ETL/Teradata Developer in Data Warehousing Environment.
- Around 3+ years of strong data warehousing experience using Informatica Power Mart 6.1/5.1/4.7, Power Center 9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1 , Power Exchange, Power Connect as ETL tool.
- Experienced with differentRelational databases like Teradata, Oracle and SQL Server.
- Developed UNIX shell scripts and used BTEQ, Fast Load, Multiload, Tpump, TPT, Data Mover and Fast Export utilities extensively to load to target database.
- Strong expertise using Informatica Power Center Client tools - Designer, Repository manager, Workflow manager/monitor and Server tools Informatica Server, Repository Server manager.
- Excellent understanding in depth knowledge of Hadoop architecture and various components such as HDFS, Map Reduce programming and other Ecosystem components.
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Hands on experience in development of Data Warehouses/Data Marts usingAbinitio
- Expertise in Report formatting, Batch processing, and Data Loading and Export using BTEQ.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for ad hoc purpose to extract data for several business users on scheduled basis.
- Performeddata analysisanddata profilingusingSQLon various sources systems including SQL Server 2008.
- Understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. and 3+ years of experience in Teradata production support.
- In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes (Primary, Secondary, PPI, Join indexes etc.)
- Worked with collect statistics, join indexes etc.
- Over 2 years’ experience inTableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau 7 and Tableau 8.
- Expertise in physical modeling with knowledge to usePrimary, Secondary, PPIand join Index.
- Experienced in data modeling inStar schemaand3NFusing Erwin.
- Through knowledge inTeradataand OracleRDBMS Architecture.
- Experienced introubleshootingTeradata scripts, fixing bugs and addressing production issues andperformance tuning.
- Technical expertise in ETL methodologies, Informatica 6.2/7.1/8.6/9.5 - Power Centre, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools
- Experienced on major Hadoop ecosystem projects such as PIG, HIVE and HBASE.
- Data Integration experience, including EDI, ETL, Data Warehouse, and Data Conversion.
- Created ETL test data for all ETL mapping rules to test the functionality of theInformaticagraphs.
- Tested the ETLInformaticamappings and other ETL Processes (Data Warehouse Testing)
- Expertise in querying and testingRDBMSsuch asOracle,MS SQL ServerusingSQL for data integrity.
- Complex ETL development using Informatica Power Center Transformations - Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.
- Extensive experience in working withAbinitioas an ETL tool for implementing transformations and conditions.
- Data modeling experience using Dimensional Data Modeling, Star Schema, Snow-Flake, Fact and Dimension Tables, Physical and Logical Data Modeling using Erwin 3.x/4.x.
- Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
- Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
- Extensively worked on PMON/Viewpoint for Teradata to look at performance Monitoring and performance tuning.
- Sourced data from disparate data sources IBM DB2, Oracle, and SQL Server and loaded into Oracle and Teradata DW.
- Experience in Testing (Unit testing, Integration Testing and System testing).
- Through knowledge in Oracle and Teradata RDBMS Architecture.
TECHNICAL SKILLS
Operating Systems: Unix, Windows 7/XP/2000, Windows NT 4.0 and Z/OS and MVS (OS/390).
Databases: Teradata V2R5/V2R6, V12, V14, V15, Oracle (8i/9i), SQL Server 2000/2005/2008 , DB2.
Teradata Tools & Utilities: Query Facilities: SQL Assistant, BTEQ, Load & Export: Fast Load, Multiload, Tpump, TPT, Fast Export, Data Mover.
Data Modeling: Erwin, Visio
ETL tools: Informatica9.6.1/ 9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1 , Power centre, Hadoop 2.6.2 (HDFS, Map Reduce, Hive, Sqoop, Pig).
Programming languages: C, C++, Shell scripting (K-shell, C-Shell), SQL.
Reporting Tools: Micro Strategy 9.2.1/9.0/8i/7.i.
Scheduling tools: Control-M, Autosys, UC4.
PROFESSIONAL EXPERIENCE
Confidential, Cary, NC
Senior Teradata/ETL Consultant
Responsibilities:
- The purpose of Project is to load data into different layer (Source Stage, Stage Journal, Journal Base and Base Semantic) using existing framework (TPT/BTEQ).
- Meetings with business/user groups to understand the business process, gather requirements, analyze, design, development and implementation according to client requirement.
- Extensively worked in data Extraction, transformation and loading from source to target system using Informatica power center and Teradata Utilities.
- Developed BTEQ Import, BTEQ Export, Fast Load, Multiload, Fast Export scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
- Data coming from different source system, loading into Oracle stage and then different dimension and fact table using Oracle stored procedure.
- Data coming from different types of flat file (e.g. .txt file) copying into staging table and presenting into Target table using different ETL logics.
- Data loading into Staging layer by preparing only definition file and parameter file and using existing framework scripts which is developed based on TPT.
- Storing data into Journal layer using initial and incremental logic.
- Used TPT,BTEQ and MLOAD.
- Worked with Error handling by using ET, UV and WT tables.
- Creation of Marketwise data loading framework.
- Creation of Informatica deployment group for code deployment from one environment to another environment.
- Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.
- Also used Journals extensively for the disaster recovery process for rollback and roll forward process.
- Performed Teradata and Informatica performance tuning
- Extensively created and used various Teradata Set Tables, Multi-Set table, global temporary tables, and volatile tables.
- Developed various UNIX shell wrappers to run variousAbinitiojobs
- Used Power Centre Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
- Extensively involved in data transformations, validations, extraction and loading process. Implemented various Teradata Join Types like Inner-join, outer-join, self-join and Merge-join.
- Worked with Workflow Manager for the creation of various tasks like Worklets, Sessions, Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.
- Performance tunedthe workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
- Involved inSQL Tuning, Optimizer, Indexes, Table partitions,andclusters.
- Worked exclusively with the TeradataSQL Assistantto interface with the Teradata.
- Performed data validation testing writingSQLqueries.
- Enhanced technical skills to be able to improve development and implementation of solutions while following defined best practices.
- Performed budgeting, forecasting, financial modeling, and reporting an essential part of the processes.
- Test Case Execution andAdhoctesting.
- Performed Integration, End-to-End,systemtesting.
- Used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
- Implemented update strategies, incremental loads, CDC maintenance.
- Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Teradata database.
- Developed mapping parameters and variables to support SQL override.
- Wrote shell scripts to perform pre-session and post-session operations
- Responsible for managing, scheduling and monitoring the workflow sessions
- Developed Ooze workflows and scheduled those through a scheduler.
- Moved data from different sources to HDFS and vice-versa using SQOOP.
- Did the Performance tuning in database side, transformations, and jobs level.
- Involved in creating Unit test plans for and testing the data for various applications.
Environment: Teradata V14.0, Teradata Loading Utilities(BTEQ, FastLoad, Fast Export and MultiLoad)Teradata SQL Assistant, Informatica9.6.0, Hadoop 2.6.2 (HDFS, MapReduce, Hive, Sqoop, Pig), Sql server,UC4, UNIX, Shell scripting.
Confidential, NY
Sr. Teradata Consultant
Responsibilities:
- Interacting with business users for requirement gathering, analyzing the business requirements and translating them in to functional and technical specifications.
- Developed Teradata utilities to populate the data into EDW like FastLoad, BTEQ, Fast Export and Multi Load.
- Massaged the data using BTEQ by applying the business rules for the source data for validation and transformation.
- Extensively used tools ofMLoad, BTeq, FastExportand FastLoadto design and develop dataflow paths for loading transforming and maintaining datawarehouse.
- Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
- Reduced Teradataspaceused by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
- Develop high performancesemantic / presentation layerfor reporting and analytics.
- Define and maintainedmetadata, data sources and set up Validation Rules.
- Designed and Implemented Tables, Functions, Stored Procedures and Triggers in SQL Server 2008.
- Wrote the SQL queries, Stored Procedures and Views.
- DevelopedSQL Server Stored Procedures, Tuned SQL Queries (using Indexes and Execution Plan)
- DevelopedUser Defined Functions and created Views.
- Through knowledge inTeradataand OracleRDBMS Architecture.
- Experienced introubleshootingTeradata Scripts, fixing bugs and addressingproduction issues andperformance tuning.
- Used Power Centre Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
- Worked with SETand MULTISET tables for performance evaluation of the scripts.
- Extensively created and used various Teradata Set Tables, Multi-Set table, global tables, volatile tables, temp tables.
- Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Created Teradata Macro’s and stored procedures for repeated use across various applications.
- Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports.
- Experienced in developing OLAP reports using Business Objects.
- Used BTEQ andSQL Assistant(Query man) front-endtools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Implement processes and logic to extract, transform, and distribute data across one or more data stores from a wide variety of sources.
- Optimize data integration platform to provide optimal performance under increasing data volumes.
- Upload data into appropriate databases in accurate and timely manner.
- Developed vision and strategy for building the Data Integration/Data Warehouse team. Selected and developed staff to meet plans and objectives. Conducted regular team meetings to facilitate team goals.
- Designed and developed Global Revenue data warehouse that extracts, transforms and loads source data from ERP source systems around the world to provide visibility to revenue data from source.
- Expertise in ETL processes using Informatica.
- Tuned mappingsusing Power Center-Designer and used different logic to provide maximum efficiency and performance.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Developedshell scriptsfor Daily and weekly Loads and scheduled usingUnix Maestroutility.
- Modified theshell/Perlscripts as per the business requirements.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Created map lets to use them in different mappings.
- Extensively involved in data transformations, validations, extraction and loading process Implemented Various Join Types like Inner-join, outer-join, self-join. And various join strategies like Merge join, Nested join, Row Hash Joins.
- Imported Source/Target tables from the respective SAP R3 and BW systems and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator) inside a map let and created new mappings using Designer module of Informatica Power Center to implement the business logic and to load the customer healthcare data incrementally and full.
- Created Complex mappings using Unconnected Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
- Unit test the ETL workflow & mappings - fixing the defects which came out of unit testing and if needed, make modifications to the documents to ensure that those are up to date as per the functionality in the system.
- Developed Informatica SCD Type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
- Exposure toLarge Scale Data Integration and Performance Tuning.
- Involved in 24x7 production support.
Environment: Teradata V13/14, Teradata Utilities, Teradata SQL Assistant 12, Informatica Power Center 9.1.0, Flat files, Oracle 11g/10g, MS SQL Server 2008, Autosys, SQL, Shell Programming, Toad, Excel and Unix scripting, Windows 2002.
Confidential, Jacksonville, FL
Teradata/ ETL Developer
Responsibilities:
- Created tables, views in Teradata, according to the requirements.
- Involved in the analysis and implementation of their system.
- Implemented logical and physical data modeling with STAR and SNOWFLAKE techniques using ERwin in Data Mart
- Involved heavily in writing complex SQL queries based on the given requirements.
- Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump
- Created and automate the process of loading using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations
- Used Fast Load for loading into the empty tables.
- Created reports using BTEQ scripts reporting functionalities and FastExport for the end user.
- Used volatile table and derived queries for breaking up complex queries into simpler queries.
- Created an archive process that archives the data files and FTP to the remote server.
- Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
- Created a shell script that checks the corruption of data file prior to the load.
- Created unit test plans to unit test the code prior to the handover process to QA.
- Involved in troubleshooting the production issues and providing production support.
- Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
- Collected statistics every week on the tables to improve performance.
- Developed unit test plans and involved in system testing.
Environment: Teradata V13, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, Erwin Designer, UNIX, MVS, Shell scripts.
Confidential
Teradata/ ETL Developer
Responsibilities:
- Interacting with business users for requirement gathering, analyzing the business requirements and translating them in to functional and technical specifications.
- Developed ETL programs using Informatica to implement the business requirements.
- Developed Teradata utilities to populate the data into EDW like FastLoad, BTEQ, Fast Export and Multi Load.
- Massaged the data using BTEQ by applying the business rules for the source data for validation and transformation.
- Developed BTEQ script for pre population of the work tables prior to the main load process and performed the transformation in the later stages.
- Handled several source file formats like flat files, COBOL copy books for loading the data into the EDW.
- Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
- Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
- Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
- Performed application level activities creating tables, indexes, monitored and tuned Teradata BETQ scripts.
- Written several Teradata BTEQ scripts for reporting purpose.
- Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
- Actively participated in the requirement discussion from end users, Dashboard design sessions with the data/business analysts and data model sessions with the data team.
- Created dashboards at different levels of data and enabled navigation across them using hyperlinks and report links.
- Implemented different types of widgets (Map, Interactive and Time series) depending on business requirements.
- Ensure accuracy & integrity of data & applications through analysis, coding, writing clear documentation & problem resolution.
- Analyze & translate functional specifications & change requests into technical specifications.
- Generated and implemented Micro Strategy Schema objects and Application objects by creating facts, attributes, reports, dashboards, filters, metrics and templates using Micro Strategy Desktop.
- Created Transformations (table based, formula based) for using in comparative reports like sales this year to sales last year.
- Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.
- Developed and tested the UNIX shell scripts for running the Teradata scripts.
- Used various Teradata Index techniques to improve the query performance.
- Created unit test plans to unit test the code prior to the handover process to QA.
- Helped users byExtracting Mainframe Flat Files (Fixed or CSV) onto UNIX Server and then converting them into Teradata Tables using BASE SAS Programs.
- Used SAS PROC IMPORT, DATA and PROC DOWNLOAD procedures to extract the FIXED Format Flat files and convert into Teradata tables for Business Analysis.
Environment: Teradata V12/V13, Teradata SQL Assistant, Teradata Utilities (BTEQ, Mload, FastLoad, Fast Export and TPT), Informatica 8.6.1,DB2, SQL server 2005/2008, SAS, Control-M, UNIX.