We provide IT Staff Augmentation Services!

Senior Teradata/etlconsultant Resume

Richmond, VA

SUMMARY:

  • 7+Years of experience as an ETL/Teradata Developer in Data Warehousing Environment.
  • Experienced with differentRelational databases like Teradata, Oracle and SQL Server.
  • Developed UNIX shell scripts and used BTEQ, Fast Load, Multiload, Tpump, TPT, Data Mover and Fast Export utilities extensively to load to target database.
  • Strong expertise using Informatica Power Center Client tools - Designer, Repository manager, Workflow manager/monitor and Server tools Informatica Server, Repository Server manager.
  • Excellent understanding in depth noledge of Hadoop architecture and various components such as HDFS, Map Reduce programming and other Ecosystem components.
  • Created UNIX shell scripts to run teh Informatica workflows and controlling teh ETL flow.
  • Hands on experience in development of Data Warehouses/Data Marts usingAbinitio
  • Expertise in Report formatting, Batch processing, and Data Loading and Export using BTEQ.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for ad hoc purpose to extract data for several business users on scheduled basis.
  • Performeddata analysisanddata profilingusingSQLon various sources systems including SQL Server 2008.
  • Understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. and 3+ years of experience in Teradata production support.
  • In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes (Primary, Secondary, PPI, Join indexes etc.)

TECHNICAL SKILLS:

Operating Systems: Unix, Windows 7/XP/2000, Windows NT 4.0 and Z/OS and MVS (OS/390).

Databases: Teradata V2R5/V2R6, V12, V14, V15, Oracle (8i/9i), SQL Server 2000/2005/2008, DB2.

Teradata Tools & Utilities: Query Facilities: SQL Assistant, BTEQ, Load & Export: Fast Load, Multiload, Tpump, TPT, Fast Export, Data Mover.

Data Modeling: Erwin, Visio

ETL tools: Informatica 9.6.1/9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1, Powernb; centre, Hadoop 2.6.2 (HDFS, Map Reduce, Hive, Sqoop, Pig).

Programming languages: C, C++, Shell scripting (K-shell, C-Shell), SQL.

Reporting Tools: Micro Strategy 9.2.1/9.0/8i/7.me.

Scheduling tools: Control-M, Autosys, UC4.

PROFESSIONAL EXPERIENCE:

Confidential, Richmond, VA

Senior Teradata/ETLConsultant

Responsibilities:

  • Meetings with business/user groups to understand teh business process, gather requirements, analyze, design, development and implementation according to client requirement.
  • Extensively worked in data Extraction, transformation and loading from source to target system using Informatica power center and Teradata Utilities.
  • Developed BTEQ Import, BTEQ Export, Fast Load, Multiload, Fast Export scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
  • Data coming from different source system, loading into Oracle stage and tan different dimension and fact table using Oracle stored procedure.
  • Data coming from different types of flat file (e.g. .txt file) copying into staging table and presenting into Target table using different ETL logics.
  • Data loading into Staging layer by preparing only definition file and parameter file and using existing framework scripts which is developed based on TPT.
  • Storing data into Journal layer using initial and incremental logic.
  • Used TPT,BTEQ and MLOAD.
  • Worked with Error handling by using ET, UV and WT tables.
  • Creation of Marketwise data loading framework.
  • Creation of Informatica deployment group for code deployment from one environment to another environment.
  • Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.
  • Also used Journals extensively for teh disaster recovery process for rollback and roll forward process.
  • Performed Teradata and Informatica performance tuning
  • Extensively created and used various Teradata Set Tables, Multi-Set table, global temporary tables, and volatile tables.
  • Developed various UNIX shell wrappers to run variousAbinitiojobs
  • Used Power Centre Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Extensively involved in data transformations, validations, extraction and loading process. Implemented various Teradata Join Types like Inner-join, outer-join, self-join and Merge-join.
  • Worked with Workflow Manager for teh creation of various tasks like Worklets, Sessions, Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.
  • Performance tunedthe workflows by identifying teh bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
  • Involved inSQL Tuning, Optimizer, Indexes, Table partitions,andclusters.
  • Worked exclusively with teh TeradataSQL Assistantto interface with teh Teradata.
  • Performed data validation testing writingSQLqueries.
  • Enhanced technical skills to be able to improve development and implementation of solutions while following defined best practices.
  • Performed budgeting, forecasting, financial modeling, and reporting an essential part of teh processes.
  • Test Case Execution andAdhoctesting.
  • Performed Integration, End-to-End,systemtesting.
  • Used teh Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
  • Implemented update strategies, incremental loads, CDC maintenance.
  • Involved in analyzing source systems and designing teh processes for Extracting Transforming and Loading teh data to Teradata database.
  • Developed mapping parameters and variables to support SQL override.
  • Wrote shell scripts to perform pre-session and post-session operations
  • Responsible for managing, scheduling and monitoring teh workflow sessions
  • Developed Ooze workflows and scheduled those through a scheduler.
  • Moved data from different sources to HDFS and vice-versa using SQOOP.
  • Did teh Performance tuning in database side, transformations, and jobs level.
  • Involved in creating Unit test plans for and testing teh data for various applications.

Confidential, Richmond, VA

Senior Teradata/ETLConsultant

Responsibilities:

  • Interacting with business users for requirement gathering, analyzing teh business requirements and translating them in to functional and technical specifications.
  • Developed Teradata utilities to populate teh data into EDW like FastLoad, BTEQ, Fast Export and Multi Load.
  • Massaged teh data using BTEQ by applying teh business rules for teh source data for validation and transformation.
  • Extensively used tools ofMLoad, BTeq, FastExportand FastLoadto design and develop dataflow paths for loading transforming and maintaining datawarehouse.
  • Loading data by using teh Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • Reduced Teradataspaceused by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Develop high performancesemantic / presentation layerfor reporting and analytics.
  • Define and maintainedmetadata, data sources and set up Validation Rules.
  • Designed and Implemented Tables, Functions, Stored Procedures and Triggers in SQL Server 2008.
  • Wrote teh SQL queries, Stored Procedures and Views.
  • DevelopedSQL Server Stored Procedures, Tuned SQL Queries (using Indexes and Execution Plan)
  • DevelopedUser Defined Functions and created Views.
  • Through noledge inTeradataand OracleRDBMS Architecture.
  • Experienced introubleshootingTeradata Scripts, fixing bugs and addressingproduction issues andperformance tuning.
  • Used Power Centre Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Worked with SETand MULTISET tables for performance evaluation of teh scripts.
  • Extensively created and used various Teradata Set Tables, Multi-Set table, global tables, volatile tables, temp tables.
  • Did teh performance tuning of user queries by analyzing teh explain plans, recreating teh user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Created Teradata Macro’s and stored procedures for repeated use across various applications.
  • Created several Teradata SQL queries and created several reports using teh above data mart for UAT and user reports.
  • Experienced in developing OLAP reports using Business Objects.
  • Used BTEQ andSQL Assistant(Query man) front-endtools to issue SQL commands matching teh business requirements to Teradata RDBMS.
  • Implement processes and logic to extract, transform, and distribute data across one or more data stores from a wide variety of sources.
  • Optimize data integration platform to provide optimal performance under increasing data volumes.
  • Upload data into appropriate databases in accurate and timely manner.
  • Developed vision and strategy for building teh Data Integration/Data Warehouse team. Selected and developed staff to meet plans and objectives. Conducted regular team meetings to facilitate team goals.
  • Designed and developed Global Revenue data warehouse dat extracts, transforms and loads source data from ERP source systems around teh world to provide visibility to revenue data from source.
  • Expertise in ETL processes using Informatica.
  • Tuned mappingsusing Power Center-Designer and used different logic to provide maximum efficiency and performance.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Developedshell scriptsfor Daily and weekly Loads and scheduled usingUnix Maestroutility.
  • Modified theshell/Perlscripts as per teh business requirements.
  • Used teh PL/SQL procedures for Informatica mappings for truncating teh data in target tables at run time.
  • Created map lets to use them in different mappings.
  • Extensively involved in data transformations, validations, extraction and loading process Implemented Various Join Types like Inner-join, outer-join, self-join. And various join strategies like Merge join, Nested join, Row Hash Joins.
  • Imported Source/Target tables from teh respective SAP R3 and BW systems and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator) inside a map let and created new mappings using Designer module of Informatica Power Center to implement teh business logic and to load teh customer healthcare data incrementally and full.
  • Created Complex mappings using Unconnected Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
  • Unit test teh ETL workflow & mappings - fixing teh defects which came out of unit testing and if needed, make modifications to teh documents to ensure dat those are up to date as per teh functionality in teh system.
  • Developed Informatica SCD Type-me, Type-II mappings. Extensively used almost all of teh transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
  • Exposure toLarge Scale Data Integration and Performance Tuning.
  • Involved in 24x7 production support.

Environment: Teradata V13/14, Teradata Utilities, Teradata SQL Assistant 12, Informatica Power Center 9.1.0, Flat files, Oracle 11g/10g, MS SQL Server 2008, Autosys, SQL, Shell Programming, Toad, Excel and Unix scripting, Windows 2002.

Confidential

Teradata/ ETLDeveloper

Responsibilities:

  • Interacting with business users for requirement gathering, analyzing teh business requirements and translating them in to functional and technical specifications.
  • Developed ETL programs using Informatica to implement teh business requirements.
  • Developed Teradata utilities to populate teh data into EDW like FastLoad, BTEQ, Fast Export and Multi Load.
  • Massaged teh data using BTEQ by applying teh business rules for teh source data for validation and transformation.
  • Developed BTEQ script for pre population of teh work tables prior to teh main load process and performed teh transformation in teh later stages.
  • Handled several source file formats like flat files, COBOL copy books for loading teh data into teh EDW.
  • Performance tuning was done at teh functional level and map level. Used relational SQL wherever possible to minimize teh data transfer over teh network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored teh results using workflow monitor.
  • Performed application level activities creating tables, indexes, monitored and tuned Teradata BETQ scripts.
  • Written several Teradata BTEQ scripts for reporting purpose.
  • Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
  • Actively participated in teh requirement discussion from end users, Dashboard design sessions with teh data/business analysts and data model sessions with teh data team.
  • Created dashboards at different levels of data and enabled navigation across them using hyperlinks and report links.
  • Implemented different types of widgets (Map, Interactive and Time series) depending on business requirements.
  • Ensure accuracy & integrity of data & applications through analysis, coding, writing clear documentation & problem resolution.
  • Analyze & translate functional specifications & change requests into technical specifications.
  • Generated and implemented Micro Strategy Schema objects and Application objects by creating facts, attributes, reports, dashboards, filters, metrics and templates using Micro Strategy Desktop.
  • Created Transformations (table based, formula based) for using in comparative reports like sales dis year to sales last year.
  • Collected Multi-Column Statistics on all teh non-indexed columns used during teh join operations & all columns used in teh residual conditions.
  • Developed and tested teh UNIX shell scripts for running teh Teradata scripts.
  • Used various Teradata Index techniques to improve teh query performance.
  • Created unit test plans to unit test teh code prior to teh handover process to QA.
  • Helped users byExtracting Mainframe Flat Files (Fixed or CSV) onto UNIX Server and tan converting them into Teradata Tables using BASE SAS Programs.
  • Used SAS PROC IMPORT, DATA and PROC DOWNLOAD procedures to extract teh FIXED Format Flat files and convert into Teradata tables for Business Analysis.

Environment: Teradata V12/V13, Teradata SQL Assistant, Teradata Utilities (BTEQ, Mload, FastLoad, Fast Export and TPT), Informatica 8.6.1,DB2, SQL server 2005/2008, SAS, Control-M, UNIX.

Hire Now