We provide IT Staff Augmentation Services!

Sr. Teradata Consultant Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • 7 plus years of experience in data warehousing, with proficiency as Teradata developer and strong expertise in SQL queries, stored procedures, Teradata Macros, Big data ecosystem related technologies like Hadoop, Pig, Hive
  • Expertise in creating Databases, Users, Tables, Triggers, Macros, views, stored procedures, Functions, Packages, Joins and Hash indexes in Teradata database.
  • Extensively worked with Teradata utilities like BTEQ, FastExport, FastLoad, MultiLoad to export and load data to/from different source systems including flat files.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Expertise in writing large/complex queries using PL/SQL.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN; Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent experience in ETL Tools like Informatica and on implementing Slowly Changing Dimensions.
  • Expert in ETL design/development using Informatica and Data stage.
  • Experience in creating dashboards in line with the analytical uniformity using the Advanced Analysis of Tableau.
  • Data loaded into Teradata from flat files using Informatica dynamic sequence job with schema file concept.
  • Hands - on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reduce/HDFS Framework.
  • Designed and implemented Hive and Pig UDF's for evaluation, filtering, loading and storing of data.
  • The Hive tables created as per requirement were Internal or External tables defined with appropriate Static and Dynamic partitions, intended for efficiency.
  • Wrote Scripts to generate Map Reduce jobs and performed ETL procedures on the data in HDFS.
  • Developed the semantic layer for reporting users like SAP BO, Micro strategy and Cognos
  • Built complex de-normalized views as part of semantic layer design and development
  • Tuned the complex queries and improved the performance of semantic views
  • Excellent Experience with Indexes (PI, SI, JI, AJIs, PPI) and Collect Statistics.
  • Good knowledge in TeraData Columnar new physical design technique.
  • In-depth hands on experience in database, ETL/ELT design and development and having excellent data analysis skills
  • Familiar with using Set, Multiset, Derived, Volatile, Global Temporary tables in Teradata for larger Ad hoc SQL requests
  • Strong Data Warehousing, Data Marts, Data Analysis experience in RDBMS databases
  • Extensively worked and used Business Objects, Designer, Reporting, Crystal Reports
  • Performing Data validation, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst by using Oracle, Teradata.
  • Experience creating design documentation related to system specifications including user interfaces, security and control, performance requirements and data conversion.
  • Have experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.
  • Proficient in preparing high/low level documents like design and functional specifications.
  • Actively involved in Quality Processes and release management activities - To establish, monitor and streamline quality processes in the project.
  • Involved in Requirements, software documentation and specifications of Artifact from software development process.
  • Familiar with Artifact like use cases and UML models to design and describe software.
  • Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.

TECHNICAL SKILLS

Primary Tools: Informatica, Teradata SQL, Teradata Tools and Utilities, Oracle, MS SQL Server, DB2

Languages: Teradata, PL/SQL, JAVA, HIVE

Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPump, SQL Assistant, Teradata Manager

Databases: Teradata 14/13.10/13/12/V2R6.2, Netezza, Oracle 10g/9i, HDFS

Hadoopecosystems: HDFSand Map Reduce, Sqoop, Hive, PIG.

Operating Systems: Windows, UNIX

Reporting tools: Business Objects, Cognos, Crystal Reports, OBIEE

Other tools: Toad, Edit plus, Test director, Mercury quality center, MS Office, MS Excel, MS Access, MS Word, Requisite Pro, Live Link, Tivoli, Rational Unified Process (RUP), Waterfall, Agile and FDD Methodologies, Clear Quest, Clear Case

PROFESSIONAL EXPERIENCE

Confidential, San Jose, CA

Sr. Teradata Consultant

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Implemented logical and physical data modelling with Star and SnowFlakes techniques using Erwin in Data Mart.
  • Involved in developing use cases, workflow diagrams in Analysis, Design and development phases.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Responsible for Collect Statics on FACT tables.
  • Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Created tables, views in Teradata, according to the requirements.
  • Worked on generating various dashboards inTableauServer using different data as Teradata.
  • Combined views and reports into interactive dashboards inTableauDesktop presented to Business Users, Program Managers, and End Users.
  • Provided architecture/development for initial load programs to migrate production databases from Oracle data marts to Teradata data warehouse, as well as ETL framework to supply continuous engineering and manufacturing updates to the data warehouse (Oracle, Teradata, MQ Series, ODBC, HTTP, and HTML).
  • Worked with different File Formats like TEXTFILE, AVROFILE, ORC, and PARQUET for HIVE Querying and Processing.
  • Worked on Hive joins to produce the input data set to the Qlikview model.
  • Performed transformations, cleaning and filtering on imported data using Map Reduce, and loaded final data into HDFS and Hive.
  • Migrate / Convert Data from Oracle to Teradata DW using Oracle Data Pump/OWB, OLE DB and DTS.
  • Worked on Informatica Advanced concepts & also Implementation ofData stage push down Optimization technology and pipeline partitioning.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Multiload and FastLoad.
  • Used the Informatica Designer to develop processes for extracting, cleansing, transforms, integrating and loading data into data ware house database.
  • Responsible for Formulating the DW process to load from sources to Target tables.
  • Used Power Center Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Developed UNIX scripts to automate different tasks involved as part of loading process.

Environment: Teradata 13 & 14, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Hadoop, HDFS, Pig, Hive, Erwin Designer, Informatica, Cognos, Tableau Desktop 8, UNIX, Korn Shell scripts.

Confidential, Framingham, MA

Sr. Teradata Consultant

Responsibilities:

  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Used Teradata utilities FastLoad, MultiLoad, TPT to load data
  • Wrote BTEQ scripts to transform data
  • Wrote FastExport scripts to export data
  • Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTEQ scripts, DML and DDL.
  • Constructed Korn shell driver routines (write, test and implement UNIX scripts) and Perl scripting.
  • Conducted Knowledge Transfer (KT) sessions for Business Users and Production Support Team.
  • Involved in creating Report Design Documentation and Universe Design Documentation for Support Team.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Designed the mappings between sources (external files and databases) to Operational staging targets.
  • Did the performance tuning for Teradata SQL statements using Teradata Explain command.
  • Data was extracted from Teradata, Processed/Transformed using KSH programs and loaded into Data Mart.
  • Used various Teradata Index techniques to improve the query performance.
  • Wrote views based on user and/or reporting requirements.
  • Performance tuned and optimized various complex SQL queries.
  • Wrote many UNIX scripts.
  • Made sure all the applications moved into production environment meets the standards set by team and have no performance issues.
  • Moved about 40 applications into production environment after performing the checks and Change Requests
  • Interacted with DBA’s about SQL Tuning issues and implemented the changes in the script as per their recommendations
  • Worked on data warehouses with sizes from 30-50 Terabytes.
  • Coordinated with the business analysts and developers to discuss issues in interpreting the requirements

Environment: Teradata 13, Teradata Studio, UNIX shell scripting, Tivoli, Ab-Intio, MS SQL server, Oracle, DB2, SAP BW and Business Objects

Confidential, Atlanta, GA

Teradata Developer, Test Analyst

Responsibilities:

  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Used Teradata utilities FastLoad, MultiLoad, TPT to load data
  • Wrote BTEQ scripts to transform data
  • Wrote FastExport scripts to export data
  • Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTQ scripts, DML and DDL.
  • Constructed Korn shell driver routines (write, test and implement UNIX scripts)
  • Conducted Knowledge Transfer (KT) sessions for Business Users and Production Support Team.
  • Involved in creating Report Design Documentation and Universe Design Documentation for Support Team.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Designed the mappings between sources (external files and databases) to Operational staging targets.
  • Did the performance tuning for Teradata SQL statements using TeradataExplain command.
  • Data was extracted from Teradata, Processed/Transformed using Ksh programs and loaded into Data Mart.
  • Used various Teradata Index techniques to improve the query performance.
  • Wrote views based on user and/or reporting requirements.
  • Performance tuned and optimized various complex SQL queries.
  • Wrote many UNIX scripts.
  • Made sure all the applications moved into production environment meets the standards set by team and have no performance issues.
  • Moved about 40 applications into production environment after performing the checks and Change Requests
  • Interacted with DBA’s about SQL Tuning issues and implemented the changes in the script as per their recommendations
  • Worked on data warehouses with sizes from 30-50 Terabytes.
  • Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.

Environment: Teradata Utilities, UNIX shell scripting, Tivoli, Ab-Intio, MS SQL server, Oracle, DB2, SAP BW and Business Objects

Confidential, Jacksonville, FL

Sr. Teradata Developer

Responsibilities:

  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Worked with different Data sources ranging from SAP, MDB, Teradata, flat files, XML, Oracle, and SQL server databases.
  • Created PL/SQL stored procedure (Dynamic procedures), functions, triggers, packages and cursors for Sales Analysis data mart. These objects extract data from SAP and Teradata and stores in oracle backend.
  • Worked in life cycle development including Design, ETL strategy, troubleshooting and Reporting. Identifying facts and dimensions.
  • Proficient in writing Mload, FastLoad and TPump scripts from windows, UNIX and Mainframes environments.
  • Expertise in writing Teradata procedures using BTEQ and SQL assistant.
  • To implement the type 2 process in more than one table created a dynamic procedure using metadata layer which will insert/updates the tables on the fly.
  • Fine-tuned existing Teradata procedures, macros and queries to increase the performance.
  • Re-designed the table structures on one AMP basis. Well organized the primary index mechanism.
  • Extensively used Data Integrator/Oracle and created mappings using transformations like Case, Map Operation, Merge, and Pivot and flagging the record using update strategy for populating the desired slowly changing dimension tables.
  • When designing data marts, Identified entity types and attributes, applied naming conventions and data model patterns, Identified relationships, assign keys, normalized to reduce data redundancy and denormalized to improve performance.
  • Utilized the best practices for the creation of mappings and used transformations like Query, Key Generation and Date Generation.
  • Upgraded Autosys scheduler from 4.0 to 4.5 and Upgraded Business objects data integrator directly from 6.1 to 11.5.1.18 which has dynamic generation of loader scripts like MLoad, fast load and TPump.
  • Using Teradata PT(TPT) can simultaneously load data from multiple and dissimilar data sources into, and extract data from, Teradata Database
  • Designed and developed a process that takes quality data sent from around the world and loads the data tables that are used to ship product, as well as, analyzing data to improve manufacturing yields and intervals.This process uses shell scripts and PERL programs that will determine the table to be loaded and dynamically generate the MLoad and BTEQ utilities to load the information.
  • Involved in the migration of Oracle to Teradata.
  • Used External tables for flat files in order for faster processing.
  • Wrote stored Procedure for complex calculation and for faster processing of bulk volume of the data.
  • Involved in technical writing.
  • Performance fine tuning of existing SQL queries for reports, ETL jobs to reduce the procession time, reduced the number of procedures by creating dynamic procedures.

Environment: Teradata V2R5,Oracle 9.i/10g, DB2, Business Objects Data Integrator 6.1/XIr2, Teradata Visual Explain, BTEQ, Teradata Manager, Teradata SQL Assistant, Fast Load, Multi Load, Fast Export, Rational Clear case, UNIX, MQ, NDM, FTP, UNIX Shell Scripts, Windows.

Confidential

Teradata Developer

Responsibilities:

  • Constructed Korn shell driver routines (write, test and implement UNIX scripts)
  • Wrote views based on user and/or reporting requirements.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Performance tuned and optimized various complex SQL queries.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
  • Gathered system design requirements, design and write system specifications.
  • Excellent knowledge on ETL tools such as Data stage, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Back up of all the KSH, BTQ, TPT's DATA and log files from time to time(Support work)
  • Daily duties including transferring and converting data from one platform to another to be used for further Analysis
  • Highly proficient in writing loader scripts like BTEQ, MultiLoad, FastLoad, TPT and FastExport
  • Understood the Business point of view to implement coding using Informatica power center designer
  • Experience with high volume datasets from various sources like Oracle, Text Files, and Netezza Relational Tables and xml targets
  • Interfaces are built and automated with Informatica ETL tool, Unix Shell Scripting
  • Experienced using Informatica integrated with shell programming
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager
  • Performed Ad hoc analysis and data pulls for business users whenever needed
  • Working with huge data sets with more than 30 million of rows

Environment: Teradata, Informatica, Oracle, SQL, PL/SQL, Oracle Business Intelligence, UNIX Shell Scripting, Tivoli and CRON jobs.

We'd love your feedback!