We provide IT Staff Augmentation Services!

Teradata Developer Consultant Resume

0/5 (Submit Your Rating)

Woonsocket, RI

SUMMARY

  • 7+ years of experience in designing, developing, implementing and supporting Data - warehousing/ETL/BI/ Business Analytic solutions in Talent Acquisition, Healthcare Retail, Financial Services and Manufacturing domains.
  • Strong experience in handling large scale data in MMP architecture (70-75 TBs) using Teradata. Also worked on other databases - Oracle, MySQL, PostgreSQL.
  • Extensive experience in data modelling, writing and tuning complex SQL Queries/Scripts, creating ad-hoc reports, daily, monthly, quarterly and yearly aggregates.
  • Experienced in Designing/Developing/Tuning/Managing ETL pipelines using SQL, HIVE SQL, ETL toolsets - Pentaho data Integration (Kettle), Informatica Powercenter 9.x, Informatica Data Explorer and with source systems like APIs, HDFS, relational tables, SAP, text/excel files, XML files.
  • Expertise in the concepts of Data Warehousing, Dimensional Modeling, ER Modeling, Fact and Dimensional Tables, Error handling, Re-startability of batch jobs.
  • Hands-on scripting experience in Unix Shell Scripting (Bash and Korn) and python2 for Automation, DML/ DDL parsing, retrieving data, FTP files from remote server, implementing file watcher mechanism, backup of repository and folder, merging many files and polishing data for downstream ETL processes
  • Experienced in Tableau8.x/9.x tool set (Desktop, Server, Reader). Skilled in Desktop for rich Data Visualization, Dashboards, Reporting and Analysis, creating Calculations, Metrics, Score cards, Attributes, Filters, Prompts, Drills, Search, Interactive Dashboards, Data Blending and Formatting.
  • Exposure to all SDLC phases and also have experience in working in agile environment using Scrum framework with cross functional teams in onshore-offshore model.
  • Have exposure to Hadoop ecosystem and technologies like HDFS, Hive, Hive SQL Sqoop.

Areas of Expertise

  • Data Warehousing
  • Dimensional Modeling
  • ETL pipelines design and development
  • Data Analysis
  • Data Management
  • Data Pipeline Management
  • Report Creation
  • Data Visualization
  • Data Cleansing
  • Data Profiling
  • Data Migration
  • Requirement Gathering
  • Quality Assurance
  • Agile Methodologies
  • Documentation
  • Production issues Analysis

TECHNICAL SKILLS

  • SQL
  • Teradata
  • Unix Shell Scripting (Bash and Korn)
  • PostgreSQL
  • MySQL
  • Oracle
  • Tableau
  • Informatica Powercenter
  • Pentaho Data Integration (Kettle)
  • HIVE SQL
  • Informatica Data Explorer
  • Informatica IDQ
  • Python2
  • Pentaho Business Analytics Visio
  • Erwin
  • Airflow
  • Control M
  • Job Scheduler
  • JIRA
  • GIT
  • HDFS
  • SQOOP

PROFESSIONAL EXPERIENCE

Confidential, Sunnyvale CA

Data-warehouse Engineer Consultant

Responsibilities:

  • Designed and Setup a presentation database layer for Analytical Teams which is refreshed each day with the latest data using UNIX Shell scripting and PgDump PostgreSQL utility.
  • Created data-models and daily, monthly, quarterly, yearly aggregates in PostgreSQL and Teradata to build reports and visualization.
  • Developed and Tuned ETL pipelines to populate data mart from APIs, Hive tables, relational tables (PostgreSQL, Teradata) text/excel files, XML files using SQL, Unix Shell Scripting, Pentaho PDI (Kettle) ETL solution, Sqoop scripts, python.
  • Worked closely with Business Users, Data Scientists, Project Managers, SMEs for requirement analysis and build test datasets that answer key business questions.
  • Evaluated datasets for accuracy and quality. Identified and solved issues concerning data management to improve data quality.
  • Improved data foundational procedures, guidelines and standards.

Confidential, Woonsocket RI

Teradata Developer Consultant

Responsibilities:

  • Worked with cross functional teams, Architect and the Client during various stages of the project and was involved in complete Software Development Life Cycle from Design, Development, Testing, Deployment, Documentation.
  • Created new dimension data models in Teradata and wrote complex Teradata SQL Queries using joins, subqueries, indexes, views, join indexes, PPI, Secondary indexes for data access used in ETL jobs and regression test scripts.
  • Tuned the existing code to handle huge dataset efficiently.
  • Worked on Teradata Utilities like BTEQ, Fastload, MLoad, TPT, Fast Export for batch processing to load\unload data into Teradata tables.
  • Created new and Refactored\Tuned existing ETL jobs using Informatica PowerCenter, SQL, Unix Shell Scripting, Informatica IDQ to extract data from flat files, relational table, XML files and to transform\load the data in target tables accordingly.
  • Created Shell routines for process automation like starting/scheduling ETL jobs, to FTP files from remote server, implementing file watcher mechanism, backup of repository and folder etc.
  • Involved in deployment roll outs for data validation purposes

Confidential, Indianapolis IN

Teradata Developer Consultant

Responsibilities:

  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Analyzed existing OLTP and BI system of the region and identified required dimensions and facts for EDW. Participated in designing the Dimensional Model for the EWD.
  • Designed and drafted design in SDD for EDW based on BRD and architectural documents.
  • Created complex ETL flows using Informatica PowerCenter to extract and transform data from flat-files, SAP R/3 tables, relational tables (Oracle and Teradata).
  • Wrote macros, packages, functions, triggers and SQL queries to support ETL mappings.
  • Wrote, tested and implemented Teradata BTEQ, Fastload, Multiload and scripts, DML and DDL to load data.
  • Designed and implemented CDC (change Data capture) mechanism using mapping variables.
  • Also, implemented SCD1, SDC2 (slowly changing dimensions type1, type2).

Confidential, Akron OH

Data warehouse Developer consultant

Responsibilities:

  • Parsed high-level design specification and created Low-level design documents.
  • Designed ETL jobs to load into staging tables and then to Dimensions and Facts.
  • Developed and implemented ETL workflows using Informatica Powercenter to extract data from flat files, relational databases (Teradata, Oracle) and SAP R3 tables.
  • Develop ABAP programs for the same and performed Cleansing of data.
  • Worked on BTEQ, FASTLOAD, MULTILOAD to load data into Teradata target tables.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Wrote complex Teradata SQL Queries using joins, subqueries, indexes, views, join indexes, PPI, Secondary indexes for data access and log manipulations etc.
  • Analyzed production issues to implement as enhancements to the system.

Confidential

Oracle Developer

Responsibilities:

  • Coordinated with Business Analyst and Architect for Requirement analysis and implemented the same into a functional database design.
  • Developed packages, stored procedures, functions and triggers to perform calculations and implement business logic.
  • Implemented Triggers, Views, Synonyms, Hints, Partitioning of Tables, Global Temporary Tables, materialized views, collections and ref cursors.
  • Performed database validations during production deployments and warranty support post deployment.
  • Created shell scripts to automate various processes like ftp files to and from third party server, file watcher, checking memory availability on servers, preparing/cleansing flat files to load database etc.

We'd love your feedback!