We provide IT Staff Augmentation Services!

Teradata/big Data Etl Developer, Resume

5.00/5 (Submit Your Rating)

Ct, UsA

SUMMARY

  • Overall 8+ years of experience in teh IT industry with a strong background in Software Development & Business Intelligence solutions in data warehousing.
  • Excellent understanding on full software development life cycle (SDLC - Agile, Waterfall) of ETL process including requirement analysis, design, development, support of testing and migration to production.
  • Adept in working in multiple projects in different technologies Mainframes, Teradata and Hadoop.
  • Experience in BFS (Confidential Services) domain with a good understanding of core banking.
  • Strong Knowledge of Data Warehousing and Business Analytics.
  • Extensive knowledge of Structured Query Language (SQL) and Teradata.
  • Worked with RDBMS Teradata and utilities like TPT, Fastload, Multiload, Tpump, BTEQ and Fastexport and tools like SQL Assistant.
  • Experience with advanced ETL techniques including Data validation and Change data Capture (Confidential)
  • Extensive experience in data ingestion from variety of sources using DMeXpress transformations such as Aggregate, Copy, Join, Merge, Filter, Reformat, Partition, Map reduce etc to provision target systems like Teradata/hive.
  • Experienced in Implementing Big Data Technologies - Hadoop ecosystem/HDFS/ Map-Reduce Framework, Sqoop and HIVE data warehousing tool.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Application master, Node Manager,Resource Manager, NameNode, DataNode, MapReduce concepts.
  • Good experience in shell scripts for DMeXpress ETL/Map Reduce tasks, Autosys scheduling, Mainframes JCL jobs CA-7 setup.
  • Experience with code migration from between repositories and folders using subversion Tortoise SVN tool.
  • Extensive experience in IMS, DB2, VSAM, JCL, COBOL, mainframe utilities (EZtrieve, FM, NDM), Scheduling tools (ZEKE,CA7), Change tools (Changeman, Endevor, Panvalet), MQs.
  • Well experienced in creating deliverables such as Business Requirement Document (BRD), Requirement Traceability document, High level Design Document (HLD), Test cases etc.
  • Experienced in managing project delivery from concept to post production validation.
  • Worked as an ONCALL production specialist supporting applications running in Cloudera Hadoop and Teradata and as well analyzed and resolved teh incidents raised by teh application users on priority (low, medium and high) through Enterprise support tickets.
  • Having good experience in Excel, Microsoft Word, PowerPoint and Visio.
  • Quick learner with good analytical and communication skills and ensures synergy in team environment
  • Experienced in source data quality analysis, defining of quality criteria and data governance.
  • Extensive experience in data transformation, data mapping from source to target database schemas.
  • Experience in working for regulatory and compliance projects.
  • Experienced in onshore- offshore model handling multiple projects in different technologies simultaneously.

TECHNICAL SKILLS

  • ETL Tools: Teradata, Mainframe, Syncsort DM Express-H
  • Big Data Ecosystem: Hadoop, Map Reduce, HDFS, Hive, Pig,Sqoop, Oozie,HBase,Zookeeper, Yarn
  • Bug Tracking Tools: Quality Center, JIRA
  • Versioning Tools: Endevor, Changeman, Subversion, Git Bitbucket, Panvalet
  • Reporting Tools: Tableau
  • Scheduling Tools: Autosys, CA7, ZEKE
  • Other Tools: SQL assistant, File Manager, SSH Tectia, Super Putty, Winscp, MS Office
  • Database: Teradata, IMS, DB2, VSAM
  • Scripting Languages: Unix Shell scripting
  • Programming languages: Python, SQL, JCL, COBOL, Easytrieve, HQL
  • Operating Systems: Window 9x/NT/XP/Vista/7/ 8.1/10, UNIX, Confidential - Z/OS
  • Data Modeling: Microsoft Visio, Erwin

PROFESSIONAL EXPERIENCE

Teradata/Big Data ETL Developer,

Confidential, CT, USA

Responsibilities:

  • Analyzing and understanding teh Business requirements and verifying teh Business requirement document and Technical design document against requirements.
  • Analysis of teh requirement documents and preparing teh estimates.
  • Involved in preparation of low level design document, coding, testing, peer review and code walkthrough
  • Analyzing teh source data coming from different sources and working with business users and developers to design teh DW Model.
  • Complete initial data profiling and matching/removing duplicate data will be performed.
  • Working closely with user decision makers to develop teh transformation logic to be used in DMeXpress tool/Teradata.
  • Implemented data quality rules for data cleansing parsing and standardization process to improve completeness, conformity and consistency issues identified in teh profiling phase.
  • Scheduling teh Workflows and monitoring them. Provided Pro-Active Production Support after go-live.
  • Worked extensively with teh architect to create and modify teh Databases, Archival Databases, Tables, Views using Erwin as part of data modeling and as well created and modified Teradata Users, Roles, macros,stored procedures along with DBA in operational and analytical platforms.
  • Developed teh ETL code components using Teradata FastLoad, Multiload, TPump, TPT, BTEQ,Fast Export, Mainframes JCL, COBOL, Endevor .
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics.
  • Generated ad-hoc reports for teh business using several complex queries (sql and hql) and easytrieve.
  • Performed several data correction activities to ensure accuracy and consistency of historic data stored in tables.
  • Performed moving history data from archive platform to production platform through Data Mover utility with teh halp of DBA and as well ensured dat teh data in Analytical and operational platform are in sync per teh respective retention periods.
  • Performed extensive catch up of several deposit applications close to a month due to operational data platform outage.
  • Analyzed user tickets and provided solutions to teh queries from teh user and as well performed root cause analysis in case of issues .
  • Experience in testing Data Marts, Data Warehouse/ETL Applications developed in mainframe/Teradata.
  • Created ETL test data for all transformation rules and covered all teh scenarios required for implementing business logic.
  • Review Business requirements documents with test team to provide insights into teh data scenarios and test cases and provide approval on test plans and test scripts documentation.
  • Interfacing with and supporting QA/UAT groups to validate functionality.
  • Coordinated with upstream and downstream to setup teh NDM configuration to push/pull teh files
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs for data cleaning, preprocessing and loading large sets of structured data.
  • Extracted data from shared location and Legacy systems and placed in HDFS and processed.
  • Involved in loading data from UNIX file system to HDFS.
  • Experienced in managing and reviewing Hadoop log files and supported Map Reduce programs those running on teh cluster.
  • Imported and exporting data into HDFS and Hive using Sqoop.
  • Involved in creating Hive tables and loading data into it on which dashboard reports were created using tableau
  • Transformed and loaded large sets of structured, semi structured data using Hive.
  • Involved in designing and creating Hive external tables and optimised teh query performance by partitioning & bucketing techniques.
  • Worked as an ONCALL production specialist supporting applications running in Cloudera Hadoop fixing job abends and Teradata and as well analyzed and resolved teh incidents raised by teh application users on priority (low, medium and high) through Enterprise support tickets.
  • Extensively worked on DMeXpress tool Transformations such as Aggregate, Copy, Join, Merge, Filter, Reformat, Partition, Map reduce etc.
  • Experience in Unix scripting.
  • Used CA-7, Autosys as scheduling tools in teh project.

Environment: s: Hadoop Ecosystem, DMeXpress tool, UNIX, Mainframe, JCL, COBOL, Easytrieve, Teradata, Shell Scripting, Super Putty, WinSCP, SVN, SQL assistant, NDM,CA-7 and Autosys schedulers, Bitbucket,JIRA, MS Visio, MS Office Suite, HP Quality Centre,Python.

Module lead and ETL DeveloperConsumer Banking Data Warehouse project

Confidential

Responsibilities:

  • Review Business requirements documents test team to provide insights into teh data scenarios and test cases.
  • Analyzing and understanding teh Business requirements and Verifying teh Business requirement document and Technical design document against requirements.
  • Experience in Extract, Transform, and Load (ETL) Design, development and Testing.
  • Experience in utilizing Teradata utilities FastLoad, MultiLoad, BTEQ scripting, TPT and FastExport.
  • Identified and performed field level compression on Teradata tables.
  • Experience in testing Data Marts, Data Warehouse/ETL Applications developed in mainframe/Teradata.
  • Experience in loading from various data sources like Teradata, Oracle, Fixed Width and Delimited Flat Files.
  • Involved in Data Extraction from Teradata and Flat Files using SQL assistant.
  • Written several complex SQL queries for validating Reports.
  • Tested several stored procedures.
  • Experienced working with Customer Information Data Mart for business reporting.
  • Attending reviews, status meetings and participated in customer interaction.
  • Debugging teh SQL-Statements and stored procedures for business scenarios.
  • Performed extensive Data Validation, Data Verification against Data Warehouse.
  • Analyzed teh bug reports running SQL queries against teh source system(s) to perform root-cause analysis.
  • Created SQL queries to generate ad-hoc reports for teh business.
  • Verifying teh Business requirement document and Technical design document against requirements.
  • Worked on data profiling and data validation to ensure teh accuracy of teh data between teh warehouse and source systems.
  • Attending reviews, status meetings and customer interaction.
  • Created clear requirements for data on reports and graphs, based on business user requirements and business rules.
  • Created and validated teh test data environment for Staging area, loading teh Staging area with data from multiple sources.
  • Created data masking rules to mask sensitive data before extracting of test data from various sources and loading of data into tables.
  • Created ETL test data for all transformation rules and covered all teh scenarios required for implementing business logic.
  • Developed and tested various stored procedure as part of process automation in Teradata.
  • Tested teh ETL process for both before and after data cleansing process.
  • Validating teh data passed to downstream systems.
  • Checking teh data integrity and referential integrity as part of data validation.

Environment: s: Teradata, Unix, Mainframe, MS Visio, MS Office Suite, MS Outlook, HP Quality Centre.

Module lead and Mainframe Developer Data Systems

Confidential,

Responsibilities:

  • Coordinating with Onsite counterpart and performing analysis for teh business requirement
  • Extensive analysis on teh IMS-Hogan application BOSS and DB2 application FFD to arrive at teh mapping requirements and identification of teh changes to be made into teh existing DB2 system to facilitate teh housing of teh new fields
  • Preparation of Low level design documents
  • Preparation of teh traceability matrix, test scripts and test plans
  • Code implementation for proposed changes and unit testing .
  • Performing Component Integration and System Integration Testing and executing teh scripts in QC
  • Release lead for teh November 2013 integrated release wherein had teh additional responsibility of ensuring smooth functioning of other projects scheduled for teh release and migration of all components from Changeman to Endevor with no issues.

Environment: Confidential mainframes z/OS, COBOL, JCL, DB2, Easytrieve, IMS,HOGAN,NDM, SAR, CA7, QC, FM, FMDB2, SPUFI, MQS, SOAP UI, BMC,FMIMSConfidential Debug tool, Endevor, Changeman,HPQC.

Team Member and Mainframe Developer,

Confidential,

Responsibilities:

  • Production support for teh BOSS system
  • Analysis of teh client or user identified issues and arriving at solutions
  • Carrying out system enhancement as per clients request
  • Perform coding, testing at Unit, Component and System level
  • Implementing cost savings / process improvements
  • Generating reports for teh clients using Easytrieve

Environment: Confidential mainframes z/OS, COBOL, JCL, Easytrieve, IMS, HOGAN, NDM, SAR, CA7, FM, FMIMS

We'd love your feedback!