We provide IT Staff Augmentation Services!

Teradata/etl Developer Resume

3.00/5 (Submit Your Rating)

Richardson, TX

SUMMARY

  • Over 6 years of experience in Information Technology field with strong emphasis in Data Architecture, ETL, Data Modeling, Business Intelligence and Data Analysis associated projects.
  • Exposed to complex assignments in various domains like Healthcare and Communications.
  • Expertise in the different stages of Software Development Life Cycle (SDLC) i.e. Requirement(s) gathering & analyzing, Design/redesign, Implementation and Testing.
  • Expert working knowledge of different RDBMS technologies like RDBMS like Teradata 16.x/15.x, ORACLE 12c/11g.
  • Experience in ETL tools like BTEQ, FastLoad, MultiLoad, TPT, SAS Procedures, Informatica Power Center, Oracle utility like SQL Loader and SQL Server utility like DTS, BCP etc., to export and load data to/from different sources including flat files.
  • Expertise in designing data models using modeling tools like ERWIN.
  • Hands on experience using query tools like SQL Assistant, TOAD, SQL Developer, PLSQL developer & Query man.
  • Expert in preparing all kinds of documentation for design and functional specifications.
  • Actively involved in Quality Processes and release management activities by monitoring and streamlining project tasks.
  • Proficient in performance analysis, tuning and monitoring using EXPLAIN PLAN, Collect Statistics in Teradata.
  • Expertise in SQL, T - SQL, PL/SQL, My SQL, Stored Procedures, Functions, Cursors, Packages, Views, Indexes.
  • An excellent team member with good interpersonal relations, strong communication skills and ability to quickly adapt to different project environments, work in teams and accomplish difficult tasks independently within time frame.

TECHNICAL SKILLS

Operating Systems: Windows 2000/NT/XP/2000 Server, UNIX

ETL Tools: Teradata Bteq, FastLoad, MultiLoad, Informatica Power Center 10.x, 9.x,8.x, Talend, Snowflake

RDBMS: Teradata 16/15/14, Oracle 12c/11g, Vertica

Data-modeling Tools: Erwin

Languages: SQL, TSQL, PL/SQL

Scripting: Python, SAS, HTML, CSS, JavaScript, XML

SQL Utilities: SQL Assistant, B-TEQ, Query Analyzer, Rapid SQL, SQL *Plus, Toad 9.x

Web Technologies: HTML, DHTML, Java Script, MS front Page, Adobe Dreamweaver CS3.

Communication Tools: MS Office (Outlook, Excel, Word, Access), QTP, Remedy 7.0, Lotus Notes 5.x/6.x

PROFESSIONAL EXPERIENCE

Confidential, Richardson, TX

Teradata/ETL Developer

Responsibilities:

  • As part of the Production Support team, have been handling multiple applications like Pharmacy Clam, RxClaim, QL Client, Drug, CAG and CAGM integration, ABC integration, Import data sets and Adhoc reports.
  • Performed Teradata SQL development, reusable SQL templates, table/view creation and stored procedure maintenance, ETL Solutions using the available Teradata Utilities BTEQ, FastLoad, MultiLoad, TPT and FastExport.
  • Analyzed data and queries to collect relevant statistics and advised to create applicable indexes to resolve production issues, conducted root cause analysis (RCA) and performed advanced tuning of queries for better performance of complex business processes and its functionality.
  • Used Informatica Power Center Designer, Workflow Manager, Workflow Monitor to connect disparate data sources (structured & un-structured) and create new or maintain existing ETL/ELT/ELTL data pipelines by developing needed transformations, mappings, mapplets, workflows, sessions & tasks.
  • Developed batch processing metadata driven ETL automation applications using UNIX Shell metadata driven framework to schedule, trigger, load, error handle, sequence-out, process/reprocess, summarize, notify(email) and meet application Service Level Agreements, build daily success/failure reports and notify stakeholders.
  • Pro-actively engaged with end-users to enhance application BAU changes, perform unit testing and drive activities to migrate to production.
  • Environment: Informatica 10.2.0, Teradata 16.20, Teradata SQL Assistant, BTEQ, MLOAD, FLOAD, FASTEXPORT, UNIX, Mastero Scheduler, Shell scripts.

Confidential, Charlotte, NC

Teradata/ETL Developer

Responsibilities:

  • Analyze business requirements, design and code from specifications, evaluate, test, debug, document and implement programs to achieve desired results.
  • Experience in AGILE Software Development Lifecycle (SDLC) - Requirement Gathering, analysis, design, development, maintenance, build, code management and testing of enterprise data warehouse applications and sophisticated ETL processes.
  • Unload/Load data into Teradata RDBMS using core framework scripts and following coding methods in specific programming languages to initiate or enhance program execution and functionality.
  • Support project implementation monitor program execution and tuning SQL to enhance performance.
  • Loading data into different layer using project specific framework scripts which was developed based on TPT utilities.
  • Extracting data from various source systems like Oracle, Informatica and flat files as per the requirements.
  • Creating and Monitoring - jobs, workflows and schedules using 1Automation.
  • Architecture and design support to provide solution for business-initiated requests/ projects.
  • Performing Data Validation on Source-Target Mapping to ensure data consistency
  • Involve in unit testing by preparing test cases.
  • Tracking defect/deployment using Jira.

Confidential, Los Angeles, CA

Teradata/ETL Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
  • Prepared detailed sizing document based on the data available for each month from source system.
  • Prepare detailed Source and Target Mapping document and ETL loading logic document.
  • Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database.
  • Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain
  • Created a BTEQ script for pre population of the work tables prior to the main load process
  • Used volatile table and derived queries for breaking up complex queries into simpler queries
  • Created a shell script that checks the corruption of data file prior to the load
  • Created and automate the process of loading using Shell Script, Multi load, Teradata volatile tables and complex SQL statements
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs
  • Performed bulk data load from multiple data source (SFDC, ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, MultiLoad and FastLoad.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Responsible for applying the right collect stats on the FACT tables.
  • Involved in troubleshooting the production issues and providing production support

Confidential, Charlotte, NC

ETL Developer

Responsibilities:

  • Hands on experience in CSG and ICOMS billing systems which include CPP, Work Order, Equipment, Paper View and Credit Adjustments.
  • Analyze business requirements, design and code from specifications, evaluate, test, debug, document and implement programs to achieve desired results.
  • Experience in AGILE Software Development Lifecycle (SDLC) - Requirement Gathering, analysis, design, development, maintenance, build, code management and testing of enterprise data warehouse applications and sophisticated ETL processes.
  • Support project implementation monitor program execution and tuning SQL to enhance performance.
  • Perform data analysis and data profiling using SQL on CSG and ICOMS billing systems.
  • Loading data into different layer using project specific framework scripts which was developed based on TPT utilities.
  • Work on error handling using ET, UV and WT tables.
  • Extracting data from various source systems like Oracle, Informatica and flat files as per the requirements.
  • Job execution using framework scripts, job scheduler and also automate process to speedup delivery.
  • Creating and Monitoring - jobs, workflows and schedules using 1Automation.
  • Providing design recommendations to improve review processes and resolving technical problems.
  • Worked in SVN version controlling tool.
  • Addressing BI issues and queries.
  • Working on the SQL tuning and optimization of the Business Objects reports.
  • Created UNIX shell scripts as per the business requirement.
  • Developed simple & complex mappings using Designer to load dimension and fact tables as per STAR schema techniques.
  • Implemented Informatica transformations such as Source Qualifier, Aggregator, Expressions, Look up Filters and Sequence Generator.
  • Designing the flow and process of CDM process.

Confidential

Database Developer

Responsibilities:

  • Gathered the needed requirement from the Business Analysts for the needed Database development.
  • Followed department SDLC methodology and development procedures.
  • Provided data warehousing architecture by writing SQL query code based on detail requirements provided.
  • Involved in developing reports with different data sources using Business objects.
  • Provided reliable, timely support of integration, performance and user acceptance testing processes.
  • Interacted with the business analysts to understand and gather the business requirements.
  • Analyzed the SQL Server stored procedures written for generating the complex Business Objects reports.
  • Converted the SQL Server stored procedures to PL/SQL procedures in Oracle.
  • Debugged the converted PL/SQL procedures
  • Compared the “before” and the “after” of the stored procedures to make sure that they produce the same results.
  • Worked on the SQL tuning and optimization of the converted procedures.
  • Checking the quality of data by doing query balancing before moving to Production Environment

We'd love your feedback!