We provide IT Staff Augmentation Services!

Teradata/etl Developer Resume

0/5 (Submit Your Rating)

Bothell, WA

SUMMARY

  • Around 8 Years of IT Experience in Telecommunication and Healthcare using Teradata, Informatica, SAS Technology under UNIX, Windows environments.
  • Strong knowledge in Teradata Architecture and its Utilities
  • Experience in developing data warehouse applications using Teradata/Informatica in telecommunication industry.
  • Detailed Analysis of Business Requirement Document
  • Has good knowledge on requirements analysis, use cases, metadata, Conceptual, Logical and Physical Data Modeling using ERWin
  • Low Level Design Document Preparation (Transformation Specifications)
  • Proficient in Teradata Database Design, Application Support, Performance Tuning, Optimization, User & Security Administration, Data Administration and setting up the Test and Development environments.
  • Involved in restarting Multi load,Fastload,BTEQ scripts if they fail during dataload
  • Creating Unit test case Documents and documenting unit test results
  • Experience in various source systems like Relational, Flat Files and XML
  • Designed and developed Transformation Specifications, Teradata BTEQ Scripts.
  • Review of Analysis documents, Design documents, Test Case documents and Coding
  • Logging in the different issues related to the development phase Created mapping documents and work flows in Informatica Power center versions 8.5/6.0.
  • Proficient in Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema and Snow - Flake schema Design)
  • Performed Debugging, Troubleshooting, Monitoring and Performance Tuning forDataStagejobs (Server and Parallel).
  • Monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
  • Proficient in using Teradata utilities such as TD Manager, Administrator, SQL Assistant, TSET, Visual Explain, Arcmain, TASM.
  • Have sound knowledge in the Teradata utilities like - Fload, Mload, BTEQ scripts
  • Handled product ionization of Teradata components (Scripts, automation…)
  • Integrated Teradata with 3rd party BI tools
  • Good exposure to Data Warehousing applications.
  • Extensively worked on BTEQ and Teradata utilities including MultiLoad, FastExport, FastLoad, Confidential -pump, TPTLoad Script
  • Used UNIX shell scripts for automating tasks for BTEQ and other utilities
  • Experience in supporting ETL strategies using Informatica PowerCenter 8.1/8.5 tools like Designer and Work flow Manager.
  • Efficiently used Workflow Manager and Workflow monitor to run Sessions and Workflows.
  • Experienced in writing SQL queries based on the requirement.
  • Expertise in SAS/BASE, SAS/MACROS, SAS/SQL, Oracle SQL /PLSQL, MS SQL.
  • Modification of existing SAS programs and creation of new programs using SAS Macros to improve ease and speed of modification as well as consistency of results.
  • Experienced in producing RTF, HTML and PDF formatted files using SAS /ODS to produce adhoc reports for Analysis Reports.
  • Expert to schedule the ETL jobs in windows and Unix machines.
  • Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
  • Working experience with CRON TAB scheduling and TWS.
  • Experienced in SDLC starting from understanding the requirements, preparing HLD, DLD, development, testing to support and maintenance.
  • Cleaning the dataset’s by using Base SAS Procedures and DataSteps.

TECHNICAL SKILLS

Operating Systems: Z/OS, OS/390, MVS; WINDOWS; UNIX; XENIX

Database: Teradata 13.10/12/V2R6; Informatica PowerCenter 8.1/8.5 ;DB2; IMS DB; VSAM Files; MS ACCESS; MS Access ;Oracle 8i/9i/10g ;

Utilities/Tools: TERADATA Fast Load; Multi Load; TPUMP; Fast Export; BTEQ; DB2 UNLOAD; DB2 LOAD; IBM Dataset Utilities such as IDCAMS, IEBGENER etc.; JCL; FILEAID; SPUFI; QMF; XPEDITOR; ENDEVOR; INSYNC; TSO/IF, SQL Assistant, Bteq, Fastload, Multiload, Fastexport, Tpump, Teradata Parallel Transport utility (TPT) Load & Export, FastLoad, MultiLoad, FastExport, BTEQ, HTML, XML, TOAD, SQL*PLUS.

Languages: COBOL; SQL; CICS; JCL ; Cognos ;SAS ; SQL ; PL/SQL ; C ; C++ ; UNIX Shell scripts

Academic Exposure: MQ Series; Oracle PL/SQL; HTML; XML; Java; JavaScript; ETL Tools

PROFESSIONAL EXPERIENCE

Confidential, Bothell, WA

Teradata/ETL Developer

Responsibilities:

  • Understood the specification and requirement of the project and existing program and identified required solutions to perform the task to extract data from Oracle to Teradata database.
  • Analyzing requirements documents and existing system to ensure the technical feasibility of the requirements.
  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • Writing teradata sql queries to join or any modifications in the table.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
  • Worked on Teradata macros and Stored procedure while developing a job.
  • Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
  • Create Run, and Schedule Reports and Jobs using Cognos Connection.
  • Involved with Scheduling and Distributing Reports through Schedule Management in Cognos Connection.
  • Design, creating and tuning physical database objects (tables, views, indexes) to support normalized and dimensional models.
  • Design and instantiation of the physical data model using ERWIN in the data warehouse / data marts.
  • Involved in analyzing existing logical and physical data modeling
  • Designed and implemented ETL code using Teradata SQL stored procedures
  • Authored complex BASH shell program which included database calls to retrieve report metadata, copy and FTP files, migrate files to archive directory, and capture, log, and report errors
  • Successfully and seamlessly transitioned to the primary provider of database development support in three weeks with no prior exposure to Teradata development or administration.
  • Database-to-Database transfer of data (Minimum transformations) using ETL (Informatica and Cognos).
  • Written scripts to extract the data from Oracle and load into Teradata.
  • Worked datasets for analysis
  • Written Teradata BTEQ scripts to implement the business logic.
  • Used UNIX scripts to access Teradata & Oracle Data
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
  • Worked on bug fixes in Informatica mappings to produce the correct output.
  • Documented the Purpose of Mapping so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.
  • Sorted data files using UNIX Shell scripting.
  • Worked on Scheduling jobs using CRON tab and automated monthly upload from Oracle 11g to Teradata using BTEQ.
  • Extensively involved in different Team review meetings and conferences with offshore team.
  • Extensively worked on Troubleshooting problems and fixing bugs.
  • Responsible for all project documentations.

Environment: Teradata 13.10, Informatica PowerCenter8.5, Data Modeling, Erwin, Sun Solaris 10, Oracle 10g, SQL, UNIX

Confidential, Cary, NC

Teradata Developer

Responsibilities:

  • Analyzing requirements documents and existing system to ensure the technical feasibility of the requirements.
  • Communicated with business users and analysts on business requirements.
  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • Writing teradata sql queries to join or any modifications in the table.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
  • Provided a highly scalable technical architecture (on Informatica and Cognos) as well as setting up security architecture for Abbott’s enterprise data warehouse and reporting
  • Responsible for developing, support and maintenance for the ETL processes using Informatica Power Center.
  • Involved in analyzing existing Logical and Physical Data Modeling (ERWIN) with STAR schema and SNOW FLAKE schema techniques
  • Create Run, and Schedule Reports and Jobs using Cognos Connection.
  • Involved with Scheduling and Distributing Reports through Schedule Management in Cognos Connection.
  • Written Teradata BTEQ scripts to implement the business logic.
  • Used UNIX scripts to access Teradata & Oracle Data
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
  • Writing SQL queries based on the requirement.
  • Worked on Informatica tool - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformations
  • Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Involved in performance tuning of SQL Queries, Sources, Targets and sessions.
  • Has experience in SQL Assistant, PMON tools.
  • Involved in ETL process under development, test and production environments.
  • Responsible for all project documentations.
  • Participated in the identification, understanding, and documentation of business requirements, including the applications capable of supporting those requirements.
  • Collected and documented business processes as well as business rules.

Environment: Teradata 13.10, Informatica powercenter 8.5, ERwin, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 9.8.03, Oracle 9i (SQL/Plsql)

Confidential

Responsibilities:

  • Communicated with business users and analysts on business requirements.
  • Designed the ETLs and conducted review meets for better understanding of the design.
  • Involved in implementation of BTEQ and Bulk load jobs.
  • Developed processes on Teradata using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata).
  • Involved preparing of production support document.
  • Successfully Integrated data across multiple and high volumes of data sources and target applications.
  • Documented best practices and code for complex functionalities for the benefit of new developers.
  • Experience in writing test case and test plans and test scenario’s for all phases of rollout.
  • Strictly followed the change control methodologies while deploying the code from DEV, QA and Production.
  • Responsible for identifying and documenting business rules and creating detailed Use Cases.
  • Develop the test plan, test conditions and test cases to be used in testing based on business requirements, technical specifications and/or product knowledge.
  • Acted as a resource in understanding how these systems carry out business functions and assists in ensuring the timely and effective implementation.
  • Responsible for all project documentations.
  • Participated in the identification, understanding, and documentation of business requirements, including the applications capable of supporting those requirements.
  • Collected and documented business processes as well as business rules.
  • Documented and delivered Functional Specification Document to the project team.

Environment: Teradata 13, Informatica PowerCenter8.5, SAS Enterprise Guide, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 8.2.02, Oracle 9i(SQL/Plsql)

Confidential

Responsibilities:

  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • Writing teradata sql queries to join or any modifications in the table.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
  • Written Teradata BTEQ scripts to implement the business logic.
  • Used UNIX scripts to access Teradata & Oracle Data
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
  • Involved in several Business meetings to understand the requirements of Business analysts and User community.
  • Participated in the entire life cycle of the project, which involved understanding scope of the project, functionality, technical design and complete development.
  • Involved in extraction of data from various heterogeneous sources like flat files and Databases.
  • Extensively create SQL scripts during pre and post load of data.
  • Writing SQL queries based on the requirement.
  • Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
  • Merging the datasets by using Merge statement.
  • Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
  • For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
  • Creating final report format by using Proc Report.

Environment: Teradata 12, Informatica PowerCenter8.1, SAS 8.1, SAS Enterprise Guide, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 8.2.02, Oracle 9i (SQL/Plsql)

Confidential

Programmer

Responsibilities:

  • Created views and altered some of the Dimensional tables to satisfy their Reporting needs.
  • Created reports like, Reports by Period, Demographic reports and Comparative reports
  • Writing SQL queries based on the requirement.
  • Writing Programming Plan for Efficacy Endpoints.
  • Creating Value Added Datasets (VAD) based on Primary and Secondary Endpoints from Statistical Analysis Plan (SAP).
  • Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
  • Generating safety, efficacy and figures using SAS programming.
  • Merging the datasets by using Merge statement.
  • Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
  • For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
  • Creating final report by using Proc Report.

Environment: SAS 8.1, SAS/Base, SAS/STAT, SAS/GRAPH, SAS/SQL, Windows XP, Sun Solaris 8.2, Oracle 9i (SQL/Plsql)

Confidential

System Analyst

Responsibilities:

  • Performed data analysis on the source data coming from legacy systems.
  • Writing SQL queries based on the requirement.
  • Writing Programming Plan for Efficacy Endpoints.
  • Creating Value Added Datasets (VAD) based on Primary and Secondary Endpoints from Statistical Analysis Plan (SAP).
  • Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
  • Generating safety, efficacy and figures using SAS programming.
  • Merging the datasets by using Merge statement.
  • Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
  • For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
  • Creating final report by using Proc Report.

Environment: SAS 8.1, SAS/Base, SAS/STAT, SAS/GRAPH, Sun Solaris 8.2.02, Oracle 8i(SQL/Plsql)

Confidential

Programmer

Responsibilities:

  • Involved in understanding the protocol and Statistical Analysis Plan (SAP)
  • Extracting the data from Oracle Clinical
  • Generating safety, efficacy and figures using SAS programming.
  • Writing QC codes and doing quality checks of generated tables
  • Coordinating with the Sr. Associate.

Environment: SAS 8.1, SAS/Base, SAS/STAT, Windows XP

We'd love your feedback!