Teradata/etl Developer Resume
Bothell, WA
SUMMARY
- Around 8 Years of IT Experience in Telecommunication and Healthcare using Teradata, Informatica, SAS Technology under UNIX, Windows environments.
- Experience in developing data warehouse applications using Teradata/Informatica in telecommunication industry.
- Detailed Analysis of Business Requirement Document
- High Level Design Document Preparation
- Low Level Design Document Preparation (Transformation Specifications)
- Proficient in Application Support, Performance Tuning, Optimization.
- Involved in restarting Multi load,Fastload,BTEQ scripts if they fail during dataload
- Creating Unit test case Documents and documenting unit test results
- Designed and developed Transformation Specifications,Teradata BTEQ Scripts.
- Review of Analysis documents, Design documents, Test Case documents and Coding
- Logging in the different issues related to the development phase Created mapping documents and work flows in Informatica Power center versions 8.5/6.0.
- Proficient in Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema and Snow - Flake schema Design)
- Performed Debugging, Troubleshooting, Monitoring and Performance Tuning forDataStagejobs (Server and Parallel).
- According to the requirement changes, made the required the code changes
- Have sound knowledge in the Teradata utilities like - Fload, Mload, BTEQ scripts
- Have exposure to the data integration work experience
- Will be able to handle product ionization of Teradata components (Scripts, automation…)
- Will be able to integrate Teradata with 3rd party BI tools
- Assist in the development of guidelines and processes for the implementation
- Collaborate with project team to develop strategies to enhance user effectiveness
- Good Exposure to Teradata SQL and BTEQ Scripting
- Good exposure to Data Warehousing applications.
- Extensively worked on BTEQ and having good knowledge on utilities including MultiLoad, FastExport, FastLoad, Confidential -pump,
- Used UNIX shell scripts for automating tasks for BTEQ and other utilities
- Experience in supporting ETL strategies using Informatica PowerCenter 8.1/8.5 tools like Designer and Work flow Manager.
- Efficiently used Workflow Manager and Workflow monitor to run Sessions and Workflows.
- Experienced in writing SQL queries based on the requirement.
- Expertise in SAS/BASE, SAS/MACROS, SAS/SQL, Oracle SQL /PLSQL, MS SQL.
- Modification of existing SAS programs and creation of new programs using SAS Macros to improve ease and speed of modification as well as consistency of results.
- Experienced in producing RTF, HTML and PDF formatted files using SAS /ODS to produce adhoc reports for Analysis Reports.
- Expert to schedule the ETL jobs in windows and Unix machines.
- Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
- Working experience with CRON TAB scheduling and TWS.
- Experienced in SDLC starting from understanding the requirements, preparing HLD, DLD, development, testing to support and maintenance.
- Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
- Applying desired client specifications to final report dataset.
- Checking final output with QC.
- Productive self-starter with a strong work ethic consistently achieving goals and Result Oriented.
TECHNICAL SKILLS
Programming: Teradata 13.0/12.0, Informatica PowerCenter 8.1/8.5, SAS, SQL, PL/SQL, C, C++.
Teradata tools & Utilities: SQL Assistant, BTEQ
Load & Export: FastLoad, MultiLoad, FastExport, BTEQ
Databases: MS Access, Oracle8i/9i/10g
Data Base Tools: TOAD, SQL*PLUS
Scripting Languages: Unix Shell scripts
Web Servers: Weblogic 11.3
Web Tools: HTML, XML
Office Applications: MS-Office.
Operating Systems: Sun Solaris 8.x/9.x/10.,Unix,Linux, Windows XP/2000/NT
PROFESSIONAL EXPERIENCE
Confidential, Bothell, WA
Teradata/ETL Developer
Responsibilities:
- Understood the specification and requirement of the project and existing program and identified required solutions to perform the task to extract data from Oracle to Teradata database.
- Analyzing requirements documents and existing system to ensure the technical feasibility of the requirements.
- Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
- Writing teradata sql queries to join or any modifications in the table.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
- Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
- Worked on Teradata macros and Stored procedure while developing a job.
- Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
- Database-to-Database transfer of data (Minimum transformations) using ETL (Informatica).
- Written scripts to extract the data from Oracle and load into Teradata.
- Worked datasets for analysis
- Written Teradata BTEQ scripts to implement the business logic.
- Used UNIX scripts to access Teradata & Oracle Data
- Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
- Worked on bug fixes in Informatica mappings to produce the correct output.
- Documented the Purpose of Mapping so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.
- Sorted data files using UNIX Shell scripting.
- Worked on Scheduling jobs using CRON tab and automated monthly upload from Oracle 11g to Teradata using BTEQ.
- Extensively involved in different Team review meetings and conferences with offshore team.
- Extensively worked on Troubleshooting problems and fixing bugs.
- Responsible for all project documentations.
Environment: Teradata 13.10, Informatica PowerCenter8.5, Sun Solaris 10, Oracle 10g, SQL, Unix
Confidential, Cary, NC
Statistical Programmer I
Responsibilities:
- Analyzing requirements documents and existing system to ensure the technical feasibility of the requirements.
- Communicated with business users and analysts on business requirements.
- Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
- Writing teradata sql queries to join or any modifications in the table.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
- Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
- Responsible for developing, support and maintenance for the ETL processes using Informatica Power Center.
- Written Teradata BTEQ scripts to implement the business logic.
- Used UNIX scripts to access Teradata & Oracle Data
- Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
- Writing SQL queries based on the requirement.
- Worked on Informatica tool - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformations
- Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
- Involved in performance tuning of SQL Queries, Sources, Targets and sessions.
- Involved in ETL process under development, test and production environments.
- Responsible for all project documentations.
- Participated in the identification, understanding, and documentation of business requirements, including the applications capable of supporting those requirements.
- Collected and documented business processes as well as business rules.
Environment: Teradata 13.10, Infromatica powercenter 8.5, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 9.8.03, Oracle 9i (SQL/Plsql)
Confidential
Responsibilities:
- Communicated with business users and analysts on business requirements.
- Designed the ETLs and conducted review meets for better understanding of the design.
- Involved in implementation of BTEQ and Bulk load jobs.
- Developed processes on Teradata using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata).
- Involved preparing of production support document.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Documented best practices and code for complex functionalities for the benefit of new developers.
- Experience in writing test case and test plans and test scenario’s for all phases of rollout.
- Strictly followed the change control methodologies while deploying the code from DEV, QA and Production.
- Responsible for identifying and documenting business rules and creating detailed Use Cases.
- Develop the test plan, test conditions and test cases to be used in testing based on business requirements, technical specifications and/or product knowledge.
- Acted as a resource in understanding how these systems carry out business functions and assists in ensuring the timely and effective implementation.
- Responsible for all project documentations.
- Participated in the identification, understanding, and documentation of business requirements, including the applications capable of supporting those requirements.
- Collected and documented business processes as well as business rules.
- Documented and delivered Functional Specification Document to the project team.
Environment: Teradata 13, Informatica PowerCenter8.5, SAS Enterprise Guide, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 8.2.02, Oracle 9i(SQL/Plsql)
Confidential
Responsibilities:
- Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
- Writing teradata sql queries to join or any modifications in the table.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and Confidential -Pump.
- Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
- Written Teradata BTEQ scripts to implement the business logic.
- Used UNIX scripts to access Teradata & Oracle Data
- Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
- Involved in several Business meetings to understand the requirements of Business analysts and User community.
- Participated in the entire life cycle of the project, which involved understanding scope of the project, functionality, technical design and complete development.
- Involved in extraction of data from various heterogeneous sources like flat files and Databases.
- Extensively create SQL scripts during pre and post load of data.
- Writing SQL queries based on the requirement.
- Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
- Merging the datasets by using Merge statement.
- Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
- For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
- Creating final report format by using Proc Report.
Environment: Teradata 12, Informatica PowerCenter8.1, SAS 8.1, SAS Enterprise Guide, SAS/Base, SAS/STAT, SAS/SQL, SQL, Sun Solaris 8.2.02, Oracle 9i(SQL/Plsql)
Confidential
Programmer
Responsibilities:
- Created views and altered some of the Dimensional tables to satisfy their Reporting needs.
- Created reports like, Reports by Period, Demographic reports and Comparative reports
- Writing SQL queries based on the requirement.
- Writing Programming Plan for Efficacy Endpoints.
- Creating Value Added Datasets (VAD) based on Primary and Secondary Endpoints from Statistical Analysis Plan (SAP).
- Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
- Generating safety, efficacy and figures using SAS programming.
- Merging the datasets by using Merge statement.
- Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
- For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
- Creating final report by using Proc Report.
Environment: SAS 8.1, SAS/Base, SAS/STAT, SAS/GRAPH, SAS/SQL, Windows XP, Sun Solaris 8.2, Oracle 9i(SQL/Plsql)
Confidential
System Analyst
Responsibilities:
- Performed data analysis on the source data coming from legacy systems.
- Writing SQL queries based on the requirement.
- Writing Programming Plan for Efficacy Endpoints.
- Creating Value Added Datasets (VAD) based on Primary and Secondary Endpoints from Statistical Analysis Plan (SAP).
- Generating Descriptive Statistics by SAS/ BASE procedures like Proc Tabulate, Proc Means etc.,
- Generating safety, efficacy and figures using SAS programming.
- Merging the datasets by using Merge statement.
- Cleaning the dataset’s by using Base SAS Procedures and DataSteps.
- For getting required output format by using Proc Transpose, Proc Sort, Proc tabulate …etc.
- Creating final report by using Proc Report.
Environment: SAS 8.1, SAS/Base, SAS/STAT, SAS/GRAPH, Sun Solaris 8.2.02, Oracle 8i(SQL/Plsql)
Confidential
Programmer
Responsibilities:
- Involved in understanding the protocol and Statistical Analysis Plan (SAP)
- Extracting the data from Oracle Clinical
- Generating safety, efficacy and figures using SAS programming.
- Writing QC codes and doing quality checks of generated tables
- Coordinating with the Sr. Associate.
Environment: SAS 8.1, SAS/Base, SAS/STAT, Windows XP