Teradata Etl Developer/analyst Resume
San Leandro, CA
SUMMARY:
- 6+ years of IT experience with technical proficiency in the field of Data Warehousing and Business intelligence, with Business Requirements Analysis, Application Design, Development, Analysis and Support skills
- 4+ years of experience in using Teradata (V2R5 thru 14.10) and Teradata ETL tools/utilities
- Experienced in Design and Setup of ETL solutions to meet Business needs using Informatica Power Centre platforms .
- With strong understanding of Teradata Architecture, RDBMS and Data Modeling concepts, a specialist in SQL tuning and Performance optimization techniques to achieve target SLAs in OLTP & OLAP environments
- Strong programming skills in Teradata SQL, PL SQL and hands on experience with SQL Server and DB2 databases
- Experience in automating batch jobs using tools like Tivoli Workload Scheduler (TWS/Conman) and Crontab
- Experience in creating reporting solutions using Teradata Views, Macros, Excel functions, Excel graphs and Pivot tables
- Experience in scripting using Unix Shell (ksh/bash), PowerShell
- Ability to deliver quality improvements and streamlining processes in a cross functional environment
- Experienced in leading teams of various sizes with offshore and onsite combinations
- Results - oriented attitude and willingness to learn new technologies on own, if required
- Very good understanding of Reporting tools like Cognos, Business Objects, Micro strategy, Tableau.
- Involved in production support activities 24/7 during on call and resolved database issues.
- Strong problem solving, analytical, interpersonal skills, communication skills and have the ability to work both independently and as a team.
- Highly enthusiastic, self-motivated and rapidly assimilate new concepts and technologies.
TECHNICAL SKILLS:
Operating Systems: Windows 7/Vista/XP/NT/2000, HP-UNIX 10.2, IBM AIX 6.1,MAC OSX
Languages: TD SQL, PL SQL, C, C++, HTML
DBMS: Teradata 15.10/14.10 ,13/12/V2R5, Oracle 9i, SQL server 2000, DB2
BI Tools: Tableau, Excel Pivots
Utilities/Tools: SQL/ETL Tools BTEQ, FastLoad, MultiLoad, FastExport, Tpump, Teradata SQL Assistant, SQL Developer, TOAD, Tivoli Workload Scheduler, Crontab, Conman, UC4 and GitHub.
ETL Platforms: Informatica 7.x, 8.x (Workflow Manager, workflow Monitor, Source Analyser, Mapping Designer, Mapplet Designer, Transformation developer)
Scripting: Korn UNIX Scripting, PowerShell.
Others: Putty, WinSCP, Attachmate, MS Office Suite, MS Excel
WORK EXPERIENCE:
Confidential, San Leandro, CA
Teradata ETL Developer/Analyst
Responsibilities
- Confidential is a Line of Business in Wholesale group at Wells, main objective of this project is to prepare “Customer/ Entity” data for consumption by ECM (Enterprise content management).
- Analyze the systems; meet with end users and business units in order to define the requirements.
- Worked with business analysts and users to understand about Wholesale Business Banking group.
- Extensively worked with Teradata utilities like TPT, Fast Load, Fast Export and BTEQ to export and load data from and to different systems with source systems.
- Worked with Business analyst team and documented BRD, FSD, TDD, Mapping and Test cases which gained immediate signoff by client.
- Worked extensively on SQL, PowerShell scripting. Performed Data profiling for data quality purposes.
- Created and Automated the process using PowerShell to call and build TPT/BTEQ scripts, generate logs, upload data and logs to Share Point.
- Confidential as part of this project GE capital’s finance data is being integrated with Wells Fargo’s wholesale data.
- Analyze the systems; meet with end users and business units in order to define the requirements.
- Work with business users/architects to understand Wells Fargo’s wholesale data warehouse.
- Worked extensively on SQL, PL/SQL, PowerShell scripting. Performed Data profiling for data quality purposes.
- Import GE capital finance’s documents metadata into Wells Fargo systems and integrate with existing wholesale customers.
- Extensively work with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load, TPT to export and load data to/from different source systems including flat files.
- Create and automate PowerShell scripts to call BTEQ scripts, generate logs and upload data and logs to SharePoint.
Environment: Teradata SQL assistant, Teradata Database, Flat Files, CSV files, MS Excel, Windows 8, SQL, PL-SQL, Stored procedures, PowerShell Scripting, Scrum, Rally, SVN.
Confidential,San Jose, CASr. Teradata ETL Developer
Responsibilities
- Supporting the Teradata Enterprise Data and Analytics platform including mentoring, implementing best practice recommendations, and participating in the day-to-day activities
- Analyze complex SQL queries and create mapping sheets for Big Data migration project
- Teradata to Hadoop migration of a financial application using HIVE and SPARKSQL
- Usage of different Talend Hadoop Component like Hive, Spark.
- Load and transform data into HDFS from large set of structured data using Talend .
- Performance tuning of existing Teradata Applications (TFC and TLD)
- Developed strict SLA based regulatory reports using Teradata SQL and reported to end users by using ROOT
- Coordination with clients, Requirements Gathering and Impact Analysis.
- Enhancement of existing applications using Teradata SQL scripts.
- Job designing and scheduling batch jobs in UC4, and commit code using GitHub.
- Support and test new functionalities with Teradata 15.10 upgrade.
- Develop Informatica mappings for getting source data to target Teradata database.
- Scheduling Teradata scripts through internal frame and CRON for development purposes.
- Use DBC Tables for the performance measurement, Space calculations.
- Work on complex adhoc queries to support user help requests within a short time frame.
- Involved heavily in writing complex SQL queries based on the given requirements.
- Extensively worked with Teradata utilities like TPT, Fast Load, Fast Export and BTEQ to export and load data from and to different systems with source systems
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS
- Responsible for managing Data Warehouse in Teradata Database, Tableau Report Development and managing ETL environments for code development and code promotion
- Responsible for conversion of business requirement from Business Analyst to Technical Design definition and implementation
- Responsible for providing effective system solutions for technical issues identified during post development phase
- Responsible for analysis, design, development and testing of the backend code in Database
- Responsible for managing technical deliverables and meeting project
- Extensively worked with Teradata utilities like TPT, Fast Load, Fast Export and BTEQ to export and load data from and to different systems with source systems
- Perform tests and validates all the data flow and prepare all ETL process according to business
- Python is used to automate the ETL process, and used to notify the status of the Backend jobs through email notifications and generate the log files.
- Python variables are used to grab credentials(Teradata) from a specified folder in a location and generate text file as a log.
- Quality checks are done during the extraction from Teradata using Python.
- Responsible for Tableau Report Development for the efficient use of business users
Environment: Teradata 14, 15.10 (BTEQ, Fast Export, MLOAD, FLOAD, TPump), TPT, HIVE, SPARK SQL, UNIX, UC4, Informatica, GitHub, Task Scheduler, Eclipse.
Confidential, El Segundo, CATeradata Developer
Responsibilities
- Supporting the Teradata Enterprise Data and Analytics platform including mentoring, implementing best practice recommendations, and participating in the DBA day-to-day activities.
- Analyzed the long running ETL’s and optimized them by performance tuning and even converted some long ETL’s to ELT’s for the better performance.
- Participating in the Client interactions, meetings, presentations on databases performances in regard to the Teradata platform.
- Proficient on writing and troubleshooting the Teradata utilities (FLOAD, MLOAD, FEXP, TPT, TPUMP) jobs.
- Involved heavily in writing complex SQL queries based on the given requirements.
- Performed bulk data load from multiple data source (ORACLE, legacy systems) to TERADATA RDBMS using BTEQ, FastLoad, MultiLoad and TPump.
- Used various OLAP functions such as RANK, CSUM, QUANTILE etc.
- UNIX scripting to support applications and database administration functions.
- Data was extracted from Teradata, Processed/Transformed using KSH programs and loaded into Data Mart.
- Used various Teradata Index techniques to improve the query performance
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
- Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit and system test.
- Worked on Cron Tab Work Scheduler.
- Used ETL Tool Ab Initio to extract, transfer and load data.
- Created series of Macros for various applications in TERADATA SQL Assistant.
- Responsible for loading data into warehouse from different sources using Multiload and FastLoad to load millions of records.
- Designed the order of flow for the execution of the jobs and scheduling the jobs.
- Created Stored Procedures to transform the data. Worked extensively on SQL, PL/SQL for various needs of the transformations.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Used Query banding to track the User activity.
Environment: Teradata V2R6, Teradata 12, Teradata 13 (BTEQ, FastExport, MLOAD, FLOAD, TPump), TPT, Oracle 10g,9i, DB2, Web Methods, MS SQL Server 2000, Ab Initio, 2003 MS Excel 2007/2010, Cron Tab work Scheduler, Perl, Putty, IBM AIX 6.2, Erwin, Windows XP, Win7.
Confidential, Alpharetta, GATeradata Developer
Responsibilities
- Work with Business to gather requirements, translate them to technical requirements and involve in the design process of applications
- Review and optimize code developed by Teradata team and as well create FastExport, FastLoad, MultiLoad, BTEQ, TPT and UNIX scripts for loading/unloading files and database transfers on Teradata database taking advantage of the potency of Teradata technology
- Create Standard Operating Procedures for team members and approve/disapprove new application coding standards.
- Troubleshoot high severity issues within SLAs and take leadership in the war room process
- Build huge tables, views making sure there is appropriate indexing and Partitioning on tables with frequent inserts, deletes and updates to reduce the contention.
- Write code on peripheral databases like Oracle, SQL Server and DB2
- Create UNIX shell wrappers that call the BTEQs/PL SQL code and automate the batch jobs using Tivoli Workload Scheduler.
- Work with DBAs for transition from Development to Testing and Testing to Production.
- Created an archive process that archives the data files and FTP to the remote server
- Created a cleanup process for removing all the Intermediate temporary files that were used prior to the loading process.
- Worked on ETL Tool Informatica for massively parallel processing
- Used Cognos the reporting tool to extract corporate data, analyze it and assemble reports
- Created a shell script that checks the corruption of data file prior to the load
- Created unit test plans to unit test the code prior to the handover process to QA
- Involved in troubleshooting the production issues and providing production support
- Streamlined the Teradata scripts and Perl scripts migration process on the UNIX box
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system
- Collected statistics every week on the tables to improve performance
- Developed unit test plans and involved in system testing.
Environment: Teradata 13, (BTEQ, FastExport, MLOAD, FLOAD, TPump), TPT, Oracle 10g,9i, DB2,, 2003 MS Excel, Tivoli Workload Scheduler 8.3, Perl, Putty, IBM AIX 6.2, Erwin, Windows XP, Win 7.
Confidential, Atlanta, GAETL Developer
Responsibilities:
- Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model.
- Supported other ongoing projects: Software Upgrade project from Teradata V2R5.1 to V2R6.1.
- Automated related tasks by developing UNIX shell scripts used to maintain the Core EDW.
- Used EXPORT/IMPORT to do table level and full database de-fragmentation.
- Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
- Accomplished data movement process that load data from DB2 into Teradata by the development of KornShell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD and Queryman reviewed and improved the design of Extract and Load specifications
- Used ETL Tool Ab Initio to extract, transfer and load data.
- Troubleshoot and created automatic script/SQL generators
- Helping the Reporting team by providing the Teradata queries
- Fulfilled ad-hoc requests coming from superiors
Environment: Teradata, SQL, UNIX, BTEQ, FastLoad, MultiLoad, FastExport, Teradata Queryman, Ab initio, Windows XP/2000.
ConfidentialDatabase Analyst
Responsibilities:
- Maintaining the University Library Database (Oracle 9i)
- Used SQL developer frequently to Query the Library database
- Created tables in the database to organize books in their respective categories
- Assist the University library website users in searching required books
- Maintaining and troubleshooting the printer network
Environment: Oracle 9i, SQL, MS office 2003, HP -UNIX 10.2, Windows XP
ConfidentialTeradata Developer
Responsibilities:
- Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata database
- Written Teradata BTEQ and FASTEXPORT scripts to export data to various data marts
- Wrote hundreds of DDL scripts to create tables, views and indexes in the company Data Warehouse
- Designed ETL process flows for entire DWH application and developed data mapping spreadsheets to define transformation rules for each stage of ETL process
- Developed Informatica ETL code using various mappings and transformations for transport of data from legacy extract files to data mart as per the business requirements
- Developed reusable transformations and Mapplets transformation
- Worked with PMCMD to interact with Informatica Server
- Developed Shell scripts for Job Automation
- Responsible for migration of Informatica mappings, workflows between different environments
- Developed Windows Batch scripts for setting customized commands
- Development of BTEQ/ FASTLOAD / MULTILOAD script for loading purpose
- Extracted data from Oracle, SQL Server, Flat files, and DB2 source systems
- Involved in Unit testing and Integration Testing of the developed code
- Responsible for four types of performance tuning Mapping Level, Session Level, source Level, and target level
- Coordinate with the Actuate reporting team in modifying the existing Informatica ETL code as per the new business requirements.
Environment: Teradata V2R5, FastLoad, MultiLoad, TPump, FastExport, BTEQ, MS Office 2003, UNIX, Putty, HP-UNIX 10.2, Windows XP