Teradata Etl Developer Resume
CharlottE
SUMMARY
- Certified Teradata 12 Technical specialist with 11+ years of IT experience in Implementation of Data Warehousing and BI projects.
- Have hands on experience on SDLC processes such as Waterfall, Agile Methodology. using Teradata utilities, UNIX, Teradata SQL Assistant 13.0, COBOL, JCL, CICS, DB2, VSAM, SQL, TSO/ISPF, and mainframe tools like ENDEVOR, File - AID, Xpeditor, Platinum, Easytreive, SPUFI, QMF, NDM, SDSF, CA7, Control M.
- Extensive experience in Business Analysis, Application Design, Development, Implementation for Banking and Financial Services.
- High experience in tracing requirements throughout the development, testing process through Requirement Traceability Matrix (RTM).
- Strong knowledge on entire Software Development Life Cycle (SDLC) and its phases - Scope Initiation, Planning, Requirements elicitation, Analysis, Design, Development and change management.
- Have experience in discovery sessions, interviews, prototyping, document-reviews, Brainstorming sessions, walkthrough with the customers.
- Experienced in conducting different types of analysis as part of requirements gathering such as GAP, Interface Analysis, Feasibility study, Impact analysis, Risk Analysis, and As-Is and To-Be analysis.
- Knowledge and experience in artifacts preparation such as Functional, Non - Functional, High - level requirement (HLR), Business Requirement Document (BRD) and Use-Case specifications.
- Worked in migration of Teradata 12 to Teradata 13.
- Worked in migration of Teradata 13.10 to Teradata 14.
- Exposure to Python functions, modules, Pandas, Numpy and other libraries.
- Strong hands on experience using Teradata utilities (BTEQ, FastLoad, MultiLoad, Fast Export, TPump)
- Conducting code walk-through and reviewing internal and external quality assurance in the applications, debugging the defects identified and fixing them, comparing the test results with production results to make sure the changes are effective.
- Proficient in developing strategies for extracting, transforming and loading using Informatica Power Center 9.1/10.2
- Experience in designing and developing stored procedures, functions, constructing tables, views, indexes, triggers and user defined data types
- Implemented data strategies, build data flows and developed conceptual, logical and physical data models to support new and existing projects.
- Extensive experience on End-to-End implementation of Data warehouse and Strong understanding of Data warehouse concepts and methodologies.
- Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing and Load Testing.
- Proven track record in delivering effective design documents, code deliverables, test strategies and business value to the customer.
- Experience in One Automation and Tidal for creating JIL files and Monitoring jobs
- Having Expertise knowledge in Teradata and has good exposure on other databases like Mainframe DB2, Oracle and SQL Server.
- Expertise in setting up testing environments, defining, creating, documenting, verifying and executing test cases, test scenarios and Test plans.
- Quick learner and keen observer, determining external & internal customer needs and well versed in internal & external relationship building. Good team worker and like to be part of a team, which always tries to explore out of box.
- Knowledge of Informatica Power Exchange.
TECHNICAL SKILLS
Operating Systems: UNIX, Linux and Windows XP/7/10. IBM Mainframes Z/OS
Languages: Teradata SQL, Advanced Teradata SQL, Hive, Python, Python - Data Manipulation, Numpy, Pandas
Database: Teradata V2R12 /V2R13.01/ V15.10.1.4, Oracle 10g, SQL Server, Hive
ETL/BI Tool: Informatica 8.x/9.x/10.2, SSRS, Talend, Tableau
Teradata Tools & Utilities: BTEQ, Fast Load, Fast Export, TPUMP, Multi Load, and SQL Assistant, Teradata Studio, Jupyter notebook, Python IDLE, Pycharm
Scheduler: One Automation, Tidal
Project Management and Supporting Software Tools: Agile-Scrum, Waterfall, JIIRA, SharePoint, Rationalize Requisite PRO, Clear Quest, Quality center, HP ALM, PAC2000, Voltage (3rd Party Encryption), Putty
PROFESSIONAL EXPERIENCE
Confidential, Charlotte
Teradata ETL Developer
Responsibilities:
- Worked on Sandbox utility created for sandbox users that facilitate to generate encryption routine SQL scripts for PII data.
- Developed a Generic Teradata Stored procedure that performs encryption using the Voltage - Teradata User defined functions.
- Analyzing model input and output data for validating the Voltage Encryption and decryption UDF results for each category of the Personally Identifiable Information (PII)/Confidential data for applications sharing the encryption Key.
- Creating Encryption validation scan reports via Teradata SQLs and Stored Procedures for tracking and monitoring encryption compliance.
- Analyzing the Database Query Logs (DBQL) to identify the end users and aid with the appropriate strategies for user adoption workflow to consume the encrypted fields.
- Coordinating problem and issue reviews with follow up on assigned actions in the Incident and Problem Management tool PAC2000.
- Providing pre/post implementation data analysis and reporting using UDF wrapper scripts and complex Teradata SQLs and publishing the data into Tableau Dashboard for visualization.
- Coordinate with Sandbox users to implement Teradata Voltage Stored procedures for enhancements on the existing process according to required standards.
- Creation of JIRA tickets for each enhancement and Documenting all Technical and System Specifications for all ETL Processes.
- Analyze and interpret complex data on target systems and provide resolutions to any data issues and coordinate with Business Analysts to validate all requirements.
- Experienced in creating test cases, testing strategy, UAT plan and production Validation Approach.
- Interacted with different system groups for analysis of systems.
- Analyzed applications to be changed for particular business requirements
- Developed SQL join indexes to solve strategic and tactical queries.
- Populating Fast Load and Multiload tables using different data load and unload utilities of Teradata.
- Used Multiset tables to load bulk data in the forms of inserts and deletes.
- Created indexes, joins on tables as per requirements.
- Created various types of temporary tables such as volatile and global temporary tables.
- Worked with collect statistics and join indexes.
Confidential, Charlotte, NC
Teradata ETL Developer
Responsibilities:
- Coordinate with ETL team to implement ETL Stored procedures for any enhancements on the existing Dim and Fact tables and maintain effective awareness of all production activities according to required standards.
- Creation of JIRA tickets for each enhancement and Documenting all Technical and System Specifications for all ETL Processes.
- Perform Unit Tests on all processes and prepare required programs and scripts.
- Analyze and interpret complex data on target systems and provide resolutions to any data issues and coordinate with Business Analysts to validate all requirements.
- Expertise in Teradata, BTEQ scripts, ETL (Informatica), UNIX, One Automation scheduler
- Experienced in automating and scheduling ETL applications, Teradata BTEQ/SQL Scripts in UNIX via job scheduler (One Automation and Tidal work scheduler)
- Experienced in performing Data profiling, Data Analysis, UAT data load and Performing End-to-End production implementation.
- Provide project level analysis - producing required project analysis documentation business requirements, future state proposals, UAT plan
- Experienced in creating test cases, testing strategy, UAT plan and production Validation Approach.
- Experienced in data movement via ETL and Teradata using FAST EXPORT, FAST LOAD, MULTILOAD, IMPORT utility etc.
- Well acquainted with dimension modeling and slowly changing dimension concepts
- Interacted with different system groups for analysis of systems.
- Analyzing applications to be changed for particular business requirements
- Developing SQL join indexes to solve strategic and tactical queries.
- Implemented SCD4 Logic for Capturing Data changes.
- Involved in unit testing, systems integration and user acceptance testing.