Teradata Developer Resume
Atlanta, GA
SUMMARY
- Over 5+ Years of experience in the IT industry, coupled with extensive experience in Data Warehousing, Teradata Administration, Monitoring, Performance Tuning, Capacity Planning and Upgrade/Migration activities and ETL/ELT.
- Have very good experience administering the Teradata database, Data Modelling and ETL.
- Has solid programming experience in Teradata SQL, UNIX and proficient in using Teradata admin tools Viewpoint, Teradata administrator, Teradata Manager and Performance Monitor.
- Extensive experience in Teradata Database Design, Application Support, Tuning & Optimization, User & Security Administration, Data Administration and Setting up the Test and Development environment.
- Solid knowledge in Teradata DBA and Query Tuning.
- Experience in Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, Teradata Parallel Transporter and BTEQ Teradata utilities.
- Experience in Backup and Restore (BAR) methodologies. Design, implement and co - ordinate all back-up and restore strategies.
- Worked on Teradata archive/restore and data replications tools- Arcmain, TARAGUI, Netback-up and Data mover.
- Have very good working knowledge on system (dbc) tables, resusage tables, DBQL and PDCR tables.
- Expertise in using Secondary indexes, PPI, Join indexes, hash indexes.
- Expertise in Teradata Parallel Transporter and scripting at Windows/Unix servers/mainframe.
- Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
- Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
- Expertise in Query Analyzing, performance tuning and testing.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like viewpoint TASM, Teradata Dynamic Workload Manager and Teradata Manager.
- Cognitive about designing, deploying and operating highly available, scalable and fault tolerant systems using Amazon Web Services (AWS).
- Extensively worked using AWS services along with wide and in depth understanding of each one of them.
- Highly skilled in deployment, data security and troubleshooting of the applications using AWSservices.
- Experienced in implementing Organization DevOps strategy in various operating environments of Linux and windows servers along with cloud strategies of Amazon Web Services.
- Proficient in writing Cloud Formation Templates (CFT) in YAML and JSON format to build the AWSservices with the paradigm of Infrastructure as a Code.
- Gained experience in deploying applications on to their respective environments using Elastic Beanstalk.
- Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWSresources.
- Acquired practical exposure with Continuous Integration/Continuous Delivery tools like Jenkins, Bamboo and AnthillPro to merge development with testing through pipelines.
- Implemented automation using Configuration Management tools like Ansible, Chef, Puppet and SaltStack.
- Worked with Docker container infrastructure to encapsulate code into a file system with abstraction and automation.
- Working experience with CRON TAB scheduling
- Experience in writing UNIX shell scripts to support and automate the ETL process.
- Experience in different database architectures like Shared Nothing and Shared Everything architectures. Very good understanding of SMP and MPP architectures.
- Involved in Unit Testing, Integration Testing and preparing test cases.
- Proficient in analyzing business processes requirements and translating them into technical requirements.
- Knowledge on Teradata Unity system in the environment.
- Involved in production support activities 24/7 during on call and resolved database issues.
- Strong problem solving, analytical, interpersonal skills, communication skills and can work both independently and as a team.
- Highly enthusiastic, self-motivated and rapidly assimilate new concepts and technologies.
- Good in analyzing the explain plans and tuning the SQL queries
- Have good understanding on the ETL tools like Informatica, DataStage, Mainframe and BI tool Tableau.
- Worked in Financial and healthcare domains.
TECHNICAL SKILLS
Data Warehousing: Teradata V15/14/V13/V2R12/V2R6, Oracle 11g/10g/9i, SQL server2000, MS Access
Languages: SQL, PL/SQL, C, C++, UNIX shell scripting, PERL
Operating Systems: HP-UX, Sun Solaris, Red Hat Linux, Windows2003/2000/XP/NT, IBM AIX
DB Utilities: BTEQ, Viewpoint, Fast Load, Multi Load, Fast Export, T Pump, SQL Loader, Exp/Imp, TD Administrator, TD Manager, TSET, SQL Assistant, Visual Explain, TASM
Tools: CSV, VSS, Arcmain, Teradata Administrator, AWS(EC2,ELB,S3,CLOUD WATCH, CLOUD FORMATION), Visual Explain, SQL Assistant, Toad, Putty, WINSCP, CYGWIN, Oracle Developer 2000, SQL*Plus, Crystal Reports, Remedy.
Others: MS Office, Telnet, Adobe Acrobat Reader/Writer, XML, HTML, UML, SFTP, FTP, SCP, Telnet, TCP/IP
PROFESSIONAL EXPERIENCE
Confidential, Atlanta, GA
Teradata Developer
Responsibilities:
- Analyze business requirements and provide solution for accurately migrate data/content from different source platforms (such as File Share, document, SQL database, etc.) to target Veeva Vault system.
- Prepare document requirements: technical summaries of the migration process, detailing amount of metadata/number of files processed, and data inventory
- Create Powershell script/DQL query to analyze structure of the data source
- Assist in planning and document the migration specification and value mapping specification, including the code using in the process, version of the tools and the explanation of the files store in target system.
- Create query, procedure, module or macro using SQL script or VBA to manipulate and cleanse data exported from Source
- Using regular expression in Notepad++ to write the WinSCP script for the FTP file copying from Source
- Design and Develop migration packages using TRUmigrate between various source systems and target systems
- Create complex SQL/DQL script by joining primary and multiple supplemental tables to get required attributes value for migration
- Work with following database systems/platforms and software: Veeva, File Shares, Documentum, RaeDocs, Microsoft Access, VBA, Microsoft SQL Server, Filezilla FTP, Postman API client, ODBC, Oracle, Argus, TRUmigrate, TRUcompare
- Create WinSCP script to transfer files from source system to FTP server
- Use TRUcompare for data scrubbing, including data validation checks during staging after migration of metadata, content, rendition done
- Worked on Oracle PL SQL to create stored procedures, packages and various functions.
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Constructed Korn shell driver routines (write, test and implement UNIX scripts)
- Wrote views based on user and/or reporting requirements.
- Wrote Teradata Macros and used various Teradata analytic functions .
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
- Performance tuned and optimized various complex SQL queries.
- Wrote many UNIX scripts.
- Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
- Gathered system design requirements, design and write system specifications.
- Developed design documentation and implementation proposal for extraction of data form W data warehouse
- Developed modules to extract, process & transfer the customer data using Teradata utilities.
- Use TRUcompare for data scrubbing, including data validation checks during staging after migration of metadata, content, rendition done
- Monitoring migration/compare job to ensure job runs smoothly, debugging when error occur in migration/compare job. Analyze migration logs to ensure the migration accuracy after the job is done
- Work with QA team to meet projects specific compliance requirements
- Take responsibility of debugging and make judgement in technical assessments
- Create business migration summary report for managing director and end users with detailed discrepancies explanation and analysis
- Created Fast Export scripts for extracting and formatting customer data from data warehouse in a mainframe file.
- Created FTP Id to set up transmission path between mainframe and UNIX machine.
- Developed an environment to receive an external source file for the data that were missing in W tables.
- Created a portable fully automated test tool, allowing 24/7 integration support for two development sites around the world and decreasing code turnaround time from 4 hours to 1 hour
Confidential, Richmond, VA
Teradata Developer/Data Analyst
Responsibilities:
- Involved in gathering requirements from business and IT managers and created documentation template and documented project requirements.
- Created SSIS packages to extract data from diverse locations and loaded data into SSMS for further tuning
- Used T-SQL to develop Stored Procedures, User-Defined Functions (UDFs), Indexed Views, and DML Triggers to facilitate ETL processes in SQL Server 2008
- Developed ETL packages in SSIS to ensure data quality and data consistency using Lookup, Execute SQL Task, Sequence Container, Breakpoint, and Checkpoint
- Used various SSIS transformations such as conditional split, derived column for data scrubbing, including data validation checks during staging, before loading the data into the Data Warehousing
- Developed SQL Server Integration Services (SSIS) ETL packages and other back-end processes to support system integration and ETL processes to facilitate the creation and population of the Enterprise Data Warehouse
- Exported the data mart to the SQL Server Analysis Services (SSAS) server to make resources available to the users and created reports containing all the information regarding the publishers, distributors and retailers using SSRS and Tableau for internal and external use
- Designed the new Credit Risk report using Transact-SQL(T-SQL) and SSRS, illustrated by interactive dashboards, linked reports, sub-reports, parameterized reports and drill-down, drill-through reports using Tableau
- Successfully completed lift and shift migration from Teradata to Snowflake
- Successfully validated and compared the data is matching in both Snowflake and Teradata
- Worked on different Daily, Weekly, Monthly processes to insert new data into snowflake
- Converted excel macros from Teradata to Snowflake
- Worked on different Macros where we need to insert the data manually and run it manually
- Converted SAS scripts to Data bricks. Created test cases during two week sprints using agile methodology
- Designed data visualization to present current impact and growth. Developed Natural Language Processing and Machine Learning Systems
- Have used GitHub and Crescendo for scheduling the jobs. Worked on some processes like WMSC, QBR and Calls Monitoring.
- Defined Database Alerts using tools such as Viewpoint, UNIX scripting to - Define, setup, respond and take appropriate actions.
- Involved in System Maintenance /Database upgrade - Planning, Scheduling, Co-ordination, Communications, Implementation, and Testing.
- Data Mover jobs performance and Server maintenance. Involved in Backup / Restore - Performing backup, recovering and monitoring, Implemented Online Archive feature for certain databases which had ETL conflicts.
- Managing database space, allocating new space to database, moving space between databases as need basis. Performance Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
- Converted SISA Customer Resiliency scripts from Teradata to Snowflake and scheduled those jobs in Crescendo. Involved in System Maintenance /Database upgrade - Planning, Scheduling, Co-ordination, Communications, Implementation, and Testing.
- Designing of SSIS DB Packages to get data files from FTP and SFTP Server and decrypting the PGP files, batch processing of incoming data and purging of old files.Creating high level design of the ETL process.
- Developed ETL process using SSIS with Various Control Flow, Data Flow task and Store Procedures for Work Order Validation process.
- Excellent report creation skills using Microsoft Reporting Services SSRS with proficiency in using Report Designer as well as Report Builder.
- Developed custom reports and deployed them on server using SQL Server Reporting Services SSRS .
- Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services SSRS
- Well versed in defining/creating and handling Data Sources, Data Source views and parameterized Reports in SSRS 2008R2.
- Designed and created cubes in SSAS.
- SSAS storage and partitions, and Aggregations, calculation of queries with MDX, Data Mining Models, developing reports using MDX and SQL.
- Created sophisticated calculated members using MDX queries designed and created aggregations to speed up queries and improve performance in SSAS.
- Extensively involved in the SSAS storage and partitions, and Aggregations, calculation of queries with MDX, Data Mining Models, developing reports using MDX and SQL. Involved in creating Dimensions, KPI, Measures and calculations in SSAS 2008R2.
- Worked with different methods of Logging in SSIS. Generation of reports from the cubes by connecting to Analysis server from SSRS.
Environment: Teradata, Snowflake, Teradata SQL Assistant Version 16.00, SQL Workbench, Windows XP, DataStage,Data Bricks.