We provide IT Staff Augmentation Services!

Teradata Developer/analyst Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 11 years of extensive technical proficiency in the field of Data Warehousing, Business Requirements Analysis, Data Modeling, Design, Development & testing and full life cycle Implementation of Data Warehouse.
  • Worked on various projects involving data warehousing on platforms like Informatica Power Center, Teradata, Oracle SQL, Netezza, DB2, PL/SQL, SAS, Python, AWS, UNIX, Tableau, Autosys, UC4 scheduling tool, Harvest, Bitbucket etc.
  • Worked on migrating data from heterogeneous DB systems including but not limited to Teradata, Sybase, Oracle, MS SQL Server, Flat & COBOL File sources to Teradata.
  • Strong knowledge of Teradata RDBMS Architecture (AMP, PE, BYNET and Data distribution by Index’s/Hash concepts, cylinder index, master index, blocks and sectors).
  • Established experience & expertise in providing simple to complex BI Solutions using various Teradata (BTEQ, FastLoad, MultiLoad, FastExport, TPT, TPump, Teradata Stored Procedures, Macros & Triggers) Utilities.
  • Extensive experience in designing solutions using various components of Informatica PowerCenter suite (Designer, Manager & Monitor).
  • Expertise in RDBMS, Data Warehouse Architecture and Modeling with thorough understanding and experience in data warehouse and data mart design, Star schema, Snowflake schema, Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3), Normalization and Denormalization Techniques.
  • Well versed with Data Warehousing concepts, design techniques and Dimensional modeling.
  • Thorough understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Experience in documenting Technical & Design specification documents from Business/User specifications.
  • Experience designing ETL Solutions using various transformations from Informatica Designer Suite and implement.
  • Experience in Tuning existing objects (Informatica & Teradata scripts) to improve Performance.
  • Worked on reviewing & implementing various DML & DDL Scripts/Commands, Indexes & Triggers using Teradata.
  • Proficient in Mapping Data from existing source systems to target system as per the requirement and applied transformation and conversion rules based on user specified business specs.
  • Demonstrated experience in documenting ETL test plans, test cases, test scripts and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.
  • Expert in creating ETL mapping documents and data dictionaries, data flow diagrams and system context flows.
  • Experience automating and scheduling Enterprise DW jobs using UNIX shell scripting, AutoSys, Control - M, Tivoli Work Scheduler and UC4.
  • Experience using version control systems - Harvest, Endeavor, GitHub and Tortoise SVN, Bitbucket.
  • Proven work experience in Multi-Mode development (On-site/Offshore) model.
  • Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew.
  • Experience in AGILE methodologies and a Certified SCRUM Master
  • Demonstrated ability to easily grasp new ideas, concepts, methods and technologies.
  • Experienced as on call support (24X7) as part of a scheduled rotation with other team members.
  • Excellent communication and interpersonal skills.

TECHNICAL SKILLS

Teradata Tools: BTEQ, Fast Load, MultiLoad, FastExport, Tpump, TPT, Teradata SQL Assistant and Teradata Viewpoint

ETL Tool/Reporting tool: Informatica PowerCenter

Database: Teradata V2R6/12/14/15/16, Oracle 11g/10i/9, MS SQL Server 2008/2005, Netezza, DB2

Scheduling Tools: Autosys, Control - M, TWS (Tivoli Work Scheduler), UC4

Version Control Tools: Harvest, Endevor, GitHub, Bitbucket and Tortoise SVN

Programming Languages: Teradata SQL, Python and UNIX Shell Scripting

Data Modelling/Methodologies: Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete Software Development Cycle

Operating Systems: Windows and Linux/UNIX

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Teradata Developer/Analyst

Responsibilities:

  • Gathering of business requirements, Designing Wire Frames, preparing mapping documents.
  • Performed Risk Analysis and Impact Analysis
  • Responsible for developing the ETL (Extract, Transform and Load) processes using Teradata Utilities Data Transformation Activities.
  • Use Teradata SQL and PL/SQL.
  • Create, optimize, review, and execute Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Create several BTEQ scripts involving derived tables and Volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Work on tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes to improve performance.
  • Use volatile table and derived queries for breaking up complex queries into simpler ones.
  • Create test cases to unit test the code prior to the handover process to QA, perform SQL tuning and provide support for UAT testing.
  • Demonstarting the final project to the business team.

Environment: Teradata, Micro Strategy, PL/SQL, Linux, Auto Sys. Jira

Confidential, Charlotte, NC

Teradata Developer

Responsibilities:

  • Involvement in SDLC which includes gathering of business requirements, architecture, design document, planning, development, deployment and maintenance.
  • Performed Risk Analysis and Impact Analysis in order to be prepared for potential business and technical risks to the system.
  • Responsible for developing the ETL (Extract, Transform and Load) processes using Teradata Utilities Data Transformation Activities.
  • Use Teradata SQL and BTEQ.
  • Create, optimize, review, and execute Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Create several BTEQ scripts involving derived tables and Volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Work on tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes to improve performance.
  • Created a cleanup process for removing all the intermediate temp files that were used prior to the loading process.
  • Use volatile table and derived queries for breaking up complex queries into simpler ones.
  • Create test cases to unit test the code prior to the handover process to QA, perform SQL tuning, provide testing approach for unit and SIT testing.
  • Streamline the Teradata and shell scripts migration process on the UNIX box.
  • Perform Unit Testing for the mappings developed and ETL process to check if the transformations are executing as expected.

Environment: Teradata, Informatica, SAS, Oracle 11g, PL/SQL, Linux, Auto Sys. Jira, Bitbucket

Confidential, Lafayette, LA

Teradata/Informatica Developer

Responsibilities:

  • Requirement gathering, Analysis and preparing Technical design documents
  • Creating SQL scripts using Teradata Utilities BTEQ, Multiload, FastLoad and TPump
  • Responsible for developing the ETL (Extract, Transform and Load) processes using Teradata Utilities Data Transformation Activities.
  • Responsible for developing, supporting and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.2 till Staging area and use Teradata Utilities for further Data Transformation Activities.
  • Use Informatica Workflow Manager to create, execute and Monitor Sessions and Workflows.
  • Create, optimize, review, and execute Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Create several BTEQ scripts involving derived tables and Volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Work on tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes to improve performance.
  • Created a cleanup process for removing all the intermediate temp files that were used prior to the loading process.
  • Use volatile table and derived queries for breaking up complex queries into simpler ones.
  • Create test cases to unit test the code prior to the handover process to QA, perform SQL tuning, provide testing approach for unit and SIT testing.
  • Streamline the Teradata and shell scripts migration process on the UNIX box.
  • Perform Unit Testing for the mappings developed and ETL process to check if the transformations are executing as expected

Environment: Teradata, Informatica, Oracle, SQL Server, Pandas, AWS, Unix, Github

Confidential

Teradata/Informatica Developer

Responsibilities:

  • Analyzing business requirements and provides ETL solution.
  • Creating SQL scripts using Teradata Utilities BTEQ, Multiload, FastLoad and TPump
  • Responsible for developing the ETL (Extract, Transform and Load) processes using Teradata Utilities Data Transformation Activities.
  • Responsible for developing, supporting and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.2 till Staging area and use Teradata Utilities for further Data Transformation Activities.
  • Use Informatica Workflow Manager to create, execute and Monitor Sessions and Workflows.
  • Create, optimize, review, and execute Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Involved in analyzing business requirements and translating requirements into functional and technical design specifications
  • Involving in daily Scrum calls and interacting directly with clients to get the requirements
  • Providing extended support to SIT and Migration Activities

Environment: Teradata, Informatica, Oracle, Tableau, Unix, Harvest.

Confidential

Informatica Developer

Responsibilities:

  • Involving in Application Maintenance and Issue resolution
  • Responsible for developing, supporting and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.2 till Staging area and use Teradata Utilities for further Data Transformation Activities.
  • Use Informatica Workflow Manager to create, execute and Monitor Sessions and Workflows.
  • Fixing the Production bugs in Informatica code
  • Delivery of service request, Bug fix and Enhancements
  • Production support, upgrades and Migration Activities
  • Ticket Handling and incident management
  • On call production support for resolving high priority tickets and project Management

Environment: Informatica, Oracle 10g, Netezza, UC4 scheduling tool

Confidential

ETL Developer/Analyst

Responsibilities:

  • Design and Development of ETL Process using Informatica Power Centre 8.6.2.
  • Creating Informatica Mappings for history load and data mart.
  • Involved in Creating Complex Mappings using various transformations
  • Reviewed the Mappings and prepared query logs, implementation plans.
  • Provided extended support to SIT.
  • Integration of the DWH setup for other subsidiaries of 17 Countries.
  • Technical guidance to the team members, mentoring freshers.

Environment: Teradata, Informatica, Tableau, Unix, PL/SQL and Oracle 10g.

Confidential 

ETL Developer/Tester

Responsibilities:

  • Understanding of the requirements, creating Mapping Documents, ETL design documents etc.
  • Creating the mappings/ workflows as per ETL documents for Source to HUB, HUB to Data Mart - Dimension Tables, HUB to Data Mart - Fact Tables and Loading History Data and Monthly Data into the tables.
  • Writing the Logical test cases for the above build modules and unit testing. Executing test cases against the TEST/ Production environment and verifying code for defects and providing solutions.
  • Preparing Migration Checklist for code migration, Writing Process document related to project and sharing with clients

Environment: Informatica Power Center 8.6.2, Teradata, Netezza, UNIX Shell Scripts, UNIX Server.

We'd love your feedback!