We provide IT Staff Augmentation Services!

Teradata Developer Resume

2.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Over 8 years of IT experience in Teradata, Informatica, SQL, PL/SQL and Unix shell scripting, as a developer having strong expertise in SQL queries, stored procedures, Teradata Macros.
  • Extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica Power Center 9.5/8.1/7.1/6.2.
  • Expertise in maintaining data quality, data organization, metadata and data profiling.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Having complete domain and development Life Cycle knowledge of Data Warehousing & Client Server Concepts and knowledge of basic data modeling.
  • Have experience in working with both 3NF and dimensional models such as Star schema model, Snow-Flake model for data warehouse and good understanding of OLAP/OLTP systems.
  • Proficient in preparing high/low level documents like design and functional specifications. Performed various kinds of tests that include Unit Testing.
  • Actively involved in Quality Processes and release management activities - To establish, monitor and streamline quality processes in the project.
  • Proficient in creating reports using Business Objects XI R2 functionalities such as Queries, Master/Detail and Formula, Slice and Dice, Drilling, Cross Tab and Charts.
  • Good knowledge on Agile Methodology and the scrum process.
  • Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
  • Expertise in RDBMS, database Normalization and Denormalization concepts and principles.
  • Good Experience in Functions, Database Design, Query Optimization and Performance tuning.
  • Experience with IMPORT/EXPORT, Data-Pump, Sql-Loader and built-in packages in Oracle.
  • Worked extensively in Development of large Projects with complete END to END participation in all areas of Software Development Life Cycle and maintain documentation.

TECHNICAL SKILLS

ETL tools: Informatica Power Center 9.5.1/9.0.1/8.6.1/7. x/6.x

Programming Languages: Teradata SQL, PL/SQL, C, C++

Databases: Teradata (V2R12, V2R6,V2R5), SQL Server 2000, Oracle(Pl/SQL)

Teradata Tools & Utilities: TASM, BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant, Teradata Administrator, TSET, Index Wizard, Statistics Wizards.

Data Modeling Tools: POWER DESIGNER.

Migration Tools: Jenkins

Process/Methodologies: Waterfall, Agile Methodology

Operating Systems: MS DOS, UNIX, Windows NT/ 2000/ XP and Linux

Languages: SQL, PL/SQL, Teradata Macros, BTEQ, MLOAD, FASTLOAD, FAST EXPORT, Shell scripting

RDBMS: Oracle 8i/ 9i/10g, MS SQL Server, Teradata V2R5/R6/R12/R13

Testing Tools: QTP, Quality Center, Rational Suite (Requisite Pro, Rose, Robot, Test Manager, Clear Quest, Clear Case), Selenium, HP quick Test Pro

Tools: Teradata SQL Assistant, Teradata Manager, PMON, Putty, Informatica

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Teradata Developer

Responsibilities:

  • Responsible for gathering requirements from Business Analysts and Operational Analysts and identifying the data sources required for the requests
  • Developed the Informatica mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Teradata, Excel files etc.
  • Involved in creating stored procedure to support recovery.
  • Responsible for working closely with the Informatica administrator to migrate Source and Target definitions, Mappings, Workflows, and Flat Files from development environment to the production environment.
  • Extensively used the Lookup and Update Strategy Transformations for implementing the Slowly Changing Dimensions.
  • Used different tasks like Email, Command task.
  • Worked with the Informatica Administrator in migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
  • Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
  • Used Informatica Data Explorer and Informatica Data Quality for Data Quality Management and Data profiling purposes.
  • Performed Data Integration, Data Standardization using IDQ.
  • Hands on experience in Teradata RDBMS using Fast Load, Multiload, TPump, Fast Export, Teradata SQL Assistant and BTEQ.
  • Used TOAD to FTP file moving processes to and from source systems.
  • Created scheduling of jobs in Workflow Manager to trigger tasks on a daily, weekly and monthly basis.
  • Involved in the mirroring of the staging environment to production.
  • Created and reviewed the Design and Code Review Templates.
  • Involved in conducting the unit tests, System tests.
  • Scheduling jobs using Autosys to automate the Informatica Sessions.
  • Optimizing the Autosys batch flow.
  • Developing control files, Stored Procedures to manipulate and load the data into Oracle database
  • Optimizing queries using SQL Navigator.

Environment: Informatica Power Center 9.5.1, IDQ 8.6.0, Teradata V14, BTEQ, MLOAD, FLOAD, ORACLE, SQL, UNIX, Windows XP.

Confidential, AZ

Teradata/Informatica Developer

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.
  • Developed ETL mappings, transformations using Informatica Power Center 9.5.1
  • Extracted data from flat files (provided by disparate ERP systems) and loaded the data into Teradata staging using Informatica Power Center.
  • Analyzed Source Data to resolve post-production issues. Used MS Access to analyze source data from flat files.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.
  • Used Mapping Parameters and Mapping Variables based on business rules provided.
  • Scheduled workflow daily basis for incremental data loading
  • Wrote PL/SQL Procedures for data extractions, transformation and loading.
  • Assisted in Data Modeling and Dimensional Data Modeling.
  • Used Informatica Data Quality Tool (Developer) to scrub, standardize and match customer Address against the reference table.
  • Used different IDQ transformations in the developer and created mapping to meet business rules.
  • Writing BTEQ Scripts to load data into Dimension tables with the hierarchy
  • Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Scheduling jobs using Autosys to automate the Informatica Sessions.
  • Used TOAD to FTP file moving processes to and from source systems.
  • Performed Unit testing for all the interfaces.
  • Used Test Director to log and keep a track of the defects.
  • Provided Production Support at the end of every release.

Environment: Informatica Power Center 9.5.1, Teradata V12, BTEQ, MLOAD, FLOAD, ORACLE, SQL, PLSQL, UNIX, Windows XP.

Confidential, MI

Teradata/Informatica Developer

Responsibilities:

  • Having experience in relational database theory and design including logical and physical structures & data normalization techniques.
  • Involved in Designing and Development of logical and physical data models of systems that hold Terabytes of data.
  • Used Informatica 9.5.1 for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Identified and removed duplicate datasets present, with IDQ components like Jaro Distance and Edit Distance
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Designed several reusable components on Informatica Data Quality using Parsers, Standardizers and Reference tables which can be applied directly to standardize and enrich Address information
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created a Generic BTEQ script to load the data into various target tables from flat files using control table mechanism.
  • Optimized high volume tables (Including collection tables) in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
  • Highly proficient in writing loader scripts like BTEQ, Load, Flood and Fast Export.
  • Build tables with UPI, NUPI, USI and NUSI.
  • Involved in performing statistical analysis using SAS system and then collecting results and generating custom reports through a tool called the Document Builder
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.
  • Done Query optimization explain plans, collect statistics, Primary and Secondary indexes.
  • Worked with DBA's for transition from Development to Testing and Testing to Production.
  • Used the Informatica Designer to develop processes for extraction, cleaning, transforming, and integrating.

Environment: Informatica 9.5.1, Teradata SQL Assistance 12, Teradata Administrator 6.0, Teradata Utilities(Load, Fast load, Fast Export, BTEQ, ARC Utilities, Teradata Manager, ASF, Teradata Priority Scheduler, JSC.

Confidential

Teradata Developer

Responsibilities:

  • Involved in writing scripts for loading data to target data warehouse for BTEQ, Fast Load and Multi Load.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Developed MLOAD scripts to load data from Hadoop dispatcher file path for history loads and to load data from Load Ready Files to Teradata Warehouse capture the real time data from the Database Redo logs.
  • Tuning SQL Queries to overcome spool space errors and improve performance.
  • Involved in the Analysis and design of mapping.
  • Carrying out data reconciliation through queries in various source systems.
  • Creating appropriate indexes dependent on table situation and requirements.
  • Worked with ET, UV and WT tables and did performance tuning.
  • Involved in unit testing Involved and Preparing test cases.
  • Created a shell script that checks the corruption of data file prior to the load.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fast load, Multiplied) and working with loader logs
  • Used volatile table and derived queries for breaking up complex queries into simpler queries
  • Creating appropriate indexes dependent on table situation and requirements.
  • Developed Teradata macros to implement business requirement.

Environment: Teradata12, Teradata SQL Assistance, Teradata Administrator, Teradata Utilities(Load, Fast load, Fast Export, BTEQ, ARC Utilities, Teradata Manager, Teradata SQL Assistant, Teradata Administrator, Teradata.

Confidential

Teradata Developer

Responsibilities:

  • Proficient in importing/exporting large amounts of data from files to Teradata and vice versa.
  • Developed mappings to load data from Source systems like Oracle, AS400 to Data Warehouse. Highly experienced in Performance Tuning and Optimization for increasing the efficiency of the scripts.
  • Used Joins like Inner Join, Outer join while creating tables from multiple tables.
  • Developed reports using the Teradata advanced techniques like rank, row number. Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.
  • Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Experienced in using OLAP functions like sum, count, csum and etc.
  • Proficient in working on Set, Multiset, Derived, Volatile Temporary tables.
  • Proficient working in loading data into staging tables via views.
  • Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL.
  • Experience in Power Point Presentation (Financial Data, Charts).
  • Expertise in generating graphs using MS Excel Pivot tables. Extracted data from existing data source, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Excel.
  • Extracted data from existing data source and performed ad-hoc queries.

Environment: Teradata12, MS Excel, UNIX, Windows NT server, 2000, XP.

We'd love your feedback!