We provide IT Staff Augmentation Services!

Teradata/ Informatica Developer Resume

3.00/5 (Submit Your Rating)

Columbia, MD

SUMMARY:

  • A certified Professional around 9 years of IT experience with strong background in Data Warehousing with emphasis on Business Requirements Analysis, Application Design, Development, Testingand Implementation.
  • Involved in all stages of Software Development Life Cycle (SDLC) from analysis and planning to development & deployment and followed Agile methodology & Scrum process.
  • Experience on Teradata database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables within committed timelines.
  • Extensively created and used various Teradata Set Tables, Multi - Set Tables, Global Tables, Volatile Tables, Temp Tables.
  • Very good understanding of Teradata UPI and NUPI, Secondary Indexes and Join Indexes.
  • Data warehousing experience specializing in RDBMS, ETL Concepts to load data from different sources into data warehouse using Teradata Utilities (BTEQ, FLOAD, MLOAD, FEXPORT and TPUMP) and Informatica Power Center.
  • Conversant in writing queries using Joins, Views, Sub Queries, Stored Procedures and proficient in Performance Analysis with query tuning using Explain Plan, Collect Statistics, NUSI and Join Indexes including Join and Sparse Indexes.
  • Proficient in ETL (Extract, Transform and Loading data) process in analyzing, designing and developing Data Warehousing projects using InformaticaPowerCenter.
  • Expertise in designing and developing complex ETL mappings and reusable transformations, scheduling workflows and worklets using InformaticaWorkflow Manager and monitoring using InformaticaWorkflow Monitor.
  • Vast experience in designing and developing Informatica Power Center mappings from varied transformations like Unconnected &Connected Lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Sequence Generator, Rank, Normalizer etc.
  • Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improvedperformance and increased flexibility and also worked with XML Sources &Targets.
  • Identified bottlenecks in ETL Processes and improved the performance using Partitioning, Index Usage, Aggregate Tables, and Normalization/De-normalization strategies.
  • Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
  • Developed UNIX Shell Scripts to automate applications, Schedule jobs, develop interfaces and Created pre-session and post-session shell scripts and mail-notifications.
  • Knowledge in Oracledatabase.
  • Implemented Slowly Changing dimensions (SCD Type 1 and SCD Type 2), methodology for accessing the full history of accounts and transaction information.
  • Versed in OLTP Modeling (2NF, 3NF) and OLAP Dimensional modeling (Star and Snow Flake).
  • Good team player with excellent technical and interpersonal skills. Pro-active, self-motivated and able to work independently as well as in team.

TECHNICAL SKILLS:

RDBMS: Teradata 15/14/13/12, Oracle 11g, 10g

Languages: SQL Assistant, SQL, PL/SQL, UNIX shell scripting

TD Utilities: BTEQ, Fast Load, Multi Load, Fast Export, TPump

ETL Tools: InformaticaPowerCenter 9.x

Scheduling Tool: Auto Sys, Control-M

Others: MS Office, MS Visio,Tectia, Toad, FTP, SFTP, SCP, Telnet

PROFESSIONAL EXPERIENCE:

Confidential, Columbia, MD

Teradata/ Informatica Developer

Responsibilities:

  • Understanding business requirements and converting them into technical specifications and reverse engineering the existing systems or processes for documenting.
  • Involved in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance and written Macros for various applications in SQL Assistant.
  • Written scripts for loading the huge volume of data from legacy systems to target data warehouse using BTEQ, FLoad, MLoad.Developed T-Pump scripts to load low volume data into Teradata.
  • Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Proficient in performance analysis and query tuning using EXPLAIN PLAN, Collect Statistics.
  • Develop UNIX Shell Scripts to perform load tasks with complex TERADATA SQL.
  • Handled millions of data and effectively running the queries against the database with no performance issues.
  • Created new Teradata tables and data replication process from Oracle to Teradata.
  • Written SQL Scripts to extract the data from Database and for Testing Purposes.
  • Extensively worked on data Extraction, Transformation and Loading data from Oracle, Flat files.
  • Using Informatica created mappings to transform the data according to the business rules.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Created mappings,tasks, workflows and sessions to move the data at specific intervals using workflow manager and modified existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW.
  • Created sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Testing and debugging of all ETL and Teradata objects in order to evaluate the performance and to check whether the code is meeting the business requirement.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize sessionperformance.
  • Involved in peer reviews of Teradata and Informatica.

Environment: Teradata 15/14, SQL Assistant, BTEQ, MLOAD, FLOAD, TPump, UNIX Shell scripts, Oracle 11g,SQL, PL/SQL, Informatica9.6.1

Confidential, Collierville, TN

Teradata/ Informatica Developer

Responsibilities:

  • Involved in requirement gathering, business analysis, design & development, testing and implementation of business rules.
  • Developed UNIX shell script with BTEQ to dynamically generate SQL script and run it in order to DROP the oldest range on Partitioned Primary Index tables using derived tables and parsing the data dictionary table dbc. Index Constraints.
  • Loaded data from the staging tables to the base tables using BTEQ scripts.
  • Written MLOAD scripts for loading the data from flat files to the staging tables for the new shipper item feeds.
  • Collection of data source information from all the legacy systems and existing data stores.
  • Added ranges to Partitioned Primary Index tables that were about to expire.
  • Fixed issues with the existing FLoad/MLoad Scripts in for smooth loading of data in the warehouse more effectively.
  • Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.
  • Performed Query Optimization with the help of Explain Plan, Collect Statistics, Primary and Secondary Indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
  • Involved in scheduling Backups and archives and restoring data when required.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for the mappings.
  • Worked on Informatica power center tools - Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions and used workflow manager to create, schedule, execute & monitor workflows.
  • Done performance tuning at source, target, mapping, sessions and system levels.
  • Analyzed complex ETL tasks and provided estimates.
  • Automated and scheduled ETL jobs using Control-M.

Environment: Teradata 13, SQL Assistant, BTEQ, MLOAD, FLOAD, TPump, UNIX Shell scripts, Oracle 10g,SQL and PL/SQL, Informatica

Confidential, Phoenix, AZ

Teradata Developer

Responsibilities:

  • Created Tables, Views, Stored Procedures in Teradata, according to the business requirements.
  • Written queries to pull the required information from Database using SQL Assistance.
  • Involved in loading millions of records into warehouse from legacy system using MLoad / FLoad and BTEQ utilities of Teradata.
  • Written scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
  • Written TPT scripts to load from different source system to the staging tables and target tables.
  • Used FExport utility to extract large volumes of data at high speed from Teradata RDBMS.
  • Interpreted explain plans and tuned queries to improve the performance.
  • Tuned various queries by adding Secondary indexes.
  • Participated in database testing like checking the constraints, data validation &stored procedures.
  • Written MACROS in Teradata to generate less channel traffic and easy execution of frequently used SQL operations and improve the performance.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Created and executed test plans for Unit, Integration, and System test phases.
  • Interacted closely with DBA to improvise the existing long running jobs in production and solving production issues.

Environment: Teradata 13, BTEQ, FLOAD, MLOAD, FEXPORT, UNIX Shell Scripting, Oracle 10g, SQL, PL/SQL.

Confidential, San Francisco, CA

Teradata Developer

Responsibilities:

  • Analyzing and understanding the end user requirements and business rules.
  • Worked closely with Data architects to design the logical and Physical model of the database.
  • Used Teradata Utilities to ensure High System performance as well as High Availability.
  • Developed scripts for BTEQ, OLEDB, FLOAD, MLOAD and all other tools.
  • Created Teradata objects like Databases, Users, Tables, Views and Macros.
  • Worked as a team with database administrators to determine Indexes, Statistics and performed table partitioning the data warehouse, in order to improve performance for end-users.
  • Expertise in Query Analyzing, Performance Tuning and Testing.
  • Interpret Explain Plans and collect Statistics to improve performance of long running queries.
  • Partitioning of primary indexes to increase performance.
  • Designed data feeds to load Teradata tables from Oracle tables or from flat files, using Fastload, Multiload, and SQLPlus. Designed physical data mart schema optimized for Teradata.
  • Utilized Multiload to insert, update, and delete records and Tpump for real time data loads.

Environment: Teradata 13/12, SQL Assistant, BTEQ, FLOAD, MLOAD, FEXPORT, UNIX Shell Scripting, Oracle 10g and PL/SQL

We'd love your feedback!