We provide IT Staff Augmentation Services!

Teradata/ Informatica Developer Resume

2.00/5 (Submit Your Rating)

Cupertino, CA

PROFESSIONAL SUMMARY:

  • Over 8 Years of IT experience including 7+ years in development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Teradata V2R5/V2R6.2/12/13 versions.
  • Experience with Ab Initio Co>Operating System, application tuning and debugging strategies
  • Proficient with various Ab Initio Parallelism and Multi File System technique
  • Used Ab Initio as an ETL tool to pull data from source Systems, Cleanse, Transform and load data into database
  • Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1, Teradata, Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000 in Finance, Health Insurance and Pharmacy Domains.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co - ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • Implemented Slowly Changing dimensions Type 1, Type 2 and Type 3methodology.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance bottlenecks.
  • Proficient in applying Performance tuning concepts to Informatica Mappings, Session Properties and Databases.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Experience in designing and implementing Ab Initio EME Metadata projects
  • Proven record in both technical and functional applications of RDBMS, Data Mapping, Data management and Data transportation
  • Expert Developer skills in Teradata RDBMS initial Teradata DBMS environment setup, development.
  • Having experience on large data warehouse containing 75 TB.
  • Have profound working knowledge on Teradata Parallel Transport coding and Teradata SQL Performance tuning
  • Hands on experience in Teradata SQL, Tools & Utilities (FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
  • Expertise in Query Analyzing, performance tuning and testing.
  • Loaded and transferred large data from Oracle into Teradata using MLoad and OLELoad.
  • Experience in writing UNIX shell scripts to support and automate the ETL process.
  • Exceptional skills in writing complex SQL queries and procedures involving multiple tables, constraints and relationships for efficient data retrieval from relational database for data validation using SQL and TOAD.
  • Created and updated Teradata models in Erwin.
  • Worked on the ETL Design documents, Mapping Documents and ETL Testing Documents.
  • Organized the data within the Teradata system using Teradata Manufacturing Logical Data Model.
  • Well versed with various Ab Initio components such as Round Robin, Join, Rollup, Partition by key, gather, merge, interleave, Dedup sorted, Scan, Validate, FTP.
  • Expertise in testing the Ab Initio graphs by running the scripts in Autosys.
  • Profound knowledge of Data modeling including Star Schema and Snow Flake Dimensional Data modeling, 3NF Normalized Data Modeling. Very good understanding of Logical Modeling and Fine tuned Physical Data Modeling.
  • Expertise in fine tuning of Physical data Models for Oracle and Teradata DB systems.
  • Expertise in writing UNIX shell scripts (wrapper scripts) for AUTOSYS scheduling.
  • Actively involved in Performance Tuning, Error and Exception handling on various ETL processes.
  • Extensive experience in developing Unit, Integration and UAT Test plans and Cases and also has experience in generating/executing SQL Test Scripts and Test results.
  • Involved in creation of logical and physical data models using Erwin 4.1.
  • Excellent analytical and presentation skills.
  • Highly motivated with the ability to work effectively in teams as well as independently.

TECHNICAL SKILLS

Skillset: Teradata 13.10/13/12, Teradata ARCMAIN, BTEQ, Teradata SQL Assistant, Teradata Manager, PMON, Teradata Administrator. Informatica Power Center 9.0.1/8.6/8.1/7.1, MS SQL Server 2000/2005/2008, T-SQL, ODBC, DTS, SSAS, SSIS, SSRS, COZYROC, Crystal Reports 8.5/9/10, Oracle 8i, 9i, 10g, MS Access-97/2000, Excel, Flat File, MS Visio, Erwin Platinum, HTML, XML, Windows NT/2000/XP/2003 server.

PROFESSIONAL EXPERIENCE

Confidential, Cupertino, CA

Teradata/ Informatica Developer

Responsibilities:

  • Used Ab Initio to extract, transform and load data from multiple input sources like flat files, Oracle files to the database.
  • Involved in understanding the requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • Analyze business requirements and develop metadata mappings. Involved in preparing high level and detailed design documents.
  • Developed/Modified subject area graphs based on business requirements using various Ab Initio.
  • Involved in the migration of Mainframe DB2 database to Teradata. Created several macros, stored procedures, JCL scripts to migrate data from mainframe DB2 to Teradata.
  • Components like Filter by Expression, Partition by Expression, reformat, join, gather, merge rollup, normalize, denormalize, scan, replicate etc.
  • Developed Complex Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Implemented a 8 and 12 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized Ab Initio’s parallelism techniques
  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Worked with different Data sources ranging from SAP, MDB, Teradata, flat files, XML, Oracle, and SQL server databases.
  • Created PL/SQL stored procedure (Dynamic procedures), functions, triggers, packages and cursors for Sales Analysis data mart. These objects extract data from SAP and Teradata and stores in oracle backend.
  • Worked in life cycle development including Design, ETL strategy, troubleshooting and Reporting. Identifying facts and dimensions.
  • Proficient in writing Mload, FastLoad and TPump scripts from windows, UNIX and Mainframes environments.
  • Experience in Unix scripting.
  • Expertise in writing Teradata procedures using BTEQ and SQL assistant.
  • To implement the type 2 process in more than one table created a dynamic procedure using metadata layer which will insert/updates the tables on the fly.
  • Finetuned existing Teradata procedures, macros and queries to increase the performance.
  • Re-designed the table structures on one AMP basis. Well organised the primary index mechanism.
  • Extensively used Data Integrator/Oracle and created mappings using transformations like Case, Map Operation, Merge, and Pivot and flagging the record using update strategy for populating the desired slowly changing dimension tables.
  • When designing data marts, Identified entity types and attributes, applied naming conventions and data model patterns, Identified relationships, assign keys, normalized to reduce data redundancy and denormalized to improve performance.
  • Utilized the best practices for the creation of mappings and used transformations like Query, Key Generation and Date Generation.
  • Designed and developed a process that takes quality data sent from around the world and loads the data tables that are used to ship product, as well as, analysing data to improve manufacturing yields and intervals.This process uses shell scripts and PERL programs that will determine the table to be loaded and dynamically generate the MLoad and BTEQ utilities to load the information.
  • Involved in the migration of Oracle to Teradata.
  • Used External tables for flat files in order for faster processing.
  • Wrote stored Procedure for complex calculation and for faster processing of bulk volume of the data.
  • Involved in technical writing.
  • Performance fine tuning of existing SQL queries for reports, ETL jobs to reduce the procession time, reduced the number of procedures by creating dynamic procedures.
  • Extensively worked under the Unix Environment using Shell Scripts and Wrapper Scripts. Responsible for writing the wrapper scripts to invoke the deployed Ab Initio Graphs.
  • Performed potential transformations at the staging area, such as cleansing the data (dealing with missing elements, parsing into standard formats) combining data from multiple sources, de-duping the data and assigning surrogate keys.
  • Worked on developing various parameterized graphs in GDE.
  • Extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.
  • Expertise in SQL queries for cross verification of data.
  • Used phases and checkpoints in the graphs to avoid the deadlocks, improve the performance and recover the graphs from the last successful checkpoint.
  • Used Enterprise Meta Environment (EME) for version control, Control-M for scheduling purposes.
  • Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
  • Extensively used Ab Initio built in string, math, and date functions.
  • Provided 24*7 extended supports during the production rollout.
  • Worked on database connections, SQL joins, Loops, Materialized Views, Indexes, aggregate conditions, parsing of objects and Written PL/SQL procedures and functions for processing business logic in the database.
  • Used Teradata utilities (Fast Load, MultiLoad) to load data into Target Data Warehouse and used Teradata Sql Assistant to Query data in the target Teradata data warehouse.
  • Involved in Unit testing, System testing and debugging during testing phase.

Confidential, Memphis,TN

Sr.Teradata Developer

Responsibilities:

  • Interacted with the business Analysts to understand the flow of the business.
  • Involved in gathering requirements from the business and design of physical data model.
  • Involved in Data Extraction, Transformation and Loading from source systems.
  • Involved in writing complexSQLqueries based on the given requirements
  • Loaded data into Teradata using FastLoad, BTEQ, FastExport, MultiLoad.
  • Written several Teradata BTEQ scripts for reporting purpose
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Developed the Teradata macros which pulls the data from several sales table and performs calculations and aggregations and dumps into a results table
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables
  • Performed Unit testing, Integration testing and generated various Test Cases.
  • Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Subqueries, EXISTS, COALESCE, NULL etc.
  • Prepared Job scheduling docs and Job Stream List Using Dollar U for code migration to test and production.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Considering both the business requirements and the factors to create NUSI, created appropriate NUSI for smooth (fast and easy) access of data.
  • Used TPT scripts to load data from one environment to other.
  • Created UNIX shell scripts for FTP, merge and send success/failure mails

Confidential, Riverwoods, IL

Sr. Teradata Developer

Responsibilities:

  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Did the performance tuning for Teradata SQL statements using Teradata Explain command.
  • Monitoring database space, Identifying tables with high skew, working with data modelling team to change the Primary Index on tables with High skew.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
  • Flat files are loaded into databases using FastLoad and then used in the queries to do joins.
  • Used Teradata SQL with BTEQ scripts to get the data needed.
  • Used Teradata utilities (Fast Load, MultiLoad) to load data into Target Data Warehouse and used Teradata Sql Workbench to Query data in the target Teradata data warehouse.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL.
  • Development of BTEQ /FastLoad /MultiLoad scripts for loading purpose.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Worked on exporting data to flat files using Teradata Fast Export.
  • Did the performance tuning for Teradata SQL statements using TeradataExplain command.
  • Extensively involved in data transformations, validations, extraction and loading process. Implemented various Teradata Join Types like Inner-join, outer-join, self-join, cross-joins. And various join strategies like Merge join, Product join, Nested join, Row Hash Joins.
  • Involved in writing complex SQL queries based on the given requirements and for various business tickes to be handled.
  • Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as Group By, Rollup, Rank, Case, Union, Subqueries, Exists, Coalesce, Null etc.
  • Used FastLoad for loading into the empty tables.
  • Experience in using unix/linux Scripting and teradata utilities.

Confidential, CA

Sr. Teradata Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Developed mappings to load data from Source systems like oracle, AS400 to Data Warehouse.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Dealt with Incremental data as well Migration data to load into the Teradata.
  • Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Experienced in SQL performance tuning experienced in writing complex SQL.
  • Worked efficiently on Teradata Parallel Transport and generated codes.
  • Generated custom JCL scripts for processing all mainframes flat files, IBM DB2.
  • Monitored the performance of the existing system running a realistic mix of applications with Teradata performance tuning.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases with Mainframes/MVS COBOL, Teradata load utilities, SQL.
  • Extracted source data from mainframe OLTP systems by writing COBOL and JCL scripts.
  • Responsible for trouble shooting, identifying and resolving data problems.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
  • Created series of Teradata Macros for various applications in Teradata SQL Assistant.
  • Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
  • Created teradata models in Erwin.
  • Performance tuning for Teradata SQL statements using Teradata Explain command.
  • Created several SQL queries and created several reports using the above data mart for UAT and user reports.
  • Organized the data efficiently in the Teradata system using Teradata Manufacturing Logical Data Model.
  • Worked efficiently on Teradata Parallel Transport codes.
  • Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Subqueries, EXISTS, COALESCE, NULL etc.
  • Developed and deployed several Ab Initio graphs for the orders mart process to load the data from the source tables and also for loading into the existing marts for data analysis
  • Generated Configuration files, DML files, XFR files specifies the Record format, which are used in components for building graphs in Ab Initio.
  • Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio graphs.
  • Worked on enhancements of Bteq scripts and Ab initio graphs which validated the Performance tables in the Teradata environment.
  • Developed graphs using multistage components.
  • Extensively Used Transform Components: Reformat, Rollup and Scan Components.
  • Implemented the component level, pipeline and Data parallelism in Ab Initio for ETL process for Data warehouse.
  • Worked on various unix/Linux commands and scripting.
  • Extensively used Partitioning Components like Broad Cast, partition by key, partition by Range, partition by round robin and Departition components like Concatenate, Gather and Merge in Ab Initio.
  • Responsible for the automation of Ab Initio graphs using korn shell scripts.
  • Developed Ab Initio scripts for data conditioning, transformation, validation and loading.
  • Extensively used EME for Version Control System and for Code Promotion

Confidential, OR

TeraData/ Informatica Developer

Responsibilities:

  • Gathered the Requirements from the business users and designed the Structure for the Data Warehouse.
  • Created new tables and designed the databases. Created new indexes on tables to fasten the database.
  • Involved in gathering the business requirements from the Business Analyst.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
  • Flat files are loaded into databases using FastLoad and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Involved in the analysis and implementation of their system.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Worked with different Data sources ranging from SAP, MDB, Teradata, flat files, XML, Oracle, and SQL server databases.
  • Created ERWIN data dictionaries and logical models for the data warehouse implementation
  • Involved heavily in writing complex SQL queries based on the given requirements.
  • Created data feed using Teradata Fast Export and FTP the data files on Oracle box.
  • Involved in developing Multi load, Fast Load and BTEQ scripts.
  • Created and automate the process of freight and shrink loading process using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.
  • Created a generic email notification program in UNIX that sends the emails to the production support team if there are any duplicate records or error in load process.
  • Created a BTEQ script for pre population of the work tables prior to the main load process.
  • Used FastLoad for loading into the empty tables.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Created an archive process that archives the data files and FTP to the remote server.
  • Created a Cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Created a shell script that checks the corruption of data file prior to the load.
  • Created unit test plans to unit test the code prior to the handover process to QA.
  • Involved in troubleshooting the production issues and providing production support.
  • Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Worked on exporting data to flat files using Teradata Fast Export
  • Analyzed the Data Distribution and Reviewed the Index choices
  • In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks
  • Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process
  • Develop Informatica mappings to identify SKU’s with Full name and product description combination by writing conditional statements and using transformations such as Expression, Aggregator, Update Strategy, Lookup, Router, etc.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter and Aggregator Transformations in Informatica.
  • Debug the Informatica mappings and validate the data in the target tables once it is loaded with mappings.
  • Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.
  • Developed Informatica Objects - Mappings, sessions, Workflows based on the design documents.
  • Debug the Informatica Mappings and validate the data in the target tables once it was loaded with mappings.
  • Developed Informatica SCD Type-I, Type-II and Type III mappings and tuned them for better performance.
  • Have used perl/Linux/Unix scripting for teradata utilities like fastexport,fastload.
  • Created Web forms which are used by many departments.
  • Developed many stored procedures that act as data source for all the web-based and reporting applications here in this project.
  • Wrote many Adhoc queries daily to get the data needed by the Management.

We'd love your feedback!