We provide IT Staff Augmentation Services!

Teradata Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 7 + years of experience in Abinitio, as a developer, designer, and analyst in designing and developing of ETL process.
  • Expertise in all components in the GDE of AbInitio for creating, executing, testing, updating and maintaining graphs in AbInitio and also experience with AbInitio Co>operating System in application tuning and debugging strategies.
  • Sound Knowledge on Dimensional Modeling concepts and design of Data warehouse with the elements of STAR and SNOWFLAKE schemas.
  • Experience in architect, design and deploy ETL solution for large - scale data OLAP and OLTP instance using ETL tools Ab-Initio and custom scripts.
  • Sound knowledge of Teradata Architecture and hands on in writing Teradata SQL’s, BTEQ scripts, Macro’s, Tuning Teradata DB queries and making use of Teradata utilities like, Fast Load, Fast Export, Multi Load and TPUMP.
  • Experience in Designing and developing complex mappings from varied transformation logic like lookups, Scan, Filter, Expression, Aggregator, Joiner, Update etc and metadata version controlling using EME and Sandbox.
  • Expertise in extracting data from multiple sources, data cleansing and validation based on business requirements.
  • Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, Netezza,MS SQL Server, Flat files and Legacy Systems.
  • Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Proficient in Teradata TD12.0/TD13.0 database design (conceptual and physical), Query optimization, Performance Tuning.
  • Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Extensive knowledge on Dimensional Data modeling, Star Schema/snowflakes schema, Fact and Dimension tables
  • Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
  • Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
  • Extensive experience in Korn Shell Scripting to maximize Ab-Initio data parallelism and Multi File System (MFS) techniques.
  • Excellent experience in Design and Implementing ETL processes and transformation mappings and loading using AB Initio Co>Op 2.15/2.14/2.13, GDE 3.0.4/ 1.15/1.14/1.13 and Data Profiler 1.15.
  • Excellent programming skills with ability to automate routine tasks using UNIX k-shell scripting and good experience of SQL in manipulating and transforming data.
  • Experience in troubleshooting and improving performance of the Ab Initio graphs, EME check-in, checkouts and sandbox creation.
  • Wrote setup scripts that define the environment variables and wrappers to run Ab Initio deployed scripts.
  • Well-organized, quick learner, self-motivated team player and has experience in all phases of the SDLC.
  • Extensive experience in writing quality SQL, Oracle PL/SQL procedures, functions, SQL Loader scripts.
  • Extensive experience in Scheduling tools Mercury Quality Center, Autosys and Control-M.
  • Experience with SQL/PL-SQL/SQL* Loader and UNIX Shell (ksh, csh) scripts for OLTP and OLAP data warehouse instance.
  • Highly motivated team player with strong communication, Organizational and analytical skills and with a passion to work in challenging environment and adaptable to varying circumstances.

TECHNICAL SKILLS

Teradata Tools: ARCMAIN, BTEQ, Teradata SQL Assistant, Teradata Manager, PMON, Teradata Administrator.

Data warehousing Tools: Ab Initio (GDE 3.1.1/3.0.5/ 3.0.4/3.0.2/1.15/1.14, Co>Operating System 3.1.1/3.0.5/2.15/2.14 ),Informatica 6.1/7.1x, SSIS

Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio

RDBMS: Tera Data 13.10/13.0/12.0, Oracle 10g/9i/8i/, Netezza 4.6.2, DB2, MS SQL Server 2000, 2005,2008

Programming: UNIX Shell Scripting, C/C++, Java, Korn Shell, TSQL, SQL*Plus, PL/SQL,HTML, ASP.Net

Operating Systems: Windows NT/XP/2000, UNIX, LINUX(Redhat)

BI tools: OBIEE 10.1.3.x, Crystal Reports 8.0/8.5, Business Objects

Scheduling Tools: Autosys, OPC, Tivoli Maestro

PROFESSIONAL EXPERIENCE

Confidential, Seattle WA

Sr.Abinitio /Teradata Developer

Responsibilities:

  • Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Performed Space Management for Perm & Spool Space.
  • Reviewed the SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.
  • Involved in unit testing and documentation of Ab Initio Graphs.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Analyzing data and implementing the multi-value compression for optimal usage of space.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Modified the AB Initio components parameters, utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.
  • Flat files are loaded into databases using FastLoad and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Use PMON, Teradata manager to monitor the production system during online day.
  • Excellent experience in performance tuning and query optimization of the Teradata SQLs.
  • Created Ab Initio graphs that transfer data from various sources like Teradata, flat files and CSV files to the Teradata database and flat files.
  • Worked on Multi file systems with extensive parallel processing.
  • Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data.
  • Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.
  • Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.
  • Automation of load processes using Autosys.
  • Used Lookup Transformation in validating the warehouse customer data.
  • Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
  • Used Teradata SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS
  • Coded and Unit tested Ab Initio graphs to extract the data from Teradata tables and MVS files.
  • Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Produced mapping document and ETL design document.
  • Worked closely with the end users in writing the functional specifications based on the business needs.
  • Participated in project review meetings.
  • Used Phases, Checkpoints to avoid deadlocks and multi-files in graphs and also used Run program, Run SQL components to run UNIX and SQL commands.
  • Excellent understanding of the System Development Life Cycle. Clear and through understanding of business process and workflow. Involved in all the four phases namely planning, analysis, design and implementation. Experienced in testing, documentation and requirements gathering.
  • Extensively worked with Stored procedures & functions and created triggers to implement business rules and validations.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Scripts were run through Unix shell scripts in Batch scheduling
  • Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
  • Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
  • Prepared Unit and Integration testing plans. Involved in SIT/ UAT with user groups.

Environment: Ab Initio (CO>Operating system 3.1.1/2.15/2.14, GDE 3.0.4/3.0.2/1.15/1.14 ), ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Teradata 13.0/12.0, DB2, COBOL, Perl, Autosys, OPC

Confidential, Richmond, VA

Abinitio/Teradata Developer

Responsibilities:

  • Involved in all phases of the Software Development Life Cycle Analysis, Business modeling and Data modeling.
  • Used Ab Initio to extract, transform and load data from multiple input sources like flat files, Oracle files to the database.
  • Designed, developed and unit tested relational data warehouse project at Confidential using Ab Initio ETL tool.
  • Created various Ab Initio graphs with 8- way Multi File Systems (MFS) that is composed of individual files on differnent nodes that are partitioned and stored in distributed directories (using Multidirectories).
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Involved in preparing graph design documents.
  • Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.
  • Developed Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.
  • Developed several partition based Ab Intio graphs for high volume data warehouse.
  • Developed various parameterized graphs in GDE.
  • Extensively used Ab Initio tool’s feature of component, data and pipeline parallelism.
  • Extensively used aggregating components like Rollup, Scan and Scan with Rollup in transformations to consolidate the data.
  • Involved in tuning the graphs by creating Lookup files and by minimizing the number of components in the flows and also by implementing MFS.
  • Developed/Modified subject area graphs based on business requirements using various Ab Initio Components like Filter by Expression, Partition by Expression, reformat, join, gather, merge rollup, normalize, denormalize, scan, replicate etc.
  • Involved in writing shell scripts.
  • Involved in performance tuning of SQL queries, views using TOAD.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab-Initio partition components to segment data.
  • Developed data base using SQL, Indexes, Constraints etc. Good experience on SQL optimization. Oracle is preferred.
  • Have good knowledge on development methodologies.
  • Used phases and checkpoints in the graphs to avoid the deadlocks, improve the performance and recover the graphs from the last successful checkpoint.
  • Extensively used Ab Initio built in string, math and date functions.
  • Used Ab Initio web interface to navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
  • Efficiently used Graph level parameters in building and executing graphs.
  • Experienced using Autosys Scheduler including Scheduling, Updating, creating and querying box/command jobs in a multi-server environment.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Created Unix shell scripts (Wrapper Scripts) for Autosys Scheduling.
  • Performed various data cleansing and data quality exercises using various Ab Initio functions.
  • Should be able to understand the current design, review the code and suggest best design in AI.
  • Provided level2 production support activities for the application.
  • Scheduled the development team to walk through their process with the DTOPS and test/acceptance teams.
  • Involved in creating LOE and presenting it to the application manager.
  • Worked with the PM assigned to the application to provide input to the project plan.
  • Perfomed the tasks outlined in the plan.
  • Worked with the application manager and PM to add dates to the dashboard.
  • Involved in evaluating the code and see what needs to change to confirm to operational requirements.
  • Provide a status to the assigned PM on the tasks assigned to the team.
  • Some level of knowledge with Logical and Physical data modeling.

Environment: Ab Initio GDE (1.15/1.14), Co>Op (2.15/2.14.1 ), EME, Autosys, Unix, Clear Quest, Oracle 10g, Toad, UNIX Script (KShell).

Confidential, Phoenix, AZ

Abinitio/Teradata Developer

Responsibilities:

  • Gathered Business requirements and Mapping documents.
  • Prepared and implemented data verification and testing methods for the Data Warehouse as well as to design and implement data staging methods and stress testing of ETL routines to make sure that they don’t break on heavy loads.
  • Played a prominent role in the maintenance, support and development of the feeds using AbInitio graph applications.
  • Involved in requirement gathering and database design and implementation of star-schema/dimensional data warehouse using Erwin. Used reverse engineer to connect to existing database and create graphical representation (E-R diagram).
  • Used AbInitio Graphical Development Environment and Shell Development Environment in order to develop and test the functionalities of the feed system.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Involved in writing complex SQL queries based on the given requirements and created series of Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for Teradata SQL statements using Teradata Explain command.
  • Implemented several error checking and validation techniques within the graph application to ensure data validity.
  • Validating the Graphs and been in the environment of unit testing.
  • Extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.
  • Developed Perl and Unix shell scripts for file manipulation and automate batch jobs through Autosys.
  • Played a prominent role in standardizing AbInitio code for the current feed system.
  • Helped in optimizing the AbInitio graph application.
  • Analyzed data discrepancies and worked with Subject Matter Experts (SME) to ensure that these get fixed.
  • Constant interaction and working with the deployment team in order to determine whether the feed system is ready to be deployed into a testing and production environment.
  • Reviewed Mapping documents provided by Business Team, implemented business logic embedded in mapping documents into AbInitio Graphs and loading tables needed for Data Validation.

Environment: Abinitio GDE 1.15/1.14/1.13, DB2, Oracle, Autosys, Unix Korn Shell

Confidential, Atlanta, GA

Teradata Developer

Responsibilities:

  • Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Involved in developing Unix Korn Shell wrappers to run various Ab Initio Scripts.
  • Experienced in Automated and manual testing.
  • Experienced in writing test plans and test cases.
  • Developed Ab Initio graphs for Data validation using validate components.
  • Developed Complex Ab Initio XFRs to derive new fields and solve various business requirements.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques such as using lookups instead of Joins etc.
  • Created various Ab Initio Multi File Systems (MFS) to run graphs in parallel.

Environment: Ab initio Co>os 2.14, DB2, Oracle 8.1, UNIX VI editior

We'd love your feedback!