We provide IT Staff Augmentation Services!

Teradata Developer Resume

4.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • 6+ years of Technical and Functional experience in Data warehouse implementations, ETL methodology using Informatica Power Center 9.5.1/9.0.1/8.6/8.1/7.1 , Teradata 12/13.10/14, Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000 in Finance, Health Insurance and Pharmacy Domains.
  • Expertise in Informatica PowerCenter7.x/8.x/9.1 Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Maple Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator transformation, Active and Passive transformations, Joiner and Update Strategy transformations.
  • Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2, type3.
  • Expertise in RDBMS, Data Warehouse Architecture and Modeling. Thorough understanding and experience in data warehouse and data mart design, Star schema, Snowflake schema, Normalization and Demoralization concepts and principles.
  • Experience in working with Mainframe files, COBOL files, XML, and Flat Files.
  • Extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica Power Center & Oracle PL/SQL technologies .
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modeling.
  • Extensive knowledge on Data Profiling using Informatica Developer tool.
  • Implemented Slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information designed and developed change data capture solutions (CDC) for the project, which captures and analyzes changes from daily feeds to maintain history tables.
  • Involved in Informatica upgrade from Informatica Power Center 7.1.1 to Informatica Power center 8.6 and Informatica Power Center 7.1.1 to Informatica Power center 8.1.1.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co - ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • Proficient in Teradata TD12.0/TD13.10/14 database design (conceptual and physical), Query optimization, Performance Tuning.
  • Strong hands on experience using Teradata utilities (FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
  • Familiar in Creating Secondary indexes, and join indexes in Teradata.
  • Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Expert in troubleshooting/debugging and improving performance at different stages like database, Workflows, Mapping, Repository and Monitor
  • Involved in Informatica administration such as creating folders, users, change management and also involved in moving code from DEV to TEST and PROD using deployment groups in Informatica Repository Manager.
  • Experience in handling different data sources ranging from flat files, Excel, Oracle, SQL Server, Teradata, DB2 databases, XML files.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance bottlenecks.
  • Proficient in applying Performance tuning concepts to Informatica Mappings, Session Properties and Databases.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Excellent communication skills and experienced in client interaction while providing technical support and knowledge transfer.

TECHNICAL EXPERTISE:

Databases: Oracle 10g/9i/8i, Teradata 14/13.10/13/12/ V2R6.2/V2R5, DB2, MS-SQL Server, MS-Access.

DB Tools/Utilities: Teradata SQL Assistant, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Manager, Teradata Query Manager, Teradata Administrator, Teradata SQL Assistance, PMON, SQL Loader, TOAD 8.0.

SAS: SAS 8/9, SAS/BASE SAS/SQL, SAS/GRAPH, SAS/STAT, SAS/MACRO, SAS/ODS, SAS/ACCESS, SAS/QC, SAS/CONNECT, SAS/INTRNET, SAS/LAB, SAS/IML

ETL Tools: PL/SQL, Informatica PowerCenter 9.1.5/8.x/7.x/6.x/5.1, Informatica Power exchange, Ab Initio (GDE 1.15/1.14/1.13 ,Co-Op 2.15/2.14/2.13 , EME).

Data Modeling: Erwin 7.3/9, ER Studio, Sybase Power Designer, Logical/Physical/Dimensional, Star/Snowflake/Extended-star schema, OLAP.

Scheduling Tools: Autosys, Tivoli Maestro.

Version Control Tools: Clear Case.

Operating Systems: Sun Solaris 5.0/2.6/2.7/2.8/8.0 , Linux, Windows, UNIX.

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Teradata Developer

Responsibilities:
  • Interacting with business team to understand business needs and to gather requirements.
  • Prepared requirements document in order to achieve business goals and to meet end user expectations.
  • Created Mapping document from Source to stage and Stage to target mapping.
  • Performed Unit testing and created Unix Shell Scripts and provided on call support.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
  • Worked with TPT wizards to generate the TPT scripts for the Incoming Claims data.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations. Used Control-M for Scheduling.
  • Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
  • Created a BTEQ script for pre population of the work tables prior to the main load process.
  • Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
  • Worked on exporting data to flat files using Teradata FastExport.
  • Analyzed the Data Distribution and Reviewed the Index choices.
  • In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
  • Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.

Environment: Teradata, Fixed width files, TPT, TPT script, Teradata 14.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant.

Confidential, San Jose, CA

Informatica/ Teradata Developer

Responsibilities:
  • Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Interacted with business team to understand business needs and to gather requirements.
  • Prepared requirements document in order to achieve business goals and to meet end user expectations.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplets designer, Transformation Developer.
  • Designed Mappings by including the logic of restart.
  • Did the Data Profiling and Data Analysis using SQL queries looking for Data issues, Data anomalies.
  • Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Involved in tuning the mappings, sessions and the Source Qualifier query.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Changing the existing Data Models using Erwin for Enhancements to the existing Data warehouse projects.
  • Manage all technical aspects of the ETL mapping process with other team members.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Worked with the Statisticians, Data Managers to provide SAS programming in analyzing Clinical Trial Data.
  • Created sessions and workflows to run with the logic embedded in the mappings.
  • Extensively used SQL, PL/SQL code to develop custom ETL solutions and load data into data warehouse system.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
  • Created Informatica Mappings and TPT Scripts to load Medical, Eligibility and Pharmacy claims from flat file to table.
  • Worked with TPT wizards to generate the TPT scripts for the Incoming Claims data.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations Used Autosys for Scheduling.
  • Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
  • Worked on exporting data to flat files using Teradata FastExport.
  • Implemented project using Agile software methodologies (scrum).

Environment: Informatica Developer 9.5.1,Unix, Oracle 10g, Teradata, Fixed width files, TPT, TPT script, TOAD, Harvest (SCM)Windows XP and MS Office Suite, Teradata 14.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant.

Confidential, Atlanta, GA

Informatica/ Teradata Developer

Responsibilities:
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica. Also created complex Teradata Macros.
  • Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels .
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
  • Changed Informatica sessions to point to Teradata TPT connections using  Stream, Update and Load operators.
  • Mentored Abbott ETL teams for the right usage of the Teradata Utilities like  TPT Operators, Fastload, Multiload etc.
  • Mostly worked on Dimensional Data Modeling, Star Schema and Snowflake schema modeling.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency. Compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on PowerExchange bulk data movement process by using PowerExchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. PowerExchange CDC can retrieve updates at user-defined intervals or in near real time.
  • Worked independently on the critical milestone of the project interfaces by designing a completely parameterized code to be used across the interfaces and delivered them on time in spite of several hurdles like requirement changes, business rules changes, source data issues and complex business functionality.
  • Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Involved in the continuous enhancements and fixing of production problems.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Supported the development and production support group in identifying and resolving production issues.
  • Created and managed different Power exchange directories like condense files directory, check point directory etc.
  • Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Experience in managing different Power exchange directories like condense files directory, check point directory etc.
  • Responsible for Performance-tuning of Ab Initio graphs. Written UNIX shell scripts in Batch scheduling.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata 14.10/13.x, UNIX, Citrix, Toad, Putty, PL/SQL Developer, Power Exchange 9.2.1/8.6.1

Confidential, Atlanta, GA

Informatica/ Teradata developer

Responsibilities:
  • Gathered the Requirements from the business users and designed the Structure for the Data Warehouse.
  • Created new tables and designed the databases. Created new indexes on tables to fasten the database.
  • Involved in gathering the business requirements from the Business Analyst.
  • Tuning of Teradata SQL statements using Explain plan, analyzing the data distribution among AMPs and index usage, collecting statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Loading Flat files into databases using Fastload and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collecting statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter and Aggregator Transformations in Informatica.
  • Debug the Informatica mappings and validate the data in the target tables once it is loaded with mappings.
  • Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.
  • Developed Informatica Objects - Mappings, sessions, Workflows based on the design documents.
  • Debug the Informatica Mappings and validate the data in the target tables once it is loaded with mappings.
  • Developed Informatica SCD Type-I, Type-II and Type III mappings and tuned them for better performance.
  • Created Web forms which are used by many departments.
  • Developed many stored procedures that act as data source for all the web-based and reporting applications here in this project.
  • Wrote many Adhoc queries daily to get the data needed by the Management.
  • Development of the new batch programs in mainframes using   COBOL, DB2 & JCL.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Worked on Ab Initio graphs that transfer data from various sources like DB2, legacy systems, flat files and CSV files to the Oracle and flat files.
  • Optimized Ab initio graphs by looking at the CPU runtime, Skew% available in the logs during the runtime and made necessary changes in the components.
  • Monitored AI production jobs like the processing time for different phases and total time and the run flow on WEB-EME. This was very helpful for new code implementations to understand the behavior of new jobs in production environment.
  • Worked on exporting data to flat files using Teradata Fast Export.
  • Analyzed the Data Distribution and Reviewed the Index choices.
  • In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
  • Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
  • Develop Informatica mappings to identify SKU’s with Full name and product description combination by writing conditional statements and using transformations such as Expression, Aggregator, Update Strategy, Lookup, Router, etc.

Environment: Teradata 12/13, Informatica Power Center 8.6.1, Oracle 11g, MS SQL Server 2008, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, Unix, IBM Mainframes, Oracle Applications 11i, Sun Solaris.

Confidential, Overland Park, KS

ETL/ Teradata consultant

Responsibilities:
  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
  • Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Architected and developed FastLoad and MultiLoad scripts developed Macros and Stored procedures to extract data, BTEQ scripts to take the date range from the database to extract data.
  • Created JCL scripts for calling and executing BTEQ, FastExport, Fload, and Mload scripts.
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Wrote highly complex SQL to pull data from the Teradata EDW and create AdHoc reports for key business personnel within the organization.
  • Created data models for information systems by applying formal data modeling techniques.
  • Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job abends, attend conference calls with business operations, system managers for resolution of issues.
  • Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
  • Performed reverse engineering of physical data models from databases and SQL scripts.
  • Provided database implementation and database administrative support for custom application development efforts.
  • Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
  • Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE .
  • Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance Testing and Regression.

Environment: Teradata 12, BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, OBIEE 11g/10g, DB2, ERwin r7.3, IBM Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, ZEKE, DB2, UNIX, FTP.

We'd love your feedback!