We provide IT Staff Augmentation Services!

Informatica/ Teradata Developer Resume

0/5 (Submit Your Rating)

Strongsville, OhiO

SUMMARY

  • 8 years of Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.5.1/ 9.0.1/8.6/8.1 /7.1 , Teradata 12/13.10/14.10 , Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000 in Finance, Health Insurance and Pharmacy Domains.
  • More than 5 years of experience in Data Warehousing/ ETL Testing within the environment of Oracle, SQL Server, Informatica Power Center 9.1/8.6/ 8.1 /7.1 , Power Mart 6.2/ 5.1/ 4.7. x (Repository Manager, Server manager, Mapping Designer, Mapplet Designer, Transformation Developer and Warehouse designer).
  • Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling
  • Extensive experience in testing and reviewing of dimensional modeling (Star and Snow flake) of data warehouse using Erwin.
  • Hands - on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Good experience of SDLC methodologies like Waterfall, Agile, Scrum.
  • Experience in designing, documenting, and executing test plans, test harness, test scenarios/scripts & test cases for manual, automated & bug tracking tools.
  • Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
  • Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets.
  • Data Processing Experience in Designing and Implementing Data Mart applications, mainly Transformation Process using Informatica.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager
  • Extensively used different features of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant, DDL and DML commands.
  • Involved in various Data Warehouse implementations involving ODS, EDW and Data Mart layers.
  • Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
  • Work experience reviewing and testing of data maps between various legacy systems and relational databases.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata /Oracle/SQL Server based Relational Databases.
  • Proficient in Teradata TD12.0/TD13.10/14 database design (conceptual and physical), Query optimization, Performance Tuning.
  • Tested UNIX shell scripts written for ETL Processes to schedule workflows on Autosys.
  • Strong hands on experience using Teradata utilities (FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
  • Automated the BTEQ report generation using Unix scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Excellent communication skills and experienced in client interaction while providing technical support and knowledge transfer.
  • Experience in Data Modeling involving both Logical and Physical modeling using DM tools Erwin and ER Studio. Created data models for both 3NF and Dimensional Data models including Star Schemas.
  • Proficient with the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables with data modeling tools ERWIN and ER Studio.
  • Experience in Data Analysis, Data Profiling and Data Mapping. Used SQL and Data Profiler tools.

Areas of Expertise

  • Data Warehousing & ETL Tester
  • Data Transfer & Data Migration
  • Functional Requirements
  • Scripting & Documentation
  • Data Processing & Development
  • Performance Testing
  • Query Optimization
  • Technical & User Documentation
  • Informatica 8.6.1
  • ERWIN/ER Studio
  • Teradata - SQL and Advanced SQL

TECHNICAL SKILLS

Primary Tools: Informatica Power Center 9.0.1/8.6/8.1 , Ab Initio (Co>Op 3.0.3.9/2.15/2.14 , GDE 3.0.4/1.15/1.14 ), Teradata SQL, Teradata Tools and Utilities, TOAD

Languages: Teradata SQL, COBOL, JCL, REXX, SQL

Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPump, SQL Assistant, Teradata Manager

Databases: Teradata 14.10/13.10/13/12 , Oracle 10g/9i, DB2/UDB, SQL Server

Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX

Data Modeling: Erwin, ER Studio

Scheduling tools: Control M, Autosys

Reporting Tools: Tableau

PROFESSIONAL EXPERIENCE

Confidential — Strongsville, Ohio

Informatica/ Teradata Developer

Responsibilities:

  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW(Enterprise Data Warehouse) loads for History as well as incremental data.
  • Architected and developed FastLoad and MultiLoad scriptsdeveloped Macros and Stored procedures to extract data, BTEQscripts to take the date range from the database to extract data.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
  • Designed Mappings by including the logic of restart.
  • Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Involved in tuning the mappings, sessions and the Source Qualifier query.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows. Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor Wrote several DB2 Stored Procedure scripts to implement the business logic.
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Reviewing programs for QA and Testing.
  • Writing SQL Scripts to extract the data from Database and for Testing Purposes.
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata, UNIX, Citrix, Toad, Putty, PL/SQL Developer

Confidential — Cincinnati, OH

ETL/Teradata Developer

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
  • Extracted datafrom DB2 database on Mainframes and loadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities.
  • Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Architected and developed FastLoad and MultiLoad scriptsdeveloped Macros and Stored procedures to extract data
  • BTEQscripts to take the date range from the database to extract data.
  • Created JCL scripts for calling and executing BTEQ, FastExport, Fload, and Mload scripts
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Load large datasets into Teradata using utilities like fastload, bteq and into Vertica using COPY commands
  • Worked on Teradata and its utilities - tpump, fastload through Informatica. Also created complex Teradata Macros
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
  • Load large datasets into Vertica database using COPY command.
  • Wrote highly complex SQL to pull data from the Teradata EDW and create AdHoc reports for key business personnel within the organization.
  • Created data models for information systems by applying formal data modeling techniques.
  • Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
  • Performed reverse engineering of physical data models from databases and SQL scripts.
  • Provided database implementation and database administrative support for custom application development efforts.
  • Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
  • Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE.
  • Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance Testing and Regression.

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata, UNIX, Citrix, Toad, Putty, PL/SQL Developer

Confidential — Detroit, MI

Informatica/Teradata Developer

Responsibilities:

  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency.
  • It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications
  • Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata, UNIX, Citrix, Toad, Putty, PL/SQL Developer

Confidential — Pleasanton, CA

Teradata Developer

Responsibilities:

  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Worked on reusable code known as Tie outs to maintain the data consistency.
  • It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency.
  • It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.

Environment: Teradata 12,BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, OBIEE 11g/10g, DB2, ERwin r7.3, IBM Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, ZEKE, DB2, UNIX, FTP.

We'd love your feedback!