We provide IT Staff Augmentation Services!

Sr. Informatica/ Teradata Developer Resume

Union, NJ

PROFESSIONAL SUMMARY:

  • Around 5 years of experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica, Teradata and Oracle technologies.
  • Good experience in Data warehousing applications, responsible for Extraction, cleansing, Transformation and Loading of data from various sources to the data warehouse.
  • Strong experience with large and midsize data warehousing implementation using Informatica Power center/ power mart, Oracle, SQL server, UNIX and Windows platforms.
  • Profecient in data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Expertise in implementing complex business rules by creating Robust Mappings, Reusable Transformations using Transformations like Unconnected Look Up, Connected Look Up, Joiner, Router, Expression, Aggregator, Filter, Update Strategy etc.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multiload, TPT, SQL Assistant, DDL and DML commands
  • Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used in Relational, Dimensional and Multidimensional data modelling.
  • Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables
  • Extensively used different features of Teradata such as BTEQ, Fast load, Multiload, SQL Assistant, DDL and DML commands. Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes
  • Strong experience with Informatica tools - Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository Manager, Workflow Manager and Workflow Monitor
  • Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like Teradata Workload Analyzer, Teradata Dynamic Workload Manager and Teradata Manager
  • Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management .
  • Worked with push down optimization in Informatica.
  • Developed comprehensive models for managing and communicating the relationships between Base Objects and Performed the ETL process for loading the data into hub
  • Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data Management, Data transportation and Data Staging.
  • Experience in creating Tables, Views, and Indexes, Stored procedures, Triggers, Cursors, Function and packages in SQL Server.
  • Good knowledge on generating various complex reports using Business Objects, Micro Strategy and Excel reports.
  • Good knowledge of Full life Cycle Design and development for building data warehouse.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.

TECHNICAL SKILLS:

Teradata Utilities: BTEQ, FastLoad, Multiload, TPT, TPump, SQL Assistant, Viewpoint, Query Monitor

ETL Tools: Informatica Power Center 9.x/8.x/7.x (Designer, Workflow Manager, Workflow Monitor, Repository Manager) and Informatica Data Quality (IDQ).

Databases: Teradata 15.10/14.X/13.X/12.X, Oracle 11g/10g,8i, DB2, SQL Server

Languages: SQL, PL/SQL, UNIX Shell Scripting

Operating Systems: Windows, UNIX, Linux.

Tools: /Utilities: PLSQL Developer, TOAD, SQL Developer.

PROFESSIONAL EXPERIENCE:

Confidential, Union, NJ

Sr. Informatica/ Teradata Developer

Responsibilities:

  • Responsible for creating Technical Design documents, Source to Target mapping documents and Test Case documents to reflect the ELT process.
  • Involved in Complete Software Development Lifecycle Experience ( SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Designed, Developed and Build Informatica Power Center Mappings and workflows using Teradata External Loaders.
  • Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities of Teradata.
  • Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
  • Performed application level DBA activities creating tables, indexes and monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Analyzing data and implementing the multi-value compression for optimal usage of space.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Used Informatica tool for extracting data from landing to stage and Teradata utilities for loading data from stage to target tables.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Flat files are loaded into databases using Fast Load and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance.
  • Excellent experience in performance tuning and query optimization of the Teradata SQLs.
  • Created checkpoints, phases to avoid dead locks and tested the graphs with some sample data then committed the graphs and related files into Repository from sandbox environment.
  • Then schedule the graphs using Autosys and loaded the data into target tables from staging area by using SQL Loader.
  • Implemented Data parallelism by using Multi-File System, Partition and De-partition components and preformed repartition to improve the overall performance.
  • Involved in unit test plans and system testing.

Environment: Teradata 14.10, Informatica Power Center 9.6, Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Map let Designer, Mapping Designer, Repository manager, Informatica Cloud, Informatica Data Quality (IDQ), UC4, Control-M, UNIX, SSH (secure shell).

Confidential, Richardson, TX

Sr. Informatica/ Teradata Developer

Responsibilities:

  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation.
  • Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Connected and Un connected Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, designed complex mappings involving target load order and constraint based loading.
  • Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting the data from the Source systems to the Staging and to Data Warehouse system.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL).
  • Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
  • Created a BTEQ script for pre-population of the work tables prior to the main load process.
  • Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
  • Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS. Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
  • Worked on exporting data to flat files using Teradata Fast Export.
  • In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.
  • Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Developed UNIX scripts to transfer the data from operational data sources to the target warehouse.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Implemented full pushdown Optimization (PDO) for Semantic layer implementation for some of the complex aggregate/summary tables instead of using the ELT approach. worked on Informatic Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor .

Environment: Informatica Power Center 9.6, Teradata 15.10, Oracle 11g, SQL Server 2012/2008, UNIX, Flat Files Oracle 11g, SQL Server 2012/2008, UNIX, Toad

Confidential, Scottsdale

Informatica/ Teradata Developer

Responsibilities:

  • Extensively used ETL to load data from Oracle and Flat files to Data Warehouse
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using power center of Informatica.
  • Performed application level DBA activities creatingtables, indexes and monitored and tuned TeradataBETQ scripts using TeradataVisual Explain utility.
  • Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Worked on Migration Strategies between Development, Test and Production Repositories.
  • Supported the Quality Assurance team in testing and validating the Informatica workflows.
  • Extensively involved in development of mappings using various transformations of Informatica according to business logic.
  • Created and Scheduled Sessions and Batches using Server Manager
  • Created and Monitor the sessions using workflow manager and workflow monitor.
  • Conducting unit testing.

Environment: Informatica Power Centre 8.6, UNIX, Teradata, Oracle 8 i, TOAD.

Hire Now