We provide IT Staff Augmentation Services!

Informatica/ Teradata Developer Resume

5.00/5 (Submit Your Rating)

Union, NJ

SUMMARY

  • Having around 5 years of experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica power Centre 9.6.2, 8.6, Teradata and Oracle technologies.
  • Good experience in Data warehousing applications using Informatica with extensive experience in designing the workflows, worklets Configuring the informatica Server and scheduling the workflows and sessions using Informatica power center 9.x/8.x/7. x.
  • Responsible for Extraction, cleansing, Transformation and Loading of data from various sources to the data warehouse.
  • Strong experience with large and midsize data warehousing implementation using Informatica Power center/ power mart, Oracle, SQL server, UNIX and Windows platforms.
  • Proficient in data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Expertise in implementing complex business rules by creating Robust Mappings, Reusable Transformations using Transformations like Unconnected Look Up, Connected Look Up, Joiner, Router, Expression, Aggregator, Filter, Update Strategy etc.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multiload, TPT, SQL Assistant, DDL and DML commands
  • Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used in Relational, Dimensional and Multidimensional data modelling.
  • Created Data stage jobs (ETL Process) for populating the data into the Data Warehouse constantly from different source systems like ODS, flat files, scheduled the same using Data Stage Sequencer for System Integration testing.
  • Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables
  • Strong experience with Informatica tools - Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository Manager, Workflow Manager and Workflow Monitor
  • Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools likeTeradataWorkload Analyzer,TeradataDynamic Workload Manager andTeradataManager
  • Worked with push down optimization in Informatica.
  • Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data Management, Data transportation and Data Staging.
  • Involved in issue tracking and agile project management usingJIRA.
  • Experience in creating Tables, Views, and Indexes, Stored procedures, Triggers, Cursors, Function and packages in SQL Server.
  • Good knowledge on generating various complex reports using Business Objects, Micro Strategy and Excel reports.
  • Hands on experience in handling large volumes of data in production environments
  • Development & Implementation of a data warehousing project and production support for enhancements and maintenance.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
  • Experience withdifferent indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI))and Collect Statistics.
  • Experience in working with both3NF and dimensional modelsfor data warehouse and good understanding of OLAP/OLTP systems.

TECHNICAL SKILLS

Operating Systems: Windows, Unix, Linux.

ETL Tools: Informatica Power Center 9.6.2/9.x/8.x/7.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), Informatica Power Exchange and Informatica Data Quality(IDQ).

Teradata Tools & Utilities: BTEQ, Multi Load, Fast Load, Fast Export, TPump, Teradata Manager, SQL Assistant, TSET, Index Wizard, Statistics Wizards.

Databases: Teradata 15.10/14.X/13.X/12.X, Oracle 12c,11g/10g,8i, DB2, SQL Server.

Software Engineering: Agile, Scrum, SDLC, UML

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, Java/J2EE.

Scheduling Tools: Autosys, Control-M.

PROFESSIONAL EXPERIENCE

Confidential, Union, NJ

Informatica/ Teradata Developer

Responsibilities:

  • Responsible for creating Technical Design documents, Source to Target mapping documents and Test Case documents to reflect the ELT process.
  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Designed, Developed and Build Informatica Power Center Mappings and workflows using Teradata External Loaders.
  • Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Involved in creation of Informatica mappings to build business rules to load data using transformations like Source Qualifier, Expression, Aggregator, Lookup, Filter, Router, Update Strategy, Normalizer, Java, Stored procedure, and Sequence generator transformations.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities of Teradata.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Tuning SQL Queries to overcome spool space errors and improve performance.
  • Designed the Data Quality engine using Dynamic SQL execution.
  • Designed the Unique key combination using various fields and join the tables for Reporting purposes.
  • Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform.
  • Also involved in creating Views, Conformed views and mapping them from the table.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
  • As part of monitoring Production system aborted many queries and changed the workload to balance the total load on the system.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Flat files are loaded into databases using Fast Load and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance.
  • Maintain and Tune Teradata Production system queries.
  • Supporting Jobs running in production for failure resolution with tracking failure reasons and providing best resolution in timely manner
  • Designing System Alert through the Teradata Manager in automating the production system monitoring. Build tables, views using UPI, NUPI, USI, NUSI and PPI.
  • Created checkpoints, phases to avoid dead locks and tested the graphs with some sample data then committed the graphs and related files into Repository from sandbox environment.
  • Then schedule the graphs using Autosys and loaded the data into target tables from staging area by using SQL Loader.
  • Used Viewpoint, Teradata Manager and PMON to monitor the system performance and load on production systems.
  • Estimates and planning of development work using Agile Software Development.
  • Implemented Data parallelism by using Multi-File System, Partition and De-partition components and preformed repartition to improve the overall performance.
  • Involved in unit test plans and system testing.

Environment: Teradata 15.10,14.10, Informatica Power Center 9.6.2, Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Map let Designer, Mapping Designer, Repository manager, Informatica Data Quality (IDQ), Oracle 12c, UC4, Control-M, UNIX, SSH (secure shell).

Confidential, Lisle, IL

Informatica/ Teradata Developer

Responsibilities:

  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation.
  • Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Connected and Un connected Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, designed complex mappings involving target load order and constraint-based loading.
  • Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting the data from the Source systems to the Staging and to Data Warehouse system.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL).
  • Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
  • Created a BTEQ script for pre-population of the work tables prior to the main load process.
  • Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
  • Identified and continuously acted to improve individual and team knowledge of new technologies, business processes and project management skills.
  • Also involved in creating Views, Conformed views and mapping them from the table.
  • Worked to create TPT script template for all medical, eligibility and pharmacy flat file coming.
  • Worked on exporting data to flat files using Teradata Fast Export.
  • Designing System Alert through the Teradata Manager in automating the production system monitoring
  • In-depth expertise in the Teradata cost-based query optimizer, identified potential bottlenecks.
  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.
  • Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Developed UNIX scripts to transfer the data from operational data sources to the target warehouse.
  • Supporting different Application development teams, production support, query performance tuning, system monitoring, database needs and guidance.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Designing System Alert through the Teradata Manager in automating the production system monitoring. Build tables, views using UPI, NUPI, USI, NUSI and PPI.
  • Worked with ETL users to provide access and creating the objects on Production environment
  • Implemented full pushdown Optimization (PDO) for Semantic layer implementation for some of the complex aggregate/summary tables instead of using the ELT approach. worked on Informatic Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Environment: Teradata 15.10, Informatica Power Center 9.6, Oracle 12c, SQL Server 2012/2008, UNIX, Flat Files Oracle 11g, Toad

Confidential

Informatica/ Teradata Developer

Responsibilities:

  • Extensively used ETL to load data from Oracle and Flat files to Data Warehouse
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using power center of Informatica.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Done various optimization techniques in Aggregator, Lookup and Joiner transformation.
  • Used command line program PMCMD to communicate with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created procedures to truncate data in the target before the session run.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Worked on Teradata RDBMS using Fastload, Multiload, Tpump, Fastexport, Multiload Export, Teradata Sql And Bteq Teradata utilities.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to Teradata RDBMS using BTEQ, Multiload and Fast Load.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Responsible for Collect Statics on FACT tables.
  • Design and development of the complete Decision Support System using Business Objects.
  • Worked on Migration Strategies between Development, Test and Production Repositories.
  • Extensively involved in development of mappings using various transformations of Informatica according to business logic.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Analyzing data and implementing the multi-value compression for optimal usage of space.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Performed Database testing and the Report level testing as per the requirement with excellent knowledge in understanding the data workflow by referring through FSD’s (Functional Specification Document).
  • Excellent understanding of mapping between Source and Target by referring the mapping document.
  • Created and Scheduled Sessions and Batches using Server Manager
  • Created and Monitor the sessions using workflow manager and workflow monitor.

Environment: Teradata 14.10/13.x, Oracle 11G, SQL A, Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Target Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Bteq, TPT, Fastload, Mload, Autosys, UNIX, SSH (secure shell), TOAD

We'd love your feedback!