We provide IT Staff Augmentation Services!

Informatica Developer Resume

San, JosE

SUMMARY:

  • 5+ years of IT experience in Data Warehouse Development and Production Support
  • Proficient in all phases of Software Development Life Cycle (SDLC), including requirements gathering, design, development, system testing and acceptance testing and production support
  • Expertise in Requirement Analysis, Design, Coding, Testing & Implementation of ETL/DWH projects using Informatica Power Centre 10.x/9.x/8.x, SQL, Oracle and Unix Shell Scripts
  • Extensive experience with ETL tool Informatica in designing Workflows, Worklets, Mapplets, Mappings and scheduling the Workflows and sessions using scheduler tools
  • Experience in documenting application use cases and providing source to target mapping requirements
  • Good experience on Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Unconnected and Connected lookups, Rank, Sorter transformations
  • Experience in integration of various data sources like Oracle, Microsoft SQL Server and Teradata and flat files by using ETL tools.
  • Experience in error handling and troubleshooting using various log files
  • Worked on performance tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions
  • Willingness to learn new concepts and ability to articulate alternative solutions.
  • Proficiency with DW concepts and practice of relational and dimensional data modeling with extensive experience in development and support
  • Worked extensively with a QA Team in understanding test cases, test plans, User Acceptance Testing (UAT) and ensuring that the software meets the system requirements specifications
  • Experience in preparing documentation like HLD, LLD and Test case documentation.
  • Having 1+ year of experience in Teradata database & tools Utilities like Fast - Load, Multi-load and Bteq Scripting.
  • Good understanding on End-To-End implementation of data warehouse and strong understanding of Business Process Analysis.
  • DWH Concepts, ETL, Star schema, Data Modeling, experience using Normalization, Re-engineering, Dimensional Modeling, Facts & Dimensions tables.
  • Good communication skills, interpersonal skills, self-motivated, quick learner
  • Experience working with Windows and Unix operating systems
  • Experience in reviewing design documents and code
  • Efficiently lead teams in previous and current projects

TECHNICAL SKILLS:

ETL&BI Tools: Informatica Power center 8.6/ 9.6/10.1.0

Data Bases: SQL Server 2008, Oracle 11gR2/10g, Teradata

Development Tools: Toad, SQL Developer, Teradata SQL Assistant, MSSQL Studio

Operating Systems: Windows XP/2007/2003, UNIX (Putty, Winscp)

Programming/Languages: C, SQL, PL/SQL, UNIX Shell Scripting

Scheduling Tools: Tidal and Autosys

Methodologies: Star/Snowflake, ETL, OLAP, Complete Software development life cycle.

Modeling: Dimension/ER Data Modeling Logical/Physical modeling

PROFESSIONAL EXPERIENCE:

Confidential, San Jose

Informatica Developer

Environment: s: Informatica Power Center 10.1, SQL Server, Oracle, Tidal, Unix and Windows 2010

Responsibilities:

  • Worked with Business in regards to the requirements, understanding them thoroughly for the complete project outcome.
  • Analyzed the business requirements, technical specification and physical data models for ETL mapping and process flow.
  • Implemented the business rules and extracted the data from various sources such as SQL Server, Oracle and Flat Files loaded the required data into Oracle database.
  • Extensively working Informatica Transformation like Source Qualifier, Rank, Router, Filter, Joiner, Lookup, Aggregator, Union, and Sorter etc.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Expertise in using different tasks (Session, Command, Decision, Email, Event-Raise, Event- Wait, Control)
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Working and coordinating on Data availability issues
  • Worked with Admin team for migrate the Informatica Power Centre mappings and Code/Folder migration from one environment to another as part of release management.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and data base tuning
  • Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
  • Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica
  • Created complex Aggregate, Expression, Join, Filter, Router, Lookup and Update transformation.
  • Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
  • Designed for populate target tables for one time load and Incremental loads.
  • Extracting, Transform and loading of data from flat file, sources to target using transformations in the mappings.

Confidential, New Jersey

Informatica Developer/Production Support

Environment: s: Informatica Power Center 9.6, Windows 2008, Unix, Oracle, SQL Server, Autosys

Responsibilities:

  • Perform data analysis for any requirement and provide source to target mapping rule document
  • Designed and developed complex aggregate, join, lookup transformation to generate and consolidate (fact and summary) data using Informatica Power Center tool.
  • Used the Slowly Changing Dimensions (SCD type 2) to update the data in the target dimension tables.
  • Walked through the Informatica and Oracle code to identify protected information references of columns like SSN, Last name and first name.
  • Knowledge of best practices in Data Warehousing and Business Intelligence
  • Designed the Dimensional Data Model of the Data Warehouse Confirmation of source data layouts and needs
  • Involved in Creating Fact and Dimension tables using Star schema
  • Created sessions, database connections and batches using Informatica Server Manager/Workflow Manager.
  • Involved in monitoring the sessions, workflows and worklets using Workflow Monitor to ensure the data is properly loaded into the Enterprise Data Warehouse.
  • Created configured and scheduled the sessions and Batches for different mappings using workflow manager and using UNIX scripts.
  • Extensively used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic implanted in the mappings.
  • Developed complex mappings in Informatica to load data from various sources.
  • Checked sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Created procedures to truncate data in the target before the session run.

Confidential, New Jersey

ETL Developer

Environment: Informatica PWC 9.6, Oracle10g, SQLServer 2005, Linux and Windows 2008, Autosys

Responsibilities:

  • Scheduling ETL process on daily, weekly and monthly basis
  • Responsible for design and development of rating data mart for financial Data Warehouse.
  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Implemented and documented all the best practices used for the data warehouse.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Using ETL Process to Extract, Transform and Load the data into stage area and data Warehouse
  • Using Transformations to clean the data from staging area as per the data warehouse Requirements
  • Creating Mapplets and using in mapping designer
  • Created mappings, reusable transformations in Mapping Designer.
  • Worked with different Sources such as Oracle, SQL Server and Flat files.
  • Designing the interfaces and Bug Fixing and unit testing to check the data discrepancy.
  • Preparation of Program Specifications and create the database connections.

Confidential

Informatica Developer

Environment: Informatica Power Center 8.6, Oracle, SQL Server, Teradata, BTEQ, Fast load, Multi load, Unix, Windows XP

Responsibilities:

  • Mainly involved in ETL developing
  • Involved in development of Stage, Dimension & fact tables mappings using Expression, Update strategy, Filter, Aggregator, Joiner and Lookup
  • Involved in Fine tuning mappings as a Performance activity
  • Using ETL Process to Extract, Transform and Load the data into stage area and data Warehouse
  • Using Transformations to clean the data from staging area as per the data warehouse Requirements
  • Created mappings, reusable transformations in Mapping Designer
  • Accomplished data movement process that load data from databases, using Teradata SQL and utilities such as Bteq, Fast load, Multi load.
  • Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
  • Experience with Teradata as the target for the data marts, worked with Bteq, Fast Load and Multi-Load.
  • Worked with different Sources such as Oracle, SQL Server, Flat files and Teradata
  • Preparation of Program Specifications and create the database connections.
  • Designed Informatica Specs (Stage, Dimension & Fact tables) for the current system
  • Using ETL Process to Extract, Transform and Load the data into stage area and data warehouse and Creating Mapplets and using in mapping.
  • Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure Fast Load and Multi-Load sessions.
  • Used Shortcuts to reuse objects across folders without creating multiple objects in the repository.
  • To monitor the scheduled workflows like Daily, Weekly and Monthly jobs and execute the Manual Weekly and Monthly jobs.

Hire Now