We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile

3.00/5 (Submit Your Rating)

TX

Professional Summary:

  • Experience in design and implement Extract, Transformation and Load ETL processes, programs and scripts.
  • Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experienced in creating ER diagrams in Erwin and documenting existing entities.
  • Analyzed Business and Functional Specifications and assisted the QA team for writing a comprehensive test plan / writing test cases.
  • Exposure to Large Scale Data Integration and Performance Tuning.
  • 8 years of experience in Information Technology with a strong background in Database development and Data warehousing and experience in ETL process using Informatica PowerCenter 9.x/ 8.x/7.x/6.x.
  • Experience in providing Business Intelligence solutions in Data Warehousing using Informatica ETL tool Extraction, Transformation and Loading .
  • Excellent Analytical and Problem Solving Skills. Proven Experience in Full Life Cycle Implementation of Data warehouses.
  • Strong Experience in developing strategies for Extraction, Transformation and Loading ETL mechanism using Informatica PowerCenter.
  • Expert in ETL process using Informatica PowerCenter Designer, Workflow manager, Workflow monitor, Informatica server and Repository Manager. Experience in Tuning of sources, targets, mappings, transformations and sessions
  • Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
  • Strong Experience and knowledge in creating Stored Procedure, Packages and Trigger using Oracle 11g/10g.
  • Data Modelling using Star Schema and Snowflake Schema. Strong in Source to Target data Mapping and CDC Change Data Capture using Slowly Changing Dimension Mapping, incremental aggregation etc. Highly Skilled with the Integration of various data sources like Oracle 11g/10g, DB2, SQL Server, Flat Files.
  • Performed Database Design and development using Oracle11g/10g, SQL Server, TeradataV2R6/V2R5.
  • Extensive knowledge in developing Teradata Fast Export, Fast Load, Multi Load and BETEQ scripts.
  • Experience in Informatica Data Quality IDQ
  • Experience in data conversion or data integration software components using Informatica.
  • Skilled in Tuning of SQL. Good Knowledge and Experience using different types of loads like multi load etc. writing SQL queries, creating tables, views and importing to the Informatica. Experienced in ETL Automation.
  • Experience in writing custom java methods to extract the data.
  • Good experience in Development, Production and Maintenance Support Projects. Skilled in Unit Test, System Integration Test and UAT. Excellent communication, client interaction and problem solving skills.

Technical Skills:

  • ETL Tools
  • Informatica PowerCenter 9.x/ 8.x/7.x/6.x,Informatica PowerExchange9.1, Informatica Data Quality 9.1, Informatica B2B DT, Informatica CDC.
  • DBMS
  • Oracle11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2, TeradataV2R6/V2R5.
  • Data Modeling
  • Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling Star Schema, Snow-Flake, Fact, Dimensions , Erwin 7.2/4.0,Normalization, demoralization.
  • Tools
  • Toad, AQT, Quest Central for DB2, SQL Developer, Clear case, Rapid Sql, IBM Rational Doors, HP ALM.
  • Reporting Tools
  • OBIEE11g/10g, Oracle BI Apps 7.9, OBIA, Business Objects 4.0,Microstrategy8.x/7.x.
  • Languages
  • SQL, PL/SQL, UNIX Shell Scripting, Visual Basic, JAVA, XML, Perl.
  • .
  • Operating Systems
  • Windows9x/ XP/2000/NT, UNIX, LINUX.
  • Scheduling Tools
  • TIDAL, AutoSys, Tivoli.

Confidential

Description: Confidential is an Operational data store and staging area for the EDW which consists of the Channel Sales data for the Worldwide Sales. I was involved in APAD and P1 NET projects where enhancements were done to PODS to retire the regional Reporting Assets and provide the single Reporting layer Trident Business Objects .

Responsibilities:

  • Took role in Requirement Analysis, ETL Design and Development.
  • Involved in converting functional requirements to Technical requirements and also creating High level design and Product design documents.
  • Involved in code review process and Code migration MTP .
  • Implemented view stack approach to derive the columns dynamically instead of overwriting the physical values on the database.
  • Experience in Data migration across databases.
  • Worked on PowerCenter Designer client tools like Source Analyzer, Target Designer, Mapping Designer and Mapplet Designer.
  • Worked on PowerCenter Designer client tools like Source Analyzer, Target Designer, Mapping Designer and Mapplet Designer.
  • Modified and created UNIX scripts to accommodate the changes.
  • Participated in Performance Tuning of ETL maps at Mapping, Session, Source and Target level as well as writing Complex SQL Queries from ABSTRACT Data model.
  • Tuned SQL Queries in Source qualifier Transformation for better performance.
  • Created and executed unit test cases for various scenarios for all the mappings, workflows and scripts.
  • Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Defect Tracking and Change management is done using HP ALM.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica 9.5/8.6,Flat Files, Oracle 11g/10g, TOAD, SQL Server 2012, HP-AIX, HPALM, Tidal, Business Objects.

Confidential

Role: Informatica Developer

Description: The Extend Single Family Reporting 2.1 is to leverage the Single Family Credit Data Mart SFCDM platform to create synergies across Freddie Mac's financial and management reporting business areas. The extension of the SFCDM will create a consolidated, centralized, reporting environment and will provide an opportunity for the re-use of business rules and data elements across all users of Single Family data.

Responsibilities:

  • Took major role in understanding the business requirements, designing and loading the data into Datamart.
  • Used ETL Informatica to load data from CDW DB2 to Oracle Datamart SFCDM .
  • Worked on PowerCenter Designer client tools like Source Analyzer, Target Designer, Mapping Designer and Mapplet Designer.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type two and type one.
  • Informatica workflow manager was used to create, schedule, execute Sessions, Worklets, Command, E-Mail Tasks and Workflows. Performed validation and loading of the Flat files received from business users. Wrote UNIX Shell scripts for Informatica ETL Tool.
  • Participated in Performance Tuning of ETL maps at Mapping, Session, Source and Target level as well as writing Complex SQL Queries from ABSTRACT Data model.
  • Tuned SQL Queries in Source qualifier Transformation for better performance.
  • Created and executed unit test cases for various scenarios for all the mappings, workflows and scripts.
  • Used Parameter files to reuse the mapping with different criteria to decrease the maintenance.
  • Extensively used Informatica Debugger for testing the mapping logic during Unit Testing.
  • Provided data to the reporting team for their daily, weekly and monthly reports.
  • Used Autosys to schedule Informatica, SQL script and shell script jobs.

Environment: Informatica 9.5/9.1, Power Exchange, Erwin 7.2/4.0, Flat Files, Oracle 11g/10g, Rapid Sql, DB2,SQL Server 2008/2005, PL/SQL, UNIX, Clear Case, IBM Rational Doors, HP ALM, Autosys, OBIEE 11g.

Confidential

Role: Informatica Developer

Description: Confidential. is an IT solutions firm, specializing in Data Management, Software Engineering, and Business Intelligence solutions. Extracting the data from different source systems and loaded the data in to the Enterprise Data ware house. Data was related to different subject areas: Drug, Service provider, Practitioner, Patient, Authorization and Claims. Implemented the Global and Deploy tables for traceability and Data loaded in to different Target tables using SCD.

Responsibilities:

  • Extracting data from several source systems like Oracle, Flat files, DB2 etc. and loading data into Enterprise Data ware house.
  • Worked on PowerCenter Designer client tools like Source Analyzer, Target Designer, Mapping Designer and Mapplet Designer.
  • Used Data Exchange for transforming structured and unstructured data.
  • Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings. Created reusable transformations and mapplets by using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer and Mapplet Designer respectively.
  • Experience in loading unstructured data using Informatica B2B DT.
  • Experience in creating Custom and Auto profiles using Informatica Data Quality tool set.
  • Experience in using Teradata utilities like Fast Export. Fast Load, Multi Load and BTEQ scripts.
  • Created Teradata external loader connections such as MLoad Upsert, Mload Update, FastLoadt and Tpump in Informatica workflow manager.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type two and type one.
  • Raised change requests, incident Management, analyzed and coordinated resolution of program flaws in the development environment and hot fixed them in the QA, Pre-Prod and prod environments during the runs.
  • Informatica workflow manager was used to create, schedule, execute Sessions, Worklets, Command, E-Mail Tasks and Workflows. Performed validation and loading of the Flat files received from business users. Wrote UNIX Shell scripts for Informatica ETL Tool. Developed complex Procedures, Functions, and Triggers etc. to implement ETL solution through PL/SQL.
  • Performed SQL tuning using Explain Plan, Tkprof, Hints and indexes. Session performance was improved with pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created an infrastructure to ensure data quality and appropriate data linkage and deliver self-service reporting to business and speeding and improving the decision making.
  • Involved with the data modular in creating the dimensions and fact tables in different schemas.
  • Used Parameter files to reuse the mapping with different criteria to decrease the maintenance.
  • Extensively used Informatica Debugger for testing the mapping logic during Unit Testing.

Environment: Informatica 9.x/8.x, Power Exchange, Erwin 7.2/4.0, Flat Files, XML, Oracle 11g/10g,TOAD, DB2,SQL Server 2008/2005,Teradata V2R6, PL/SQL, UNIX, OBIEE 11g, Oracle CRM, IDQ 9.1, OBIA.

Confidential

Role: ETL/Informatica Consultant

Description: Confidential has been a leader in document technology and IT services. The objective of this project was to develop a new data mart called FDM. This data mart consists of Star Schema data design features and fact tables, conformed dimensions Type1 and Type2 included , link dimensions Fact-less Dimensions , hierarchy tables in it. Our main primary source was EDW.

Responsibilities:

  • Took major role in understanding the business requirements, designing and loading the data into Enterprise data warehouse.
  • Used ETL Informatica to load data from different sources to Oracle DWH.
  • Participated in Performance Tuning of ETL maps at Mapping, Session, Source and Target level as well as writing Complex SQL Queries from ABSTRACT Data model.
  • Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance in the existing sessions.
  • Used Informatica client tools - Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Used Informatica Data Migration to decrease the risk and minimize the errors.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Experience in data conversion or data integration software components using Informatica.
  • Responsible for performance tuning for several ETL mappings, Mapplets, workflow session executions.
  • Performed optimization of SQL queries.
  • Used UNIX Shell scripting for automation of the process, invoking PL/SQL procedures, and Informatica sessions.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Experience on Enterprise Reporting, Cube Analysis MOLAP , Ad hoc query ROLAP analysis, Statistical Analysis Mining and Report Delivery Alerting using Microstrategy.
  • Extensively worked in creating and integrating Microstrategy reports and objects
  • Attributes, Filters, Metrics, Facts, Prompts, Templates, Consolidation and Custom Groups .

Environment: Informatica PowerCenter 8.1/8.0, IDQ 9.1, SQL, PL/SQL, Sybase12.5, Oracle 11g/10g, MS SQL server 2005, DB2,TOAD 8.5, Erwin 7.2/4.0, Unix Scripting, Flat Files, Windows NT/2000, Autosys, Microstrategy.

Confidential

Role: ETL Analyst/ Developer

Description: : Confidential is a vertically led global Information Technology IT services organization and a leading provider of consulting, business optimization, and outsourcing solutions. The Data arrives from different Source systems daily, which is sent into the tables of the staging area where it is transformed according to the business logic and finally loaded into the CDW. The data from the CDW is extracted for analysis and renewals by the Marketing team.

Responsibilities:

  • Involved in Requirement Analysis, ETL Design and Development for extracting the data which is stored in different sources like Oracle and SQL Server.
  • Analyzed, designed and developed an environment that would facilitate the use of Informatica to transform data coming from Igrasp oltp to datamarts.
  • Developed complex Informatica B2B Mappings with transformations like lookup, router, aggregator, expression, update strategy, joiner and etc.
  • Created Connected, Unconnected and Dynamic Lookup transformations for better performance.
  • Created sessions, Worklets, workflows for the mapping to run daily, biweekly based on the business requirements.
  • Extensively used Parameter files, mapping variables in the process of development of the IDE, IDQ mappings for all the dimension tables.
  • Developed and implemented the UNIX shell script for the start and stop procedures of the sessions.
  • Implemented Slowly Changing Dimension type2 methodology in Informatica.
  • Worked on handling performance issues of Informatica Mappings, evaluating current logic for tuning possibilities.
  • Tuned SQL Queries in Source qualifier Transformation for better performance.
  • Created and executed unit test cases for various scenarios for all the mappings, workflows and scripts.
  • Provided data to the reporting team for their daily, weekly and monthly reports.
  • Involved in team weekly and by monthly status meetings.

Environment: Informatica PowerCenter8.5, Erwin 4.0, Oracle10g, Toad, TSQL, PL/SQL Stored Procedure, Trigger, Packages , Windows 2000, UNIX Shell scripts.

Confidential

Role: ETL /Informatica Developer

  • Description: Confidential, is an ISO 9001:2008 certified IT Solutions company, established in 2000. The scope of the project is to build Data Marts for business analysis departments, such as generation, transmission, distribution and customer satisfaction. The project involved extracting data from the different sources and loading into Data Mart. Responsibilities:
  • Involved in study of existing operational systems, data modeling, and analysis and translated business requirements into data mart design.
  • Define the entity-relations ER Diagrams and designed the physical databases for OLTP and OLAP data warehouse .
  • Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files, DB2, Mainframes and load into Data warehouse using the PowerCenter, power exchange.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Expression, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created reusable transformations and mapplets based on the business rules to ease the development.
  • Designed and developed UNIX shell scripts to schedule jobs. Also wrote pre-session and post-session shell scripts
  • Collaborated with Informatica Admin in process of Informatica Up gradation from PowerCenter 7.1 to PowerCenter 8.1.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision Used various debugging techniques to debug the mappings.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.

Environment: PowerCenter7.x,Oracle9i, SQL Server 2000, SQL,PL/SQL, shell scripts, Toad 8.0, Windows 2000,UNIX,Tivoli.

We'd love your feedback!