We provide IT Staff Augmentation Services!

Sr.informatica Developer Resume

3.00/5 (Submit Your Rating)

FL

SUMMARY

  • Over 8+ years of IT experience in the Analysis, Architecture, Design, Development, Testing and Implementation of Database Applications, Business Intelligence solutions using Data Warehousing & ETL tools.
  • 2+ year of experience in ILM Data Archiving tool 6.1,FAS, Data validation tool, Data discovery tool
  • Excellent noledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall and Agile including Requirement analysis, Design, Development, Testing, Support and Implementation.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this noledge in building MDM solutions.
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
  • Experience in database programming in PL/SQL (Stored Procedures, Triggers and Packages).
  • Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), and Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
  • Highly skilled in creating dynamic and customized packages for ETL from various data sources using SSIS.
  • Experienced in performing Incremental Loads and Data cleaning in SSIS. Expert in handling errors, logging using Event Handlers using SSIS.
  • Expertise in utilizing Oracle utility tool SQL Developer and expertise in Toad for developing Oracle applications.
  • Responsible for the design, development and implementation of dynamic SSIS packages for ETL (extract, transform, and load) development following company standards and conventions.
  • Experience in Report Design, Report Model, report configuration and deployment using SSRS.
  • Experience in designing and developing reports on SSRS (SQL Server Reporting Services)
  • Generated various reports like parameterized, drill down, drill through, sub reports using SQL Server Reporting Services (SSRS) for various projects
  • Expertise in developing Parameterized, Chart, Graph, Linked, Dashboard, Scorecards, Report on SSAS Cube using MDX, Drill - down and Drill-through reports using SSRS.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Expertise and hands on experience in RDBMS, Data Warehouse Architecture, Inmon and Ralph Kimball Technologies and thorough understanding and experience in data warehouse and data mart design.
  • Good understanding of Data Models (Dimensional & Transactional), Conceptual/Logical & Physical Data Models, DWH Concepts, ER diagrams, Data Flow Diagrams/Process Diagrams.
  • Knowledge in designing and developing Data marts, Data warehouse using multi-dimensional Models such as Snow Flake Schema and Star Schema using Facts & Dimension tables.
  • Expertise in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 9.5/9.1/8.6/8.5/8.1.1/7.1.3/7.1.1/7.0 and Power Exchange (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer).
  • Experienced in creation of reusable objects Mapplets & Transformations using Mapplet Designer and Transformation Developer.
  • Experience in Performance Tuning of Sources, Targets, mappings and using Push Down Optimization, different Session Partitioning techniques like Round robin, Hash-Key,Range & Pass Through.
  • Experience in implementation of the Informatica Web Services Hub setup and event based (SOAP request/response) triggering of workflows imported through WSDL
  • Business Intelligence experience using Cognos, and Working noledge of Business Objects XI R2.
  • Experience in setting up Workflows for Concurrent execution.
  • Experience in UNIX Shell scripting for high volume data warehouse instances.
  • Proficiently rendered services on Oracle 11g/10g/9i/8i, Teradata, Netezza, SQL Server, DB2,SQL, PL/SQL
  • Experience in Netezza Extract Load and Transformation mechanism by taking data from various sources and loading it into Netezza DB using NZ SQL and NZ LOAD utilities.
  • Expertized in Netezza queries by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Expertise in utilizing Oracle utility tool SQL Developer and expertise in Toad for developing Oracle applications.
  • Involved in setting up the standards for Architecture, design and development of database applications
  • Expertise in Database development skills using SQL, PL/SQL, T-SQL, Stored Procedures, Functions, Views, Triggers and complex SQL queries. Proficient using TOAD, SQL Developer for system testing of the reports.
  • Experinced working with Job schedulers Autosys,WLM,Maestro and version tools Clear Case & SVN
  • Trained and working noledge on Datastage.
  • Used Informatica B2B to extract data from unstructured source files and PowerExchange to extract real time data.
  • Designed Mappings using B2B Data Transformation Studio with the help of UDT (Unstructured Data Transformation).
  • The process developed, takes input requirements from Excel File and outputs multiple data feed filesat multiple schedules, file names, with/without control files, as specified.
  • Worked on POC to implement HadoopwithCloudier & Hortonworkstechnology.
  • Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Executed SQL queries, stored procedures and performed data validation as a part of backend testing.
  • Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.
  • Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Documented Cleansing Rules discovered from data cleansing and profiling.
  • Expertise in Oracle BI Server, Oracle BI Answers, BI Interactive Dashboards and Oracle BI Publisher.
  • Experience in developing OBIEE Repository (.rpd) at three layers (Physical Layer, Business Model and Mapping Layer &Presentation Layer), Time Series Objects, configuring metadata, Answers, Delivers / Interactive Dashboards / Reports with drill-down and drill-across capabilities using global & local Filters, Security Setup (Groups and Access Privileges), Web Catalog Objects and scheduling iBots.
  • Worked in 24/7 production support of ETL and BI applications for large Life Sciences & Healthcare Data warehouses for monitoring, troubleshooting, resolving issues.
  • Experience reviewing Test plans, Test cases and Test case execution. Understanding business requirement documents and functional specs and then writing test cases using Quality Center. Also played an active role in User Acceptance Testing (UAT) and unit, system & integration testing.

TECHNICAL SKILLS

Environment: UNIX,SSIS/SSRS,Sun Solaris 5.8/5.6,AIX5.3/4.3, HP-UX, DOS, Linux, Windows 98/NT/2000/XP/Vista/7, SOAP.

Database: Oracle 11g/10g/9i/8i/, SQL Server, Netezza, IBM DB2.

Languages: C,C++,Java,XML, UML, UNIX, Shell Scripting (Bourne, Korn), SQL, PL/SQL, T-SQL

Tools: Putty, SQL* Plus, TOAD, SQL*Loader, Data Archiving Tool, Data Discovery Tool, Data Validation Tool, SQL Server

ETL Tools: Informatica PC 9/8.x/7.x, Informatica Power Exchange

Reporting Tools: Business Objects, Cognos

Data Modeling: ERwin, MS Visio.

PROFESSIONAL EXPERIENCE

Confidential, FL

Sr.Informatica Developer

Responsibilities:

  • Responsible for Requirement Gathering from the client and Analysis for the same
  • Responsible for converting Functional Requirements into Technical Specifications
  • Interacted actively with Business Analysts on mapping documents and design process for various sources and targets.
  • Configured and installed Informatica MDM Hub server, cleanse Server, resource kit in Development, QA, Pre-Prod and Prod Environments.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
  • Validated the archived jobs using DVO (data validation tool)
  • Used data discovery tool to view reports in ILM
  • Data integrity between the various source tables and relationships.
  • Perform data analysis to evaluate the data quality and resolve the data related issues
  • Involved in writing SQL scripts, stored procedures and functions and debugging them.
  • Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Build efficient SSIS packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes
  • Generated reports using SQL server reporting services (SSRS).
  • Used SQL Server Reporting Services (SSRS) for creating matrix and tabular, drill down, drill through, and parameterized reports based on the requirements.
  • Deployed and Subscribe Reports using SSRS to generate all daily,weekly,monthly Reports.
  • Design and implementation of a Metadata Repository.
  • As a Sr ETL developer provided suggestions and improvements to adhere to the standards
  • Prepared the Code Review Checklist, Standards document and Reusable components to be used across multiple projects
  • Used Push Down Optimization and Partitioning to improve the performance on Informatica.
  • Developed mappings using Filter, Router, Expression, Source Qualifier, Joiner and Connected & Unconnected Look up, Update Strategy, Stored Procedure, Sequence Generator and Aggregator Transformations.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Wrote SQL-Overrides and used filter conditions in source qualifier theirby improving the performance of the mapping.
  • Implemented complex business rules in Informatica Power Center by creating Re-usable Transformations, and working with Mapplets.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks.
  • Preparing the documents for test data loading.
  • Developed Re-Usable Transformations and Re-Usable Mapplets
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Used Session Logs, and Workflow Logs for Error handling and Troubleshooting in the development environment
  • Good understanding of various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache. Responsible for Unit Testing of mappings and workflows.
  • Experienced in loading data into Netezza using NZ Load utility.
  • Experienced in loading data between Netezza tables using NZ SQL utility.
  • Worked on the Netezza Admin Console when the issues were not solved at the session/workflow level. Involved in quality assurance of data, automation of processes.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Extensively worked on UNIX Shell scripting and BTEQs to extract the data from the warehouse . Involved in Design review, code review, Performance analysis.
  • Experienced in using Workload Manager (WLM) for scheduling and running the on-demand or scheduled jobs.
  • Involved in Performance Tuning (Both Database and Informatica) and their by decreased the load time
  • Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, and IIF function.
  • Effectively communicated with the Business Partners and team members, the problem and the expected time of resolution.
  • Responsible for providing timely feedback and necessary help/cooperation to meet client expectations

Environment: Informatica Power center (v9.1), SSIS/SSRS, SQL Server, Teradata 13.1, Oracle 11g, UNIX AIX-V6, Workload Manager(WLM)

Confidential, Dublin, OH

Informatica Developer

Responsibilities:

  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Designed and developed SSIS packages, designed stored procedures, configuration files, tables, views, and functions.
  • Created Configuration files with XML documents to support the SSIS packages in different environments
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Migrated Informatica mapping to the SQL Server Integration Services (SSIS) packages to transform data from SQL 2000 to MS SQL 2005.
  • Responsible for Requirement Gathering Analysis and End user Meetings.
  • Responsible for converting Functional Requirements into Technical Specifications.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2 and Flat Files.
  • Profiled the data usingInformatica Data Explorer(IDE) and performed Proof of Concept forInformatica Data Quality(IDQ).
  • Worked on Dimension/Fact tables to implement the business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets.
  • Used various transformations like Lookup, Filter, Normalizer, Joiner, Aggregator, Expression, Router, Update strategy, Sequence generator and XML Generator Transformations in the mappings.
  • Used XML spy tool to validate the input source XML files.
  • Worked on XML Parser transformation to read the XSD file and build the source definition and accordingly to read the XML source file.
  • Worked with Shortcuts across Shared and Non Shared Folders.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use for the company and also accessed Netezza environment for implementation of ETL solutions
  • Expertized in Netezza queries by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Involved in loading the data into Netezza from legacy systems and flat files using scripting on Unix. Used NZ SQL & NZ LOAD utilities of Netezza
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Developed Mappings for Type 1 and Type 2 Slowly Changing Dimensions.
  • Responsible for Performance Tuning at the Source, Target, Mapping and Session Level.
  • Created Pre and Post Session Scripts for checking file existence.
  • Responsible for Unit testing and Integration testing of mappings and workflows.
  • Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in). Defect Tracking and reports are done by Rational Clear Quest.
  • Documented existing mappings as per standards and developed template for mapping specification document.
  • Formulating the QA plan for black box testing of the application including Functional, Regression, Integration, Systems and User Acceptance Testing.
  • Provided excellent customer service to the internal functional team by pro-actively following up with the issues on hand (through detailed emails and by setting up short meetings).
  • Bugs found during testing were analyzed (root cause analysis) beyond their obvious reason to extrapolate various errors that can occur in future.

Environment: Informatica Power Center 9.5.1, SSIS/SSRS, SQL Server 2012/2008,SSIS/SSRS Oracle 11g, Shell Scripts, UNIX, Quality Center, IDQ tool, Autosys scheduling tool, Rational clear Quest

Confidential, Memphis, TN

Informatica Developer

Responsibilities:

  • Worked on complex mappings and always guided the team when stuck and ensured timely delivery of the ETL components.
  • Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
  • Created STM’s (Source to Target Mappings) for data files into the PTY model.
  • Worked on Subversion SVN for maintaining documents and code. Performance tuning by session partitions, dynamic cache memory, and index cache.
  • All SSIS standards are followed to maintain reliability and scalability in the extraction.
  • Used SQL Server Reporting Services (SSRS) for creating drill down, drill through, and parameterized reports based on the requirements.
  • Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
  • Implemented update strategies, incremental loads, Data capture and
  • Incremental Aggregation.
  • Involved in the Design and Implementaton of the Data Quality framework for Error handling and file validations
  • Troubleshooted and Debugged Macros in excel to run SQL queries using macros on Oracle to check for validations before the loads.
  • Cleansed and migrated HCO & HCP data, Integrated Medpro licensing data to enable sampling processes and compliance reporting, processing of sales data from WKHLTH and IMS (prescriber master, customer master, payor master and DDD outlet master) through informatica.
  • Built reusable mappings, transformations using Informatica Designer
  • Involved in performance tuning of Informatica and optimizing the performance by identifying and eliminating Target, Source, Mapping, and Session bottlenecks while loading into Sales Force
  • Involved in Design review, code review, Performance analysis.
  • Involved in migration of the Informatica components for multiple releases
  • Involved in Testing of all the Interfaces with huge amount of data and fixed bugs accordingly within the Mappings/ Workflows
  • Used Redwood to schedule UNIX shell scripts and Informatica jobs
  • Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Environment: Informatica 8.6 ETL, SSIS/SSRS, Oracle 11g Database, SQL Server, Redwood Scheduling, Subversion code control, Mercury Quality Center, Salesforce.com

Confidential

Informatica Developer

Responsibilities:

  • Involved in analysis of database schemas and design of star schema models for Complaints, Sales and Inventory modules.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles forhierarchy management in MDM Hub implementation.
  • Efficient in creating SSIS Packages. Created packages to extract data from flat files, TeraData, Oracle and DB2 and transform the data according to the business requirements and load the data in SQL server tables.
  • Worked on data cleansing by creating SSIS packages against the Flat Files
  • Maintained documentation for every process including the Mapping and Transformation Rules, Source and Target definitions, Output Flat file locations, Shell Scripts and Template files directories
  • Created a test plan and a test suite to validate the data extraction, data transformation and data load and used SQL and Microsoft Excel.
  • Importing source/target tables from the respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.
  • DevelopedInformaticamappings by usage ofaggregator, SQL overrides in lookups, source filterandsource qualifierand data flow management into multiple targets using router transformations.Creating sessions and work-flows for the Informatica mappings.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc., for developingInformaticamappings
  • Loading of Data to Staging Area
  • Created Informatica mappings to load the data from staging to dimensions and fact tables.
  • Created new Shell Scripts for File Renaming, Moving files from one directory to another etc... All these Scripts were used in the Post Session Command within the Informatica Workflow
  • Used Pmcmd to run Informatica workflows and automated the processes using UNIX Shell scripts.
  • Interacted with the business clients to resolve their issues/concerns post production implementation
  • Created and maintained Migration documentation and Process Flow for mappings and sessions.
  • Involved in migrating Power Center folders from Development to Production Repository using Repository Manager.
  • Troubleshooted production defects and fixed Cognos reports
  • Used Maestro to schedule UNIX shell scripts and Informatica jobs
  • Changed the load schedules after a through testing with changes on job dependencies and reduced the daily load duration from 18 hours to 12 hours.

Environment: Kalido 8.2/8.3, Oracle 9i,SSIS,Informatica 7.1.3, Maestro Scheduling, Mercury Quality Center, Cognos, PHP, Maestro

We'd love your feedback!