We provide IT Staff Augmentation Services!

Sr.informatica Developer Resume

4.00/5 (Submit Your Rating)

FL

SUMMARY

  • Over 8+ years of IT experience in teh Analysis, Architecture, Design, Development, Testing and Implementation of Database Applications, Business Intelligence solutions using Data Warehousing & ETL tools.
  • 2+ year of experience in ILM Data Archiving tool 6.1,FAS, Data validation tool, Data discovery tool
  • Excellent knowledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall and Agile including Requirement analysis, Design, Development, Testing, Support and Implementation.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
  • Experience in database programming in PL/SQL (Stored Procedures, Triggers and Packages).
  • Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), and Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
  • Highly skilled in creating dynamic and customized packages for ETL from various data sources using SSIS.
  • Experienced in performing Incremental Loads and Data cleaning in SSIS. Expert in handling errors, logging using Event Handlers using SSIS.
  • Expertise in utilizing Oracle utility tool SQL Developer and expertise in Toad for developing Oracle applications.
  • Responsible for teh design, development and implementation of dynamic SSIS packages for ETL (extract, transform, and load) development following company standards and conventions.
  • Experience in Report Design, Report Model, report configuration and deployment using SSRS.
  • Experience in designing and developing reports on SSRS (SQL Server Reporting Services)
  • Generated various reports like parameterized, drill down, drill through, sub reports using SQL Server Reporting Services (SSRS) for various projects
  • Expertise in developing Parameterized, Chart, Graph, Linked, Dashboard, Scorecards, Report on SSAS Cube using MDX, Drill - down and Drill-through reports using SSRS.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Expertise and hands on experience in RDBMS, Data Warehouse Architecture, Inmon and Ralph Kimball Technologies and thorough understanding and experience in data warehouse and data mart design.
  • Good understanding of Data Models (Dimensional & Transactional), Conceptual/Logical & Physical Data Models, DWH Concepts, ER diagrams, Data Flow Diagrams/Process Diagrams.
  • Knowledge in designing and developing Data marts, Data warehouse using multi-dimensional Models such as Snow Flake Schema and Star Schema using Facts & Dimension tables.
  • Expertise in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 9.5/9.1/8.6/8.5/8.1.1/7.1.3/7.1.1/7.0 and Power Exchange (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer).
  • Experienced in creation of reusable objects Mapplets & Transformations using Mapplet Designer and Transformation Developer.
  • Experience in Performance Tuning of Sources, Targets, mappings and using Push Down Optimization, different Session Partitioning techniques like Round robin, Hash-Key,Range & Pass Through.
  • Experience in implementation of teh Informatica Web Services Hub setup and event based (SOAP request/response) triggering of workflows imported through WSDL
  • Business Intelligence experience using Cognos, and Working knowledge of Business Objects XI R2.
  • Experience in setting up Workflows for Concurrent execution.
  • Experience in UNIX Shell scripting for high volume data warehouse instances.
  • Proficiently rendered services on Oracle 11g/10g/9i/8i, Teradata, Netezza, SQL Server, DB2,SQL, PL/SQL
  • Experience in Netezza Extract Load and Transformation mechanism by taking data from various sources and loading it into Netezza DB using NZ SQL and NZ LOAD utilities.
  • Expertized in Netezza queries by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Expertise in utilizing Oracle utility tool SQL Developer and expertise in Toad for developing Oracle applications.
  • Involved in setting up teh standards for Architecture, design and development of database applications
  • Expertise in Database development skills using SQL, PL/SQL, T-SQL, Stored Procedures, Functions, Views, Triggers and complex SQL queries. Proficient using TOAD, SQL Developer for system testing of teh reports.
  • Experinced working with Job schedulers Autosys,WLM,Maestro and version tools Clear Case & SVN
  • Trained and working knowledge on Datastage.
  • Used Informatica B2B to extract data from unstructured source files and PowerExchange to extract real time data.
  • Designed Mappings using B2B Data Transformation Studio with teh halp of UDT (Unstructured Data Transformation).
  • Teh process developed, takes input requirements from Excel File and outputs multiple data feed filesat multiple schedules, file names, with/without control files, as specified.
  • Worked on POC to implement HadoopwithCloudier & Hortonworkstechnology.
  • Worked with business SMEs on developing teh business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Executed SQL queries, stored procedures and performed data validation as a part of backend testing.
  • Presented Data Cleansing Results and IDQ plans results to teh OpCos SMEs.
  • Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
  • Documented Cleansing Rules discovered from data cleansing and profiling.
  • Expertise in Oracle BI Server, Oracle BI Answers, BI Interactive Dashboards and Oracle BI Publisher.
  • Experience in developing OBIEE Repository (.rpd) at three layers (Physical Layer, Business Model and Mapping Layer &Presentation Layer), Time Series Objects, configuring metadata, Answers, Delivers / Interactive Dashboards / Reports with drill-down and drill-across capabilities using global & local Filters, Security Setup (Groups and Access Privileges), Web Catalog Objects and scheduling iBots.
  • Worked in 24/7 production support of ETL and BI applications for large Life Sciences & Healthcare Data warehouses for monitoring, troubleshooting, resolving issues.
  • Experience reviewing Test plans, Test cases and Test case execution. Understanding business requirement documents and functional specs and then writing test cases using Quality Center. Also played an active role in User Acceptance Testing (UAT) and unit, system & integration testing.

TECHNICAL SKILLS

Environment: UNIX,SSIS/SSRS,Sun Solaris 5.8/5.6,AIX5.3/4.3, HP-UX, DOS, Linux, Windows 98/NT/2000/XP/Vista/7, SOAP.

Database: Oracle 11g/10g/9i/8i/, SQL Server, Netezza, IBM DB2.

Languages: C,C++,Java,XML, UML, UNIX, Shell Scripting (Bourne, Korn), SQL, PL/SQL, T-SQL

Tools: Putty, SQL* Plus, TOAD, SQL*Loader, Data Archiving Tool, Data Discovery Tool, Data Validation Tool, SQL Server

ETL Tools: Informatica PC 9/8.x/7.x, Informatica Power Exchange

Reporting Tools: Business Objects, Cognos

Data Modeling: ERwin, MS Visio.

PROFESSIONAL EXPERIENCE

Confidential, FL

Sr.Informatica Developer

Responsibilities:

  • Responsible for Requirement Gathering from teh client and Analysis for teh same
  • Responsible for converting Functional Requirements into Technical Specifications
  • Interacted actively with Business Analysts on mapping documents and design process for various sources and targets.
  • Configured and installed Informatica MDM Hub server, cleanse Server, resource kit in Development, QA, Pre-Prod and Prod Environments.
  • Involved in implementing teh Land Process of loading teh customer/product Data Set into Informatica MDM from various source systems.
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
  • Validated teh archived jobs using DVO (data validation tool)
  • Used data discovery tool to view reports in ILM
  • Data integrity between teh various source tables and relationships.
  • Perform data analysis to evaluate teh data quality and resolve teh data related issues
  • Involved in writing SQL scripts, stored procedures and functions and debugging them.
  • Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Build efficient SSIS packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes
  • Generated reports using SQL server reporting services (SSRS).
  • Used SQL Server Reporting Services (SSRS) for creating matrix and tabular, drill down, drill through, and parameterized reports based on teh requirements.
  • Deployed and Subscribe Reports using SSRS to generate all daily,weekly,monthly Reports.
  • Design and implementation of a Metadata Repository.
  • As a Sr ETL developer provided suggestions and improvements to adhere to teh standards
  • Prepared teh Code Review Checklist, Standards document and Reusable components to be used across multiple projects
  • Used Push Down Optimization and Partitioning to improve teh performance on Informatica.
  • Developed mappings using Filter, Router, Expression, Source Qualifier, Joiner and Connected & Unconnected Look up, Update Strategy, Stored Procedure, Sequence Generator and Aggregator Transformations.
  • Used teh PL/SQL procedures for Informatica mappings for truncating teh data in target tables at run time.
  • Wrote SQL-Overrides and used filter conditions in source qualifier theirby improving teh performance of teh mapping.
  • Implemented complex business rules in Informatica Power Center by creating Re-usable Transformations, and working with Mapplets.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks.
  • Preparing teh documents for test data loading.
  • Developed Re-Usable Transformations and Re-Usable Mapplets
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Used Session Logs, and Workflow Logs for Error handling and Troubleshooting in teh development environment
  • Good understanding of various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache. Responsible for Unit Testing of mappings and workflows.
  • Experienced in loading data into Netezza using NZ Load utility.
  • Experienced in loading data between Netezza tables using NZ SQL utility.
  • Worked on teh Netezza Admin Console when teh issues were not solved at teh session/workflow level. Involved in quality assurance of data, automation of processes.
  • Worked with SQL Override in teh Source Qualifier and Lookup transformation.
  • Extensively worked on UNIX Shell scripting and BTEQs to extract teh data from teh warehouse . Involved in Design review, code review, Performance analysis.
  • Experienced in using Workload Manager (WLM) for scheduling and running teh on-demand or scheduled jobs.
  • Involved in Performance Tuning (Both Database and Informatica) and their by decreased teh load time
  • Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, and IIF function.
  • Effectively communicated with teh Business Partners and team members, teh problem and teh expected time of resolution.
  • Responsible for providing timely feedback and necessary halp/cooperation to meet client expectations

Environment: Informatica Power center (v9.1), SSIS/SSRS, SQL Server, Teradata 13.1, Oracle 11g, UNIX AIX-V6, Workload Manager(WLM)

Confidential, Dublin, OH

Informatica Developer

Responsibilities:

  • Developed and supported teh Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
  • Worked on data cleansing and standardization using teh cleanse functions in Informatica MDM.
  • Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL.
  • Performed match/merge and ran match rules to check teh effectiveness of MDM process on data.
  • Designed and developed SSIS packages, designed stored procedures, configuration files, tables, views, and functions.
  • Created Configuration files with XML documents to support teh SSIS packages in different environments
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Migrated Informatica mapping to teh SQL Server Integration Services (SSIS) packages to transform data from SQL 2000 to MS SQL 2005.
  • Responsible for Requirement Gathering Analysis and End user Meetings.
  • Responsible for converting Functional Requirements into Technical Specifications.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2 and Flat Files.
  • Profiled teh data usingInformatica Data Explorer(IDE) and performed Proof of Concept forInformatica Data Quality(IDQ).
  • Worked on Dimension/Fact tables to implement teh business rules and get required results. Developed Re-usable Transformations and Re-Usable Mapplets.
  • Used various transformations like Lookup, Filter, Normalizer, Joiner, Aggregator, Expression, Router, Update strategy, Sequence generator and XML Generator Transformations in teh mappings.
  • Used XML spy tool to validate teh input source XML files.
  • Worked on XML Parser transformation to read teh XSD file and build teh source definition and accordingly to read teh XML source file.
  • Worked with Shortcuts across Shared and Non Shared Folders.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use for teh company and also accessed Netezza environment for implementation of ETL solutions
  • Expertized in Netezza queries by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Involved in loading teh data into Netezza from legacy systems and flat files using scripting on Unix. Used NZ SQL & NZ LOAD utilities of Netezza
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Developed Mappings for Type 1 and Type 2 Slowly Changing Dimensions.
  • Responsible for Performance Tuning at teh Source, Target, Mapping and Session Level.
  • Created Pre and Post Session Scripts for checking file existence.
  • Responsible for Unit testing and Integration testing of mappings and workflows.
  • Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in). Defect Tracking and reports are done by Rational Clear Quest.
  • Documented existing mappings as per standards and developed template for mapping specification document.
  • Formulating teh QA plan for black box testing of teh application including Functional, Regression, Integration, Systems and User Acceptance Testing.
  • Provided excellent customer service to teh internal functional team by pro-actively following up with teh issues on hand (through detailed emails and by setting up short meetings).
  • Bugs found during testing were analyzed (root cause analysis) beyond their obvious reason to extrapolate various errors dat can occur in future.

Environment: Informatica Power Center 9.5.1, SSIS/SSRS, SQL Server 2012/2008,SSIS/SSRS Oracle 11g, Shell Scripts, UNIX, Quality Center, IDQ tool, Autosys scheduling tool, Rational clear Quest

Confidential, Memphis, TN

Informatica Developer

Responsibilities:

  • Worked on complex mappings and always guided teh team when stuck and ensured timely delivery of teh ETL components.
  • Involved with Informatica team members in Designing, document and configure teh Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
  • Created STM’s (Source to Target Mappings) for data files into teh PTY model.
  • Worked on Subversion SVN for maintaining documents and code. Performance tuning by session partitions, dynamic cache memory, and index cache.
  • All SSIS standards are followed to maintain reliability and scalability in teh extraction.
  • Used SQL Server Reporting Services (SSRS) for creating drill down, drill through, and parameterized reports based on teh requirements.
  • Worked extensively on PL/SQL as part of teh process to develop several scripts to handle different scenarios.
  • Implemented update strategies, incremental loads, Data capture and
  • Incremental Aggregation.
  • Involved in teh Design and Implementaton of teh Data Quality framework for Error handling and file validations
  • Troubleshooted and Debugged Macros in excel to run SQL queries using macros on Oracle to check for validations before teh loads.
  • Cleansed and migrated HCO & HCP data, Integrated Medpro licensing data to enable sampling processes and compliance reporting, processing of sales data from WKHLTH and IMS (prescriber master, customer master, payor master and DDD outlet master) through informatica.
  • Built reusable mappings, transformations using Informatica Designer
  • Involved in performance tuning of Informatica and optimizing teh performance by identifying and eliminating Target, Source, Mapping, and Session bottlenecks while loading into Sales Force
  • Involved in Design review, code review, Performance analysis.
  • Involved in migration of teh Informatica components for multiple releases
  • Involved in Testing of all teh Interfaces with huge amount of data and fixed bugs accordingly within teh Mappings/ Workflows
  • Used Redwood to schedule UNIX shell scripts and Informatica jobs
  • Tested all teh mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Environment: Informatica 8.6 ETL, SSIS/SSRS, Oracle 11g Database, SQL Server, Redwood Scheduling, Subversion code control, Mercury Quality Center, Salesforce.com

Confidential

Informatica Developer

Responsibilities:

  • Involved in analysis of database schemas and design of star schema models for Complaints, Sales and Inventory modules.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles forhierarchy management in MDM Hub implementation.
  • Efficient in creating SSIS Packages. Created packages to extract data from flat files, TeraData, Oracle and DB2 and transform teh data according to teh business requirements and load teh data in SQL server tables.
  • Worked on data cleansing by creating SSIS packages against teh Flat Files
  • Maintained documentation for every process including teh Mapping and Transformation Rules, Source and Target definitions, Output Flat file locations, Shell Scripts and Template files directories
  • Created a test plan and a test suite to validate teh data extraction, data transformation and data load and used SQL and Microsoft Excel.
  • Importing source/target tables from teh respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.
  • DevelopedInformaticamappings by usage ofaggregator, SQL overrides in lookups, source filterandsource qualifierand data flow management into multiple targets using router transformations.Creating sessions and work-flows for teh Informatica mappings.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc., for developingInformaticamappings
  • Loading of Data to Staging Area
  • Created Informatica mappings to load teh data from staging to dimensions and fact tables.
  • Created new Shell Scripts for File Renaming, Moving files from one directory to another etc... All these Scripts were used in teh Post Session Command within teh Informatica Workflow
  • Used Pmcmd to run Informatica workflows and automated teh processes using UNIX Shell scripts.
  • Interacted with teh business clients to resolve their issues/concerns post production implementation
  • Created and maintained Migration documentation and Process Flow for mappings and sessions.
  • Involved in migrating Power Center folders from Development to Production Repository using Repository Manager.
  • Troubleshooted production defects and fixed Cognos reports
  • Used Maestro to schedule UNIX shell scripts and Informatica jobs
  • Changed teh load schedules after a through testing with changes on job dependencies and reduced teh daily load duration from 18 hours to 12 hours.

Environment: Kalido 8.2/8.3, Oracle 9i,SSIS,Informatica 7.1.3, Maestro Scheduling, Mercury Quality Center, Cognos, PHP, Maestro

We'd love your feedback!