We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

NevadA

SUMMARY

  • Over all 16 years of IT experience in ETL architecture, Technical Lead, Data Architect, Senior business analyst and worked on various data migration, Data Analysis, Data Profiling, Data Integration and data warehousing.
  • Proposed and implemented business process improvement solutions across the enterprise.
  • Responsible for Gathering requirements from Stake holders.
  • Responsible for creating requirement artifacts like Business Requirements Document (BRD) and Functional Requirement Document (FRD)
  • Perform feasibility analysis to evaluate multiple options and determine best fit solution
  • Perform GAP analysis to evaluate solution scope by doing an AS IS to TO BE comparative study
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Webservices,VSAM files, XML,XSD File, PL/SQL and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Strong ETL experience of using Informatica 10.1, 9.5/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/DataMart Design, ETL, OLAP, BI, Client/Server applications.
  • Experience working on Data quality tools Informatica IDQ 9.1, Informatica MDM 9.1
  • Worked with Informatica power exchange and Informatica cloud to integrate and load the data to Oracle db.
  • Strong experience working with ETL tools Informatica/OWB.
  • Have worked on scheduling tools like Autosys, DAC (Data warehouse Administration Console).
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Strong experience in Reporting and Analytical tools like Business Objects, Siebel Analytics OBIEE, Cognos.
  • Utilized AUTOTRACE and EXPLAIN PLAN for monitoring the SQL query performance.
  • Extensive knowledge of MicrosoftBI/AzureBIsolutions like Azure Data Factory, Power BI, Azure .Databricks, Azure Analysis Services.
  • Good Understanding of Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and created POC in moving the data from flat files and SQL Server using U-SQL jobs.
  • Knowledge on Snaplogic to integrate using snaps integration connector to move data from on premise to cloud.
  • Good intercommunication skills to work with all levels of manageme nt.

TECHNICAL SKILLS

RDBMS/Data Access: Oracle, Teradata, MS SQL Server 2000, MS Access

Data Warehouse Tools: Informatica 5/6.1/6.2/8.0.1/8.6/9.1/9.5 ,10, Informatica MDM, Informatica IDQ,Informatica Cloud,Informatica power exchange, OWB(Oracle Warehouse Builder),Azure Data lake, Power BI

Programming Languages: SQL, PL-SQl, BTEQ, Unix Shell Scripts, Batch Scripts,AWK Script

R eporting Tools: Cognos, Business Objects, Crystal Report, SAP BW, Spot Fire,MSBI,MSTR,PowerBI

Data Modeling: Erwin, Power Designer, Tibco, Toad

PROFESSIONAL EXPERIENCE

Confidential, Nevada

Sr. Informatica Developer

Responsibilities:

  • Initial analysis of business requirements related database for Work and Asset Management for Telecom(WAMTEL) application Phase2.
  • Proposed and implemented business process improvement solutions across the enterprise
  • Responsible for Gathering requirements from Stake holders
  • Responsible for creating requirement artifacts like Business Requirements Document (BRD) and Functional Requirement Document (FRD) .
  • Identifying areas for data quality improvement, help to resolve data quality problems and recommend measures to prevent reoccurrences. Participate in the data governance team.
  • Providing analysis and data management practices, data quality analysis, drive data metrics.
  • Determine business impact level for data quality issues.
  • Work to correct data quality errors.
  • Implementing ETL (Extract, Transform and Load) solutions using Informatica tool and Solution designing of data migration and ETL requirements.
  • Created the mapping,workflow and loading strategy for the different source system(PPM,P6,OMS,Maximo,Ventxy,ERP)
  • Loaded the data from the sql server,PS, Oracle to the Oracle dataware house through informatica.
  • Determine the performance impact in the mapping and implemented the solutions to increase the performance in the informatica.
  • Determine root cause for data quality errors and make recommendations for long-term solutions.
  • Research and determine scope and complexity of issue so as to identify steps to fix issue.
  • Develop process improvements to enhance overall data quality.
  • Perform feasibility analysis to evaluate multiple options and determine best fit solution
  • Perform GAP analysis to evaluate solution scope by doing an AS IS to TO BE comparative study
  • Design of WAMTEL application Phase1 enhancements based on user requirements.
  • Define the XML and XSD target files required by the Maximo system.
  • Create the data objects and load data into the Maximo system.
  • Worked with the blue print of CU.
  • Design & develop detail ETL specification based on the business requirement.
  • Translate business needs into the technical solutions by working with business and IT stakeholders.
  • Development of MSBI,BIRT reports on large amount of data.Written the complex queries for the Birt report which is built in the OLTP system (Maximo).

Confidential, Baltimore

ETL Developer

Responsibilities:

  • Lead design, development and implementation of the ETL projects end to end.
  • Develop solution in highly demanding environment and provide hands on guidance to team members.
  • Worked on data profiling including performing assessment whether metadata accurately described the actual values in the source database.
  • Worked on cross platform integration initiatives and performed data mapping, data profiling between the distinct systems
  • Created Test Plans, Test Cases, and Test Scripts for all testing events such as System Integration Testing SIT User Acceptance Testing UAT .
  • Performed Data Profiling Data Quality and data validation.
  • Working as liaison between stakeholders and technology team to interpret business needs
  • Perform Enterprise analysis to determines operational objectives .
  • Project deals with the Data Migration of legacy to oracle database. Data from Flat Files are loaded to Oracle Inbound staging and outbound areas using Informatica.
  • Understand the business requirements and identified gaps in different processes and implemented process improvement initiatives across the business improvement model
  • Developed process mapping of current and future business processes
  • Translated business user concepts and ideas into comprehensive business requirements and design documents
  • Draft and maintain business requirements and align them with functional and technical requirements
  • Worked with Informatica cloud with different transformations like Aggregator, Lookup, Joiner,Filter, Router,Update strategy, Union,Normaliser,SQL in ETL development.
  • Worked with Push down optimization to improve performance.
  • Created pre-session, post session, pre-sql, post sql commands in Informatica.
  • Worked with Event wait and event raise tasks,mapplets,reusable transformations.
  • Worked with Parameter file for ease of use in connections across Dev/QA/Prod environments
  • Implementing an Informatica based ETL solution fulfilling stringent performance requirements.
  • Conducted impact assessment and determine size of effort based on requirements.
  • Worked with Informatica cloud to synchronize the data from CDO data into MS SQL for BI/Analytical.
  • Developed Stored Procedures, Functions, Views and Triggers , Complex SQL queries using SQL Server, T-SQL.
  • Created transaction data objects to come up with source to target mapping and data conversion design.
  • Worked in all the design/develop artefacts and gave a detail visibility to client lead in terms of data delivery of the blueprint program.

Confidential

ETL Architecture

Responsibilities:

  • Lead design, development and implementation of the ETL projects end to end.
  • Developed Informatica workflows/worklets/sessions associated with the mappings across various source system like Maximo, FRM, FMIS, CMS, MS access, Customer Billing etc.
  • Worked with cleanse, parse, standardization, validation.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings across Dev, QA, and PROD environments
  • Developing system modification specifications; mapping data; establishing interfaces; developing and modifying functions, programs, routines, and stored procedures to export, transform, and load data, meeting performance parameters, resolving and escalating integration issues.
  • Validates data integration by developing and executing test plans and scenarios including data design, tool design and data extract/transform.

Confidential

Informatica Lead

Responsibilities:

  • Worked in the integration of 16 registries.
  • Worked with the functional people to understand the clinadmin and design the integration of clinadmin with Isearch data model.
  • Responsible for ETL technical design discussions and prepared ETL high level technical design document.
  • Created the architecture of iSearch Integrated Data Repository by bringing the normalised data into single format.
  • Defined the ETL logic, data cleansing and harmonization rules for the 16 different sources.
  • Lead team of ETL (PL/SQL) and reporting developers and coordinate onsite/offshore activities.
  • Created batch script which will run the store procedure from the windows scheduler.
  • Created Data Mapping between Source and Target Systems
  • Intensively worked in the new report tool SPOTFIRE to bring the functionality of NGIS,NGSS and NGCA.
  • Define the architecture of the report for NGIS, NGSS and NGCA.
  • Documentation of ETL process for each flow.

Confidential

Techno Functional Lead

Responsibilities:

  • Lead team of ETL and reporting developers and coordinate onsite/offshore activities.
  • Initial understanding of existing system and develop new system to enhance Fate CN/JP into DWH.
  • Performance tuning of mapping and oracle query.
  • SME for informatica and DAC for USL project.
  • Created unix generic script which will run the workflow.
  • Worked on the Data net(Client specific tool) ETL tool with in stipulation time and deliver the USL(Four way recon module) project.

Confidential

Tech Lead

Responsibilities:

  • Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modelling changes
  • Involved in Data Profiling and Analysis.
  • Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical data model.
  • Performed Data Gap Analysis in current data warehouse.
  • Created Data Mapping between Source and Target Systems.
  • Worked closely with the source system (ECOMM, Infinity, BASIS, Cash vault, Cost Center) to integrate into the BIDW.
  • Worked with the functional people to map the source system data.
  • Involved in OBIEE reporting for preparation of repository.

Confidential

Consultant

Responsibilities:

  • Interacted with Business Analyst to understand the business requirements.
  • Designed the ETL mapping with the data flow from various sources to target and also implemented the SCD.
  • Developed Test Plan, Test Cases and Test Scripts for with complete understanding of each domain functionality.
  • Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
  • Created the mapping to build the logic for the suspend handling and the error handling.
  • Involved in successful development, testing, deployment and maintenance of edw3&6.
  • Created and executed UNIX scripts as pre/post-session commands to ftp the file from landing area to informatica work area and also to archive the files after loading the data.
  • Created and executed PL/SQL packages to handle the delta handling, exchange partition and metadata information.
  • Worked in the business objects to build the universe.

Confidential

IT Analyst

Responsibilities:

  • Worked on the Architecture of BW for creating the different dimensions and facts.
  • Worked closely with the source system (Orbit, SAP LAN and Bancs) to integrate into the Enterprise reporting system.
  • Worked with the functional people to map the source system data.
  • Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
  • Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
  • Developed and executed UNIX scripts as pre/post-session commands to schedule loads.
  • Created the Universe and Business views in the business objects and crystal reports respectively
  • Developed Test Plan, Test Cases and Test Scripts for with complete understanding of each domain functionality.
  • Given the architect solution to the user for the MIS and BW reports .

Hire Now