We provide IT Staff Augmentation Services!

Informati Consultant Resume

CA

PROFESSIONAL SUMMARY:

  • Experienced professional with 12 years of expertise in analyzing the requirements, Development, Testing, Production support and a comprehensive background in Data bases and Data warehousing Technologies.
  • Expertise in Metadata Management, Data Quality, MasterData Management, Data Governance, Data Integration, Data Migration, Data Analysis etc.
  • Proficient in Real time integrations by using Informatica on - premises and Cloud based technologies.
  • Hands on experience with Informatica Cloud based data movements by using Data services Technologies
  • Good knowledge in Implementing Integration solutions by consuming Informatica Web Services solutions
  • Configured the connections to extract and load data from Metadata systems, and setup encryption and compression values to encrypt and compress data during the PowerCenter session.
  • Hands on experience in scheduling workflows, Sessions and generating metadata reports on load time and system performance.
  • Hands on experience in configuring and managing the Informatica Integration services to run on a Grid and assigning a single node to different sets of nodes .
  • Good Knowledge in configuring the Informatica workflows to process massively with Parallel Processing methods for different Informatica Domain Repository services .
  • Good experience in Installation, upgrading, migrating repositories on Enterprise Data Warehouse environments and hot fixes, configuration
  • Worked on taking backup of repository and restore in different location by synchronizing the properties of that environment.
  • Hands on experience in configuring the Informatica Power Center with Code pages and Data Movement in single and multi-bytes like ASCII and Unicode
  • Worked on Version Control Labels in Migration of Informatica workflows, Security issues and managing the repository with user connection and user locks happens on workflows and sessions.
  • Hands on experience in moving the Logical data models to physical metadata models and in implementations of Data warehouse and Data marts.
  • Experience in Erwin Database programming for Data Warehouses (Schemas), proficient in dimensional modeling, Star Schema, Snowflake and Hierarchy modeling .
  • Good Idea in Configuring Informatica PowerExchange and PowerCenter Integration services to extract and load Relational, Non-Relational, and Changed Data during Batch, Change and Real- Time basis.
  • Extensive database experience using Oracle, Teradata, DB2, MS SQL Server, MS Access, MS Excel, Flat Files and XML
  • Good Knowledge on Informatica Big-Data Edition and HDFS file processing and in writing jobs using Hive
  • Exclusively worked on Siebel Admin tool in creating Informatica Repository (.rep), Analytics Repository (.rpd), and Web catalog.
  • Hands on Experience in performing Administrative tasks on huge databases and migrating repositories of OBIEE, Informatica and DAC between environments
  • Experience in performance tuning the bottlenecks of source DB, mappings, target DB and Informatica Workflows and sessions.
  • Effective interaction with other team members of Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle.
  • Strong technical exposure with good degree of competence in business systems like Finance, Healthcare, Insurance, Telecom, Truecomp Callidus, Salesforce CRM, Manufacturing, Energy Solutions, Pharmaceutical, Food & Beverages and e-commerce Retail.
  • Worked on Shift Basis and 24/7 rotation basis for support and trouble shoot the Issues and tickets.

TECHNICAL SKILLS:

ETL Technologies: Informatica (Power Center, Data Quality, Cloud Realtime, Master Data Management, Meta Data Manager, ILM, DAC & Web services), Data stage, SAP-BODS, OWB, ODI, Mulesoft - ESB

Reporting: Business Objects, OBIEE, SAPBI, Tableau

Data Modeling: Logical and Physical data modeling, Star-Schema, Snowflake and Hierarchical Modeling, FACT and Dimension tables modeling using Erwin.

Data bases: Oracle, SQL Server, Teradata, DB2, HANA

Operating Systems: UNIX, Sun Solaris, Linux, MS Windows, Secure Agents

Languages: PL/ SQL, XML, UNIX Shell Script, PERL, Java, Javascript

Tools: & Utilities: AWS, WSDL, SOAP, TIDAL, Hive, ESP, Control-M, UC4 Dynamic Scheduler, Autosys, BTEQ, Share Point, Visio, Toad, web services, SOAP-UI, WSDL, Cloud Real Time util

Business Systems: Callidus, CCAR, SFDC-CRM, Oracle EBS (CRM, OMS, Procure to Pay, HRMS), SAP MM, Siebel CRM, Data Governence, Oracle ATG, SAP NGOM (Quote to Cash):

PROFESSIONAL EXPERIENCE:

Confidential, CA

Informatica Consultant

Responsibilities:

  • Designed the required design structures and Dataflow diagrams for CORE applications with consideration of existing Business structures.
  • Designed ETL Dataflow points to analyze the impact on downstream systems for the changes on multiple systems and applied Data Lineage rules for multiple mapping and data transformations
  • Worked on optimizing and tuning the UCM views and SQL’s to improve the performance of batch and response time of data for users.
  • Provided XML output files to downstream applications by processing the Retail and Lease contracts from HOST and Carlos applications.
  • Designed the Dataflow process from salesforce to UCM for one time scrub phone numbers to send reminder SMS messages to customer.
  • Designed and developed Data mask mappings for processing customer masked test data into user environments.
  • Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, and dependencies are set as requested.
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Provided quick production fixes and proactively involved in fixing production support issues.
  • Provided support during the system test, Product Integration Testing, UAT and raised RFC’s to migrate code.
  • Designed the framework for multiple ETL data flows into downstream applications including Master Data.
  • Participated in Business meetings and technical meetings for data process changes from different ETL’s and Reference data systems.
  • Worked with multiple teams for configuring the workflows with dependent workflows and applications.
  • Worked with production support team in processing the backup and restore operations of repositories
  • Worked with IT operations and release teams to configure the metadata repositories in multiple environments
  • Worked with technical teams and ETL jobs scheduling teams to take regular backup’s and sync the ETL changes in Metadata Repositories

Environment: Informatica Power Center/TDM/Cloud 10.1.1, mainframes, salesforce, Oracle DB, MFT, Autosys

Confidential - Sanjose, CA

Data Governance Consultant

Responsibilities:

  • Architecture the required Infrastructure for IDQ/Metadata manager installation with consideration of existing applications
  • Configured Informatica Data Quality/MetaData Manager to analyze the impact on downstream systems for the changes on multiple systems and applied Data Lineage rules for multiple mapping and data transformations
  • Created IDQ mappings/workflows for processing error records processed by multiple ETL process.
  • Designed IDQ workflows by including Human Tasks to edit the error records by SME’s and approvers.
  • Created separate resources for Data Lineage, Business Terms and MetaData Catalogs for the data sources
  • Created Business Glossaries as per business terms for each subject area and Multiple ETL applications
  • Configured RBL(Rule based linking) for custom models and enumerated links to create links for multiple files in each Business Glossary
  • Designed the Metadata Custom models for populating Data Lineage between SAP Bank Analyzer calc views and BOBJ filed level reports.
  • Developed B-ETL mappings for processing data into SAP HANA database as per business requirement.
  • Developed ETL mappings by utilizing the SAP HANA RFC function modules and IDOC procedures
  • Configured MM Repository for processing daily, weekly and monthly data from multiple source systems.
  • Designed the Data processing documents and timings of processing the data from Multiple ETL’s into Downstream applications
  • Designed the framework for multiple ETL data flows into downstream applications including Master Data.
  • Participated in Business meetings and technical meetings for data process changes from different ETL’s and Reference data systems.
  • Worked with SAP Security team for configuring the IDQ DL groups/users for editing and approving the error records and consumed the DL’s into IDQ workflows.
  • Worked with production support team in processing the backup and restore operations of metadata repositories
  • Worked with IT operations and release teams to configure the IDQ/metadata repositories in multiple environments
  • Worked with technical teams and ETL jobs scheduling teams to take regular backup’s and sync the ETL changes in Metadata Repositories

Environment: Informatica Power Center/Data Quality/Metadata Manager 10.1.1, SAP HANA, Oracle DB, Teradata, UNIX, Control-M

Confidential - Fremont, CA

Metadata Manager Consultant

Responsibilities:

  • Configured Informatica MetaData Manager to analyze the impact on downstream systems for the changes on multiple systems and applied Data Lineage rules for multiple mapping and data transformations
  • Created separate resources for Data Lineage, Business Terms and MetaData Catalogs for the data sources
  • Created Business Glossaries as per business terms for each subject area and Multiple System of Records
  • Configured enumerated links to create links for multiple files in each Business Glossary
  • Designed ETL POC mappings for processing Credit and Non-Credit SOR’s as per business requirement.
  • Designed Relationship Profitability Matrix table to process data from Demand Deposit and Term Deposits.
  • Developed ETL mappings for processing daily, weekly and monthly data from multiple source systems.
  • Designed the Data processing documents and timings for processing the data from Multiple SOR’s into Public and Private Environments of consumption layer.
  • Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
  • Designed the ETL runs performance and security tracking sheet in different phases of the project and shared with Production team.
  • Utilized the IT Framework to populate Matrix format of data population for DQS and Horizon for generating officer ID for Master Data.
  • Interacted with the business stake holders for applying Data Quality rules and cleanse the data as per Data Governance principles.
  • Participated in Business meetings and JAD calls for data process changes from different SOR’s and Reference data systems.
  • Extensively used Informatica Metadata manager to link the Business Terms with related catalogs
  • Exclusively worked with different Technical teams and Business users and gathered Information in for processing SOR’s
  • Worked with production support team in resolving processing difficulties with backup and restore operations into multiple environments.
  • Validated the counts and Business As of Dates for multiple SOR’s after the data is restored in Consumption layer.

Environment: Informatica Power Center, MetaData Manager 9.6.1, Oracle DB, SQL server, DB2, CCAR, UNIX, Autosys

Confidential - Sunnyvale, CA

Data Integration Architect

Responsibilities:

  • Integrated Multi-dimensional business data into Callidus cloud application
  • Configured Drop-box locations to process the files into cloud environments and back to on-Premises
  • Configured the environment file with new variables like Database connection strings, and included multiple paths like INPUTBASE, OUTPUTBASE,ARCBASE,SQLDIR, LOGDIR etc
  • Configured Informatica services to read data from multiple source systems like SAP, Relational Database systems.
  • Configured Informatica Integration services to process cloud return files and created required connection strings with Informatica Repository
  • Written function variable in shell scripts like GetSQLValue, by using environment file variable paths, and called these functions into multiple script to get connected to Database and execute/read sql files
  • Designed the Data Flow Templates for processing Positions, participant, territory, HR, goal and compensation plan information
  • Designed the ETL execution Plan to extract required data from multiple sources systems like EDW into Landing and staging areas.
  • Designed ETL mappings to read sequencing and Data flagging’s to process manually generated files by business.
  • Designed the output Data format templates to provide the ETL processed data for on-demand applications
  • Utilized Oracle Data base import techniques to get data from drop box locations and keep the applications in sync
  • Designed the validation tables and control tables to keep track of processing files by basing on File types and File Names
  • Designed custom mappings as per business requirements and utilized to standardize the source systems data in power center
  • Created backups of Informatica repository on schedule basis and migrated the required workflows in different phases.
  • Extensively used Informatica to extract data from source system and load into OD-Operations tables after processing batch jobs
  • Exclusively worked with different Technical teams and Business users and gathered Information in Integration and development phases.
  • Supported production team in resolving processing difficulties with migrations, parameter files, Database tables and shared objects.

Environment: Informatica PC 9.1.1, TC SaaS, Callidus Cloud, EDW, Oracle, Business Objects, UNIX shell scripting

Confidential, Sunnyvale, CA

Tech Lead/Architect

Responsibilities:

  • Configured the Informatica data services like Data quality, Master Data management (MDM) and web services
  • Configured Model repository, Data Integration services (DIS), required and security Authentications.
  • Created different data base schemas to process data in LAND-MDM, STG-MDM, BO-MDM, and ORS-MDM also utilized Informatica Meta data while processing data in repositories.
  • Designed the ETL flow from multiple source systems and defined the MDM Data Trust values for multiple sources
  • Designed the Match rules, Merge rules to consolidate data from multiple data sources on MDM Base objects
  • Designed the Informatica MDM process models for multiple Modules like product, Customer and Contracts
  • Designed the Order fulfillment and Saas Templates for processing the Active contracts from Autonomy Databases
  • Designed the ETL execution Plan to extract required data from Autonomy sources systems like (Epicore, Oracle, Softrax, Netsuite and NIBS) into multiple levels of staging areas.
  • Configured the Data Profiling, Data Match, Organizing the record Groups, Data Standardization, Association rules and Address validations Mapplets of IDQ.
  • Applied the Data Quality rules to parse xml source data into required format for downstream systems.
  • Configured custom IDQ mapplets as per business requirements and utilized to standardize the source systems data in power center
  • ETL plan included to Consolidate, profile, cleanse, enrich, de-dup and Integrate the data as per Target systems.
  • Configured ETL mappings to prepare X-Ref matching data between different Autonomy source systems to match the requirements of Order Management Transactions methods.
  • Configured Hive - QL Map-Reduce jobs for processing data from third party applications by using web services
  • Integrated SAP NGOM generated real time orders with SEMS application to generate Entitlement ID by using TIBCO web services.
  • Exclusively used TIBCO web services, to process entitlement id back to SAP after fulfilling the order with other applications.
  • Executed DDL’s for the required staging tables, X-Ref tables to stage the data from multiple source systems and produced the DDL scripts to Database team.
  • Processed large sets of unstructured data files into VERTICA database by standardizing as per reports requirements.
  • Written shell scripts to trigger Informatica workflows on Informatica servers for on fly and on-demand jobs run
  • Configured the TIDAL scripts to execute the ETL jobs in required order.
  • Coordinated with the operations team for scheduling the Work Flows and processed the flat files with data wise.
  • Created backups of Informatica repository on schedule basis and migrated the required workflows in different phases.
  • Extensively used Informatica to extract data from HPP and load into SEMS IDM tables after processing batch jobs
  • Exclusively worked with different Technical teams and Business users and gathered Information in Integration and development phases.
  • Supported production team in resolving processing difficulties with migrations, parameter files, Database tables and shared objects.

Environment: Informatica PC 9.5.1/9.6.1, IDQ, MDM, Oracle, SQL Server DB, VERTICA 7, TIBCO 5.7, SFDC,JAVA, WSDL, SOAP, TIDAL scheduler

Confidential, Santa Clara, CA

ETL Architect Consultant

Responsibilities:

  • Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
  • Created Logical Data flow Model from the Source System study according to Business requirements on MSVisio.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Created UML Diagrams including Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, Collaboration Diagrams and Deployment Diagrams, Data Flow Diagrams (DFDs), ER Diagrams and Web Page Mock-Ups using Smartdraw, MS Visio & Rational Rose.
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
  • Prepared low level technical design document and participated in build/review of the BTEQ Scripts, FastExports, Multiloads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
  • Provided support during the system test, Product Integration Testing and UAT.
  • Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, and dependencies are set as requested.
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Provided quick production fixes and proactively involved in fixing production support issues.
  • Processed and Integrated Product description Data files by using Mulesoft Java messaging services and API calls.
  • Designed Mulesoft mappings with consideration of logging methods to observe the data flows
  • Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.
  • Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.
  • Generated Archived based reports from Multiple unused data base systems by using Informatica ILM Retention
  • Extensively used Informatica to extract to integrate with soft launch tables and load into the target Oracle ATG
  • Developed Data Masking workflows for credit card transactional data by using Test Data Management of Information Life cycle Management workbench (ILM)
  • Analyze business requirements, designs and write technical specifications to design/redesign solutions.
  • Involved in complete software development life-cycle (SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.
  • Developed technical design documents (HLD and LLD) based on the functional requirements.
  • Coordinate with Configuration management and production support teams in code deployments.
  • Worked on Shift Basis and 24/7 rotation basis for support and trouble shoot the Issues and tickets.

Environment: Informatica PC 9.1.1, Cloud Apps, ILM, Oracle, Mulesoft - Anypoint Studio, Teradata12, SQL Assistant 12.0, DB2 (Price & Promotions), Blue Martini, Oracle ATG, Salesforce, Hadoop 0.20, MapReduce, talend, Hive 0.10,One-login, ESP scheduler, Star Team 2008, Anthill Pro, java, Akamai, Mingle(story board)

Confidential, Redwood City, CA

ETL Lead/Architect

Responsibilities:

  • Here I had taken responsible to analyze the source systems Demantra DM, EBS 11i, Vision, Demantra PTP and ODS and designed the flow of ETL process with control tables. Also interacted with data Integration team to design the ETL database schema for OBIEE reporting and worked with offshore team for developing the Informatica jobs.

Environment: Informatica Power Center 8.6.1, Oracle Data Integrator 10.1.3.4, Oracle R12 (General Ledger GL, Account Payables AP, Account Receivables AR, Purchasing PO, Order Management (OM), OBIEE 11.1.1.3, OBIA/BI APPS 7.9.6/7.9.5, Oracle

Confidential, Irving, TX

ETL Lead Engineer

Responsibilities:

  • Here I had taken responsible to develop the Informatica jobs which will extract PEGA Applications XML data into COAMGR staging tables and then loads into existing COAMGR Data Mart and developed the Informatica mappings and workflows to extract the Operational data from different sources databases like DB2 and Oracle.

Environment: Informatica Power Center 8.6.1, INFORMATICA Power Exchange, DB2 for AS400, VSAM, Teradata, PERL, Oracle 10g, SQL, XML, Erwin

Confidential

ETL Consultant, San Roman CA

Responsibilities:

  • ETO (Energy Trading & Optimization), this project is to gather all the information from different source systems like CMRI (CASIO Web services), OASIS (CASIO Web Services), FBS, MDS, RTSM, PNODE, PGE-PRICE, and by basing on the Information, this data has been moved into related Master and Transaction tables of warehouse.

Environment: Informatica Power Center 8.1.1, Oracle 10g, SQL, PL/SQL, XML, XSD, UNIX, Erwin, UC4 Dynamic scheduler

Confidential

ETL Consultant, South SFO, CA

Responsibilities:

  • Here I had taken responsible to develop the Informatica jobs from different stages and different database systems into existing Oracle warehouse system and I also worked with the Data Intelligence team to develop data flow Templates and Informatica mappings from CDM data mart to the True Comp reporting.

Environment: Informatica Power Center 8.1.1, Oracle 10g, PL/SQL, SQL, Business Objects R3, OLAP Services.

Confidential

Informatica Consultant, Santa Clara, CA

Responsibilities:

  • BT (Business Transformation) is all about migration of business data from SAP 4.7 and Non-SAP (Oracle Apps, People soft, Matron, Navision, Trend, Priority, Excel, MS-Access and Lotus notes) to SAP ECC. Informatica PowerCenter is used to perform ETL part of the BT.

Environment: Informatica Power Center, Informatica Meta Data Reports, OWB, Teradata, SAP 4.7, SAP ECC 6.0, Oracle APPS, SCM, PeopleSoft - HRMS, HP Quality Center, UNIX Shell Scripting, Sun Solaris server.

Hire Now