We provide IT Staff Augmentation Services!

Informatica Lead Consultant Resume

Toyota, CA

PROFESSIONAL SUMMARY:

  • Around 10 years experience in Technical consulting, with expertise in Data Warehousing project’s using Big Data, Informatica, Business Objects, Oracle, Teradata and OBIEE. Demonstrated experience in all aspects of Data Warehouse Life Cycle across different business domains such as Dimensional Modeling, Extraction Transformation and Loading (ETL/ELT), query optimization, ETL/BI tools deployment and administration, Decision Support (DSS) and Reporting.
  • Most recently worked with implementing Data Lake with Cloudera Hadoop Big Data platform & Informatica 10.1 Big Data Edition (BDE).
  • Summary of Data Warehousing and Business Intelligence Experience
  • 15+ Years experience in implementing various Data Warehousing solutions on Oracle / Teradata
  • 2 + Years’ Experience in Hadoop Big Data Systems using Cloudera Platform.
  • 6 ½ experience as Technical Lead for various Data Warehousing implementations
  • 10 + Years experience on Informatica Data Integration Tools
  • 6 ½ Years experience on BI reporting tools like Business Objects / OBIEE
  • ETL & ELT Architecture / Development expertise in using Hadoop / HIVE technologies.
  • Scripting expertise in UNIX Korn Shell, Python, JAVA & Perl
  • Development & Implementation experience on Informatica 10.1 Big Data Edition & Data Quality product suites
  • Assess, Design, Upgrade and Implement Informatica 10.1/9.x/8.5/7.x/6.x/5.x/4.7 Power Center Architecture for various EDW ETL platforms based on Teradata and Oracle
  • Demonstrated strong technical experience in integration Informatica ETL platform involved multi - terabytes databases using Hadoop, HIVE, Oracle and Teradata
  • Developed and provided various best practices, guidelines & standards using Informatica ETL platform
  • Expertise in implementing advanced transformations like Informatica Web Service Consumer, Sales Force Application Source Qualifier, XML & Unstructured Data transformations
  • Strong expertise in Informatica ETL process Performance Tuning and optimization
  • Strong implementation experience Informatica 8.x Full Push-Down optimization feature
  • Solid understanding of Dimensional Modeling and Provided Multi-Dimensional Data models
  • Implementation Informatica 8 feature of High Availability and Server Grid options to scale larger volumes of data come in and as more teams share the ETL infrastructure
  • Expertise in Administration and Configuration of Informatica 10.x/9.0/8.x/7.x/6.x ETL platforms
  • Good experience in making use of Oracle / Teradata Data Warehousing specific features

TECHNICAL SKILLS:

Data Warehouse & BI: Data Warehouse Architecture, Design, Development and Deployment

Big Data: Cloudera Hadoop Enterprise 5.1 ( HIVE, Spark, Python, NOSQL)

Databases: Oracle 7.3/8i/9i/10 G, Teradata V2R6, Hyperion Essbase, Oracle Express

DW/BI Tools: Informatica Big Data Edition & Data Quality 10.1, Informatica Power Center 9.5/8.1.1, Informatica Complex data exchange, Teradata Load Utilities (BTEQ, MLOAD & FLOAD), Business Objects 5i & XIR2

Analytics: OBIEE 10.x / Siebel 7.x Analytics, Oracle BI Apps (Pre-configured Informatica ETL for ERP & CRM systems), Oracle Financial Analyzer (OFA), Hyperion Essbase

PROFESSIONAL EXPERIENCE:

Confidential, Toyota, CA

Informatica lead consultant

Responsibilities:

  • Data Ingestion into Hadoop Platform using HDFS / HIVE / Spark.
  • Apply Data Quality & Validation rules using Informatica Data Quality Tool/ Unix Korn Shell & Python scripting for all the incoming data into Hadoop
  • Develop a dimension data model in HIVE which can act as a source feed for Hyperion EPM.
  • Extract Hadoop/Hive data using data services into Oracle Reporting database.
  • Using Python and Unix Shell scripting parse data file formats like XML,JSON, Flat files.
  • Parse Data services JSON file formats using Informatica unstructured data transformation tool.
  • Implement automated Informatica ETL solution which can trigger downstream data feed for Hyperion systems.
  • Develop Shell scripts which supports ETL automation & data load activities.
  • Develop Informatica mapping to integrate dimensionality across the fact data
  • Customize the OBIEE RPD & Reports to back feed Hyperion forecasting data into OBIA Financial data warehouse.

Environment: Cloudera Hadoop Enterprise 5.1 ( HDFS, HIVE, Spark, Python), Informatica 9.5/10.1, Shell Scripts, OBIA 7.9.6.3, OBIEE 11.1.6, Informatica DAC

Confidential, San Francisco, CA

Informatica lead consultant

Responsibilities:

  • Help to develop ETL/Data mart Model design by analyzing various source systems like SFDC, Proposal System, Contract Database, Meter Database and other on-demand ERP cloud systems.
  • Extract and Transform data into Hadoop Platform from Sales Force & Oracle ERP cloud.
  • Use Informatica cloud ETL/Web Service Consumer for connecting Sales Force with Oracle ERP.
  • Development and Deployment of OBIEE solution
  • Converting reporting data model into OBIEE RPD using Administrator tool
  • Building Dashboards, ad-hoc reports and detailed drill-down reports
  • Working with OBIEE Agents (ibots) for distribution various reports
  • Developed Dashboards / Reports with key usability feature Analytics Views (Pivot Table, Chart, Tabular and View Selector), Alerts, Guided Navigation and Dynamic / Interactive Dashboards with drill-down capabilities
  • Building OBIEE scorecards using new 11g performance management features
  • Worked with Oracle pre-built Financial & Analytics for the ERP reporting solution using Informatica DAC
  • Integration of OBIEE & Informatica Real-time web service with SOA applications salesforce and MDM
  • Consolidating a core set of asset, contract, Sales Force, ERP and proposal information using Informatica ETL
  • Training BI end-users and power users on OBIEE tool capabilities and documenting the same

Environment: Sales Force CRM, Informatica Cloud/Web Services, Cloudera Hadoop Enterprise ( HDFS, HIVE, Python), Korn Shell /Python scripting, OBIEE 11.1, Oracle Financials .

Confidential, San Jose, CA

Informatica lead consultant

Responsibilities:

  • Responsible for implementing brocades master data management (customer data hub) solutions using
  • Informatica ETL & Data Quality tool set
  • Enrich the customer profile by Integrating Informatica with Dun & Broadsheet (D & B) world marketing SOA tool kit using Informatica web services consumer transformation
  • Extract, transform & cleanse customer data from various sources like Sales Force, Partner Point Of SaleOTM, BMI, Oracle ERP
  • Cleansing & validation of customer address data using Address Doctor SOA within Informatica Implement various validation checks and implement Data Stewart interventions using Informatica Data Quality product
  • Profiling various customer data sources to arrive at authoritative source for data and design the ETL logic required to clean up and load the data into the MDM customer data hub.
  • Used Informatica data quality and transformation techniques to scan data for violations of business rules, missing values, incorrect values, duplicate records, and other data-quality issues. Integrate clean customer master with existing Data Warehouse and Oracle EBS systems

Environment: Informatica 8.6 (Power center & data quality), Informatica Unstructured data transformation, Oracle 10 G, Oracle EBS & CDH 11i

Confidential

Informatica lead consultant

Responsibilities:

  • Define and implement ETL architecture & strategy for Informatica-Teradata world Provided Source System Analysis and Data Architecture for the Confidential data warehouse
  • Designed multi-dimension model that can suit best for both ETL and Reporting data Develop Informatica-Teradata mapping templates which act as guidelines for the development team Implement various data acquisition and data loading strategy’s for successful data movement into DW Integrating Informatica
  • Teradata features that can fasten the ETL solution development Implementation Informatica 8 Pushdown optimization feature which is compliment to BTEQ utility Informatica mapping, session & workflow Design, development, unit testing and performance Tuning Provide support in administration and maintenance of Informatica architecture for Dev /
  • QA / UAT/ Production environments
  • Performance tuning of stage to reporting ETL/Oracle load processing using Informatica features like full push-down optimization

Environment: Informatica 8.1.1, Informatica Pushdown Optimization, Informatica B2B Complex data Exchange, Teradata V2R6, Teradata Utilities (MLOAD, FLOAD,TPUMP & BTEQ), Sun Enterprise 15000

Confidential, San Jose, CA 0

ETL (Informatica) Architect R

Responsibilities:

  • Designing data Architecture for Enterprise Performance Management Dashboard and Scorecard systems
  • Identify data elements and develop dimension data models for enterprise level dashboard reporting
  • Data analysis of various Confidential data marts and source systems like Oracle ERP, Siebel CRM
  • Identify data gaps between existing reporting and dashboard/score card systems
  • Detailed source systems analysis for identifying various load mechanisms Design & Develop ETL interfaces for various SAN health data ingestion feeds using Informatica / Oracle Ingest, validate and integrate various source data feeds from different 3rd party content providers Configure ETL workflows to be run on event based and scheduled mechanism
  • Develop templates and guide line across the ETL flow from Source systems to Data Marts Develop best practices for developing mappings using Informatica 8.6 Creating and enhancing executive dashboards using Siebel Analytics (OBIEE) Developed Dashboards / Reports with OBIEE (Siebel Analytics ) Views (Pivot Table, Chart, Tabular and
  • View Selector), Alerts, Guided Navigation and Dynamic / Interactive Dashboards with drill-down Capabilities
  • Used Siebel Analytics Delivers (iBots) to refresh dashboards into cache and broadcast reports to users Performance tuning of Oracle 10G environment for faster data analysis and used Oracle Data Warehouse features like Bitmap indexes, partitioning, pivot table functions, transportable table spaces

Environment: Informatica 8.5,Informatica Complex Data Exchange 8.5, Unix, Oracle 10g, Sun Solaris, Erwin, Visio and OBIEE (Siebel Analytics) 10.x Implementing

Confidential, San Jose

Responsibilities:

  • Assess, Design and Implement Informatica 8 Architecture for Confidential ’s EDW ETL platform based on Teradata
  • Implementation of High Availability and Server Grid options to scale larger volumes of data come in and as more teams share the ETL infrastructure 2 HP superdome machines linked together via Informatica 8 Grid technology to take advantage of idle cpu time on the additional machine and most importantly would have Failover capability Configured heterogeneous grid environment with 64 bit Informatica server for oracle EDW and 32 bit
  • Informatica server for to be Teradata EDW
  • Provide assistance in administration of Informatica 8 server grid Introduce policies, processes, and procedures to keep Informatica repositories lean Successful upgrade of existing Informatica 7.1.3 EDW code base to Informatica 8.1.1 which comprises of mappings and 20 odd folders
  • Developed Unix shell scripts as part of the balancing process - records count in XML files, adding specific entity numbers in the XML files, splitting files.
  • Eliminate shortcuts and objects introduced by the v7 upgrade process Identify and Develop various possible PCMR reports to better manage EDW Informatica repositories
  • Develop guidelines & standards for Informatica-Teradata based EDW to be architecture
  • Design standard templates in using Teradata utilities FLOAD,MLOAD and TPUMP through Informatica for external data movement
  • Design templates in using Informatica 8 PushDown feature for faster data movements with in Teradata world
  • Also developed complex query's to process time-variant data within Teradata EDW using PushDown feature

Environment: Informatica 8.1.1, Oracle 10g, Teradata V2R6, Teradata Utilities (MLOAD, FLOAD,TPUMP & BTEQ),HP Super Dome Server, Unix and Red Hat Linux

Confidential

Informatica lead consultant

Responsibilities:

  • Defining ODS Application Performance Tuning Approach, Scope and Performance benchmarks
  • Reducing the current ETL load window timings by optimizing the current ETL load process
  • Provide Tuning assessment by rightly identifying performance bottlenecks and explore tuning opportunities in the current ETL environment
  • Monitoring CPU utilization levels, CPU wait times and I/O activity for Informatica and Oracle Server
  • Tuning Informatica mappings, sessions, SQL queries, lookup, aggregator, sorter and joiner caches
  • Optimizing Informatica sessions by effectively applying session partitioning and partition point features
  • Redesigning some of the Data Load Process with minimal changes
  • Enabling the system to process more ETL jobs by minimizing system resource utilization and CPU utilization times
  • Implementing tuning recommendations by coordinating with Application owners, Oracle DBA team, Source system administrators and Informatica administrator

Environment: Informatica 6.2.2 and & 7.1, Erwin, Oracle 9i & 10g, Unix, Sun Enterprise Fire Servers, Sun Solaris 5.8

Confidential, SFO, CA

ETL lead

Responsibilities:

  • Business Requirements Analysis
  • Near Real Time Data Warehousing Technical Architecture
  • Design of Dimension Model, which supports 24/7 data loading & data access requirements and analytical reporting requirements of users
  • Teradata Database Architecture, Database Sizing and Index Strategies
  • Teradata Logical and Physical Data Model development
  • ETL process design for WAP and MMS products integration to accommodate 24*7 data load for every 15 minutes
  • Informatica Design, development, unit testing and Performance Tuning
  • Implemented Downstream Applications Dependency

Environment: Teradata V2R5 (BTEQ, Multi Load, Fast Load), Informatica Power Center 6.2 (Designer, Workflow Manager, Repository Manager, Workflow Monitor), Micro strategy 7i, Shell Scripting, Erwin 4.0

Hire Now