We provide IT Staff Augmentation Services!

Sr Informatica Big Data Analyst/developer Resume

Lakeforst, CA

SUMMARY

  • Over 15+ years of focused industry experience in IT wif a strong background Hadoop, Informatica, BigData, Teradata, Oracle Database development and Data Warehousing and about 12 years in ETL process using Hadoop, Informatica Power Center 9.5/8.6.1,..
  • Expertise at designing tables in Spark, HBASE, Hive, MYSQL using SQOOP and processing data like importing and exporting of databases to the HDFS.
  • Expertise in creating mapreduce & Spark framework to create a structured file from an unstructured file.
  • Expertise in Core Java concepts including Collections, Exception Handling, Serialization and Deserialization.
  • Expert in Oracle Business Intelligence 10g/11g and Applications Consultant wif successful track record in gathering user requirements, designing, developing and supporting different Business Applications.
  • Extensively worked on Business Intelligence and Data warehousing wif Kimball and Inmon Methodology
  • Expert in Installation and configuration of Financials Analytics, Supply chain and Order Management Analytics, and Procurement and Spend Analytics.
  • Hands On Kafka and Spark Streaming for messaging system
  • Expertize on YARN (MapReduce2.0) architecture and components suchs as Resource Manager, Node Manager, Container and Application Master and execution of MapReduce job.
  • Analytics Metadata objects, Web Catalog Objects (Dashboard, Pages, Folders, Reports) and scheduling iBot, DAC.
  • Proficient in understanding Business processes requirements and translating them into technical requirements.
  • Solid experience in Informatica PowerCenter Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor, Repository Manager, Star Schema and Snow flake Schema, OLTP, OLAP, Data Reports.
  • Extensive experience wif Data Extraction, Transformation, and Loading (ETL) from disparate Data sources like Multiple Relational Databases like Teradata, Oracle, DB2 - UDB and Worked on integrating data from flat files, CSV files, and XML files into a common reporting and analytical Data Model using Erwin.
  • Extensively worked on ETL Processes, Data mining, and Web reporting features for Data warehouses using Business Object.
  • Designed and created Oracle Database objects: tables, Indices, Views, Procedures, Packages, and Functions.
  • Experience in Developing SQL and PL/SQL code, migrate to test environment, and perform unit and integration testing.
  • Experience in tuning the stored procedures, queries by changing the join orders.
  • Experience in Pig, Hive, Spark,Scala,Python Sql for Hadoop Eco System.
  • Strong experience wif UNIX Korn Shell scripting.
  • Extensively worked on Scheduling jobs.
  • Excellent understanding of Client/Server architecture makes a valuable asset to any project, wif Excellent Logical, Analytical, Communication and Organizational Skills.
  • Excellent team player and can work on both development and maintenance phases of the project.

TECHNICAL SKILLS

BI Tools: OBIEE 11g/10g, OBIEE Financial App, Sales App,Business Object

ETL Tools: Informatica Power Center/ Power Mart 9.x/8.x/7.x/6.x, Spark

Applications: Oracle R12, Siebel, SFDC, OBIEE Financial, Supply Chain, Procure/spend Apps

Tools: /Utilities: SQL*Loader, TOAD 7.x/8.x/9.x, Erwin 4.5

Languages: SQL, PL/SQL, UNIX shell scripting, Java 1.4, HTML 4+, CSS, JavaScript, C, C++,TSO,ISPF,Cobol,DB2,JCL,Pig,Hive,Mapreduce,Python

Databases: Oracle 10g/9i/8i/7.3, TeraData V2R4/V2R3, DB2 UDB 7.1, MySql, MS Access, vsam,NoSQL,HBASE

PROFESSIONAL EXPERIENCE

Sr Informatica BIg Data Analyst/Developer

Confidential

Responsibilities:

  • Created Detail design documentation to describe program development, logic, coding, testing, changes and corrections.
  • Created ETL Specifications for Development and conversion projects.
  • Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.
  • Created requirement definition and analysis in support of Data Warehouse.
  • Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter,update strategy, lookup, sorter, Normalizer, sequence generator, etc.
  • Worked wif XSD and XML files generation through ETL process.
  • Defined and worked wif mapping parameters and variables.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Performed the performance evaluation of the ETL for full load cycle.Checked Sessions and error logs to troubleshoot problems and also used debugger for complex.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Created test cases for unit testing and functional testing.
  • Coordinated wif testing team to make testing team understand Business and transformation rules being used throughout ETL process.
  • Drives and manages development and application of data algorithms, models and corresponding documentation.
  • Used Spark RDD and Dataframe for transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS.
  • Participated in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meeting wif various business users.
  • Used Sqoop to transform data between Oracle and HDFS.
  • Developed PIG Latin scripts for the analysis of semi structured data.
  • Converted the Informatica SQLs to Spark cluster using spark dataframe and Rdds.
  • Created Hive & Spark SQL queries for faster processing of data.
  • Created the Dataframe and RDD from Parquest and Jason Files .
  • Expert in creating OBIEE reports dashboard, RPD and Presentation Letter from Hive Connection.
  • Created Shell scripts to dump the data from MySQL to HDFS.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Written java code for file writing and reading, extensive usage of data structure ArrayList and HashMap.
  • Executed Python programs and used Spark API to join various RDDs.
  • Loaded the data to Teradata using fastload.
  • Created the testing issues using Jira.
  • Used Github to store the latest codes dat need to be migrated.
  • Have created Kmeans algorithm using Scala Language.
  • Written Scala code for file writing and reading, extensive usage of data structure ArrayList and HashMap.
  • Used SBT for packaging the scala program and executed through eclipse in Spark Framework.
  • Created various graphs for Confidential Monthly transaction metrics using Excel Power BI.
  • Designed and modified ETL Mappings using Informatica from Scratch to load the Data from Source System (Oracle EBS) to staging system (SDE mappings) and to the target Analytics Warehouse.
  • Design and Manage execution of Extract, Load, and Transformation processes from source data to target using Informatica.
  • Experience wif creation, application and validation of appropriate Data algorithms / models to the data to produce clear business actionable results managing and reviewing Hadoop Log files and also developed the Pig UDF's and Hive UDF's to pre-process the data for analysis.

Sr Business Intelligence Analyst

Confidential, Lakeforst, CA

Responsibilities:

  • Performed the Setup, Installation and Configuration of the complete Analytics platform environment OBIEE, Informatica and DAC and the required connectivity for seamless integration wif the data warehouse.
  • Extracted the data from the source system EBS R12, transformed the data and loaded into the OBIEE analytics warehouse using Informatica.
  • Implemented Financial Analytics, Procurement and Spend Analytics, integrated data from Finance and other enterprise systems and transformed data using Informatica (ETL).
  • Migrated OBIEE 10g to OBIEE 11g environment.
  • Analyzed, Designed, and Developed OBIEE Metadata repository (RPD) dat consists of Physical Layer, Business Mapping and Model Layer and Presentation Layer.
  • Interacted wif SME’s, Business Analysts and Users for gathering and analyzing the Business Reports Requirements.
  • Gatheird requirements from the client for GAP analysis, translated them into technical design documents and worked wif team members in making recommendations to close the GAP.
  • Designed, implemented and tested security privileges on Dashboards and Reports.
  • Worked using Catalog Manager to migrate Web Catalog among Instances.
  • Built reports using Answers including drilldown objects, union based reports and formatted functions wifin the reports.
  • Performed debugging for verifying the prebuilt ETL mappings for Oracle BI Applications.
  • Designed and modified ETL Mappings using Informatica from Scratch to load the Data from Source System (EBS) to staging system (SDE mappings) and to the target Analytics Warehouse.
  • Using Universal adapters created new folders and mappings in Informatica to bring the data from other source systems.
  • Developed Mappings using corresponding Source, Targets and Transformations like Source Qualifier, Sequence Generator, Filter, Router, Joiner, Lookup, Expression, Update Strategy and Aggregator.
  • Developed and debugged many intelligence Dashboards using different Analytics Views (Pivot Table, Pie/Bar Chart, and View/Column Selector, Narrative View), Dynamic/Interactive Dashboards wif drill-down capabilities using global and local Filters. integrated sources and Legacy source systems in Exadata.
  • Customized Out of the Box BIAPPS Dashboard pages like Sourcing Trends, Total Spend Overview, Exceptions, Purchase Requisitions, Purchase Orders and Employee Expenses in Supply Chain Analytics and Procurement and Spend Analytics.
  • Created Hive & Spark SQL queries for faster processing of data.
  • Created the Dataframe and RDD from Parquest and Jason Files .
  • Created Shell scripts to dump the data from MySQL to HDFS.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Written java code for file writing and reading, extensive usage of data structure ArrayList and HashMap.
  • Executed Scala programs and used Spark API to join various RDDs.
  • Have created Kafka consumer and process the data.
  • Have created Kmeans algorithm using Scala Language.
  • Written Scala code for file writing and reading, extensive usage of data structure ArrayList and HashMap.
  • UsedSBT for packaging the scala program and executed through eclipse in Spark Framework.
  • Designed and modified ETL Mappings using Informatica from Scratch to load the Data from Source System (Oracle EBS) to staging system (SDE mappings) and to the target Analytics Warehouse.
  • Design and Manage execution of Extract, Load, and Transformation processes from source data to target using Informatica.
  • Experience wif creation, application and validation of appropriate Data algorithms / models to Implemented Time Comparison and Calculation measures in the business model using Time series wizard and modeled Slowly Changing Dimension Data.
  • Trouble shooting the issues related to repository models and report development.
  • Developed many complex Full/Incremental Informatica Objects (Workflows / Sessions, Mappings/mapplets) wif various transformations.
  • Imported and Exported Execution Plan in DAC from one Instance to other Instance.
  • Customization and enhancement of standard Oracle applications reports in Account Payables module like Invoice Aging Report, Invoice Validation report, Cash requirement report using Oracle BI Publisher.
  • Developed AP Supplier Contact report dat will provide detailed contact information about Vendors/Suppliers for communication purposes using Oracle Reports 10g.
  • Created new requests and opened Answers for ad hoc reporting to the End users.

Environment: Informatica 9.5, Linux, Linux, Cloudera Hadoop 2.x, Hive, Map reduce, Pig, Spark Obiee 11g, Teradata, Oracle 11g

Sr Business Intelligence Analyst

Confidential

Responsibilities:

  • Full project life cycle implementation for Oracle Business Analytics Warehouse (OBAW)
  • Analyzed the source system to understand the various attributes.
  • Created high level and low level design document for Service Request Dashboard.
  • Designed various Dimensions and Facts for sales performance and Support data mart.
  • Created the Presentation layer attributes for User in OBIEE.
  • Upgraded OBIEE 10g OBIEE 11G.
  • Created the sales performance plan in DAC Scheduling tool and associated the workflows.
  • Upgraded Informatica from 8.6 to 9.1.
  • Partitioned the Tables and Modified the Long running query to improve ETL and Dashboard
  • Developed the Physical and BMM Layer in RPD.
  • Designed Schema/Diagrams using Fact, Dimensions, Physical, Logical, Alias and Extension tables.
  • Created detailed specification document to develop the ETL.
  • Coordinated wif offshore team to do the development.
  • Review the development activities from offshore.
  • Developed the Presentation Layer as per the Business requirement in OBIEE.
  • Coordinated wif various team to carry out the integration testing.
  • Assembled subject areas and created execution plan and accommodated the new Informatica mappings.
  • Performance Tuned the SQL query and Mappings for better performance.
  • Implemented the ETL Informatica mappings to Production
  • Created the training document for User
  • Supported the Mappings and resolved issues in Production.
  • Responsible for installing, configuring, developing, support and maintenance of the BI APPS. Environment: Informatica 9.5, Linux, Obiee 11g, Oracle 11g,Pyhton

Etl Informatica Developer

Confidential, Irvine, CA

Responsibilities:

  • Involved in gathering and analyzing the requirements and preparing business rules.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.
  • Worked wif Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Developed Informatica Workflows and sessions associated wif the mappings using Workflow Manager.
  • Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
  • Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 9.1.0.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Involved in Performance Tuning of mappings in Informatica.
  • Good understanding of source to target data mapping and Business rules associated wif the ETL processes.

Environment: Informatica 9.5, Linux, Cloudera Hadoop 2.x, Hive, Mapreduce, Pig,Spark, Obiee 11g, Oracle 11g,Pyhton

Senior Informatica ETL/OBIEE Analyst

Confidential, Los Angeles, CA

Responsibilities:

  • Analyzed the current requirement and created the design and Specification document for SAP Integration Project to Finance system.
  • Designed the Functional and Technical specification documents for the combined data users.
  • Created the new mappings and modified existing mappings as per the requirement.
  • Created Teradata Tables, Views, and indexes to accomplish the new structure.
  • Designed the new SCD2 dimension structures and loaded them using Informatica 8.6.
  • Created fast load, multi load script to load tables based on specifications.
  • Generated the SAP ABAP code from mapping to pull the data from SAP Systems to staging environment.
  • Developed versatile reusable mapplets, transformations and tasks for data cleansing and audit purposes across the team.
  • Analyzed, Designed, and Developed OBIEE Metadata repository (RPD) dat consists of Physical Layer, Business Mapping and Model Layer and Presentation Layer .
  • Developed custom reports/Ad-hoc queries using Answers and assigned them to application specific dashboards.
  • Designed and developed reports and dashboards
  • Created usage analytics subject area, reports on usage
  • Analyzed performance bottlenecks in OBIEE and implemented performance improvement techniques like
  • Migrated the Mapping, Session and Workflows to Integration Environment.
  • Configured and created OBIEE Repository, Involved in modifications of the physical, the BMM, and the presentation layers of metadata repository using OBIEE Administration Tool.
  • Written the complex SQL query to update the Dimension and fact in post session sql.
  • Created the Unix Scripts to handle the errors and send a notification to Support team for failure and successful message. Coordinate wif external team to set up the environment.
  • Created the application support guide for Level2 Team to support the application.
  • Resolved system test and Production issues.
  • Created proper documentation to describe program development, logic, coding, testing, changes and corrections.

Sr. Programmer Analyst

Confidential, Los Angeles, CA

Responsibilities:

  • Designed the Data Warehousing ETL procedures for extracting the data from all source systems to the target system.
  • Extensively used Transformations like Router, Lookup (connected and unconnected), Update Strategy, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator Transformations.
  • Worked extensively wif dynamic cache wif the connected lookup Transformations.
  • Created, scheduled, and monitored workflow sessions on the basis of run on demand, run on time, using Informatica Power Center workflow manager.
  • Design Siebel Web Components including OBIEE Answers and Intelligence Dashboards.
  • Used DAC client to load the custom mapping to the warehouse.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Configured the session so dat Power Center Server sends an Email when the session completes or fails.
  • Debugged and sorted out the errors and problems encountered in the production environment.
  • Determined various bottle necks and successfully eliminated them to great extent.
  • Extensively worked on PL/SQL to write Stored Procedures to increase the performance and tuning the programs, ETL Procedures and processes.
  • Created the various BI reports for Canadian receipt and inventory using Business Object.
  • Wrote and modified Unix Korn Shell scripts to handle dependencies between workflows and log the failure information.
  • Actively took part in the post implementation production support.
  • Mentoring junior team members and fostering a learning environment.
  • Loaded the Teradata tables using fast load and MultiLoad through Mainframe JCL.
  • Executed BTEQ script to update the Teradata table using JCL.

Senior ETL Informatica /Teradata Developer

Confidential, Wilkesboro, NC

Responsibilities:

  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications.
  • Analyzed the specifications and identifying the source data dat needs to be moved to the data warehouse.
  • Partitioned lot of tables, which have frequent inserts, deletes and updates to reduce the contention and to improve the performance.
  • Designed the Data Warehousing ETL procedures for extracting the data from all source systems to the target system.
  • Extensively used Transformations like Router, Lookup (connected and unconnected), Update Strategy, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator Transformations.
  • Worked extensively wif dynamic cache wif the connected lookup Transformations.
  • Designed and Optimized Power Center CDC and Load Mappings to load the data in slowly changing dimension.
  • Created, scheduled, and monitored workflow sessions on the basis of run on demand, run on time, using Informatica Power Center workflow manager.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Managed offshore people for delivering the task and scheduled training as and when required to get the task done.

Hire Now