We provide IT Staff Augmentation Services!

Sr. Odi Developer Resume

4.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • Overall 14+ years of professional IT experience and 8+years of experience in design and developing end - to-end BI Solutions using ODI12C/11g, Informatica9.x and OBIEE11g/12C and 6 years in Mainframes Technologies
  • Expertise in working with relational databases such as Oracle, Db2, Netezza, Teradata, Sql server
  • SOA, Integration Technologies in the areas of Web Services, SOAP, WSDL, XML, XSD, XPATH and XSLT, Workday, Hadoop, Essbase, JSON, Flat files
  • Configured and setup ODI, Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Interfaces, Scenarios, Load plans, and Metadata.
  • Strong understanding of the principles of Data Warehousing, Fact Tables, Dimension Tables, star and snowflake schema modelling.
  • Worked on ODI12C Migration into AWS Environment
  • Worked on Cloud to Cloud and Cloud to ON-Prem Integration projects
  • Have good knowledge on cloud infrastructure (Iass, Pass, SaaS)
  • Used Replication tool in Oracle Golden Gate in the Utility Applications
  • Experience in providing end to end business intelligence solution by dimensional modeling design, developing and configuring OBIEE Repository (RPD), Interactive Dashboards, OBIEE Answers, Security implementation, Analytics Metadata objects, Web Catalog Objects (Dashboards, Pages, Folders, Reports).
  • Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow.
  • Experience using Dimensional Data modeling, Star Join Schema modeling, Snow-Flake modeling experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling.
  • Extensively worked with large Databases in Development, Testing and Production environments.
  • Extensively used tools MLoad, Bteq, FastExport and FastLoad to design and develop data flow paths for loading transforming and maintaining data warehouse
  • Having Good Knowledge in Oracle cloud services and Data base options.
  • Worked on all phases of data warehouse development life cycle, ETL design and implementation and support new and existing applications.
  • Solid Understanding of ETL design Principles and good knowledge of performing ETL design process through Informatica.
  • Have worked on various ongoing maintenance issues and bug fixes monitoring informatica sessions as well as performance tuning of mapping and sessions.
  • Experience in using Automation Scheduling Tools like Autosys, WLA and control-M
  • Used Version control tools Git and SVN
  • OBIEE Load / Performance Testing Process Using OATS
  • Used JIRA tool for bug tracking and issue tracking
  • Experience in various domains like Education, HealthCare, Telecom, Insurance, Retails and Manufacturing.
  • Good knowledge on Middle ware components SOA, OSB, ADF
  • Hiving knowledge of Hadoop Ecosystem, various components such as HDFS, Map Reduce, Hive, HBase, Pig, Sqoop.

PROFESSIONAL EXPERIENCE:

Confidential, LA

Sr. ODI Developer

  • Analysed the Business requirement for Oracle Data Integrator and mapped the architecture and used ODI for reverse engineering to retrieve metadata from data storage and load it to the repository.
  • Gathering requirements from different application teams to develop interfaces, packages. Scenarios.
  • Analysed Session log files in operator navigator to resolve errors in mapping and managed session configuration.
  • Created numerous ODI Interfaces (Mappings in ODI 12c).
  • Used Interfaces to load data from Flat files, JSON and CSV files in to staging area Oracle and load into Oracle data warehouse
  • Performed reverse engineering of databases using ODI Designer tool
  • Created, executed and managed ETL processes using ODI.
  • Implemented Type-II Slowly Changing Dimension (SCD) and Implemented Incremental Load for mapping.
  • Integration with SQL Server Cloud to On-Prem SQL Server data loads to make the data in Sync
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL. Creating Tables, Indexes, Views, Triggers. Created Oracle packages with advanced SQL queries like analytical and aggregate functions to process and load daily enrolment files.
  • Knowledge in performing the customizations to the ODI Knowledge Modules as part of tuning the Knowledge Module Code based on the requirements.
  • Provided recommendations on data conversion best practices that were implemented.
  • Written Data Validation Test Scripts to verify the data at the Sources and the Target.

Environment: ODI 12C, SQL, PL/SQL, SQL SERVER, OBIEE12C, JSON, Flat files, XML, Linux

Confidential

Sr. ODI Developer

  • Translated business requirements into technical specifications to build the Enterprise Data Warehouse.
  • Worked on data linage to identify end to end from dashboard to source systems.
  • Installation and Configuration of Oracle Utility Analytics Components, (ODI, OGG, APEX Admin).
  • Configure ODI and OGG using Administration Tool
  • Configure the Job parameters, entities, and jobs. Monitor the job executions
  • Responsible for designing, developing, and testing of the ELT strategy to populate the data from various source systems NMS, CCB, MDM, ODM using ODI.
  • Set up and run the initial sync between source and OUA Application layer
  • Hands on experience on working with ODI Knowledge Modules like, Golden gate knowledge module LKM, IKM and JKM, CKM.
  • Worked on ODI Designer for designing the interfaces, defining the data stores, interfaces and packages, modify the ODI Knowledge Modules (Reverse Engineering, Journalizing, Loading, Check, Integration, Service) to create interfaces to cleanse, Load and transform the Data from Sources to Target databases, created mappings and configured multiple agents as per specific project requirements.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL. Creating Tables, Indexes, Views, Triggers. Created Oracle packages with advanced SQL queries like analytical and aggregate functions to process and load daily enrollment files.
  • Knowledge in performing the customizations to the ODI Knowledge Modules as part of tuning the Knowledge Module Code based on the requirements.
  • Provided recommendations on data conversion best practices that were implemented.
  • Written Data Validation Test Scripts to verify the data at the Sources and the Target.
  • Designed and developed the ELT logic to Cleanse and transform the data in the target from various sources using Canonical tables.
  • Used ODI for Reverse engineering to retrieve metadata from data storage and load it to the repository.
  • Enhance performance by creating Aggregate tables, Table Partition, Indexes and managing Cache

Environment: ODI 12C, SQL, PL/SQL, OBIEE12C, OGG, OUA (NMS, CCB, MDM, ODM), TOAD, Linux

Confidential, Irving, TX

Sr. ODI Consultant

  • Translated business requirements into technical specifications to build the Enterprise Data Warehouse.
  • Development and design of ODI interface flow for Upload / Download files.
  • Installation and Configuration of Oracle Data Integrator (ODI). Responsible for configuration of Master and Work Repositories on Oracle.
  • Responsible for designing, developing, and testing of the ELT strategy to populate the data from various source systems (Flat files, Oracle) feeds using ODI.
  • Hands on experience on working with ODI Knowledge Modules like, Golden gate knowledge module LKM, IKM and JKM, CKM.
  • Worked on ODI Designer for designing the interfaces, defining the data stores, interfaces and packages, modify the ODI Knowledge Modules (Reverse Engineering, Journalizing, Loading, Check, Integration, Service) to create interfaces to cleanse, Load and transform the Data from Sources to Target databases, created mappings and configured multiple agents as per specific project requirements.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL. Creating Tables, Indexes, Views, Triggers. Created Oracle packages with advanced SQL queries like analytical and aggregate functions to process and load daily enrollment files.
  • Knowledge in performing the customizations to the ODI Knowledge Modules as part of tuning the Knowledge Module Code based on the requirements.
  • Provided recommendations on data conversion best practices that were implemented.
  • Written Data Validation Test Scripts to verify the data at the Sources and the Target.
  • Designed and developed the ELT logic to Cleanse and transform the data in the target from various sources using Canonical tables.
  • Used ODI for Reverse engineering to retrieve metadata from data storage and load it to the repository.
  • Enhance performance by creating Aggregate tables, Table Partition, Indexes and managing Cache

Environment: ODI 12C, SQL, PL/SQL, TOAD, Flat files, UNIX Shell Script, Java

Confidential, Foster city, CA

Sr. ODI Consultant

  • This project is Integration with Workday, developing a budgeting and forecasting application which will be used starting next fiscal year, using Hyperion Planning application.
  • Gathered ELT requirements from Business team and designed its high-level documents.
  • Worked as ODI Integration consultant to integrate their Workday system with Oracle Hyperion Planning.
  • Worked in importing data from heterogeneous data sources like database, flat file.
  • Designed Interfaces to load data from Flat files, CSV files in to staging area (Oracle) and load in to Oracle Data warehouse
  • Performed source to Staging and staging to target mapping specifications and developed ELT ODI code for the same.
  • Worked with ODI contexts to load data into Hyperion target systems which are in different environments from single ERP source system
  • Developed ODI custom interfaces to load Metadata and Data into Hyperion. Built custom views to extract data from ERP system to staging area.
  • Performed ODI Tasks such as Establishing ODI Development Standards, Created ODI Repositories, Agent, Contexts, Data Server, and both of Physical & Logical Schema in Topology Manager, created both of source and target models’ folders, models and data stores
  • Created Project, Project Folder, and Packages, imported required Knowledge Modules.
  • Created Interfaces, defined optimization context and staging area, performed source and target data store mapping and defined property, defined the Flow's LKM & IKM option and value. Defined filter and constraints as require, run execution and working with Operator if there is error occurred, Migration of ODI Repositories.
  • Created interface to integrate metadata and data from an external application to refresh the planning hierarchy inside Hyperion planning.
  • Developed batch and rules files to load data from financial system's database to Essbase cube.
  • Utilized Knowledge Module Adapter to build planning application structure from Oracle, GL, load the metadata, data and load smart list to Hyperion planning application.
  • Worked with ODI OsCommand to call windows and UNIX batch scripts.

Environment: ODI 11g, Hyperion Essbase, Planning, SQL Server 2008, Oracle11g, Flat files, PL/SQL.

Confidential, Irvine, CA

Sr. ODI Consultant

Responsibilities:

  • Interacted with the business and gathered requirements and design of Data warehouse and Data mart entities.
  • Used ODI Designer for importing tables from database, reverse engineering, to develop projects, and release scenarios.
  • Used Knowledge modules to achieve client requirements such as connecting to specific technology, extracting data from it, transforming the data, checking it, integrating it, etc.
  • Used ODI Operator to monitor sessions.
  • Involved in Production Support and Monitoring of daily load plans and Data Validations
  • Developed complex Interfaces and packages using ODI.
  • Extensively used Topology Manager to create database connection, physical and logical schema in order to achieve the whole infrastructure.
  • Designed and Developed Error Handling Mechanism
  • Managed user privileges using security manager.
  • Analysis and designing of Interfaces pipelines in ODI.
  • Used Topology Manager to schedule scenario-using Agent.
  • Customized Knowledge modules to adapt in extracting data from source
  • Developed various ODI interfaces to load data from Flat file & Relational Sources to Oracle Data Mart
  • CDC Implementation

Environment: ODI11G, Oracle11g, Flat files, Siebel, Db2, Endeca, Web logic server, Red Hat Linux 6.4

Confidential, LA

Sr. ODI11G/OBIEE Consultant

Responsibilities:

  • Interacted with business and gathered requirements and design of Data warehouse entities.
  • Involved in Business requirements sessions and Technical Validation sessions.
  • Understanding existing business model, customer requirements and prepared technical design documents for various entities.
  • Used ODI Designer for importing tables from database, reverse engineering, to develop projects, and release scenarios.
  • Used Knowledge modules to achieve client requirements such as connecting to specific technology, extracting data from it, transforming the data, checking it, integrating it, etc.
  • Created ETL mappings as well as incremental loads
  • Used ODI Operator to monitor sessions
  • Used Topology Manager to schedule scenario-using Agent
  • Developed various ODI interfaces to load data from Flat file & Relational Sources to Oracle Data Mart
  • The Existing Fact and Dimensional tables, ODS were migrated from ODI ETL
  • Reviewed ODI mappings and Scheduled Jobs.
  • Involved in Hyperion data model BQYs
  • Created Adhoc reports in OBIEE and data validation against BQYs reports
  • Collaborating with DBA Teams for tuning the heavy queries and optimizing workflows.
  • Communication with multiple teams to get new data loading initiated.
  • Working with data quality team to validate the data that is being loaded.

Environment: ODI11G, OBIEE 11g, Hyperion BI, Oracle, Flat files, XML Files, Db2, Linux 6.4

Confidential, OMAHA, NE

ODI/Informatica9.X/OBIEE11g Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Worked Extensively Teradata Load Utilities like MLoad, Tpump and FastExport
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Handled End to End application implementation.
  • Analyze the data sources and mappings in informatica and developed Interfaces in Oracle

Environment: Informatica 9.X, OBIEE 11g, Oracle, Flat files, Sql Server, Db2, Netezza, Teradata, Shell Scripting, Red Hat linux 6.4

Confidential, Hartfort

Sr. ODI11G/OBIEE Consultant

Responsibilities:

  • Responsible for preparation of Technical Design document to meet the requirements of functional requirement documents.
  • Responsible for development of Code for application changes as required/specified by technical design documents.
  • Involved in PGS/UTC and Code reviews, code walk troughs and Unit testing using the test data prepared by Confidential .
  • Involved in Knowledge Acquisition/ Knowledge Transition of the Proclaim Pre-processing Domain to the newly inducted Team members and attended the DP and weekly Status meetings.
  • Assist developers in the technical design phase, construction and unit-testing phase.
  • Used File Aid for Browse, Edit View Record Layout, Reformat, and Compare records
  • Coded the IMS/DC functions, IMS/DC online programs. Developed Screens using macros Basic Mapping Support, physical, symbolic maps and I/O operations.

Environment: s: COBOL, JCL, DB2, VSAM, CICS, IMS-DB, FILE-AID, PANVLET, XPEDITOR

Confidential, Atlanta, GA

Mainframes Developer

Responsibilities:

  • Enhancements to the existing system and solving new request problems by analyzing, coding and testing.
  • Review of Design documents.
  • Preparation of Test plan and Test result documents.
  • Using Panvlet for Configuration Management.
  • Using CA7 scheduler for job monitoring and production support activities.
  • Involved in providing support for System and Integrating testing.
  • Responsible for Requirement Analysis, understanding the user requirements
  • Responsible for creation of External & Internal Designs, Test plan, Test case
  • Responsible for coding, testing and user acceptance
  • Using Xpeditor for debugging.
  • Responsible for production support for batch jobs (24x 7)
  • Analysis, design and development of Application based on business requirements
  • Involved in writing test plan, test cases which is basically targeted for regression testing and functional testing of these applications.
  • Onsite offshore Co-ordination Activities

Environment: s: COBOL, JCL, DB2, IMS DB/DC VSAM, FILE-AID, ENDEVOR, OS/390, XPEDITOR, IDCAMS, IEBGENER, QMF

Confidential

Mainframes Developer

Responsibilities:

  • Analyzing impacted programs.
  • Preparing design documents. Coding and testing.
  • Review of Design documents.
  • Coding as per the business logic.
  • Preparation of Test plan and Test result documents.
  • Using Endeavour for Configuration Management.
  • Using CA7 scheduler for job monitoring and production support activities.
  • Involved in providing support for System and Integrating testing.
  • Using Expeditor for debugging online and batch.

Environment: s: COBOL, JCL, DB2, VSAM, CICS, FILE-AID, ENDEVOR, OS/390, XPEDITOR, IDCAMS, IEBGENER, QMF, and COMPAREX.

We'd love your feedback!