We provide IT Staff Augmentation Services!

Senior Etl Architect/data Modeler/developer Resume

SUMMARY

  • 10+ years of experience as an application, ETL, database, Business Intelligence and data warehouse Developer/Data Modeler on various ETL/BI products and around 4+ as Data Architect.
  • Co - Author of ODI blog ODIExperts.com and Developer of Open Source ODI driver ODI JDBC Excel, ODI JDBC Access, ODI Parallel and ODI HTML Email.
  • Expert in Installation, Configuration, and Development of Oracle Data Integrator (ODI) objects and expert in ODI SDK, Java and other programming languages.
  • Expert in developing out of box ODI Knowledge Modules and various versions of ODI.
  • Extensive experience and expertise in Data Modeling and Architect. Designed and Architected various scalable OLAP & OLTP ranging from XML Data Model to 3 Billion Rows Dimensional Model.
  • Intensive experience in Database and Non Database such as Oracle (including Exadata), SQL Server, DB2, Informix, MySQL, Json, XML, REST Web Service and so on…
  • Extensive experience with Big Data tools and languages such Hadoop, Hive, Pig, Impala and HBase few including group projects.
  • Experience with other BI tools like Informatica, Data Stage, SSIS, OBIEE & RPD, OWB and others.
  • Expert in RDBMS concepts and experience in creating, maintaining and tuning database and database objects and experience in database architecture, process flow diagrams, data flow diagrams and ER diagrams.
  • Multiple experience mainly in database development using SQL that includes DML, DDL and DCL, TCL, PL/SQL that includes cursors, ref-cursors, Bulking techniques, collections (nested tables, partitioned tables, anonymous, stored procedures, functions, triggers, packages, materialized views, dimensions, cubes, bitmap indexes, index organized tables (IOT), External table and dynamic SQL etc. performed query optimization, database performance tuning etc.

TECHNICAL SKILLS

RDBMS: Oracle 12, 11g, 10g, 9i, 8i, SQL server 2000, 2005, 2008, Informix, DB2, MySQL, Teradata, PostgreSQL

Data Modeling Tools: Visio, ER-studio, Oracle Data Modeler

Oracle/SW Tools: Oracle Data Integrator 12C,11g,10, Informatica Power Center 4.8, 6.2 , Hyperion EPM 11 Suite, OBIEE, SAP BI, FI, MM,SSIS and more..

Languages: SQL, PL/SQL, TSQL, Shell Scripting, ODI API, Java, Python, Groovy, Hive, Pig and more…

PROFESSIONAL EXPERIENCE

Confidential

Senior ETL Architect/Data Modeler/Developer

Responsibilities:

  • The client maintains Tower Service Ticket in Web Service Tools which sends data in JSON format. This data is cyclic process once a token is passed. ODI directly failed to read the webservice because of the authentication issue. To resolve the authentication issue, custom Groovy code is passed along with token and authentication and stores the output JSON data in the oracle CLOB column. Using Oracle JSON parser the data is parsed and loaded into Oracle tables.
  • The client maintains CRM data in their Salesforce and ODI does not support to read the data directly from Salesforce. To resolve, this Custom Technology was created to reverse Tables. Jython RKM was modified to read the metadata of the tables and extract data into Oracle. Custom IKM was also created to insert into Salesforce.
  • Created LDAP Connection to extract data from the Active Directory environments to Extract Users and Group Data for US and Global market. This data is then modified to identify the SCD Type 2 functionality which is read by OBIEE to identify the Users report permission.
  • Client maintains a custom application which is extracts data from multiple source system for the Auditing of Equipment’s and Towers and their information is later processed and stored as an Oracle CLOB column containing on average 700- 1500 lines of Simple to Multiple Grouped Hierarchical XML data, parsed and populated into Dimensional Model.
  • The extension of the above application was created to re-audit the inspected data to maintain and filter data quality issues which led to extension of the above application. This complex extension handles and coordinate data from the various audits for weekly and Auditor comparison.
  • Client purchased an important cloud application project management solution for Internal processes. This application stores data into Flat structure in oracle Tables. Designed the process to read the flat data into Dimension Modeling and process the data into oracle every 10-15 min cycle.
  • I also upgrade the ODI Repository from ODI 11 to ODI 12 during the initial part of the project. I worked with Administration team to help them and guide them during the upgrade process.
  • Client recently updated the oracle BIAPPS environment into R12 which also required to update the tool from Informatica to ODI 11. Some of the custom jobs was created to handle the Informatica to ODI 11 upgrade.
  • Many small projects and complex ETL functionalities also implemented with few to mention like SSIS to ODI Conversion, SharePoint Data Extractions, KM creations for various above logics and functionalities. Data Modeling of Transactional and Star Schema design.
  • In the above position and tasks, I have played the role of the Architect, Data modeler and Developer. All the development other than BIAPPS is developed using ODI 12c.

Confidential

Senior ETL (ODI) Architect/Data Modeler/Developer

Responsibilities:

  • I helped created the Data Warehouse Design capable of using the Exadata power and performance to extract and load data and support for International characters, from various AS400 systems across the globe, along with optimized partitioning strategy to support Global reporting.
  • Global Orders is one of the most important ETL for client but the existing ETL had complex steps which took many hours and on Month end sometime even around a day. Redeveloped the complete Order Job leading to less Interface Steps and reducing Execution time from 2 hours per Million to 20 Minutes per Million.
  • The client support hierarchical marketing Customer Support which included to handle bonus for upper part of the chain. Create a Multi Merge Loop Process, handling recursive looping of hierarchical data of 100 million, loading time between 30-50 min which would have taken few hours using connect by level or any other recursive process.
  • Developed a Master KM which handles multiple requirement into One (IKM Merge/Incremental Update/Multi Insert/Exchange partition) and support for more logic. Developed a PLSQL Package which can handle Exchange Partitioning for any type of Partitioning / Partition Sub Partition Combination, which can be called independent and through Master KM.
  • Created a PLSQL Package to handle multiple SCD Type 2 requirement for each unique Column present in the table thus providing detailed history.
  • Create a Java Validation Program which can validate data between Source AS400 and Target (Oracle) Systems thus reducing Validation Hours and fetching unmatched or issues related to data with 99.9% accuracy.
  • Automated some of the repetitive Data Extract Process through ODI SDK libraries thus reducing ETL Development time and cost drastically.
  • Along with few above mentioned tasks I have developed 100's of Interface to handle various Daily ETL logic and client’s requirement. Data for ODS is sourced from AS400, Oracle, SQL Server through ODI 11. In this role I have played as Architect, Data Modeler and ETL Developer.

Confidential

Senior ETL (ODI) Developer/Analyst

Responsibilities:

  • Developed/Architected a complete data Warehouse to handle and identify Student across all university in State of Georgia.
  • Created a Master Scenario and Error capturing mechanism involving Sunopsis API, Java bean shell .This scenario is called by all other Scenario so any kind of ODI error is captured.
  • Data is fetched from various Oracle Transaction System and populated into Oracle data warehouse and data was processed through various ETL logic and populated in the Facts and Dimension.

Confidential

Senior ETL (ODI) Developer/Analyst

Responsibilities:

  • Sourcing the data from the JD Edwards and Oracle System and loading the data into a Hub for temporary data storage required transformation and validation.
  • Setting up Topology and other User's Profile.
  • Creating the Interface for required data loading and also modifying the KM for support of Partition by clause and No grouping support.
  • Packaging the above interface and creating the support to handle or log the Failed ODI objects into File and later retrieving the same for handling error interface without going into operator.
  • Jython Scripts to capture all the error into File by retrieving the information from SNP CHECK TAB table.
  • Creating Wrapper scenarios which reads other interface scenarios and catalogs the errors and failure into respective File and can be called for all the interface scenarios.
  • ODI scheduler scripts to be triggered by external scheduler UC4 and supporting the same in Development, UAT and Prod.
  • Transferring the scenarios from Development to UAT and to Production.
  • Knowledge transfer of the above to the client in detail by verbal communication and written documentation.
  • Sourcing the data from AS400 system and loading the data into Oracle targets.
  • Looping to reach through each of the source AS400 for individual country and looping through each country and finally loading the same into target and automatic the above into one single piece of scenario.
  • Experience with Inventory Optimization Supply chain data and design processing.

Confidential

Senior ODI - ETL Specialist

Responsibilities:

  • Fetching Data from ODS (Operational Data Store) to Forecasting Application tables based on Simple and complex transformation and Logic.
  • Design and Development of the Logic and related Interface, Package based on the required on it.
  • Parallel Design and Development on Interface Development based on the Source of the Data for Multiple Work Repository (Development).
  • Providing support on Test and Production environment.

Confidential

ETL Developer

Responsibilities:

  • Migrating Informatica Mappings into ODI Interface and Packages that will update the Oracle target.
  • Integration of ODI Services with Service Oriented Architecture (SOA) architecture
  • Integrating ETL Process flow from ODI to Oracle Business Activity Monitoring tool (BAM)
  • Created ODI Interface of various complexities, Lookup, Condition, Constraints.
  • Created around 300 interfaces and packages of various complexities.
  • Loading of Data from Multiple Flat File, People Soft, SAP sources and loading into Oracle Target.
  • Optimized Existing process flow of Informatica and defining the same Data Transformation into fewer steps using ODI.
  • Data matching between the Informatica and ODI data match for row by row testing.
  • Integration of Unix Scripts for Pre-and Post Load process with BPEL process and retrieving the log in BAM.

Hire Now