Data Engineer Resume
SUMMARY
- More than 14 years of IT experience in planning, designing, developing, testing and integrating large scale data warehouse applications and core banking domain.
- 8+ years of strong experience on Unix shell scripting, Oracle (Pl/Sql and performance tuning)
- 6+ years of strong Experience in Design and Development of ETL in ODI (Oracle Data Integrator).
- Around 4 years of Experience with AWS services (Redshift, S3, EC2, Redshift Spectrum)
- Experienced in ODI 10G/11G and 12C.
- Experience on Netezza warehouse.
- Strong domain knowledge in Retail and Core banking financial services.
- Experience in managing complex production and development environment.
- Strong skills in tools and technologies relevant to enterprise data management including; conceptual, logical & physical data modeling (3NF & Dimensional), physical database design, data integration, data mapping & lineage maintenance, data stewardship, metadata maintenance, performance tuning/volume customization, capacity estimations.
- Expertise includes analysis, design, and development, testing of data warehousing application using Oracle Data Integrator ODI (Topology Manager, Security Manager, Designer, Knowledge Modules and Operator).
- Analyzed the Business requirement for Oracle Data Integrator and mapped the architecture and used ODI for reverse engineering to retrieve metadata from data storage and load it to the repository.
- Used Oracle Data Integrator Designer (ODI) to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
- Worked on slowly changing dimensions (SCD), Change data capture (CDC) as part of data warehousing concepts.
- Expertise in creating stored procedures, functions packages, collections and bulk collections using PL/SQL.
- Extensively involved in SQL Tuning to enhance the performance of the complex queries for Management Reports and other business logic.
- Thorough knowledge of Software Development Life Cycle (SDLC) with deep understanding of various phases like Requirements gathering, Analysis, Design, Development and Testing.
- Performed Unit, Functional, Regression, Integration, Database and User Acceptance Testing (UAT).
- Experience in writing Technical design documents and developing System and unit test documents.
TECHNICAL SKILLS
Operating Systems: Linux/Unix/AIX, Windows
Programming Languages: PL/SQL, Unix Shell scripts, AWK & COBOL
Database: Oracle 9i and 11g, Redshift, Netezza, Essbase cubes
Database modeling tools: ER Studio, Erwin Data Modeler
ETL Tools: Oracle Data Integrator 10G/11G/12C
Reporting Tools: OBIEE, MicroStrategy
Version Control Tools: Subversion
Scheduling Tools: Unix, Crontab & Control - M
Other Tools: Pl/SQL Developer, SQL Developer, TOAD
PROFESSIONAL EXPERIENCE
Confidential
Data Engineer
Responsibilities:
- Design, Develop, implement and enhance the ODI real time and batch jobs.
- Customizing the knowledge modules to increase the performance.
- Create SQL, PL/SQL scripts for sourcing data, including creating tables, Materialized views, stored procedures, and loading data into the tables.
- Implement slowly changing dimensions (SCD), Change data capture (CDC) using oracle data integrator.
Environment: ODI 12.2.1, Oracle Database, Erwin Data modeler, Control-M
Confidential
Sr. ODI Developer / Data Modeler
Responsibilities:
- Design, Develop, implement and enhance the real time synchronization between salesforce.com, OMB (Order Maintenance and Billing), JDE and BigMachines using ODI.
- Create and configure ODI Master, work and execution repositories.
- Create and Configure standalone ODI agent as window service - YAJSW method.
- Migrate ETL mappings and procedures across environments - development to UAT/SIT and eventually to production and building migration documents (transport documents).
- Installing and configuring the Oracle Data Integrator (ODI) standalone edition.
- Create SQL, PL/SQL scripts for sourcing data, including creating tables, Materialized views, stored procedures, and loading data into the tables.
- Develop, implement and enhance the Data Warehouse data model and performing data loads using Oracle data Integrator.
- Implement slowly changing dimensions (SCD), Change data capture (CDC) using oracle data integrator.
Environment: ODI 12.1.3,Oracle 12g,Salesforce.com,Jira,JDE, Erwin Data modeler, Redshift
Confidential
Redshift Data modeler/Data Mart developer
Responsibilities:
- Developing, implementing, enhancing, and evolving the overall Enterprise Data Warehouse (EDW) BI solutions, reports/dashboards, semantic layer, and supporting technologies (including SQL view creation).
- Accountability for developing, documenting, and communicating BI best practices, naming conventions, promotion processes, and solution patterns.
- Validate the data mappings and the data model changes. Work with the production support, ETL & DBA to create DDL scripts and facilitate its testing and deployment.
- Investigating data reconciliation and data processing issues and driving the issues to closure.
- Build of subject area independent, target state data warehouse, using our standard BI platform (MicroStrategy).
- Worked on the database migration from Netezza to AWS Redshift. Worked closely with the vendor on the plan and scripts needed to migrate the data.
- Extensively used S3, Glacier to archive date from redshift.
- Extensively used Amazon spectrum to read the data from S3 Data lake to combine the data with redshift.
- Accountable for driving new applications within the BI landscape, including the functionality and/or performance of EDW BI analytics/reports.
- Input to and will serve as the SME related to new enhancements, upgrades, and capacity planning related to the BI platform.
- Sizing/scoping of future projects/enhancements to our EDW solution.
- Work directly with our internal database design, data integration, and solution support teams to address critical incidents, as well as our vendors (both service and software) to improve the stability and performance of our EDW.
- Ensure that promoted BI objects (reports/meta data/database views/etc.) are in keeping with the specified BI design standards.
- Create tools and utilities to manage all environment and automate data model deployments
- Mentoring Team in Data Model Techniques and Domain Understanding.
Environment: AWS, RedShift,S3,EC2, Redshift spectrum, Glue, Netezza, Erwin Data Modeler, Informatica, MicroStrategy, Oracle 11g, HP Quality Center, SAP, Hyperion
Confidential, Irving, TX
Sr. ODI Developer
Responsibilities:
- Create and deploy ETL mappings and develop PL SQL procedures to execute complex data transformations
- Develop and deploy ETL logic with Oracle Data Integrator.
- Modified existing Knowledge modules to meet the client requirements.
- Involved in the process of migrating ETL mappings and procedures across environments - development to UAT/SIT and eventually to production and building migration documents (transport documents)
- Extensive use of Oracle Analytic functions to meet customer requirements
- Expertise in Configuring Oracle Data Integrator.
- Assisted in writing custom Essbase calculations scripts, MDX scripts.
- Provided technical support in the design and development of numerous Essbase cubes and various OLAP Application Models using Hyperion Essbase.
- Experience in Loading Essbase metadata with Oracle Data Integrator
Environment: Oracle Data Integrator (ODI) 11g, Oracle 11g, SQL*Plus, Essbase cubes and AIX.
Confidential
Sr. ODI Developer
Responsibilities:
- Primary responsibility includes installing and configuring the Oracle Data Integrator (ODI) software tool in a three-tier environment and performing periodic upgrades, performing source-to-target mappings, storage capacity planning, developing ETL.
- Experience in running RCC (Repository consistency check) to see if master and work repositories are consistence.
- Experience in using new concept called mapping which is newly introduced in ODI12c. Created mappings with joining multiple sources and loading to multiple tables using IN and OUT connectors.
- Manage the ODI development team with Onsite/Offshore delivery model.
- Gathering requirements from users to develop interfaces, packages, Load plans.
- Resolved day to day production issues.
- Performed calculation if delta (Incremental) data through UNIX scripts.
- Increased performance of loads by applying proper Hints, Indexes, analyzing master tables and explain plans to check cost.
- Worked on setting up crontabs on UNIX servers for scheduling the jobs.
- Worked on using files system (DSV) as source and finally load data to target which is oracle tables.
- Worked on using Oracle loader utility for loading data from file to tables using external table concept.
- Experience in Dimensional Modeling such as star schema, snow flake schema, creating Facts, Dimensions and Measures.
- Created SQL scripts for sourcing data, including creating tables, Materialized views, stored procedures, and loading data into the tables.
- Used ODI Designer for importing tables from database, reverse engineering, to develop projects, and release scenarios.
- Used User functions for converting date format from AS400 to Oracle.
- Implemented Synchronous and Asynchronous Change Data Capture (CDC) techniques.
- Used Knowledge modules to achieve client requirements such as connecting to specific technology, extracting data from it, transforming the data, checking it, integrating it.
- Experience with Business systems such as Sales, Inventory and Order Management.
- Used Interfaces to load data from Flat files, CSV files in to staging area (Oracle) and load in to Oracle data warehouse
- Analyzed the Business requirement for Oracle Data Integrator and mapped the architecture and used ODI for reverse engineering to retrieve metadata from data storage and load it to the repository.
- Used odiparam.bat file for updating recently encoded password for master repository and used startscen.bat file to start the scenario using agent. These scenarios are finally deployed to production.
- Applied slowly changing dimensions (SCD) in various Mappings to load data from source to target and maintained current and historical data on year wise in all brand dimensions.
- Analyzed Session log files in operator navigator to resolve errors in mapping and managed session configuration.
Environment: Oracle Data Integrator (ODI) 11g, Oracle 11g, SQL*Plus, TOAD, and AIX.