We provide IT Staff Augmentation Services!

Sr. Oracle / Etl Developer Resume

Indianapolis, IN

SUMMARY

  • Over 11+ years of extensive experience in Oracle Database Development, Data Warehousing, and Data Modeling.
  • Excellent project management, scoping, development life cycle and implementation experience
  • Have expertise knowledge on development methodologies like Waterfall, Agile and Scrum practices.
  • Strong in Database Design, System Architecture and Development of Relational / Multi - Dimensional Schemas for Data Warehouse, Data Modeling, Data Migration, development, enhancement, implementation, maintenance and support of applications developed under ORACLE (11g/10g/9i/8i).
  • Worked on XML DB and parsing and loading of XML feeds into Oracle 10g using DOM parsers/ XMLTABLE .
  • Experience in Data Modeling (Entity Relationship) including logical, physical data models and data dimension data models using ERWIN/ Oracle SQL Data modeler.
  • Expertise in developing advanced PL/SQL code through various stored procedures, functions, cursors, triggers, packages and materialized views.
  • Expertise Oracle, SQL, PL/SQL and UNIX Shell scripting. Created Korn Shell and Perl scripts on Unix called by Crontab and Autosys to format and transfer data from various platforms and databases.
  • Good experience in Data Warehousing, Custom ETL’s, and Business Intelligence using Crystal Reports, Crystal Enterprise.
  • Proficient in advance features of Oracle for PL/SQL programming like using cursor variables, Ref. cursors, Varrays, nested tables and Dynamic SQL.
  • Experience in Query Performance Tuning by Analyzing tables, hints, DBMS Profiler, TKPROF,AWR, SQL Trace, Explain Plan, Bulk Collect feature, FORALL statements, and Troubleshooting Oracle database issues. Used oracle performace related features such as optimizer hints, Indexes, Partitioning, Temporary tables, v$ views to do performance analysis.
  • Expertise with Data Migration, Data Mapping or Table mapping, Data loading and exporting by using Import Export, SQL*Loader.expdp/impdp using Datapump.
  • Experience in implementing different options in Oracle like Parallel Query, Partitioning and Analytical functions.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
  • Strong knowledge of Oracle PL/SQL Wrapper to protect PL/SQL Procedures and Packages.
  • Designed and developed OLTP and Datawarehouse system, application schemas, reports using PL/SQL Stored procedures, ETL Tools, Packages, Trigger, Functions, Global Temporary Tables, External Tables, Analytical Functions, Bulk Collect, Table functions, Analytical functions.
  • Strong knowledge in Oracle Cursor management, Exception Handling and also Experience in performance tuning (PL/SQL).
  • Experience with handling very large data files using export & import and SQL*Loader.
  • Experience in Design and Implemention of ETL (Extract, Transform, Load) using I nformatica Cloud/PowerCenter, SAS DI Studio.
  • E xpertise in writing/ debugging of Stored Procedures, Functions, Triggers, Views, Indexes and Sequences.
  • Extensive solid hands-on experience in Oracle Development - SQL, PL/SQL, Shell Scripting (bash/ksh/csh).
  • Strong foundation in Documentation, Quality Process, Hardware/Software Evaluation and Estimation. Exposure to high level design and building efficient databases.
  • Experienced in building classic reports, interactive reports and dynamic PL/SQL reports to enhance the performance.
  • Experienced in using JavaScript, HTML, and CSS and also built standard templates to meet the corporate themes for the application
  • Strong Analytical, Organizational, Inter-personal and Communication Skills.

TECHNICAL SKILLS

RDBMS: ORACLE 12c/11g/10g/ 9i/ 8i, SQL Server 2000/2005, DB2 UDB 7.8, MS-ACCESS, Oracle APEX 3.0/3.2/4.0/4.2/5.0/5.1

Oracle Utilities/Tools: SQL*NET, IMPORT/EXPORT, SQL*LOADER, TKPROF, TOAD.

Design Tools: Toad Data Modeler, Embarcadero ER/Studio, Erwin 3.5.2/ 4.0.Oracle SQL data modeler.

ETL Tools: SAS DI Studio 4.903, Informatica 9.6.1.Informatica Cloud.

BI Tools: Crystal Reports 8.0/8.5/9.0/10.0 , Business Objects 6.x/XI R2/R3.

Application Server: Oracle 10g Application Server, Oracle9i Application Server Release 2 (9.0.2.0.1)/1.0.2.2, Apache, Windows 2000 Server / Windows 2003 Server / IIS Plus

Web servers: IBM Web Sphere, IIS, Apache Tomcat.

Operating Systems: HP UNIX, Solaris 2.x, IBM AIX 5.1, LINUX, Windows XP, NT 2003/ 2000.

Languages: SQL, PL/SQL, Perl and UNIX Shell scripting (ksh/bash/csh)

PROFESSIONAL EXPERIENCE

Confidential, Indianapolis, IN

Sr. Oracle / ETL Developer

Responsibilities:

  • Worked on HCM conversion project to pull SAP HR data to be migrated to Workday HCM Cloud. This project is currently live.
  • Responsible for migration of Contingent Worker data that include using PL/SQL procedures/Packages / Functions/ Analytic functions / Bulk collect/ Views/ Regular expressions to implement Business Logic that denormalize tables from various source systems into staging tables and create CSV extract file for loading into a single Cloud Vendor management system called Beeline
  • Worked on writing complex PL/SQL stored procedures and functions to create file import data extract files which are imported to Workday
  • Worked on data cleansing of phone number, address, suppliers, emails, dates in both SAP and affiliate files using Regular expressions.
  • Worked with business users to understand the logic and coded complex adhoc pl/sql scripts to create EIB loads for import to Workday using PL/SQL Procedures/Packages
  • Extensively used the advanced features of PL/SQL Collections like Records, Nested Tables, Object types and Dynamic SQL
  • Worked on different phases of data validation process with affiliates around the world and input affiliate validated data to be imported to Workday.
  • Created complex SQL procedures to create certification files that validate data between SAP and Workday
  • Worked with offshore and onshore Workday team to coordinate data files to be loaded into workday
  • Worked on Data Dictionary views to create sc ripts, check indexes, performance of Procedure
  • Worked with Collections to reduce overhead involved with cursor looping. Bulk fetched the data from sql engine into collection. Handled exceptions using SAVE EXCEPTIONS and bulk exception.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Worked with Lilly team to determine workers who are ineligible for rehire and developing Dynamic sql code using execute immediate that uses Bulk Collect/ table functions/Pipeline function / Collections / PRAGMA autonomous transactions/ User defined Exceptions/ Exception handling/Cursors (Implicit and Explicit),REF Cursors to pull historical information of employee and contingent worker data and applying logic to determine the rehires.
  • Developed Materialized views that with fast and incremental refreshes
  • Worked on extracting reports from Workday and validating data against SAP for all the contingent workers in delta period and created various adhoc reports to determine contract end dates, extension dates, entitlements and various other data points.
  • Created batch files for automation of SQL script files and scheduled those using windows scheduler .
  • Error handling using system defined exceptions and user defined exceptions like NO DATA FOUND etc., and PRAGMA EXCEPTION INIT.

Confidential, Richmond, VA

Sr. Oracle /ETL Developer

Responsibilities:

  • Responsible for requirement gathering and reports required for initial sprints from business users
  • Worked on creating SQL scripts that pull data from PeopleSoft CRM data to CSV files which act as a source for SAS Data Integration Studio
  • Involve in planning and modelling of data migration plan to migrate data from PeopleSoft CRM Legacy system to Oracle Cloud Data Warehouse and creating Dimensions, Facts in Data Mart for OLAP Analytics.
  • Creating SAS DI studio ETL Workflow Mappings that map Data hub (DWH) to Datamarts in Oracle Cloud and deploying SAS DI jobs.
  • Creating SAS Macros to automate insert and update tables based on metadata information.
  • Developing Oracle objects such as Tables, Views, Indexes, Stored Procedures and Functions in PL/SQL, Packages in PL/SQL, Materialized Views, and Dynamic SQL.
  • Creating complex Stored Procedures to check Referential Integrity and Data Quality.
  • Developing and unit testing PL/SQL scripts involved in moving data from PeopleSoft Legacy data to Oracle Cloud Data Warehouse.
  • Wrote PL/SQL Database triggers to implement the business rules in the application.
  • Understanding key concepts of underwriting such as written premium, earned premium, unearned premium reserve to develop logic for new underwriting datamart.
  • Create SAS DI studio ETL Workflow Mappings that map PeopleSoft data and other flat files into Oracle Exadata Cloud database.
  • Worked as a part of GOLDEN GATE change data capture process for daily delta loads.
  • Developing Data lineage for that maps from source to target Da tabase including transformations, generate Data Counts for the mapped fields and updating database with latest mappings.
  • Reconciliation using PL/SQL views /Cursors, Procedure/Packages of gross premium amounts and claim payment amounts that include the policy holders and the claimant in the policy.
  • Involved in Database Application Development, Query Optimization, Performance Tuning and DBA solutions and implementation experience in complete System Development Life Cycle.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
  • Carrying out Users Acceptance Test (UAT), go-live deployment activities and Post implementation support.
  • ETL development using PL/SQL, analysis of the existing database, BULK loading, Data Partitioning to load the data in Oracle Database.
  • Extensively used the advanced features of PL/SQL Collections like Records, Nested Tables, Object types and Dynamic SQL
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
  • Involved in developing UNIX shell scripts for Loading Database Tables.
  • Worked on JIRA tool to script EPICS and user stories and complete sprints using SCRUM methodology
  • Worked on creating documentation for Data Migration from PeopleSoft to Oracle Cloud DWH.
  • Worked on creating adhoc reports using Microstrategy Reporting.
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
  • Writing scripts for Data Staging, Loading and cleanup process.
  • Worked with Germany Team and organized daily meeting for Data Modeling Data warehouse tables.

Environment: Oracle 12c, SAS 9.3, SAS Data Integration Studio, Micro strategy Reporting, ADAPT, UNIX Shell Scripting, Toad for Oracle, JIRA, Putty

Confidential, Richmond, VA

Sr. Oracle/ETL Developer

Responsibilities:

  • Worked on data migration project where data from oracle legacy database is migrated to Salesforce.
  • Creating PL/SQL Scripts to stage required data from several tables into a denormalized staging area.
  • Responsible for creating adhoc scripts that stage data from the staging tables into view area as required for Salesforce tables.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Involved in the continuous enhancements and fixing of production issues.
  • Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
  • Used Bulk Collections in PL/SQL objects to improve performance by reducing context switching between SQL and PL/SQL engines.
  • Handled errors using Exception Handling for the ease of debugging and displaying the error messages in the application.
  • Performance tuning of complex queries and tuning procedures by analysing the run time statistics generated by using utilities like Explain Plan, SQL Trace, TKPROF and AWR reports.
  • Worked on DBMS SCHEDULER to execute the stored procedures in repeat interval to calculate the business logic for the data.
  • Working on the ETL side of the process to load data into database from different servers.
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day. Used standard packages like UTL FILE, DMBS SQL and used BULK Binding involved in writing database procedures, functions and packages for Front End Module.
  • Experienced and skilled Agile Developer with a strong record of excellent teamwork and successful coding project management. Specializes in problem identification and proposal of alternative solutions.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Responsible for creating views that act as source for Informatica Cloud Mappings which are mapped to Salesforce Cloud
  • Responsible for scripts that reconcile data from source to target.
  • Responsible for creating documentation for data migration

Environment: Oracle 10g/11g, Oracle Enterprise Manager, SQL*Loader, SQL*Plus, PL/SQL Developer, UNIX Shell Scripting, Toad for Oracle

Hire Now