We provide IT Staff Augmentation Services!

Senior Oracle Data Warehousing Data Architect Resume Profile

2.00/5 (Submit Your Rating)

SUMMARY

  • A senior IT consultant and Oracle Certified Associate with over 12 years of experience in leading small sized teams, systems analysis, design and development of various Client/Sever, n-tier projects using state of the art technologies like Oracle SQL, PL/SQL for large investment Banking firms. Strong experience, knowledge and ability to learn and adapt to the rapidly changing Finance domain.
  • Undergoing Training at American Technical Institute for PMP Certification.
  • Strong Query Tuning experience with large production systems 500 GB within financial environment application development with Banking Databases
  • Hands-on working experience in Leading Market product such as Charles River and Eagle Pace.
  • Sound knowledge programming languages such as C, C and Java in context of building Database connectivity and troubleshooting production issues.
  • Implementation experience in Capital Market investment tools such as Equity, Fixed Income, Money Market and Derivatives Interest Rate Swaps module for an investment banking client.
  • Sound understanding of Compliance area in Investment Banking domain through project experience.
  • Experienced in performance tuning and optimization using Explain Plan, Trace Plans, hints, Toad, Enterprise Manager and Grid control.
  • Proficient working experience in SQL, PL/SQL Client and Server side Procedures, Functions, Packages, Objects, Triggers for application development.
  • Expert experience in developing generic API programs for Database Maintenance Operations such as Partitioning, Index Re-build, Analyze /Gather stats on Objects.
  • Strong experience in Oracle 11g RAC environments and sound working experience on Business Intelligence BI using Business Objects.
  • Strong experience in Database Architecture standardization
  • Good knowledge of Oracle ETL features such as Merge Statement, SQL Loader, External Tables, Pipelined/Table functions and Multi Table Inserts.
  • Strong working Experience with CLOB, XML data types for Automating User manual feeds load and usage of Materialized views.
  • Implementation of RDBMS Concepts and Oracle 10g OLAP features in building Zurich/KAM/CTS Data mart and ODS oracle data source and enhancing Client data warehouse CDW clubbed with Business Intelligence.
  • Freelance Performance tuning consulting for Credit Suisse PCDW Datawarehouse project.
  • Experience in UNIX Shell Scripts, Perl Db routines for Connectivity.
  • Strong experience in implementing Chang Data Capture methodologies in Datawarehousing.
  • Sound experience in Database Capacity Planning to give database and Operating System space requirement estimations to DBA and other infrastructure teams.
  • Strong experience in acting as Database Performance tuning and Root Cause Analysis of any performance issues.

PROFESSIONAL EXPERIENCE

Confidential

  • Worked on Database Performance Troubleshooting issues
  • Design, Develop and maintain ETL framework with atomic jobs for loading data from multiple sources flat file data each for Staging, ODS and Fact loads using Informatica Powercenter
  • Do performance benchmarking for existing legacy processes.
  • Build dynamic Stored Procedure interfaces to source the data to Business Objects
  • Build generic PL/SQL interfaces for Data Cleansing, Validation and Transformations.
  • Implement Database Deployment infrastructure including Release Shell Scripts, Batch scripts, Build All and Upgrade SQL Scripts along with maintainable SVN folder structure to store Database Objects, DML scripts and Release scripts mentioned above.
  • Develop generic Workflow Data Model used across all Global Compliance Practice areas.
  • Develop Conceptual and Logical Data Model for Compliance Risk Assessment Practice area which includes Subject Areas such as Risk Assessment Units and Survey Questionnaires
  • Involved in Master Data Management effort to understand inconsistencies, redundancies and gaps in existing Reference data.
  • Estimate, Design and Develop Offline Data Migration Strategy from Legacy SQL Server Database to Oracle and develop a customized ETL framework to enable that including tools like BCP,SQL Loader and Oracle PL/SQL
  • Data Modelling for BiTemporal Versioning.
  • Platform for different practices areas in Operation Compliance Risks such as Regulation and Obligation Management, Policy and Procedure Management, Training and awareness, Monitoring and Risk Assessment. Data will be sourced from various internal legacy systems.
  • As part of Global Compliance Risk Management work involved around in Building a Strategic Reporting and Tracking

Environment: Oracle 11gR2 with RAC, Business Objects XI Web Intelligence, Perforce, Control M, Spring JDBC, JQuery, Hibernate OAM Framework, SQL Server 2008 R2

Senior Oracle Data warehousing Data Architect, ETL Lead and Data Modeler

  • Design, review and presentation of Data Flow Diagrams, Conceptual and Logical Data Models according to Subject areas identified focused on PHASE 2 Strategic release.
  • Actively involved in Design decisions for Spring JDBC vs. Hibernate ORM as middle tier framework. Two main points on which did evaluation was Architectural Scaling and Performance Scaling.
  • Identify and create new domains as necessary across each subject area.
  • Worked on Migration of Legacy SQL Server Database to new Server.
  • For PHASE 1 Tractical release Reverse engineer existing SQL Server database for Legal Obligation Tracking to create a Logic Model to understand relationships and referential integrity.
  • Based on the Model and BRD document create low level Use Case Document with multiple action flows.
  • Translate Business Use Cases to Technical Test Cases .Based on the Use cases above Design, Develop and Unit test stored procedures. Also performed end to end Integration Testing of these Use cases with UI, Middle Tier components.
  • Developed Modular and Generic PL/SQL interfaces for Admin Utility based on Entitlements Data Model.
  • Design, Develop and Implement Process Logging Framework for Regulation Library in Oracle.
  • As part of consolidation effort carried out Gap Analysis at attribute level for legacy systems.
  • Code review from point of view of Manageability, Readability, Reusability and Performance Scalability.

Confidential

  • As part of this new initiative various Time Critical Regulatory Reports being developed for various Data work streams like Inventory Data, ARM Data etc. Users will have added flexibilities to trigger workflows which will seamlessly regenerate the reports as a result of various actions such as Security Overrides, Security Adjustment, Lookup values modifications.
  • As part of Finance Control Regulatory Reporting Data warehousing team, CASTLE application consolidates Position and TETB data from upstream Settlement systems and also Reference data like Security Master from external Market Source like Bloomberg. This data is further loaded, cleansed and transformed in reportable format to be able to generate reports as per Regulatory guidelines to be consumed by Finance Control Business.

Environment: Oracle 11gR2 with RAC, Business Objects XI Web Intelligence, Perforce, Autosys, Microsoft .Net based RFPF framework for reporting.

Senior Oracle Data warehousing Data Architect/Developer/Data Modeler

  • Designing Data Model for Hierarchical Reference Data Taxonomies such as Legal Entities, Line of Business etc.
  • Design Application level Entitlements Data Model
  • Develop data Model which includes new FACT tables with version id and run id to support Intraday Versioning initiated by Business Workflows as mentioned above.
  • Designed Security Override process for overriding Static Security Attributes using ISIN, CUSIP, and SEDOL etc.
  • Designed Optimized Load Processes to load FACT tables which will cater to Base load and subsequent intraday runs.
  • Developed Database Performance Diagnostic Utility to troubleshoot Performance issues and identify bottlenecks.
  • Designed Autosys jobs considering upstream and downstream dependencies and concurrent processing for Optimized Batch timings.

Confidential

Senior Oracle Data warehousing ETL Developer and Data Modeller

Environment: Oracle 11gR2 ExaData with RAC, Informatica Powercenter 9, Business Objects XI Web Intelligence, Subversion TortoiseSVN 1.6.16 , IBM Tivoli Enterprise Scheduler

  • The Asset Management AM work stream includes a number of EVM components relating to the investments held by Swiss Re. The data that flows into the system is primarily the assets held by Swiss Re in a US GAAP form both balance sheet and income statement . It is this that gets translated into EVM format, in theory generating balance sheets at for different times, which can be reconciled back to each other via a corresponding generated income statement. This document primarily deals with the reporting requirement of, not only, Asset Management but also those of Legacy, Treasury, Group and P C investments other groups in relation to GFD carry-over positions and subsequent EVM calculations.
  • To this end the EVM Auditability project will cater to Business needs to provide full year EVM figures and they are fully audit compliant on publication. The key to the auditability is demonstrating that the process is consistent and that the figures can't be manipulated without there being controls in place.The other stream of the project includes reconciliation of Balance Sheets and PnL accounts from multiple sources and provide the exception report for the same.
  • Design, Develop and maintain ETL framework with atomic jobs for loading data from multiple sources flat file data each for Staging, ODS and Fact loads using Informatica Powercenter.
  • Design, Develop and maintain ETL framework with atomic jobs for loading data from Mainframe Based Global Accounting System by dynamically generated parameter files using Workflow parameter files.
  • Develop Excel Based Report Mockups to be reviewed by Business using Adavnced Excel Feature such as Pivot Tables and Pivot Graphs
  • Implement Persistent Data Masking routines using Informatica to hide Sensitive Business data in Test environment.
  • Design and develop generic PL/SQL Pipelined function to support dynamic input of Report Filters and Prompts with ability to pass one or multiple values for a given filter.
  • Design and develop generic PL/SQL Pipelined function with ability to provide all the static attributes at the lowest grain of the data. This is to feciliate Adhoc slicing and dicing of data in BO at Report Level.
  • Design and Develop generic PL/SQL Pipelined function for Static US GAAP to EVM walk static reports where Actual EVM number is the translation based various Mappings and filters with GAAP number as a starting point.From Data Model prospective each column of the report is derived from a Aggregated Fact table.
  • Design and Develop generic PL/SQL Pipelined function for Stacked Graph report with multiple reset points.In here starting form left to right numbers are displayed in reference to first bar.
  • Designing and developing generic PL/SQL pipelined function to generate Pivot results by using Advance PIVOT and CUBE features in Oracle 11g.
  • Designing Dimensional Data Model keeping in mind Historical and Ad-hoc reporting requirements.
  • Design Aggregated Materialized Views to transparently improve report performance using Query Rewrite
  • Optimize Database maintenance operations like MV refresh, Gather Statistics and Index build for performance.
  • Driving the effort to migrate and test ETL Oracle 11g to Oracle 11g ExaData for legacy as well as strategic Data warehouse.
  • Extensive Performance Tuning of any performance issues reported using tools including but not limited to Explain PLAN,TKPROF,SQL Profile,SQL Tuning Advisor and SQL Tuning sets

Confidential

Senior Oracle Data warehousing and ETL Developer

Environment: Oracle 11gR2 ExaData, Datastage 8.5, Business Objects 3.1, Perforce, TIDAL Enterprise Scheduler

  • Touchpoint is used to report all the Communication between Lord Abbett and its clients. Example includes Emails sent to FA, RM visit to FA, Documents order by FA, FA visit to LA website .This data is received from multiple Source Applications. The typical types of Client in this context are Financial Advisor, Home Office Contact, Center of Influence
  • Database consists of two schemas one is reference which is a Star Schema used for reporting and ODS which is a normalized schema. Reference Schema stores Dimensional data for touchpoints,RR,Employee,Clearing Branch,Teritory,Broker dealer,Cmapeign along with more generic Dimensions like Time and Date.ETL process puts data in Staging area before processing it into ODS .Staging data is periodically purged to maintain 3 Months running data.
  • The BO Reporting Universe is based on data in ODS and Reference schema and is refreshed daily for canned report
  • and custom user reports.
  • Design Flexible ETL framework with atomic jobs for each of Staging, ODS and Fact loads.
  • Develop, Enhance, Tune and Maintain ETL processes to load external source data into Star Schema
  • Historical data correction by customizing existing ETL jobs.
  • Optimize Database maintenance operations like MV refresh, Gather Statistics and Index build for performance.
  • Driving the effort to migrate and test ETL post Datastage 7.5 to Datastage 8.5 and Oracle 11g to Oracle 11g ExaData for legacy as well as strategic Data warehouse.
  • Writing Pl/SQL routines for High Volume Data correction using BULK COLLECT features.

Confidential

Senior Oracle Data warehousing and ETL Developer

Environment: Oracle 11gR2 ExaData, Informatica, MS Team Foundation Server

  • Develop Asynchronous Multi-Threading processing Framework to process and send data downstream received from upstream systems using PL/SQL DBMS SCHEDULER API.
  • Design and Develop Recon Process for static and transaction data from Upstream Source systems.
  • Working on Designing and development of Dimensional Hierarchies.
  • Design and Develop reports FX Exposure Reports for Investment Accounting.
  • Designed and Developed generic ETL framework using Perl.
  • Performance tuning for batch jobs and underlying Oracle Stored Procedures
  • Worked on Performance tuning of OBIEE reports.
  • Provide Design Pattern documents for new development Book Of Work.
  • Detail Technical Specs to retrieve reporting data in Header-Detail format with Drilldowns, Pivots using Analytical functions like RANK, CUBE, ROLLUP, GROUPING SETS, PIVOT.
  • As part of Phase II loads for JDS designed and developed generic stored packages to load the data into Transaction, Trade, Journal tables of normalized Data Model using PL/SQL in the most generic way possible.
  • Using CAST MULTISET to send Object Type Collections to external systems.
  • Batch load Performance tuning

Confidential

Senior Oracle Developer and Subject Matter Expert

  • Upstream Front Office Systems and Vendor send daily Positions, Transactions, Journal Postings data which is staged into central Data warehouse called JDS using ETL process and then into normalized data model. From here this data is loaded into CAPRI which is the Enterprise level sub ledger system, which does reporting for Back office as well various data reconciliation and Trial balances.
  • Write Technical Specifications for PL Attribution Report Systems for Product Control.
  • Designed Star Schema Data Models for P/L Trial Balance and Reconciliation gaps.
  • Designed and Developed generic ETL framework using Perl.
  • Write Generic PL/SQL routines to convert non-partitioned tables to partitioned ones.
  • Performance tuning of views used for CAPRI reporting.
  • OTC and Non OTC positions were loaded in ETL using different Perl scripts. Worked on the refactoring of the same to load OTC position into general Position infrastructure.
  • Developed Shell Scripts to pre process some source files where in multiple files were received for a single load.
  • Designed Data Model for Meta data tables required for the ETL process.
  • Developed a database based Test Automation for Regression Testing.
  • Detail documentation for Position Vs Trade reconciliation which will be basis of the development.
  • Currently working on the planning and preparing strategy for Oracle 10g to 11g migration for JDS and CPARI databases.
  • Performance tuning for batch jobs and underlying Oracle Stored Procedures which loads from Staging to Fact table in the Data mart by eliminating redundancies and other techniques.
  • Worked extensively on Event and Rule Mapping Data Model which will trigger generation of Journal postings from the raw Journal data received from JPMorgan.
  • Worked on PnL Attribution reports for Product Control group.
  • Using Excel VBA Scripts to generate reports.

Environment: Oracle 10g 11g, Pl/SQL, Perl and UNIX Shell Scripts, OEM 10g, ERWIN,SVN,IBM Tivoli

Confidential

Senior e ETL and Oracle Datawarehousing Developer

  • Upstream Risk System stores data in Result DB RDB and here it is transformed and is used for BI application and sending downstream extracts.
  • Designed Informatica workflows to send data downstream to Product Control Datawarehouse.
  • Working on Design and Development for new CRO Credit Risk Office Regulatory project to add new Risk Scenarios to system.
  • Developed Perl script to automate Data discrepancies between Production and QA environments.
  • Worked on standardizing Data extracts .As part of this effort converted Shell Scripts into Informatica workflows.
  • Develop and maintain Business Object reports and configure Universe.
  • Handling Day to day to Database Query performance issues.
  • Analyze and transform Listed Options and Convertible Bond data received from upstream Risk Systems.
  • Worked on building Historical Data snapshot using PL/SQL.
  • Working on performance tuning of ETL process based on STAR schemas.

Environment: Oracle 10g 11g, Pl/SQL, Perl and UNIX Shell Scripts, OEM 10g, ERWIN

Confidential

Senior Oracle Consultant and Analyst

  • As part of Client Services and Research group MIS application provided analytical as well as reporting capabilities for the Business on Fixed Income and FX data such as Sales Credit, Client Interactions and sales revenue. This system received data from multiple sources which was Extracted, Transformed and Loaded in Centralized Hub database in normalized format. This data is then transformed and loaded into the Datamart which is de-normalized for Query Performance and analytics by Business.
  • Implement Data Masking routines using Informatica to hide Sensitive Business data in Test environment using Encryption methodology.
  • Designed and developed new consolidated FICC Datamart which will have both Fixed Income and FX data for Business reporting.
  • Developed new Data Masking process to mask critical client data using PL/SQL for Swiss UAT environment.
  • Modifying existing Shell infrastructure to accommodate new data masking process.
  • Peer code reviews.
  • Coordinating application build and deployment in liaison with other teams such as UI and Service team.

Environment: Oracle 10g, Pl/SQL, Java J2EE, Solaris, Perl and UNIX Shell Scripts, Sybase Adaptive Server Enterprise 12.5.3, OEM 10g, ERWIN

Confidential

Senior Oracle Consultant

  • As part of Fixed Income and Equities Client Technology Solutions group applications CDW and CRM provided analytical as well as reporting capabilities for the Business on various FACT data such as Client Meetings, Client events Credit and Client Interactions. This system received data from multiple sources which was Extracted, Transformed and Loaded in Staging area database in normalized format. This data then was transformed and loaded into the Presentation area which is de-normalized for Query Performance and analytics by Business.
  • Developed the testing Shell script to bypass environment dependent sourcing of Environmental variables.
  • Worked closely with SCM to migrate Source Code base from Clearcase to Subversion and verified the correctness of the migration.
  • Evaluated various client tools which can be used for Subversion based on the features provided and efficiency of their use.
  • Worked on complex BAU enhancement which requires using Oracle Collections.
  • Worked on extensive database performance tuning for Equity Intelligence reports developed using Business Objects executing in less than 10 seconds rather than over 2 hours and saved additional USD 5000 in cost for inviting Vendor side consultant.
  • Designed and developed efficient ETL processes for end-to-end loading of the data using Oracle.
  • Performance tuning of existing Oracle ETL procedures.
  • Devised generic Oracle Pl/Sql procedure to copy data from Production database to either of the test databases.
  • Putting up new mappings and transformations in Informatica for new data sourcing.
  • Assisting other teams in coming up with more efficient and optimized Oracle PL/SQL code.
  • Developing new techniques for loading high volume CLOB data with Optimal Performance.
  • Monitoring and Enhancing ControlM job streams for run time improvement.
  • Developing Unix Shell Scripts for batch processing.

Environment: Oracle 10g, Pl/SQL, UNIX Shell Scripts, Clearcase, Subversion

UBS Investment Bank Client Services and Research, Jersey City, NJ September 2007 to May 2009

Confidential

Senior Oracle Consultant

  • As part of Fixed Income Client Services and Research group MIS application provided analytical as well as reporting capabilities for the Business on various FACT data such as Sales Credit and Client Interactions. This system received data from multiple sources which was Extracted, Transformed and Loaded in Centralized Hub database in normalized format. This data then was transformed and loaded into the Datamart which is de-normalized for Query Performance and analytics by Business.
  • Developed the testing utility using ORACLE PL/SQL to test out newly designed ETL process for data accuracy.
  • Developed Database Monitoring Utility to validate ORACLE DB objects such as Indexes, Materialized views and Business Data Sanity checks before and after Datamart Loads.
  • Liase with Data Modeler for Strategic Data Model changes for FIBEE application.
  • Implemented Data Model changes to add Surrogate keys for the application in order to isolate it from any changes to Natural Keys from the Source Data in Sybase SQL Server.
  • Designed and created tables and other objects. Defined Referential Integrity Constraints and Check constraints for these tables.
  • Enhanced PL/SQL Stored Procedures, Packages and functions for effective Data Transformation and developed PL/SQL program units for Database Maintenance operations for Partitions, Indexes and Materialized Views.
  • Identified changes to Materialized Views to make use of Query Rewrite Capabilities extensively for Application Performance.
  • Extensively applied PL/SQL Analytical functions and CASE statements for performance optimization.
  • Enhanced PL/SQL program units by applying Oracle Optimizer Hints to boost the performance and optimize resource consumption.
  • Developed program units to handle large volume of data using Oracle PL/SQL Bulk Processing features BULK COLLECT and FORALL.
  • Designed object using PL/SQL collection construct, Nested Table to load large volume of Dimension data.
  • Designed and Developed Partitioned Fact tables using Range Partitioning and Local Indexes for each partition.
  • Designed and developed SQL Loader control files and wrapper shells scripts to invoke the SQL Loader.
  • Exercised SQL Performance tuning using Oracle tools EXPLAIN PLAN, SQL Trace, TKPROF and Automatic Tuning features.

Environment: Oracle 10g, Pl/SQL, Java J2EE, Solaris, Perl and UNIX Shell Scripts, Sybase Adaptive Server Enterprise 12.5.3, OEM 10g, ERWIN

Confidential

Technical Lead

  • As part of State Street Managed Account Services developing a business service model that includes the standardization on a single set of financial data feeds from Source portfolio management system to a centralized middleware data warehouse Data Hub and reporting platform, and the subsequent dissemination to downstream client and vendor systems.
  • Involved in Requirement analysis and documented Technical Specifications.
  • Designed the core Data Model using Erwin. Interacted with Business Users to confirm sources into Facts and Dimensions of the Star Schema Model
  • Designed and created tables and other objects. Defined Referential Integrity Constraints and Check constraints for these tables.
  • Evaluated, Designed and developed various approaches for ETL processes including Data Staging and Loading Procedures according to coding standards using SQL Loader.
  • Effectively used OEM 10g Grid Control to proactively monitor Oracle database for any performance bottlenecks.
  • Developed PL/SQL Stored Procedures, Packages and functions for effective Data Transformation.
  • Extensively applied PL/SQL Analytical functions and CASE statements for performance optimization.
  • Created multiple Oracle Sequences to generate Surrogate keys for the Datawarehouse in order to isolate it from any changes to Natural Keys from the Source Data.
  • Enhanced PL/SQL program units by applying Oracle Optimizer Hints to boost the performance and optimize resource consumption.
  • Developed program units to handle large volume of data using Oracle PL/SQL Bulk Processing features BULK COLLECT and FORALL.
  • Designed object using PL/SQL collection construct, Nested Table to load large volume of Dimension data.
  • Designed and Developed Partitioned Fact tables using Range Partitioning and Local Indexes for each partition.
  • Designed User Defined Types using PL/SQL Object Oriented Programming Principle of Inheritance to reduce storage space and avoid Data redundancy.
  • Created Materialized Views with Fast Refresh Capability to implement Slowly Changing Dimensions of Type 2 and client Data Marts for the Data Hub.
  • Worked to achieve effective PL/SQL Error Handling using RAISE APPLICATION ERROR, SAVE EXCEPTION and LOG ERROR clause of Multi Table Insert.
  • Designed and developed SQL Loader control files and wrapper shells scripts to invoke the SQL Loader.
  • Developed Shell Scripts to call Java RMI Class client.
  • Designed Oracle XML DB Schemas for Business Meta Data Translation using XSD templates.
  • Develop a Test Engine using Perl which will be used as a tool for Black Box Testing of File Extracts and prepare the Test Plan document.
  • Developed Shell Scripts to move files to and from FTP server by polling them for certain amount of time.
  • Developed Java programs which does File Transfers to and from FTP server.
  • Design and implement Database User Access Framework using Oracle Roles and Privileges.
  • Define Directory structure for Database Source Control using SVN.
  • Develop a Sizing Tool to estimate storage of Historical Data and for Future Data Growth using Microsoft Excel.
  • Exercised SQL Performance tuning using Oracle tools EXPLAIN PLAN, SQL Trace, TKPROF and Automatic Tuning features.
  • Worked on Technical Documentation for Database Components, Conceptual Model and Conceptual to Physical Modelling.Updaing existing FSD documents.
  • Interact with Business Users and Source System IT teams to define, agree and document incoming data feed specifications.

Environment: Oracle 10g, Pl/SQL, Java J2EE, Solaris, Perl and UNIX Shell Scripts, Informatica Powercenter 8.1.1, OEM 10g

Confidential

Technical Lead

  • GPC has one of the largest bases of Private Clients spread across the Globe. GPC has engaged TATA Consultancy Services TCS to enhance and maintain its Trading Applications CDCP and LPS in Alternative Investments Business Group and to provide offshore development services
  • Track and Communicate Project status to upper management.
  • Involved in Proposal Writing for new Prospective contracts.
  • Understanding Data Model using Erwin and reverse engineering it for further enhancements.
  • Designed, Enhanced and deployed Client/Server Systems using Weblogic, Shell Scripts and Oracle PL/SQL
  • Designed and Developed Oracle PL/SQL Stored Procedures, Packages, functions and Triggers.
  • Extensive SQL Query tuning for Oracle Databases using Oracle tools EXPLAIN PLAN, SQL Trace, and TKPROF.
  • Applying coding practices using special features of PL/SQL such as Autonomous Triggers.
  • Designed object using PL/SQL collection construct, Varrays using FORALL and BULK COLLECT to do Bulk Data processing.
  • Developed PL/SQL Stored Procedures for Data Load using Dynamic SQL such as EXECUTE IMMIDIATE
  • Developed File Load and Extraction procedures using PL/SQL UTIL FILE package.
  • Performed tuning for PL/SQL Program units to identify the performance bottlenecks and reduce processing time using DBMS PROFILE.
  • Designed and Defined indexes and constraints for all the tables.
  • Involved in System Integration initiatives with internal Authentication Systems.
  • Develop Shell Scripts for Bulk file Transfer by using UNIX Regular Expressions and awk, sed utilities.
  • Automated existing process for data updates and data audits using Perl.
  • Designed Autosys scheduler jobs for nightly Batch jobs and enhanced existing Job streams to reduce runtime for the batch to meet the SLA.
  • Developed Java programs which does File Transfers to and from FTP server and call Oracle Packages and procedures to load the data using JDBC.
  • Developed small utility Web Services using MS Visual Studio .NET 2003 Framework
  • Enhanced MS SQL Server DTS Packages using Enterprise Manager
  • Application GUI enhancements using ASP and XML in MS Visual Studio .NET 2003 Framework
  • Designing and Developing ETL processes for End-to-End Data Transfer using Informatica.
  • Configured Database connectivity for Powercenter Domain and ODBC Connection for Source and Target Databases

Environment: Oracle 9i/10g,PL/SQL, Perl, Shell Scripts Java,J2EE, Linux,BEA Web logic 8.1,Autosys 3.5,Informatica Powercenter 8.1.1,Powerbuilder 7

Confidential

  • Designed and Developed Oracle PL/SQL Stored Procedures, Packages, functions and Triggers.
  • Designed object using PL/SQL collection construct, Varrays using FORALL and BULK COLLECT to do Bulk Data processing.
  • Extensive SQL Query tuning for Oracle Databases using Oracle tools EXPLAIN PLAN, SQL Trace, and TKPROF.
  • Worked on iterative Oracle instance tuning by modifying Performance initialization parameters.
  • Developed PL/SQL Stored Procedures for Data Load using Dynamic SQL.
  • Develop PL/SQL programs to send Email Alerts using UTIL SMTP package.
  • Performed tuning for PL/SQL Program units to identify the performance bottlenecks and reduce processing time.
  • Designed and Defined indexes and constraints for all the tables.
  • Involved in System Integration initiatives with internal Authentication Systems.
  • Developed Perl Scripts for batch data load from the Source Vendor Systems.
  • Designed Autosys scheduler jobs for nightly Batch jobs and enhanced existing Job streams to reduce runtime for the batch to meet the SLA.
  • Implemented scheduling functionality using Oracle 9i PL/SQL Package DBMS JOB.
  • Develop Perl and Shell Scripts to load files received from the Accounting System and extract files for nightly transfer.
  • Developed multiple Informatica Transformations for a file data load.
  • Enhanced Java programs which does File Transfers to and from FTP server.
  • Designing and Developing ETL processes for End-to-End Data Transfer using Informatica.
  • Created Informatica Powercenter Repository Users, groups and assigned privileges for them.
  • Created Informatica Source Definitions, tables and Target Definitions, tables.
  • Modified PowerBuilder code to enhance GUI functionality.
  • Designed the core Data Model using Erwin.
  • Developed PL/SQL Stored Procedures, Packages and functions for effective Data Transformation.
  • Enhanced PL/SQL program units by applying Oracle Optimizer Hints to boost the performance and optimize resource consumption.
  • Designed object using PL/SQL collection construct, Nested Table to load large volume of data.
  • Worked to achieve effective PL/SQL Error Handling packages using RAISE APPLICATION ERROR, SAVE EXCEPTION and LOG ERROR clause of Multi Table Insert.
  • Designed and developed SQL Loader control files and wrapper shells scripts to invoke the SQL Loader.
  • Developed Shell Scripts to move files to and from FTP server by polling them for certain amount of time.
  • Enhanced Java programs which does File Transfers to and from FTP server.
  • Developed PL/SQL Stored Procedures for Data Load using Dynamic SQL.
  • Developed File Load and Extraction procedures using PL/SQL UTIL FILE package.
  • Enhanced Java programs which does File Transfers to and from FTP server and load the data using JDBC
  • Developed Perl Scripts for batch data load from the Source Vendor Systems.
  • Configured Database connectivity for Powercenter Domain and ODBC Connection for Source and Target Databases
  • Modified PowerBuilder code to enhance GUI functionality.
  • Developed Perl and Shell Scripts to load files received from the Accounting System and extract files for nightly transfer.

Environment: Oracle 9i/10g , Oracle PL/SQL,Java,J2EE,, J2EE, , Sun Solaris,UNIX Shell Scripts, Perl, Autosys 3.5,Informatica Powercenter 7,Powerbuilder 6.5

We'd love your feedback!