We provide IT Staff Augmentation Services!

Oracle Datawarehouse Lead Resume Profile



14 years of experience in architecting, design, development and implementation of Database applications and Data Warehouses 12 years of experience in Oracle 11g/10g/9i/8i/7, PL/SQL, SQL Loader, Bulk SQL and Performance Tuning. 2 years of experience in C /Pro C applications. Extensive experience in Data Transformations, Data Loading, Migration, Conversion, Data Cleansing Extraction and building interfaces. Experience in performing all dimensions of development including Extraction, Transformation, and Loading ETL data from various sources into Data Warehouses and Data Marts using ETL tools like Pervasive/Data Junction Data Integrator and Informatica PowerCenter Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor . Extensively worked on Packages, Procedures, Functions and Triggers in PL/SQL. Worked on Oracle Collections and Batch/Bulk processing. Extensive experience in loading high volume data and performance tuning using various types of Hints, Indexes and Partitions. Tuned and optimized SQL statements using EXPLAIN PLAN, TKPROF and 10046 AWR reports. Experience in automation of ETL process using Korn Shell Scripts. Developed heavily parallel CPU bound ETL process jobs by using background processing in UNIX. Experience in scheduling batch jobs using Autosys. Extensively worked with Dimensional Modeling and implemented Data Marts, Facts and Dimensions using Star and Snowflake schemas. Experience with Data Analysis and Data Modeling using ERWIN, open source data model tools like SQL Power Suite and DBDesigner and MS Visio. Proficient in using Oracle Utilities - SQL Loader, Import/Export and Data Pump. Experience in setting up Materialized Views/Snapshots for data replication across databases. Experience with various technologies such as: IBM MQ Series, WBI-MB, MS SQL Server, SQL Navigator, Cognos, JavaScript, Perl script, Unix shell scripting, Java and HTML. Developed reports using Crystal reports, Business Objects XI, SQL Server Reporting Services and iDashboards from OLAP cubes. Excellent leadership skills to work in team, analytical skills, logical reasoning, interpersonal skills and attitude to lead high performance teams Excellent communications skills, fast enthusiastic learner, excellent work ethics and a good team player.


  • Languages: C/C , SQL, SQL Plus, PL/SQL, Websphere Messaging and Queuing MQSeries 6.0/ 5.3 and WMQI Websphere MQ Integrator v2.1 , WBI-MB v6/5, JMS, Visual Basic 6.0
  • RDBMS: Oracle 11g / 10g / 9i / 8x / 7, Oracle EBS R12, DB2, SQL Server 2008, MS-Access
  • Modeling Tools: ERWin 4.0, SQL Power Suite/ SQL Power Architect, Rational Rose, DB Designer
  • Scripting: UNIX Shell Scripts, PERL, Lotus Notes Script
  • Operating Systems: Windows, UNIX Sun Solaris 9, MS DOS, Mainframe CA-7
  • Reporting Tools: iDashboards 6.5, SQL Server Reporting Services, Jasper Reports, iReport, Crystal Reports, Cognos, Oracle Reports 6i/2.5, Business Objects XI
  • Version Control Tools: PVCS, SCM, Visual Source Safe, Serena Dimensions
  • ETL Tools: Pervasive Cosmos 8.12 and Informatica PowerCenter 7.1.2 / 7.1.1/ 6.2 Source Analyzer, Datawarehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Monitor, Workflow Manager
  • Oracle Utilities: SQL Loader, External Tables, TKPROF, Oracle Import/ Export/ DataPump
  • Other: Microsoft Project, TOAD, SQL Developer, AutoSys, Remedy User 7.5 / 6.03, J2EE - JSP, Servlets, MySQL, PERL, Web Services, Oracle Enterprise Manager, Python, MongoDB



Oracle Datawarehouse Lead

Technologies: Oracle 11g /10g /9i/8i, Oracle Applications R12, HP-Unix, TOAD, ERWin/ SQL Power Architect, MS SQL Server 2008, iDashboards, Oracle Reports 6i The project comprises of designing and developing the Reynolds Global Data warehouse using Oracle11g/10g/9i/ 8 and Oracle e-Business Suite R12. Enhance existing interfaces with ALCOA programs to extract, transform and load data from EBS system into Reynolds Global Data warehouse. It includes development of ETL programs to combine SMART Replacement North American Datawarehouse , FAST Replacement European Datawarehouse , PBIS requirements, data from the Legacy systems residing in Europe, South America, Asia Pacific regions. Develop reports using SQL Server reporting services and iDashboards from OLAP cubes Sales, Invoice lines, Financial Report Repository, GL Balances, GL History, GL Interco, GL History built in SQL Server Analysis Services.

Responsibilities Included: Provided technical leadership on the Global Data warehouse project by facilitating requirements capture, design, data mapping and development using Oracle 11g/10g/9i/8i, PL/SQL, SQL Server SSRS, SSIS, SSAS and iDashboards for reporting Led the effort to develop cross mapping of data flow between various external systems, Oracle tables and generated data conversion scripts using SQL Loader, PL/SQL procedures, functions and packages using Oracle collections, Bulk SQL features, Ref Cursors, Analytical functions and developed testing strategies, Korn Shell Scripts. Presented the technical research findings to the team members, stake holders and closely worked with them to exchange the principles and ideas behind the proposed solutions. Provided technical and functional expertise to various development and functional teams that includes developing data strategy to support business needs. Designed, developed and tuned Oracle applications and provided 24/7 on call support Mentored junior team members and led development efforts and worked closely with the team for developing test plans, scripts, tracking the progress and conducting the team status meetings. Scheduled the ETL jobs using the Oracle job scheduler and Oracle Application Server Concurrent Manager to create the Request Sets for the OLAP/ OLTP processes. Built and automated Korn Shell scripts to FTP and load delimited flat files from external sources to Reynolds Global Datawarehouse staging and dimension tables using SQL Loader. Used analytic functions, hierarchical functions and Oracle regular expressions to transform and calculate aggregations / averages of Total Amount / Net Savings / Amount Due / revenue by operating unit / business unit / customer group Global, High Level Customers and customer groups / Market codes. Set up Materialized Views/Snapshots for data replication across databases. Created B-Tree indexes for faster retrieval of data. Tuned long running SQL queries using Indexes and CBO hints, explain plan, TKPROF and Auto trace. Used DBMS STATS package to gather table and index statistics. Generated DDL scripts for creation of new database objects like tables, views, indexes, synonyms, sequences, roles and grants for code migrations. Used open source SQL Power Architect data model tool to reverse engineer the database and create ERD diagrams for system. Developed stored procedures using Ref Cursors to return a large result set to the SQL Server reports. Responsible for design of logical and physical model of the Datawarehouse and created partitioned tables and partitioned indexes for manageability and scalability of the application. Established data processes and ETL routines for a new automated supply chain management system. Administered user access to iDashboards and used Multi Dimensional eXpressions language MDX to query and manipulate the multidimensional data from the OLAP cube to deliver performance management, scorecards and business alerts in iDashboard reports. Used Pivoting functions in MS-Access to build a prototype for the business users before building the drill-down reports in idashboards and SQL Server Reporting Services. Developed and enhanced Oracle Reports 6i for a few projects.


Senior Oracle PL/SQL/Datawarehouse Developer

Technologies: Oracle 9i/10g, Teradata, Unix, SQL Loader, TOAD, SQL Developer, Job Control, SQL Power Architect, DBDesigner, ERWin, EAI Transfer/WinSCP/FileZilla for File Transfer Process, Business Objects XI/Crystal Reports, Golden Gate XOHM Data Warehouse XDW is a strategic, enterprise-wide resource that enables the business to generate reports, analyze data, and perform predictive modeling in support of the XOHM service. XDW supports XOHM by creating a single repository for all 4G/WiMax related data. XDW sources its data directly from both Amdocs and Sprint 4G application platforms like Enabler, Clarify, OMS, OM/IM and OMA/DM. Data-marts and Business Objects universes enable canned and ad-hoc reporting across the various XOHM business functions. They feed information into Executive Dashboards, Business KPIs, and Standardized Operating Metrics and Predictive Analytics for supporting Sales and Marketing teams. XDW also provides Data feeds to other systems inside Sprint Nextel and outside vendors who provide content or services to XOHM subscribers. It also caters to automated data loads from multiple sources both internal and external to Sprint Nextel/XOHM. XDW utilizes Transactional Data Management technology from Golden Gate Systems to provide continuous, real-time capture and delivery of data from source to target between various 4G systems.

Responsibilities included:

  • Developed a Business Intelligence BI and Data Warehouse reporting system for Sprint, to launch their new XOHM 4th generation wireless communication system, a 5 billion project.
  • Conducted interviews and application development meetings with technical staff and business users and mapped them to the technical requirements from several business units.
  • Led the effort to develop cross mapping of data flow between various external systems, Oracle tables and generated data conversion scripts using SQL Loader, PL/SQL scripts in the Unix environment and developed testing strategies for the Sprint's XDW database.
  • Led the efforts to provide technical evaluations involving various methods of implementing BI solutions using software packages and tools such as Oracle, Business Objects, Open Source Data Model tools such as SQL Power Architect and DBDesigner to reverse engineer the database to create ERD diagrams for the source systems, Open Source BI tools such as Pentaho. Results of these evaluations were presented to the stakeholder management in oral and written forms of communication.
  • Defined, developed and documented BO XI/ Crystal reports and dashboards along with the Business Analysts, with key performance indicators and user interactive features, for implementation in the Business Objects automated reporting system.


Senior Developer

Technologies: Oracle 9i/10g, Data Junction 7.5.5, Pervasive Cosmos 8.12/8.14, Informatica 7.1.2, SQL Loader, XML, TOAD, SQL Developer, SCM, Mercury Quality Center Provided technical expertise by supporting architecture, design, development and implementation of Electronic Data Interchange projects. Multiplan receives claims for re-pricing from various clients like Cigna, Aetna, United Health Care, Humana, and FirstHealth in the file format appropriate to each client's implementation. These files are decrypted as appropriate, loaded into Multiplan's proprietary EDP claim transaction system tables.

Responsibilities included:

  • Designed and developed Healthcare data applications on the Electronic Data Interchange EDI team using Oracle PL/SQL and Data Junction mapping product/Cosmos Pervasive/Informatica 7.1.2 ETL Tools to extract, transform and load claims data into the database based on the business specifications.
  • Installed and configured the Informatica client and Informatica Server.
  • Worked with tables having 50 million plus records per table and used complex data transformations consisting of approximately 15 transformations per mapping in the mapplets. Used various transformations like lookup, update strategy, router, filter, sequence generator, source qualifier/Joiner on data and extracted according to the business rules and technical specifications. Extensively used unconnected lookup transformations to get decode values for the code list fields.
  • Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier level for relational databases and selecting the table with fewer rows as master table in joiner transformations.
  • Developed packages, procedures, functions using Oracle's collections and BULK SQL features to implement new changes used in EDI processes to improve the performance. Used analytic functions to calculate aggregations/averages of Total Amount/Net Savings/Amount Due/revenue by client/ total number of re-priced claims per client, etc.
  • Developed Shell Scripts to automate daily activities such as loading files into database using SQL loader, FTP files to or from remote servers, Encrypting and Decrypting files using the PGP software, compression and decompression of files, executing SQL scripts, executing PL/SQL procedures, running jobs in parallel job scheduling, spooling CSV files using Automate
  • Evaluated and provided comparative analysis of Pervasive, Pentaho and Informatica tools to the client. Also, evaluated the iReport and Jasper Reports which is a part of the Jaspersoft BI Suite for designing and generating Reports.


Senior Programmer Analyst

Technologies: Oracle 9i/8i, Websphere Business Integration Message Broker WBI-MB /MQ Series 5.3/6.0, Pro C, Cobol, HTML, PERL Script, Sun Solaris, C/C , JMS, Java, Unix Shell scripting, SQL Loader, SQL Import and Export utilities, XML, TOAD, DCOM, Javascript, JSP/Servlets, Web Logic Server5.0, Oracle Forms Data Format, RightFax, Cognos Impromptu v6 and v7, Delphi, TIF2BMP/TIF2MODCA Utility program VB6 and VisImage , Send2Documerge program VB6 and Visual Info, VisImage . Responsibilities included: Designed, developed, tuned, monitored the Oracle applications and MQ client-server applications in Java, JMS, C/C and Perlscript and provided 24x7 on call production support, provided technical and functional expertise to various development and functional to support business needs on the Middleware team. Websphere Business Integration - Message Broker 6.0 WBI-MB / Websphere MQ Integrator 2.1 - Took the lead in Broker configuration and Configuration Manager Setup, building the Queue Managers and Queues via QPasa and UNIX shell scripts and WBI-MB ESQL, imported DTDs/COBOL copybooks into Message sets, wrote ESQL to map the outgoing Interface Receive Module IRM COBOL copybooks to the incoming Interface Send Module ISM COBOL copybooks with some data transformations MRM to XML, XML to XML and XML to MRM in the Message flows for over 50 interface modules. Deployed the packaged message sets, message flows and ESQL files as broker files to the execution groups of the brokers with Oracle 9i/8i and DB2 as backend databases. Queues monitored via QPasa MQ Monitoring tool . Mentored the team members and coordinated the development efforts, leading the effort of developing test plans, scripts, tracking the progress and conducting the team status meetings. Developed and Reviewed Test Plans using Mercury Quality Center/Test Director to verify the Functionality, Performance and Stability of the application basis system requirements. Automated the PolicyLink GUI application using WinRunner. Led the effort to gather the business requirements, functional specs, user interface documentation for the RightFax Online Customer Service Forms project using Coldfusion. Led the development of the Middleware web project to provide the end users with information on the Queue Manager attributes, Queue attributes e.g. Queue depth, Queue Trigger, browse messages that are present on the Queues using the Java MQ APIs and IBM PCF commands with a GUI interface using JSP/Servlets for the Management. Initiated, developed, documented and supported the NAIC and Guntherscan client - server application using VB6.0, VC , Mainframe and MQSeries. The VC and VB6.0 driver programs are triggered by MQ Queue messages sent by the Mainframe. VC programs receive that information in the form of an EBCDIC message from the Queue and translate it to an ASCII formatted text file and spawn a batch file that makes a call to a Delphi program that processes this text file to generate an output file in the form of a TIF image. Developed, documented and supported the Lead Tracking System client - server application using VB6.0 and Oracle 8i to track the number of lead agencies exported and generated Cognos reports for the end-users.

Hire Now