Senior Database Consultant Resume
Princeton, NJ
SUMMARY:
Over a decade of experience and demonstrated excellence in database software development, programming, and Oracle backend development. Outstanding problem solving and communication skills. Team player. Consistently completed assignments on schedule.
TECHNICAL PROFICIENCIES:
Oracle: PL/SQL, T - SQL
Databases: SQL Server 2012, Oracle 11g, 10g, PostgreSQL
Cloud Infrastructure: Amazon Web Services (AWS)
Other Languages: Java, UNIX Korn Shell, Visual Basic 6
System Lifecycle: Requirements, Design, Release documents
Communication skills: Taught from 1982 - 1999 at small colleges and 2002 - 2004 for the Irish Govt.
SQL Query performance tuning: Both SQL queries and PL/SQL tuned for performance.
Reporting Systems: SQL Server Reporting Services (SSRS)
PROFESSIONAL CHRONOLOGY:
Senior Database Consultant
Confidential, Princeton, NJ
Responsibilities:
- PL/SQL and MS SSRS development with Oracle 11g and SSRS 2008 R2
- For 25 schemas across 7 environments, I had to monitor the accuracy of code deployment using 4 Tortoise SVN keywords embedded in the DDL.I used a package with functions to query the data dictionary for PL/SQL code, views and materialized views. The last 2 use LONGs to hold the code. A materialized view used these functions to keep up to date information. An SSRS report was used to allow the user to choose 1 schema in any environment as the desired code base and compare that to any of the others.
SQL Server Reporting Services Developer
Confidential, Easton, PA
Responsibilities:
- Microsoft SQL Server Reporting Service reports against SQL Server 2012.
- Designed new reports against a Navision ERP database.
- Enhanced existing reports for speed.
Senior Database Consultant
Confidential, Princeton, NJ
Responsibilities:
- PL/SQL and MS SSRS development with Oracle 11g and SSRS 2008 R2
- To solve this, I wrote a procedure which was to be called by Autosys with an ID number (1,2…). Autosys was set up to run several jobs in parallel (with its own ID). Each process read its ID and used that to read data from a configuration table with the schema names that it was to process. These schemas were ordered by a sequence column in the table and could be made Inactive. This way, the load was spread out and important (at any given time) schemas were run first.
- Code releases were a problem as each DDL script was run by hand in each schema.
- This held up 20 people during the night. I wrote a Korn shell script to read a file with a list of Programs (schemas) and loop through them and call another script to read a file with the list of DDL scripts to run (in the proper order). Output from the parent and child scripts were sent to a log file and the Oracle output was sent to an error file.
- These files are tailed by support during the run. The key is that the child scripts are sent to the background and so, run in parallel. To control the number of parallel jobs as the company grows, a number is read from a configuration file which limits the number of background jobs that run at once. The PIDs are used for this and when one no longer exists in /procs/, another is started so that the limit is never exceeded but as many as possible are running at one time.
Senior Database Consultant
Confidential, Fort Washington, PA
Responsibilities:
- PL/SQL and Datastage development
- Converted a slow DataStage process to PL/SQL for efficiency.
- Cut the run times for several parts of the process from hours to minutes using Bulk operations.
- Migrated the system from 10g to 11g.
- Redesigned the schema to reflect all new data sources.
- Planned and implemented the separation of data and code as one company split into two.
- Solved a problem stopping the DBAs from moving data to the new infrastructure due to a space issue on the Production box. Wrote Pl/SQL scripts to export and import only the subset of the data that the new company owned.
Senior Database Consultant
Confidential, Malvern, PA
Responsibilities:
- General development as a project was mothballed.
- Used Agile practices to support a main application. Used Oracle 10g and PL/SQL.
Senior Database Consultant
Confidential, Horsham, PA
Responsibilities:
- Handled the refresh and update of the Performance database.
- Resolved hundreds of issues while doing it.
- Used Native Dynamic SQL, data dictionary views and a parameter table containing: table names, schema names and tablespace names for the following.
- Dbms scheduler to fork off parts of a process on a table with a very large number of rows affected.
- Used the analytic function ntile to break up a set of primary keys to feed the forks. dbms alert to register these jobs dbms alert.waitany to wait for all these jobs to finish so that the indexes can be rebuilt
- For a given tablespace, schema, table set:
- NDS and all tab partitions to get a handle to each partition or subpartition of the following type: interval, list and range-list.
- NDS to exchange partitions to a temp non-partitioned table and then to the target partitioned table.
- NDS to create a new partition after using the data dictionary to determine that it didn't exist.
- NDS to move the tablespaces for each table to the correct partition.
- NDS to drop the now empty, partition.
- After all jobs doing the above were finished:
- NDS and all indexes to get an array of index names
- For each if these used NDS to rebuild the indexes.
- Used Bulk Collect and Forall to manipulate data between tables.
- Bulk Collect into an associative array.
- Use Forall to insert
- Save any Bulk exceptions and continue
- On a Bulk error, process the error array and log it
Independent Consultant
Confidential, Easton, PA
Responsibilities:
- Java, PHP & MySQL Stored Procedures. Took training for Oracle 11g.
- Used the Oracle 10g and database via PL/SQL to import, process and extract (ETL) many large data sources.
- The data was received as non-normalized flat files which were imported, cleaned and stored as non-normalized data.
- Wrote complex SQL queries using inline views in both the Select and From fields and group-bys to roll-up this non-normalized data. Sub-queries could be nested three deep.
Consultant/Software Developer
Confidential, Lansdale, PA
Responsibilities:
- Automated manual returns process (extracting and reading EDI data and finding proper invoice and amount to ensure data matched and proper amount credited to buyer).
- Program processed tens of millions of dollars without issues. Saved valuable man hours.
- Used PL/SQL to create automated program for processing returned goods. Process facilitated via EDI data transfer from third parties; file was read and order reassembled from invoices; shipping data was reassembled from packing slips; order was compared to ensure completeness.
- Initiative facilitated conditions that needed to be satisfied to credit full price. Price was calculated accurately and adjustments to original order were credited (or not) depending on conditions.
- BULK COLLECTed Data into associate arrays, subsidiary data obtained with CURSOR FOR LOOPs from keys in arrays, Native Dynamic SQL (get invoices/shipping lists), order and shipping arrays merged to appear as one order and shipment to facilitate comparison of returned against ordered/sent, and log file written to record successes and rejection reasons.
- Automated MOV process and calculation of minimum order requirements depending on various conditions. Used complex data to make process data driven and calculate MOV. For given line item, lowest applicable minimum value to be applied while for whole order, highest.
- Read e-mail distribution list from UNIX flat file and passed it into PL/SQL script. Redirected output from SQLPlus to log file and then grepped log for ORA errors.
Consultant
Confidential
Responsibilities:
- Created PL/SQL procedures, including program to do large select and then pivot unknown amount of data from vertical to horizontal.
- Cut run time from 40 minutes to one minute and 20 hours to 15 minutes by rewriting SQL cursor.
- Expedited old process (Korn Shell Script using “cat” function 19,000 times on 60 MB flat file on UNIX box) that usually ran in 24 hours (or longer); developed combination of Korn and PL/SQL scripts to use SQL Loader (dynamic) to load 60 MB file in Oracle table and go against table to filter desired data—new process ran in about 20 minutes. Array sorts and binary searches were used to improve performance.
- Worked on writing several Korn scripts to move data around different servers.
- Utilized FTP for UNIX to UNIX and Samba for Windows to UNIX. Re-directed output and grepped it for errors. Implemented encryption for important data using GnuPG software.