Database Developer Resume
SUMMARY:
Twenty five plus years of experience in database development including logical and physical modeling, E/R diagraming, source to target mapping and data analysis, ETL development (Ab Initio, Informix, UNIX Shell, AWS Data Pipeline), data base administration and Project Management.
AREAS OF EXPERIENCE:
Software: AWS (schema conversion tool, data migration services, data pipeline, S3), PostgreSQL, Sybase IQ v15.5, Oracle 10g, SQL Server, Sybase Interactive, Informix IDS v11.5, Informix XPS, Informix OnLine, Ab Initio, SSIS, Talend, UNIX Shell, Pythn, Perl, AWK, SED, Informix 4GL, Business Object - Data Integrator, SQL, SQL*Plus, Confidential, NDM, and Service Center (Peregrine)
Operating Systems: Linux, AIX, Windows 2000, SCO UNIX, Sun Solaris, Interactive UNIX, DG/UX, and XENIX
COTS Packages: pgAdminIII, Toad Data Point, Toad Data Modeler, Oracle SQL Developer, GitHub, Confluence, Rally, HipChat, Power Designer, Microsoft Desktop tools, Lotus Notes, Microsoft Outlook, Vantive, Confidential, Viso, IBM Maximo, Borland Starteam, Microsoft Sharepoint, Harvest, Crystal Reports, Hyperion Clear Case, Hyperion Clear Quest
PROFESSIONAL EXPERIENCE:
Confidential
Database Developer
Responsibilities:
- Running in an Agile development project, data migration of data from several on premises Oracle databases to RDS Oracle instances and RDS PostgreSQL databases.
- Data migration of an on premises Oracle database to an RDS PostgreSQL database. Independently created and executed the entire process. Process included Toad Data Point, SQL Developer and UNIX shell scripts to extract the data, load it to AWS S3 and the load the data to an RDS PostgreSQL database. Final step was to execute ETL of data from the RDS PostgreSQL database to another RDS PostgreSQL database using only UNIX shell scripts. Initial load of data will later be transformed into an AWS Redshift data warehouse for data analysis and reporting.
- Data migration of an on premises Oracle database to an RDS PostgreSQL database. Independently created and executed the entire process. Process included Oracle Export, Oracle Import, UNIX Shell and Python scripts for ETL work, Perl script (transfer data to RDS backend from Ec2), AWS Data Migration Services and AWS Schema Conversion Tool.
- Data migration included data analysis, data profiling and construction of data requirements.
- Utilizing database schemas, front end application and actual data in the database conduct data profiling. Logical data entities identified and presented to the customer in an effort to determine data requirements for ETL phase of the data migration.
- Creation, maintenance and enhancement to Talend jobs created to import data from a secure FTP server to an RDS PostgreSQL database.
- Create on demand custom reports as requested by the client using pgAdminIII and Excel.
- Administrator to the RDS database for routine tasks including user accounts and ad-hoc reports requested by the client.
Confidential
Project Manager / Senior Consultant
Responsibilities:
- Basic project management for a contract at the IRS. PM responsibilities include Monthly client/stakeholder Status Reports, team meetings, employee performance reviews, project plan development and maintenance, lessons learned, change request creation and tracking, user and technical documentation, configuration management implementation and active use of and adherence to Capability Maturity Model Integration (CMMI) established procedures.
- Training, document development and implementation of CMMI required processes in order to achieve a level 2 CMMI certificate.
- Overseeing and assisting in the installation of Oracle 11g.
- Overall team management for 3 employees including 2 members building metadata for 2000 plus tables, 160,000 plus columns. Metadata was collected to Excel spreadsheets and uploaded to a SQL Server 2008 database using SAS macros. Team development of a new user friendly UNIX shell script, MS Access front end for creating more accurate metadata.
- Database modeling of current and ‘merged’ databases using SAP’s Power Designer.
- Independent and team development of UNIX shell, Python, PERL and AWK scripts to load historical mainframe flat file data to a 10+TB Corporate Data Warehouse (CDW) running in Sybase IQ version 15.5 on a UNIX Solaris Sever. Scripts are developed to load and transform Tax Return Database (TRDB) data to client specifications. For each year of data, program scripts are developed, in coordination with a team of CDW database administrators, based on existing standards and methods. Some pre-processing is executed to combine multiple input files into a single table (to denormalize the data), derive standard columns, and mask confidential data using functions and algorithms provided by the Government. Data processing and Load scripts developed following standard SDLC methodologies.
- Production support of installed TRDB and TRDBM Database to include new data loads on a weekly, quarterly and yearly basis.
- Respond to customer inquiries on suspicious data returns from client developed SAS extract processes. Analysis can be simple from an invalid query join to a complex query requiring a modified ETL process, new aggregated tables or possible escalation to the metadata research team for in depth data analysis at the mainframe level.
- Create and track client and developer initiated Change Requests (CR) from start to implementation following standard SDLC processes. This includes initiation in an in-house change request tracking system, gathering and documenting Business and Technical requirements, peer review of application code (UNIX Shell scripts supported with AWK, SED and Perl scripts) changes, unit testing, systems testing, implementation, user acceptance testing and quality assurance testing. Source code stored via Concurrent Versions System ( Confidential ) and implemented in scheduled releases to Production.
- Development of a complete Technical Project Document to capture all business requirements, technical solutions, and user guide steps.
- Adherence to all CMMI requirements / documents.
- Participate in software demonstrations and evaluation for possible solutions to current and predicted needs.
Confidential
Informix Developer DBA
Responsibilities:
- Back end support to developers including schema and data changes. Several hundred instances of Informix across multiple servers.
- Changes are implemented using flat files and UNIX shell / T-SQL scripts.
- Software configuration management handled using Borland Starteam.
- Release management handled via IBM’s Maximo software.
- Metadata work on new system containing over 5000 fields. Work includes normalization, and compliance with metadata standards and source to target mapping.
- Documentation maintained through Microsoft Sharepoint.
Confidential
Informix DBA
Responsibilities:
- Maintenance and production support of 4 database servers. Including the planning and execution of recreating 30 tables to eliminate a high number of data extents.
- Implementation of a disaster recovery environment using Informix’s Enterprise Replication with high-availability clusters.
- Design, development and installation of a reporting database Informix. Database was refreshed daily with aggregated data and published to end users. Report database was implemented to reduce database contention and expedite report execution time.
Confidential, Washington, DC
ETL Application Developer
Responsibilities:
- Implementation and data migration of historical Informix data into a SQL Server database.
- Creation and maintenance of SSIS processes designed to transform historical bankruptcy data into data cubes for reporting.
- Support and enhancements to existing application code written in a combination of Informix Stored Procedures and Linux shell scripts. Data received via XML files from the mainframe and cleansed through complex ETL business rules processes. Cleansed data loaded to the applicable data store. Notification reports of invalid data returned to client through PDF reports.
- Complete system development of a new application to report on Judge Activity that included input to and review of business requirements, and data store (Informix star schema) development, and entire ETL application written with only one additional ETL developer. Application was designed to extract, transform and load Judge activity data to the main data warehouse and then extract, transform and load the data to a report level data store.
- Independently, without assignment, created new Linux/SQL KSH and Python scripts that automated the daily processing of the XML data through the ETL application, reviewed the logs for success or failure and emailed the results to appropriate personnel.
- Executed analysis to complete source to target documents.
- Solely responsible for initiation, development and implementation of SDLC processes and applicable documentation around implemented application code changes.
- Conversion of Informix stored procedures to GUI ETL Data Integrator (Business Objects) processes.
- Solely responsible for the daily processing of XML data files loaded and processed daily to the Bankruptcy and Adversary developed data stores.
- Development of technical requirements in support of customer requested changes.
- Review of approved application code changes with client to ensure desired results are achieved.
- Enhance/develop application code as directed through change requests via Clear Quest.
- Maintain Configuration Management practices with use of Clear Case for software version control.
- Execute the SDLC changes as required through the code development/maintenance.
- Validate Business Objects (BOE) generated SQL against backend Informix SQL to validate report design and data results.
- Development, maintenance and execution Quality Assurance scripts/processes to validate processed data.
- Development and maintenance of documents that support the processes of regenerating and reloading of data to the data warehouse.
- Development of applicable SDLC processes and documentation in support of the project.
- Administration of adherence to implemented SDLC processes.
Informix DBA
Confidential
Responsibilities:
- Support and continued development of a 500 GB, start schemas data warehouse for the Administrative Offices of the US Courts designed to collect court case statistics.
- Tuning of the test environment. Efforts include; query and stored procedure (SPROC) benchmarking, engine/onconfig customization and environment variable tuning.
- Optimizing query execution; sqexplain analysis, table and index fragmentation variances, creation of SPROCS where applicable, addition and deletion of optimizer directives, implementation and execution where applicable of parallel data query (PDQ), and caching options.
- Loading of flat file data per customer request using Informix’s High Performance Loader (HPL) tool, and Informix’s dbLoad utility.
- Monitoring query performance, identify possible bottlenecks and propose possible solutions to improve the performance on the production instance of Informix.
- Configuring and implementing table fragmentation to increase query performance.
- Upgrades and patches; Informix 11.10 to 11.5.
Confidential
IT Developer
Responsibilities:
- Review of analyst/user submitted Business Requirements for accuracy.
- Independent and team development of corresponding Technical Requirements.
- Design and develop/maintain code for various applications per Business and Technical requirements. Application code is developed to take advantage of the MPP infrastructure to ensure query return time is optimized. Applications are in Confidential ’s Corporate Data Warehouse (8 TBs) and Data Mart (2 TBs and under) environments. Development is done using Ab Initio (GDE) in the Enterprise Metadata Environment (EME) with anticipated/required exception handling and rollback logic, appropriate parallelism considerations and K-shell scripts. The business applications extract, transform/cleanse and load data extracted from the Corporate Data Warehouse, to Informix XPS (similar to Teradata) Data Marts. The Data Warehouse and Data Marts run on an AIX MPP platform.
- K-shell scripts developed to: execute set-up steps such as unloading data from the data mart to be used as look-up tables in Ab Initio, copying user input data from server to server, run as wrapper scripts for repeatable Ab Initio code, check error logs, and any other required utility not deemed to be created in Ab Initio.
- Primary point of contact for Confidential jobs. Create, implement and maintain back-ups of all Confidential jobs. Responsible for manually implementing job steps in the Integrated Test Region and responsible for completing Confidential job requests to implement the jobs in the Production Region.
- Conversion of SAS code to either Informix stored procedures or Ab Initio code as requested by users.
- Develop and maintain application code using SDLC and Capability Maturity Model (CMM) methodologies to meet Sarbanes-Oxley requirements.
- Develop Unit test plans using required methodologies.
- Work with the testing team and DBA in development and planning of required System tests.
- Develop Risk Mitigation, Implementation, Verification and Back-out plans.
- Design and development of job definitions and job flow dependencies for various applications using the Confidential job scheduler tool.
- Maintenance and enhancements to an application used to store production support calls/pages. Application is written in C.
- Management of projects using Peregrine’s Service Center software. *(Change and Problem IT Infrastructure Libraries).
- Participation in root-cause analysis sessions in a continuous effort to better the application code.
- Execute and document results, of Unit testing in the Development Region. Unit testing is conducted by manually executing applicable K-shell and Ab Initio scripts from the command line.
- Package and synchronize source code from the Development Region to the Integrated Test Region. Execute the established processes as required to complete the Integrated Test in the Integrated Test Region. Integrating testing is conducted be executing applicable K-shell scripts and Ab Initio graphs using implemented Confidential jobs. This ensures the application code and the Austosys Jobs are tested. The Application code is executed and the data is loaded to the Integrated Test Region for DQ/UAT validation before it is packaged again and synced to the Production Region.
- On call support of Confidential jobs that execute daily, weekly, monthly and quarterly Ab Intio and K-shell script processes. Monthly refresh of data extracted, transformed and loaded approximately 150 GBs of data from the data warehouse to the data mart.
- Execution of monthly processes to extract data from the Data Warehouse and load the data to a DQ/UAT database on the production server. Load any addition data requested for QA or UAT testing. Once validated, the data is moved by the DBA from the staging area to the production DB.
- Research and resolve assigned problem tickets.
- Quality analysis of data extraction and FTP processes to ensure data received by external customers is accurate. Internal customer data extract is loaded to a campaign mailing Oracle 10g DB.
- Production of NDM request for users as required.
- Assist MicroStrategy users with connectivity, query response time and analysis when confronted with unexpected data results.
- Analysis of back-end data issues (Oracle 10g DB) using SQL*Plus per customer request.
- Installation and configuration of ODBC drivers for DQ and UAT access to the several Data Marts.
- Complete project estimates as assigned. Estimates include all steps required to implement the project: development, testing, documentation, and implementation. Estimates are used in the completion of the project plan (MsProject) and in the scheduling of the integrated test period on the Integrated Test Region/server.
- Responsible for the creation of, or modifications to, the project plan (MsProject) depicting milestones and steps required to implement the project. Plan includes Development, Integrated Testing and Implementation to Production.
- Completion of all required System Development Life Cycle (SDLC) and Source and Executable Code Development Management (SECDM) documents under tight Sarbanes Oxley (SOX) compliance regulations. Documents are stored in both hard copy and on the department portal and include documents such as: Technical Requirements, Test Plans, Code Reviews, Implementation Plans, Back-out Plans, and Verification Plans.
- Primary POC for all SOX issues for the team. Attend all SOX compliance meeting and ensure documents (existing and new) are SOX compliant. Ensure team is following SOX developed practices and are aware of SOX compliance changes.
- Review of created documents for SOX compliance before close of the project.
- Creation of documentation templates where applicable. Templates increased the speed of documentation development time, and decreased the developer’s frustration and confusion on compliance requirements.
- Creation and maintenance of required team technical documentation.
- Primary point of contact for the Metadata team to the Sigma and CDW data marts. Responsibilities included field level mapping from copybook to database.
- Confidential data modeler for 4 data marts.
- Basic Informix DBA responsibilities for the Sigma Data Mart.
- Assistance with the development and maintenance of an Informix I-Spy web based application used for database monitoring and reporting. Application is written in Perl and Ab Initio.
Confidential
Informix DBA
Responsibilities:
- Support and maintenance of over 50 databases residing on approximately 20 Sun servers running Informix versions 7.24 and 7.30.
- Creation and implementation of new Informix instances as requested by the user.
- Development of UNIX shell scripts in an effort to minimize the amount of time required in user access maintenance.
- Customer on-call support for three servers used primarily for development and quality assurance.
- Participation in the development of a migration plan for the purpose of migrating all Informix systems from Rockville Maryland to Charlotte North Carolina.
Confidential
IT Manager / Informix DBA / Informix 4GL Programmer
Responsibilities:
- Installation and configuration of an Informix 7.3 engine on a DG av3650 box. Installation included Informix software and the database conversion from 5.05 to 7.3.
- Maintenance of all user access and securities.
- Creation and implementation of backup, startup and shutdown scripts, and procedures.
- Conversion of the application code, and development and execution of a complete unit and system test plan. The application code conversion is required due to an upgrade in the OS from DG/UX 3.10 Motorola to DG/UX 4.2 Intel and an upgrade from Informix OnLine 5.05 to Informix IDS 7.3.
- Creation and maintenance of Confidential diagrams for the purpose of documentation.
- Maintenance to existing Warehouse Management System (WMS) application code. The core of the WMS is Haushahn’s Viaware software (Informix 4GL and C code). Additional Informix 4GL has been developed and layered over the core application. The additional code provides Radio Frequency (RF) capabilities for receiving, picking & packing and shipping. This code was also written in Informix 4GL.
- Creation and maintenance of existing EDI programs and script used to relate order/shipment information to customers.
- Overall responsibility of all hardware and software used in the operation of the distribution center.
- General management of all aspects relating to the IT department and IT staff. This includes timecards, performance reviews, team meetings, management staff meetings, budget planning, salary increases and plans, job titles and responsibilities, training and other general responsibilities.
- Planning and direction of future IT enhancements and maintenance. Responsible for the plans, and adherence to those plans, of ensuring Y2K compatibility of all warehouse hardware and software.