- I have over twenty years of number of diversified data processing experience with a strong technical background in systems analysis, functional design and specifications, test schemes and implementation. I functioned as a Senior System Analyst Programmer with extensive application programming supporting batch and on - line systems. Possess leadership abilities as well as being a team player.
- I am very able of effectively communicating with management, technical staff, and end users. Have full Life Cycle experience on each project, uses case tools and Microsoft project management tools for tracking and design.
- IRS, Social Security Administration, Auto Insurance, Federal Deposit Insurance Commission, Student Loan Processing, Banking/Financial, Telecommunications, Purchasing, Direct Marketing, Forecasting, Retail Sales, Cash Disbursements, Inventory Control, Biomedical, Store Ordering, Fixed Assets, Order Entry, Contracts
Primary Skills: COBOL2, CICS, DB2/PLATINUM, VSAM, ENDEAVOR, JCL, TSO, ISPF, INTERTEST, SPUFI, WINDOWS 10, MVS, FILEAID, SYNCSORT, HTML, RUP, MICROSOFT OFFICE, ASP.NET, CA7, AQT, QMF, APS, RequisitePro, QUICK JOB, EXPEDITOR, MQ SERIES,STAR TEAM, SQL, Xpeditor, UNISYS ECL,Quickpro, Stored Procedures
Mainframe Cobol/Architect Engineer
- Supported the requirement to perform an analysis to determine if there is a way to remove ACSWEB’s dependency on VeraStream Bridge Integrator (VBI). As a part of the analysis the contractor shall recommend alternatives to VBI. Working on the Automated Collection System (ACS) was developed to allow the IRS to collect taxes from delinquent taxpayers. Collection Representatives (CR) use ACS case management abilities to contact taxpayers, review case histories, and resolve cases.
- Supporting The Automated Collection System WEB (ACSWEB) is a WEB-based application that serves as a front-end Graphical User Interface (GUI) for ACS. There are three components to b - ACSWEB, NLSWEB, and ACSWEB Manager. ACSWEB contains taxpayer’s telephone numbers, case histories, notices, liens, or levies to resolve cases. It allows CRs to access specific cases via Taxpayer Identification Number (TIN) or base on their inventory. ACSWEB contains screens that are updated and reviewed by the CR.
- Supporting the National Levy Source WEB (NLSWEB) application functions in conjunction with ACSWEB to provide perfected lien and levy actions. NLSWEB is an electronic database of centralized levy source addresses. The Levy Source Coordinator will perform the necessary research to verify the correct levy source address. NLSWEB contains screens which are used to verify and update levy information.
- Supporting ACSWEB Manager which is an application that works in conjunction with ACSWEB for use by personnel in the ACS manager role.
- Supporting the analysis of ACSWEB, NLSWEB, and ACSWEB Manager use Attachmate's Verastream Bridge Integrator (VBI) platform, which is an interface to the IBM Transaction Bridge facility that allows high speed interaction with Customer Information Control System (CICS) regions. ACSWEB is a component of the Desktop Integration (DI) and Account Management Services (AMS) desktops for call center support. ACS is made up of applications running on workstations (desktops and laptops) and the IBM mainframe in the Tennessee Computing
- GMF: generalized mainline framework (GMF) validates & perfects data from a variety of input sources - tax returns, remittances, information returns, and adjustments - and update transactions are controlled, validated, and corrected.
- ERS: Error Submission System (ERS) provides for correction of errors associated with input submissions. The error inventory is managed on an ERS database and corrected documents are validated by GMF modules.
- Test Automation team work with the BMF business section to determine which processes could be automatic in the initial process. Ran the previous IBM JCL to create a demo to see how process would work.
- Center (TCC).
Mainframe Developer/Conversion Engineer
- Currently working for the Office of Disability Programs Management Information (ODPMI) developed the Disability Research File (DRF) to provide longitudinal data on the Social Security Administration’s (SSA’s) title II (t2) and title XVI (t16) disability programs. This file tracks disability claims from the filing of disability applications through full adjudication, including appeals to the federal court system. The file contains ten years of disability claims, organized by filing year cohorts.
- SSA and the Office of Program Development and Research (OPDR) required us to perform the following tasks:
- Create a document that summarizes the changes made to the DRF, since the 2009 DRF build process;
- Identify, validate, and reconcile the best sources of data for the Programmatic Longitudinal Disability Data Repository (PLDDR);
- Develop and test DRF flat files that are representative of the data that will be found in the PLDDR; and
- Reconcile and resolve any differences between the test DRF files and the actual DRF files for ten years of data.
- Assist with the extraction and analysis of data elements from multiple sources in varied formats including mainframe flat files and RDBMSs. Read COBOL, DB2 code and translate logic into pseudo-code and contribute to the development of analysis documentation.
- Work with senior Agency technical managers and staff to provide expert-level support for mainframe application design (including systems requirements), development, testing, and maintenance support efforts. Assisted in Identifying and recommending best practices.
- Serve as one the key coordinator among multiple project teams and/or components to ensure enterprise-wide consistency of application development efforts. Did work without technical oversight and supervise a team of specialists. Directed the development and maintenance of applications, and provide technical oversight for major projects. Oversaw the entire systems development lifecycle, including systems requirements, coding, testing, and implementation of proposed systems and provides critical recommendations and solutions for complex problems and Identified and addressed barriers and risks to successful implementation.
- The repository encompassed over 43 million claims, 200 COBOL programs, and data from scores (Scores might be a little high) of federal and State legacy mainframes. This task is a foundational step toward implementing a robust data warehouse and business intelligence capability for the program and just one element of SSA’s emerging data management initiatives.
- The team: analyzed and proposed candidate data sources, short and long-term data enhancements, and Analyzed options to streamline and update processes and recommended solutions (e.g., data standards, coding techniques, tools to automate business rule extraction or the ETL process).
- Reverse-engineered business rules and a data dictionary from over 200 COBOL and FOCUS programs and JCL streams.
- Conducted detailed joint requirements and design sessions with client and user groups.
- Discovered and documented defects in existing file development process and data.
- Developed high-level process flows for the as Is and To-Be ETL process, based on approved data sources.
- Used an Agile approach to managing the lifecycle.
- Coded and tested (COBOL, FOCUS) the extract, transform, and load process and wrote new and restructured/streamlined existing job streams (JCL).
- Created test cases and test plans; conducted system and integration testing.
- Used SAS Enterprise Guide to validate data and compare modernized file results with current files.
- Documented and resolved differences between the two files and modified code to improve data matching. The team achieved more than a 99.8 percent match between the old and the new datasets.
- Developed an implementation plan and transitioned the finished dataset to the owning organization. The plan included weekly meetings to increase buy in and share knowledge, a SharePoint site, full lifecycle documentation and artifacts, a full data dictionary, and orienting the owning organization to the new files and programs.
- The team routinely achieved Very Good or Exceptional on monthly Performance Assessment Reports and successfully delivered the completed files and artifacts on time or ahead of schedule.
- Developed applications that receive, stage and organize policy events that require underwriting.
- Coded mainframe programs to load policies into the Regional Queue.
- Worked on the front-end VB.NET and C++, IMS-DC to access back-end
- Provided a view of the work in queue which helps the resource planning process. Established relationships among underwriting reasons, decisions and follow-up letters.
- Distributed Database processing - each region has its own SQL server and File server. The application is loaded only on each of the 8 regional servers.
- Retrieve files from a IDMS Databases to retrieve data
- EPSS works for Oasis policies these decisions that are made in EPSS have ability to update a limited number of tables and fields for the policy in the Oasis database.
- EPSS then schedules Issue to get the updated rows issued.
- EPSS also schedules cancellation through the Cancellation process.
- The Structure Process Center constitutes the primary databases supporting data collection at FDIC. The regional offices of FDIC personnel populate and retrieve structure data using the SIMS front-end application.
- This database contains current bank structure information. The Structure Distribution Center serves as the primary database for distributing data to other FDIC systems and organizations. The SDC contains data specific to the needs of various structure user communities.
- Overall responsibility was to support the SIMS (STRUCTURE INFORMATION MANAGEMENT SYSTEMS) back-end processing and WEB/Visual Basic processing at FDIC. My responsibilities on this project included analysis using RequisitePro which manage the requirements, use cases, improve traceability, strengthen collaboration, reduce project risk, and increase quality., design, development, testing Using Test Director to deploy high-quality applications quickly and effectively by providing a consistent, repeatable process for gathering requirements, planning and scheduling tests, analyzing results, managing defects and issues. We used Star team, which provided integrated requirements management, change and configuration management, project and task management, defect tracking, file versioning, threaded discussions and implementation of multiple applications.
- These Projects included development of new applications, and enhancing existing applications. Maintained timely views of structured data as collected and/or maintained by the SPC, the SDC data is updated nightly. Views have been created for easy access to SDC data. Created a DB2 table of a Financial Institution's history, including name changes, class changes, address changes, and mergers. Provided a WEB based system for maintaining the SIMS tables.
- Improved the job run times for the creation of the SDC tables. Worked with Production Support in researching and fixing production problems.
- The application was written in Cobol2 using DB2 on an IBM Mainframe.
- These systems used COBOL, COBOL2, VSAM, CICS and DB2. These applications were tested with SPUFI and QMF, INTERTEST, Endeavor and various mainframe Debugging tools. Provided on call support, and technical guidance to subordinates, have worked in a 3-tier client server architecture (Desktop, Server, and IBM Mainframe) environment. Have had testing background and supported regression or load/performance testing in a 3-tier environment. Worked closely with the automated testing group in determining, building, and maintaining the environment and processes in support of same. Have provided strategic planning/direction for the team and coordinate with interfacing application and user groups.
- Responsibilities on the FUSION project included analysis, design, development, and implementation of the Student Loan application process. This project was to modify existing programs and create new programs for the Credit Services and Application Entry process for Borrowers
- Students and Schools. I worked with Production Support in researching and fixing production problems and EFT group modifying existing programs for Fusion Project. The application was written in Cobol2 using DB2 and CICS on an IBM Mainframe.
Confidential, Arlington, VA
- I was responsible for providing all analysis, programming, technical guidance for subordinates and on call support, for the ExpressTrak Billing System. Duties included investigating, identifying and correcting the ‘Hot List’ Billing Errors, developing the software solutions to correct the errors. (SQL) Lead Support/Analyst for the Enterprise Business Group handling all emergency fixes for client, assisting clients in understanding how to correct specific problems through the online (GUI) provided ExpressTrak Billing online and batch database interface, production submissions to new hires.
- The ExpressTrak Billing system was developed using Smalltalk for online processing Cobol2 batch processing with DB2 as the database.
Confidential, Greensboro, NC
- Responsibilities included system analysis, design and development of the online Employee/Union Contractual Systems. This involved analysis of the existing Employee/Union System and identifying enhancements; accommodated multiple tasks by combining several screens into one functional online screen.
- Provided and user documentation, identified the new functionality. The project was written in COBOL, CICS and DB2. Screens were modified from BMS to SDF.
Confidential, Winston-Salem, NC
- Programs had to be Y2K complaint before putting into production. Analysis and design documents had to be written to send to users in Tulsa OK, in short,applications had to be made Confidential Group compliant to run in the Confidential Group formats. The team utilized applications using ORACLE / PL/SQL.
Confidential, Winston-Salem, NC
- Worked on year 2000 project, duties were coordinating and testing complete bank Applications including over 600 programs both batch and online. I coordinated testing programs and JCL to simulate actual production runs.
- This consisted of four phases, (1) baseline testing, (2) 2001 testing, (3) Turn of the century testing and (4) plus 28 years into the twenty-first century.
- These tests were done with various augmentation tools and CA7 processes, a clone of production environment for the online and batch processing was used, and signed off by users when completed. These applications were written in COBOL, CICS and DB2 for IBM Mainframe.
Confidential, Burlington, NC
- Responsibilities for the Warehouse Inventory Applications Enhancement project included design, analysis, development, and implementation. The re-design of the Warehouse Inventory system provided a 50% reduction in the time the previous system took to add items to the inventory.
- Provided the ability to uniquely identify each warehouse location (65), allowing each location to receive their inventory information/reports.
Confidential, Charlotte, NC
- Responsibilities on this project included analysis design, development, and implementation of the customer detail/summary account information online Banking application. The application was written in Cobol2 using SQL and CICS on IBM Mainframe using an APS compiler.
Confidential, Burlington, NC
- Responsibilities on this project included design, analysis, testing, debugging and implementing applications. Primary duties were payroll, account receivables, account payable, general ledger, fixed assets, and purchasing applications.
Confidential, Greensboro, NC
- Responsibilities on this project included coding, design, analysis, testing, debugging and implementing new applications and modifications to existing applications.
- Primary duties included the conversion of a single Purchase Order Part System to a global multiple purchasing part order system. I converted all plants online programs to receive these purchase orders. All programs were written in COBOL, CICS, and DB2 on IBM Mainframe, these were new applications.
Confidential, Winston-Salem, NC
- Responsibilities on this project included analysis design, development, and implementation of multiple applications. These included development of new applications, and enhancing existing applications. These systems used COBOL, COBOL2, CICS and DB2. These applications were tested with SPUFI and QMF, INTERTEST and various mainframe debugging tools. I provided on call support for 12 years.