- Over twenty five years of diversified data processing experience with a strong technical background in systems analysis, functional design and specifications, test schemes and implementation.
- I functioned as a Senior System Analyst Programmer with extensive application programming supporting batch and on - line systems.
- Possess leadership abilities as well as being a team player. I am very able of effectively communicating with management, technical staff, and end users.
- Have full Life Cycle experience on each project, uses case tools and Microsoft project management tools for tracking and design.
Primary Skills: COBOL2,COBOL,Cobol74,CICS,DB2/PLATINUM, VSAM, ENDEAVOR, JCL, TSO, ISPF, INTERTEST, SPUFI, JAVA, WINDOWS XP, MVS, FILEAID, SYNCSORT, HTML, RUP, MICROSOFT OFFICE, ASP.NET, CA7, AQT, QMF, APS, RequisitePro, QUICK JOB, STORE PROCEDURES, EXPEDITOR,STAR TEAM,TEST DIRECTOR
System Conversion Engineer
- Currently working for the Office of Disability Programs Management Information (ODPMI) developed the Disability Research File (DRF) to provide longitudinal data on the Social Security Administration’s (SSA’s) title II (t2) and title XVI (t16) disability programs. This file tracks disability claims from the filing of disability applications through full adjudication, including appeals to the federal court system. The file contains ten years of disability claims, organized by filing year cohorts.
- SSA and the Office of Program Development and Research (OPDR) required us to perform the following tasks:
- Create a document that summarizes the changes made to the DRF, since the 2009 DRF build process;
- Identify, validate, and reconcile the best sources of data for the Programmatic Longitudinal Disability Data Repository (PLDDR);
- Develop and test DRF flat files that are representative of the data that will be found in the PLDDR; and
- Reconcile and resolve any differences between the test DRF files and the actual DRF files for ten years of data.
- Assist with the extraction and analysis of data elements from multiple sources in varied formats including mainframe flat files and RDBMSs. Read COBOL code and translate logic into pseudo-code and contribute to the development of analysis documentation.
- Work with senior Agency technical managers and staff to provide expert-level support for mainframe application design (including systems requirements), development, testing, and maintenance support efforts. Assisted in Identifying and recommending best practices.
- Serve as one the key coordinator among multiple project teams and/or components to ensure enterprise-wide consistency of application development efforts. Did work without technical oversight and supervise a team of specialists. Directed the development and maintenance of applications, and provide technical oversight for major projects. Oversaw the entire systems development lifecycle, including systems requirements, coding, testing, and implementation of proposed systems and provides critical recommendations and solutions for complex problems and Identified and addressed barriers and risks to successful implementation.
- Confidential provided a team of nine including managers, analysts, programmers, and data architects to the SSA to modernize one of its largest repositories of longitudinal data for a large benefits program. The repository encompassed over 43 million claims, 200 COBOL programs, and data from scores (Scores might be a little high) of federal and State legacy mainframes. This task is a foundational step toward implementing a robust data warehouse and business intelligence capability for the program and just one element of SSA’s emerging data management initiatives. The team:
- Analyzed and proposed candidate data sources, short and long-term data enhancements, and mapped them to current data elements. The team improved the data layout and incorporated enhanced data and coding standards, while expanding the available data set for end customers.
- Analyzed options to streamline and update processes and recommended solutions (e.g., data standards, coding techniques, tools to automate business rule extraction or the ETL process).
- Reverse-engineered business rules and a data dictionary from over 200 COBOL and FOCUS programs and JCL streams.
- Conducted detailed joint requirements and design sessions with client and user groups.
- Discovered and documented defects in existing file development process and data.
- Developed high-level process flows for the as-Is and To-Be ETL process, based on approved data sources.
- Used an Agile approach to managing the lifecycle.
- Coded and tested (COBOL, FOCUS) the extract, transform, and load process and wrote new and restructured/streamlined existing job streams (JCL).
- Created test cases and test plans; conducted system and integration testing.
- Used SAS Enterprise Guide to validate data and compare modernized file results with current files.
- Documented and resolved differences between the two files and modified code to improve data matching. The team achieved more than a 99.8 percent match between the old and the new datasets.
- Developed an implementation plan and transitioned the finished dataset to the owning organization. The plan included weekly meetings to increase buy in and share knowledge, a SharePoint site, full lifecycle documentation and artifacts, a full data dictionary, and orienting the owning organization to the new files and programs.
- The team routinely achieved Very Good or Exceptional on monthly Performance Assessment Reports and successfully delivered the completed files and artifacts on time or ahead of schedule.
- Developed applications that receive, stage and organize policy events that require underwriting.
- Coded mainframe programs to load policies into the Regional Queue.
- Worked on the front-end VB.NET and C++ to access back-end
- Provided a view of the work in queue which helps the resource planning process. Established relationships among underwriting reasons, decisions and follow-up letters.
- Distributed Database processing - each region has its own SQL server and File server. The application is loaded only on each of the 8 regional servers.
- EPSS works for Oasis policies these decisions that are made in EPSS have ability to update a limited number of tables and fields for the policy in the Oasis database.
- EPSS then schedules Issue to get the updated rows issued.
- EPSS also schedules cancellation through the Cancellation process.
- These policies are selected to meet the criteria of needing re-underwriting by a variety of sources. Some of the inputs are Claims Loss files, Claims referrals, PLOG referrals, WIPS by UDS, Renewals, Conversions, Point Calc, Oasis Service.
- Improved the job run times for the creation of the SDC tables. Worked with Production Support in researching and fixing production problems.
- The application was written in Cobol2 using DB2 on an IBM Mainframe.
- These systems used COBOL, COBOL2, CICS and DB2. These applications were tested with SPUFI and QMF, INTERTEST, Endeavor and various mainframe Debugging tools. Provided on call support, training and technical guidance to subordinates, have worked in a 3-tier client server architecture (Desktop, Server, and IBM Mainframe) environment. Have had testing background and supported regression or load/performance testing in a 3-tier environment. Worked closely with the automated testing group in determining, building, and maintaining the environment and processes in support of same. Have provided strategic planning/direction for the team and coordinate with interfacing application and user groups.