We provide IT Staff Augmentation Services!

Senior Application Systems Engineer  Resume

2.00/5 (Submit Your Rating)

Charlotte North, CarolinA

SUMMARY:

  • I am an experienced IT Technical and Business consultant, recognized for broad - based skills, wearing many hats.
  • Strong detail and analytical skills to solve problems and perform necessary research to determine the root cause of issues.
  • Move with ease through and adapt readily to ever changing technologies. All focused around building long term relationships and driving results to completion.
  • IT Business Analyst/ Data Analytics & Execution Analyst
  • Mainframe Programming & Development ( JCL, Modeling, Coding, Troubleshooting, Analyzing, & Fixing Errors)
  • Application Design and Support
  • Software, System, SIT, UAT, Regression, Integration, Functional & Load Performance Testing
  • Data Management (Data Mapping, Processing, Extraction & Data Mining)
  • Release, Change, & Project Management(Project Planning, Tracking, Requirements, Risk, & Execution Management)
  • Configuration Management &Quality Assurance

TECHNICAL SKILLS:

Methodologies:  (SDLC) Systems and Software Development Life Cycle, (DMAIC) Define, Measure, Analyze, Improve, & Control process, (PTLC) Product Test Life Cycle, Waterfall, Agile

Projects: Visa Digital Wallet (V.Me) Visa, BankAmeriDeals Gen2, OLB/MB Card Replacement, Scheduled Transfer Risk phase 3 and eCommunication phase 2 Confirmation Email

Programming Languages:  COBOL, SQL, JCL

Software Application:  WEAS (Legacy), BORNEO, MDA,VIPAA, PIPAD, (AO) Account Overview, (BOSS) Branch Online Service System - Non Financials, (FST) IMS Fast Authorization, (KTC) Know-The-Customer, (COIN) etc., MS Office Suite (Excel), Visio, Word Perfect, Formula, Omegamon XE, WebMV, MQ Series, Test Director, Director Control Center, Payments Director Image Review, CMVC, Install Shield, HP, Mercury QC 9.0 & Above, Tivoli (TEC) Enterprise Console, VTAM/VSAM, Peregrine, Remedy, SAS, JCL, TSO, CA-1, CA-7, CA-11, OPCA, NDM, QW, SAR, TPX, JES2, SDSF, VPS, ISPF, CPSM, Changeman, Netview, Natural, Java, CPCS, APView, Netman, HP Openview, Endeavor, RMDS, RACF, SAP, EA Expert Advisor 5.0, Netview 6000,User Manager, Job Scan, ISTT, Clearquest, TME10, ICCF, TMON, Fileaid, ZMF, Prodex, VPN, Lotus Notes

Hardware & Technologies:  IBM - 3090/3900, Compaq Proliant (Servers), Payments Director Transaction Server, WebSphere Application Server, AIX, Payments Director Export Server, Desktops, VOIP- Conference, Net or Live Meeting, PC, IBM Servers, Tape, HMC Boxes, Remote Consoles

Databases & Technologies:  IMS, DB2, Sybase, MS Access, ASG Mobius Products, Quickbase, Teradata RDBMS, CMDB, ADABAS, IMF, IOF, RMF, QMF, CICS-(Terminal, Application, and File) Owning Regions/ DDR- (Dynamic Routing Regions), Oracle, (EDW)- Enterprise Data Warehouse, ActiveX, MS Transaction Server, MS SQL Server, Environment Controller, Metadata, ODBC

Operating Systems:  MS Dos, Z/OS2, MS Windows 3.11, 95, 2000 NT/XP, NT 4.0 Workstation, VTS, Linux- Sun Solaris, Mainframe, SunGard Data System, MVS-ESA/VSE-ESA, OS390, Websphere 6.1, HP-Unix, Novell Netware 5.0, Band 2000, ES9000, Gateway, (EMS) Event Management System, Networker 3270 emulator, QWS3270, Candlenet Portal V180

EXPERIENCE:

Confidential

Senior Application Systems Engineer

  • As an ACT On-board Applications Systems Engineer 5, having worked in Enterprise Risk Management Technology Access Management Solutions, with multiple teams, averaging 10-15 different groups, to help prepare their application data and drive the ACT on-boarding process to a state appropriate for certification and compliance with policy and security was indeed a great success. With every release, I was able to maintain a successful state for each application moving from test into production. I worked with technical teams from all lines of business within the bank to establish new transmissions of data used in the certification of user access to applications and systems. excellent written and verbal communication skills - facilitated conference call meetings, project reporting on applications, MS Lync and MS Outlook email mainframe batch ETL experience - used IPSwitch FTP to move files to mainframe and check data integrity mainframe SAS - understood how to locate abends and fix issues mainframe security (top secret) - used TSO ACT built JCL scripts to define ACT to ART mapping in personal libraries and Test/Production environments to run batch processing on application files to prepare environment to send files to QA
  • RMS (Resource Management System) - to research applications data analysis - compared file data against ICD (Internal Control Document) to analyze quality, accuracy and linkage of files
  • DAT (Data Acceptance Tested) of application files for Users, Resource, UserAttr, RsrcAttr and Config created ART Request for applications - based on review of NDM Information coming from ACT Preparatory Questionnaire Surveys Report Manager created documentation, check-list and flow charts - for ACT On-board process, Endevor, QA Batch for Service Accounts, NDM Service Request Setup and more created database queries in DB2 and MSSQL- used MS Sequel Server determined passed or failed transmissions in IBM NDM

Environment: SnagIT 13, Mainframe (Top Secret), Web Apps, MS Office, IPSwitch, MS SQL Server, ACT (Access Certification Tool)

Confidential, Charlotte, North Carolina

GIS/IAM (Identity Access Management) - IT Security Specialist/BA

  • As a UIOLI project team member, I provided support for the IAM4210 and IAM0101-08 Disable Account - Existing accounts dormancy where controls must be in place to disable user accounts following no more than 90 days of inactivity. Dormancy Requirement policies to implement controls to monitor and enforce applications on-boarded to the entitlement repository (CSDB). These AITs or applications used an automated dormancy detection process initially constructed for applications with internal security stores on-boarded to CSDB (Centralized Security Database), and sending correct LUDs (Last Used or Login Dates) in their daily file feeds. The process carried through to those applications using the Basic SiteMinder SSO and having Splunk solution for Last Used Dates and entitlements coming from either internal security store or CDSN (Consolidated Directory Server Network).
  • Performed impact research and analysis on applications for UIOLI (Use It or Lose It) project in order to determine which applications were ready for enablement to the Dormancy Detection process. - used quality, entitlement, AIT status, dormancy, threshold, revocation, de-provisioning, mainframe, ARM, AIT system, ROCK, CSDB and batch reports, etc. Along with UIOLI setting to determine this, then enabled for dormancy detection, if found to be ok.
  • Executed dormancy detection QC (Quality Control) using a Data Quality Statistic report generated to find failures or fallout of those applications enabled for Dormancy Detection, but failed due to Last Used or Login Date errors. - created a process to determine those that were New and/or Fixed failures, in order to update automated UIOLI settings for reporting.
  • Coordinated, Captured Evidence, Reviewed and Approved for GIS, STORM (Standardized Technology & Operational Risk Management) artifact PDFs and emails, for those AITs not in compliance to policy based on not providing Last Used or Login Dates and having a business justification to either Risk Accept or Risk Mitigate the issue. - engaged and responded to LOB partners such as SDMs, Risk Execution Partners and Application Managers and their managers to provide feedback on AITs or Applications in question or out of compliance to policy.
  • Reviewed NEW and Existing ARM (Access Request Management) ticket request for Application Threshold or AIT Exclusion from Dormancy Detection - granted access to requestors having an ARM request ticket to receive an Exception of either Extend (Threshold to 180 or 365 days) or Exclude (from running dormancy detection).
  • Tracked, captured for Audit and maintained the integrity and availability of AIT or application readiness response information stored in an Excel spreadsheet - helped in the automation of the tracker process. Also, all metrics are captured on the IAM Weekly Quality Control Initiative CIO Scorecard.
  • Created UIOLI Analyst and User Stories based-off bug findings and to better develop reports in order to provide adequate information for research and analysis of AITs or Applications
  • Created UIOLI AITs - Enablement for Dormancy Detection Research & Analysis Overview and Process
  • Tested UIOLI analyst and user story reports in DEV, UAT and PROD
  • Trained new and existing teammates throughout assignment on varies task and tools
  • Maintained all SharePoint team site pages - held weekly meeting to address updates, adds, and deletions

Environment: Agile, SiteMinder SSO, Splunk, CDSN (Consolidated Directory Server Network), Data Mining and Analytics, V-Lookup, Pivot Tables, SharePoint, WebEx, Visio, MS Access, STORM (Standardized Technology & Operational Risk Management), ROCK (Risk & Operational Controls Knowledgeable), Mainframe, CSDB (Central Security Database), AIT System, CCM (Control Center Monitoring), ARM (Access Request Management), Communicator, MS Office (CSV or Excel), and Outlook

Confidential, Charlotte, North Carolina

Teradata BA

  • (DS) Datastage to (AI) Abinito Conversion Project 2014 - Application: Deposits - TDA, DDA, and RPM
  • Developed Downstream tracking sheet – to track external downstream dependencies, current dependencies, and future, in case of any changes or differences while converting from DataStage to Abinito, they would be found before releasing the SIT tables for testing, by having downstream to run their dependent job.
  • Coordinated downstream testing and signoffs - sent notifications to downstream of impacts and when to test their jobs – collected approvals, signoffs and tracked any issues or comments
  • Re-Modified EIW Deposits Forecast Funding By Project Spreadsheet workbook - due to many discrepancies from previous creator coming from formula and calculation issues
  • Ran queries against databases to determine data field columns in tables and views - used EIW Teradata to run sequel – roles included EIWP, EIWU, EIWS and EIWD- mainly used P and S
  • Created SLA’s and OLA’s based on FSD’s - used Agreements Monitoring tool, to store these agreements, and working agreements for tracking reviews, renewals and approvals
  • Used Documentum (DMS) Document Management Services to upload new, checkout old or update SLA’s and OLA’s by versioning, checking in and promoting, then copying the URL to place in the Agreements Monitoring tool.
  • Used Heatmap to determine users of tables or views based on the database; Tableau to determine users and their roles – the process was to identify the user, find out what the table is being used for and if needed or not, in order to establish decommissioning
  • Analyzed old and new code from the developers to determine how to map
  • Created EMR Source to Target Mapping from SHUB to (LR) load ready, from LR to (STG) stage, from STG to Base (01) tables, then views from 01 to (E0, V0 and V0 to V1, V2) - based off of the FSD and developer code document
  • Data analysis of source to target data model and requirements; determine transformation rules from field element within the code
  • Setup meetings to walk-through source to target mappings with developer and team
  • Setup meetings to walk-through SLA’s with customers
  • Met project deliverables and task in a timely manner

Environment: Sharedrive, MS Access, Data Analytics, Teradata SQL Assistant, Tableau & Heatmap for View and Tables, Agreements Monitoring, Documentum, Lync, MS Office, Outlook

Confidential, Charlotte, North Carolina

IT Business Analyst/Project Coordinator

  • Business and Gap analysis/ Process monitoring and improvement/ Multi-Client Coordination
  • CRK Sunset Legacy application effort for 2013 Sunset program - Analyzed applications to identify key components, interfaces, dependencies; Documented current state of an application; Emailed application owners and tracked their responses; Updated Sunset program 4 blocker with status of sprint goals weekly
  • Gathered summary requirements using task id in order to review and update stories and task status in (RTC) Rational Team closed and/or moved to product backlog
  • Strategic and Operational planning and execution of (CSU) Calculation Support Unit - Documentation: Reviewed/Updated Gap analysis documents;
  • Created technical documents and document processes; researched data,; Created charts and graphs based on stats; Built and adjusted metrics based on current and future state
  • Maintained reports/spreadsheets such as: weekly status update(distributed); top 10 issues; and master workaround; Updated high touch client deck; Meetings, Created the “Weekly Status Update” template to provide communication and visibility for both clients; Production support – performed file validation in STAC for several workaround issues
  • Automated daily and weekly Service-Now reports; Created charts and graphs based on stats; Built and adjusted metrics based on current and future state
  • Setup/ Open & Facilitated; Distributed minutes/agenda, notes and follow-ups; Tracked task and action items; Attended Scrum of Scrums to give status of application task; Facilitated WBS (Wireless Business Solutions) estimation and task to track updates
  • Tracked and updated status of incidents, defects, workarounds and problems

Environment: MS Word, Excel, PowerPoint, Snag It, SharePoint, Share M drive, WebEx, MS Office Communicator, HP ALM Explorer, Notepad++, RTC (Rational Team Concert), Data Analytics, CSU (Calculation Support Unit), MVAS (Mobile Value Added Services)

Confidential, Charlotte, North Carolina

IT Business Analyst/ Execution Data Analyst

  • Participate in online and mobile banking projects to validate data capture (business event) records both in test and production.
  • Extract and analyze data from existing data stores and perform ad-hoc queries to validate data (business event and transactional data) using Teradata or web extraction tools.
  • Track defects and re-validate after fixes made.
  • Collaborate across e-commerce business and technical units on cross-functional teams
  • Ability to write SQL queries against relational databases( such as, Teradata) and other ETL tools, such as, My CTO and Environment Control
  • Strong analytical, detail, quantitative, problem solving, and conceptual data skills
  • Adjust quickly to changing project needs and requirements
  • Work with business event, technology, testing channel, project management and data warehouse teams during the testing cycle.
  • Communicate effectively for requirements gathering of each project
  • Verify business event records meet specifications
  • Detect, Log and track defects found in data using Mercury Quality Center Control tool
  • Attend defect meetings. Work with developer on necessary changes, notify test teams when to re-execute scripts, generate records, and re-verify records.
  • Keep team apprized on status of testing and defects.
  • Verify transactional data sourcing records, etc.
  • Work with W developer to determine source of defect and necessary changes.
  • Library maintenance and updates: Add business event and associated details into collection tables and verify records are loaded into W tables.
  • SAS connects to the Teradata DBMS using Teradata ODBC (Open Database Connectivity)
  • Create reporting requirements for our self-service reports for all new business events added and update changes for existing business events.
  • Documentation: Develop and maintain technical documentation
  • Research requests from business partners and project teams
  • Write standards-compliant SQL to verify data in SIT/UAT and production (read, write, debug)
  • Ability to design, build, test and execute test plans to deployment (for library updates and testing projects
  • Ability to look at test data and find processing errors
  • Data Modeling skills
  • Experience working with eCommerce/eChannels Internet Marketing projects

Environment: Data Warehouse The ‘W’, Teradata - SQL Client Assistant 3.11, MS Word, Excel, Environment Controller, SQL , ODIN, Snag It 11, Discovery/SharePoint, Share drive, WebEx, NetMeeting, Office Communicator, HP Quality Center 9.2, Metadata Management tool, Business Events Data Management tool, Data Analytics, DBMS, SAS, ODBC

Confidential, Charlotte, North Carolina

IT Business Analyst/ Data Analyst

  • Participated in the full project lifecycle from conception through post-deployment for OLB and MB
  • Experienced in working on complex, cross functional projects and the creation of detailed requirement documents
  • Experienced consolidating research and data into coherent and persuasive presentation materials to present findings and recommendations to senior management for review and approval
  • Managed 3 to 4 large and small projects simultaneously
  • Managed daily activities, working sessions, issue resolution, and communication across project and support teams in order to deliver projects and enhancements on time, as requested by the customer
  • Reviewed HLD, LLD and BRD to determine if there were any Assumptions, Functional Requirements and/or
  • Reviewed reporting analytic requirement impacts
  • Signed off on SIT, UAT and PROD Test validation
  • Monthly project reporting- create Monthly PowerPoint presentation project slides for each deploy or implementation
  • Weekly project reporting- report weekly project status such as, Testing and in what cycle, closed, submitted specification and waiting to test SIT/UAT/PROD signoff etc.
  • Logged and tracked defects in CIT, SIT, UAT and PROD from backend systems
  • Submitted deliverables for the testing phase such as, the test plan, schedule, procedures and results, incidents/defect report, deferred defects and resolution plan, testing approval signoff and project deployment deck for eComm IMRA release review meeting
  • Quantitative Data Analysis- gathered and comparing numerical variables
  • Qualitative Data Analysis- pilot tested such as BankAmeriDeals G2, which is to implement several enhancements based on pilot feedback
  • Analytical, problem solving and adjust quickly to changing project needs and requirements
  • Created test plan, test scripts, test validation sheet and SQL table queries
  • Ability to write basic SQL data queries, execute and troubleshoot against relational databases
  • Gathered requirements and workflow analysis
  • Understand automated testing, requirements analysis and process model tools
  • Served as business liaison between business areas, senior leadership and IT development and support teams
  • Designed Business Event specification sheet for developers to code and project team to review
  • Self-directed and results driven with demonstrated ability to multi-task, prioritize and execute on executive strategies
  • Maintained project schedules and tracking life cycle phases; provide timely project status by reporting back to stakeholders and project team
  • Developed the functional specifications and review use cases and other documents with IT and the business to ensure all requirements are adequately reflected
  • Translated high-level business requirements into detailed functional specifications
  • Collaborated with business process owners to identify reporting needs and to improve product, service and business processes
  • SAS connects to the Teradata DBMS using Teradata ODBC (Open Database Connectivity)
  • Analyzed and evaluate data gathered from multiple sources and distinguish user requests from underlying true needs
  • Collaborated across eCommerce business and technical units on cross-functional teams for full life cycle
  • Designed online banking and mobile banking data capture (business event) records to capture customer activity
  • Designed and extracted data capture and report requirements from AGILE and SIX SIGMA project documentation
  • Validated data capture (business event) records both in development, test and production
  • Extracted and analyzed data from existing data stores and perform ad-hoc queries to validate data (business event and transactional data) using Teradata SQL or web extraction tools
  • Demonstrated to project business partners how to use self-service reports to pull production business event data

Environment: Data Warehouse The ‘W’, Teradata - SQL Client Assistant 3.11, MS Word, Excel, Environment Controller, SQL , ODIN, Snag It 11, Discovery/SharePoint, Share drive, WebEx, NetMeeting, Office Communicator, HP Quality Center 9.2, Metadata Management tool, Data Analytics, Business Events Data Management tool, Maximo, DBMS, SAS, ODBC

Confidential, Charlotte, North Carolina

Business Systems Data Analyst

  • Queried portfolio reports using Planview, modified and re - created those reports for Key Infrastructure, Release, and Wave projects using Excel and V-look as a source of reference
  • Mapped, filtered, manipulated, researched, extracted, populated, analyzed and linked data spreadsheets
  • Created scorecard and edited Pivot Tables with percentages, calculations, and formulas to track, update, and report progress
  • Identified and extracted data within the reports that needed to be changed
  • Used SharePoint for updating, uploading data and files, tracking changes, documentation, retrieving project package data and monitoring project packages
  • Gathered reports for all projects that were red and yellow and separately identified those with FSDs (Functional Specification Documents) and reasons for outstanding delays, such as pending CR (Change Request) and BRDs (Business Requirement Documents)
  • Analyzed and reached out to project managers on Go-to-Green dates and data discrepancies on project packages
  • Participated in dress rehearsals and conversions to support vendor mainframe applications by monitoring, tracking, and logging start and completion of task
  • Created documentation and timeline for data reporting process

Environment: Data Management, Web, SQL, MS Access, Excel (V-lookup, Pivot Tables, Data Mapping, Data Mining), SharePoint, Planview

Confidential, Charlotte, North Carolina

Production Change/ Release Manager

  • Project managed the implementation of 13 applications for CIS initiatives and production validation support
  • Planned and defined requirements, designed I - plans, timelines and PowerPoint project presentations
  • Captured information on changes, monitored the test change status to completion
  • Engaged all lines of business for cross-functional teams through emails
  • Planned and facilitated CIS daily checkpoint call meetings
  • Captured and tracked defects, tested progress, tracked and logged testing metrics (provided status reports)
  • Analyzed implementation plans to determine who may be impacted and need to be engaged in the implementation reviews
  • Coordinated, monitored and executed multiple application Weekend and Independent releases
  • Gathered installation specifics for each CIS change to create implementation walk-through
  • Advised application managers and teams on change tickets being extended to avoid missing their implementation window
  • Escalated and troubleshoot CIS issues with business partners
  • Collaborated with user experience team to track, monitor and manage issues and action items
  • Control room operations management activities
  • Used SharePoint to manage permissions to libraries and items, update/upload data and files, track and monitor I-plans, test phases and stored documentation
  • Used PowerPoint to build release timeline decks for CIS changes going into production by using Excel (Shapes and Graphics
  • Used MS Project to track project events and PowerPoint to create flowchart timeline diagrams
  • Identified defects, created SQL queries and reports using Mercury Quality Center (Test Director) for Test Lab & Defects
  • Identified tickets using Peregrine Service Center- understanding of Change Request and Change Advisory Board (CAB) Process
  • Developed & communicated testing requirements with (TCoE) Test Center of Excellence and others
  • Other duties and responsibilities of the position included attending the SIT Model Defect meetings to represent CIS and address defect statuses
  • Attended CAB meetings to capture changes that pertained to CIS
  • Attended weekly integrated release management meetings to plan and communicate readiness activities for CIS teams, and ensure compliance with Event Management
  • Facilitated CIS integrated release management meetings to present PowerPoint presentations on CIS changes

Environment: Mercury Quality Center 9.0 & above (Test Director), Quickbase, SQL, MS Office (Excel, Project & PowerPoint), Clarity, Timeshare, ClearQuest, Peregrine, TSO, Project Management, Change Management and Coordination, Implementation Release Coordination efforts

Confidential, Charlotte, North Carolina

Sr. QA Test/Support Environment Administrator

  • Defined requirements, designed JCL, modeled/coded PROC, PARM and PROGRAM, tested scripts and jobs
  • Job scheduled quality assurance tested
  • Developed stage environment for JCL, PROCS, PARMS and PROG s, to test conditions and expected results based on the application requirements
  • Set - Up schedules for TEST environments Load Performance Testing, (System Integration (SIT), Regression, and Acceptance (UAT) in Scheduler and CA-7
  • Arranged and led meetings with application managers and (TDL/TDM) on specific projects
  • Designed, coded, tested and implemented small enhancements to existing code
  • Provided subject matter expertise and project management on numerous projects
  • Collaborated with user experience team to track, monitor and manage issues and action items for projects
  • Interviewed key stakeholders and reviewed current state business and systems documentation
  • Captured specifications for record balancing processes, including dates, record counts, and financial amounts
  • Verified proper execution of all system s components
  • Knowledge of environment components relevant to end- to-end testing
  • Reviewed and executed (HLD) high level designs, detailed business requirements and business process flows
  • Project managed the implementation of software applications for initiatives and testing
  • Managed Mobius Reports for the entire Charlotte team along with NY and Denver
  • Developed documentation for new technology applications
  • Documented and tracked errors throughout the system life cycle
  • Siebel and Omni script testing
  • Batch execution, troubleshoot and fix Jobs in Error
  • Researched and Fixed Legacy/ Omni plus and OmniTrade application problems
  • Omni-Plus and Omni-Trade batch processing and testing

Environment: Regression, ADABAS, Cobol, JCL, CICS, IMS, NDM, JES2, SDSF, TSO, ISPF, CA-7, DB2, Netman, PDSMAN, SAS, CA-Scheduler, Natural, Service Center, ASG Mobius Product Suite, (Storage Management- TAPE and DASD), Endeavor, SQL, Oracle, Siebel, Scripts, Legacy, Agile/Waterfall, OMNI-Plus, VSAM, JAVA Mainframe, z/OS, OS/390

Confidential, Charlotte, North Carolina

QA Software Test Specialist / Mainframe & Server

  • Defined requirements, designed screens/database/objects and test plans, coded UI, logic, reports and configuration modifications, tested scripts, defects, features and documentation then implemented documentation and installation of the new fix level code
  • Test phases included planning, preparation, execution, report defect processes, and case scenarios
  • Installed and Configured servers to run check processing software for Payments Organization products
  • Windows and Linux - setup and connectivity
  • Installed and Configured software
  • Created Test Scripts and Test Cases
  • Tested Defects and Created Test Plans for Features
  • Verified user exit modifications, execution of matching logic, configuration modifications and error message processing
  • Created SQL queries and reports using Mercury Test Director /QC and CMVC for files
  • Modified Readme.Files and Change History
  • Uninstalled old (base and fix) then Installed the base level code first, then the new fix level
  • Installed and Verified installation of Fix Level Code to see if it could be performed over the Base Level Code and over a previously installed Fix Level Code Checked application logs, and restarted servers
  • Developed, tested, supported, documented, and resolved typical system related issues
  • Verified New components displayed on the install shield screen
  • Verified New components and samples were included in the proper folders
  • Verified Shippable updates
  • Created CD s from the Burner PC and Printed labels on the CD s
  • Deployed and Updated (.Ear) file using Websphere 6.1
  • Knowledge of multiple payment processing applications, such as Check21 and CPCS (check processing applications
  • Excellent problem diagnosis and resolution skills

Environment: Agile, Mercury Test Director and Quality Center 9.0 & Above, (CMVC) Configuration Management Version Control, Director Control Center, Payments Director Image Review, CMVC, Install Shield, Patching, CPCS, CPSM, Payments Director Transaction Server, Payments Director Export Server, DB2, MS Transaction Server, SQL, SAS, Web Applications, Unix, MS DOS, MS SQL Server, Websphere MQ, Oracle, SQL, Linux- Sun Solaris, Windows, Mainframe, Gateway, (EMS) Event Management System, Gateway, IBM Websphere Application Server, IBM Rational Application Developer, z/OS, OS390

Confidential, Charlotte, North Carolina

Applications Developer Support

  • Defined requirements, designed JCL, modeled/coded PROC, PARM and PROGRAM, tested jobs and implemented documentation
  • Migrated data for Fleet conversion, worked on (CRM) for KTC and BOSS applications support
  • Developed environment JCL, PROCS, PARMS and PROG s, to test conditions and expected results based on the application requirements
  • Designed, coded, tested and implemented small enhancements to existing code
  • Built schedule flows, modeled, coded and tested applications
  • Developed applications (job name creation) and created flow charts to build and execute design
  • JCL creation, JCL manipulation and regression tested
  • Created SQL queries and reports using ISTT ticketing tool for user request
  • Installed and Implemented from Stress, Tec Test and SIT to Production
  • Facilitated weekly meetings, reported on project status and coordinated with clients
  • Involved in the (DMAIC) process for Six Sigma roll - out of ISTT ticketing system
  • Worked assigned tickets, projects, and provided remote/onsite support for test and production on-call
  • Accepted, tracked, and coordinated the resolution of user-initiated application support requests
  • Resolved application support issues by providing a technical solution, or assigning responsibility to the appropriate person or team member
  • Promoted code, staged, and deployed changes within test and production environments using Changeman
  • Monitored, fixed and reported on application abends
  • Worked with Confidential s CORE model applications (some of the W Teradata concerning KTC)

Environment: Cobol, (BOSS) Branch Online Service System – Non Financials, (FST) IMS Fast Authorization, (KTC) Know-The-Customer, (COIN), MS Office Suite (Excel), CICS, VTAM/VSAM, SQL, SAS, Oracle, Peregrine, JCL, TSO, CA-1, CA-7, CA-11, OPCA, NDM, QW, SAR, TPX, JES2, SDSF, ISPF, Changeman, Netview, Netman, RACF, Teradata, Job Scan, Prodex, ISTT, Clearquest, Agile/Waterfall, VPN, TMON, Fileaid, Networker 3270 emulator, QWS3270, DB2, CICS-(Terminal, Application, and File) Owning Regions/ DDR- (Dynamic Routing Regions), Mainframe, z/OS, OS/390

Confidential, Richmond, VA

Sr. Mid/Main (Systems) Operations Analyst

  • Senior Level Analyst, provided technical support to lines of business and system staff
  • Performed (IPLs) initial program loads to implement upgrades to the systems (manually and automated)
  • Checked application logs, restarted server services
  • Developed, tested, supported, documented, and resolved typical system related issues
  • Was responsible for the management of multiple Sysplexed (LPARs) logical partitions that supported the system and Production application resources
  • Created SQL queries and reports using Service Center (Peregrine) and Tivoli (TEC)...etc.
  • Ability to quickly recover LPARs
  • Communicated both verbally and written to management, business partners and technical support
  • Facilitated Bridge line calls and wrote documentation for IPLs
  • Routed over to CICS and IMS database regions to check connections to terminals, files and financial transactions

Environment: Data Center, Informix,, Formula, Omegamon XE, WebMV, Netmaster, MQ Series, Tivoli (TEC) Enterprise Console, Cobol, Storage Management (Tape Devices and DASD), VTAM/VSAM, IPL, Oracle, (Service Center) Peregrine, Web Applications, Remedy, JCL, TSO, CA-1, CA-7, CA-11, NDM, SQL, SAS, CICS, TPX, JES2, SDSF, VPS, ISPF, SNA Manager, Netview, APView, HP Openview, EA Expert Advisor 5.0, Netview 6000, User Manager, TME10, ICCF, TMON, IMS, DB2, MS Access, IMF, IOF, RMF, QMF etc.

We'd love your feedback!