We provide IT Staff Augmentation Services!

Senior Analyst & Assistant Team Lead Resume

2.00/5 (Submit Your Rating)

Madison, WI

SUMMARY:

  • Have 16+ years of IT experience in design, development & maintenance, conversion, production support, user acceptance testing, change migration support & DB2 DBA. Domains worked are Student Loan Processing, Health Care, Mutual Funds, Retail, Commercial & Investment Banking, Cards, Insurance, In - house product development,state (DHS & DWD) and federal projects (GL).
  • Expertise in Mainframe technologies COBOL, DB2, JCL, VSAM, CICS, IMS
  • Worked exclusively on IBM technologies as well as supported distributed systems
  • Worked on both Waterfall and Agile methodology (Scrum & Kanban)

As a Senior Analyst & Assistant team lead

  • Assisted the project lead in the feasibility study of 2 approaches in conversion of one scanning software to the other. Alternate DB approach & reengineering possibility using MQ service. Estimated both approaches and created a proposal document to higher management for the decision. Got the alternate DB approach approved.
  • Analyse, coordinate, design, assign, develop, test, review and implement the conversion.
  • Created Business level documentation, process documentation, test plan documentation and obtain client approvals
  • Generated reports using complex SQLs & Excel macros, generated extracts to Data warehouse team, provided nightly on call support and assisted Changeman Version upgrades
  • Worked on lots of COBOL-DB2 batch application design, development and enhancements. These involved table design, table creation request using normalization techniques, SPUFI executions, SQL tuning using VISUAL & mainframe EXPLAINS, supported DB2 version upgrade from Version 9.1 to 10.
  • Have interacted with mid-tier (TIBCO), front end (.NET & Java) and external agencies thru Stored procs (COBOL & Native SQL), MQ service, SWIFT messaging.
  • Applied lean Six sigma quality principles on DB2 Refresh Process (Unload and load of tables) and got certified
  • Have individually worked end to end from business requirement gathering thru post implementation for various individual requests and have performed huge retrofitting efforts for entire project and supported various periodical implementations.
  • Have worked on researching & enhancing CICS programs. Have analysed IMS DB/DC modules. Knowledgeable on IMS data structures and embedded IMS programming both batch and online.

As a test region DBA

  • Created, modified, synched table structures, Load/Unload or Data refresh from Prod to QA regions.
  • Perform Quarterly Data Refresh of 1400 tables and 4400 files of entire QA applications ensuring data integrity
  • Perform Data Objects Authorizations, spufi executions, image copies, Database maintenance activities like Reorg, Runstats, Rebind, Check data, Backup etc.
  • Monitor and fix QA DB2 job abends, assist in admin aspects of NON DB2 issues (Dump and restore of files migration of change packages to QA, Batch Allocate &, deallocate files in CICS region) and ensure job completion within timelines.

As a Project Lead

  • Tasks acquisition from onsite and assign to offshore team.
  • Analyse and create Estimation, Technical specifications for the allocated tasks
  • Monitor, Assist & lead smooth execution of Enhancements meeting deadlines
  • Perform reviews on the code, unit test plans and results.
  • Review Packaging for certification, Create Certification test plan and Implementation plan for the releases and provide implementation support
  • Maintain timesheets, perform quality mandates, IQA, EQA, Final Inspection, Metrics, Audit & PMR
  • Manage and Maintain Project details on the Server and Prepare SOW for the agreement extensions
  • Write unsolicited proposals and share to higher management.

As a Developer, UAT tester & Migration specialist

  • Extensively developed adhoc programs using COBOL, VSAM, using File, Table, character handling and report creating techniques. Handled FB, VB on both DASD as well as on TAPES
  • Extensively used File Aid and IDCAMS for VSAM file creations &deletion. Base cluster, AIX, Path builds, GDG manipulations
  • Extensively xpedited CICS & batch programs for production sev3 level tickets
  • Extensively worked on background transaction processing in CICS and provided online oncall support
  • Provided batch oncall support for PROD/QA/Model office jobs.
  • Have worked on retiring legacy applications.
  • As a UAT tester have created test plans and test scripts and executed manual testing
  • As a Change Migration Specialist performed Endevor Migration Activities by reviewing and approving compliant client packages and Quality Assurance review of JCL / Procs
  • As an In-house tool developer Used Rexx, ISPF utilities (Panels, Skeletons)
  • Individually strategized, processed and accomplished $0.6m Cost Savings for ING thru various Mainframe cost saving techniques.

TECHNICAL SKILLS:

Operating Systems: MVS/OS-390, Z/OS, Windows 95/98/2000/NT/XP/VISTA/7, UNIX

Languages: COBOL, SQL,JCL,CICS,REXX

Database/ Database Tools: DB2(Versions 6 thru 11),IMS,VSAM,SPUFI,INSYNC,IBM DATA STUDIO,IBM COMMAND EDITOR,QMF,BMC,IBM DB2 ADMIN,PLATINUM,SQL Explorer,STROBE

Special Software: Version Manager, CHANGEMAN,ENDEVOR,SCLM,LIBRARIAN

Tools: MQ messaging,XML,SORT,DTCN,ISPF Services,EditPlus

File Editors: FILE-AID,STAR TOOL,INSYNC

Program Analyzer: Mainframe Express,REVOLVE,CASEPAC,Projcl/Infox

Debugging tools: XPEDITOR(Batch and Online),DTCN,INTER TEST,ABEND AID,FAULT ANALYZER

Job Output Manager: EOS,$SAVRS,TOM,SAR

Report Manager: EOS,Infodevl View Direct, CONTROL-D

Job Schedulers: Control-M,TWS,CA-7, Control-M Enterprise Manager

Tape Manager: RMM,TLMS

CICS/Batch Monitor: TMONCICS,OMEGAMON

ISPF Services: Panels,Skeletons,Message Libraries

Reporting tools: TRMS,DYL280,Easytrieve

IBM Utilities: IEBGENER, IEFBR14, IKJEFT01, IDCAMS, SYNCSORT, IGYWCL, CSQUTIL, XMITIP

Testing tool: HP ALM Explorer(QC),HP Service centre

Business tools: LIDP, INVESTONE, CRD, ANCHOR, VISTA, TOM (Trade Order Management),CARES

Others: PREPALL, JOBSCAN, PROJCL,JEM, XML, Rational Asset Analyzer, EmbarcaderoConfluence, JIIRA, Link, SmartTS, Serena Business Manager (Prod Change ticket process), Citrix Client for remote connection, Host on Demand 11.

PROFESSIONAL EXPERIENCE:

Senior Analyst & Assistant Team Lead

Confidential, Madison, WI

Responsibilities:

  • Assisted the lead in the feasibility study of the 150+ components (COBOL, JCL/ Controlibs, SAS). Analyzed the modules for abstraction layer approach or reengineering possibility using MQ service. Estimated both approaches and created a proposal document to higher management for the decision. Got the alternate DB (Abstraction layer) approach approved.
  • Analysed Operational tables imaging impact and finalised 50+ impacted components.
  • Coordinate with SME’s and technical leads in finalizing impacted components.
  • Research Image Print process both batch and online to replace it with Java process and created new DB2 structures for the print process replacement and worked with DBAs to create the same.
  • Create technical specification documents along with the team lead and assign it to self & team.
  • Made SQL and job updates to point to the new imaging abstraction layer tables when identifying borrowers who are in Skip that have a bad address or phone number. The updates were made to Skip trace reporting & distributed to clients
  • SQL Updates were made to the job responsible for identifying any military reject letters that were not processed and sent to borrowers. Changes were made to the SQL to search the military reject text in new document activity table instead of imaging event table.
  • Modified the SQL that creates EA80 Note File with borrowers, loans, and imaged Note forms associated with the loans to look at the new tables. And create an Audit/Error report detailing what was processed and any errors encountered.
  • Modify 4 SQL unloads of Total & Permanent Disability form *2833* imaging metadata, Servicing history documents, Borrower 1098 History Report and Consolidation documents.
  • Modify the Sort outrec and merge cards that sorts the unloads into Index file and imaging file to accommodate new increased length document key and
  • Push the Index File that contains imaging pointers to be printed to MQ for the print process
  • Modify 1 existing cobol module that consumes the preprint process output file of borrower documents, validates the form number for each document, and loads each document to the Print Image table
  • Code a new job & unload card that unloads records matches operational table to the cross reference table.
  • Code new cobol program that reads the unloaded file and update the related rows with the new document key field
  • Performed below activities to all of the process mentioned above.
  • Prod parallel tested the above changes and documented the test results
  • Create process document and test plan document
  • Executed Visual EXPLAIN for SQL changes and fine-tuned SQLs for better Service units
  • Obtain client approvals for the process document & test plan documents and implemented the changes

Environment: COBOL, JCL, DB2, CICS, DB2 Data Studio, Insync, SPUFI, CHANGEMAN, MQ messaging services, File-AID for files and DB2, SAS,REXX, PREPALL, SYNCSORT, $avers, TRMS,CA-7,Test Management tools, DMR(Mask PII data during test copy), Automatic restart feature, QUICK START & BYPASS routines, File I/O handler that checkpoints for rollbacks and commits, CSQUTIL, XMITIP Rational Asset Analyzer, Embarcadero, Confluence, JIIRA, Link, Agile

Senior Mainframe Analyst

Confidential, Madison, WI

Responsibilities:

  • For the report requirements created in CATS/JIRA by clients, analysed system, table data and prepared high level criteria document. (MS Word for SQL query, BLDs for Program change using VISIO)
  • Generation of non adhoc reports -
  • As required, coordinated with DBAs to create temp tables, get data loaded & generate reports comparing TEMP tables & CARES/ ACCESS tables
  • For program changes, coordinated with production support team to request UT program, JCLs & generated report thru Request cycle(RQJ) cycle
  • Ran query for selected partitions, unit tested the query result against approved criteria.
  • Document the unit test results and other docs in CATS/JIRA and closely work with System & UAT (clients) team to get the report tested
  • Once UAT approved, generated report for all partitions, formatted the output using various EXCEL macros and distributed it to clients
  • Generation of Adhoc reports
  • Generate routine Adhoc reports (using SQL queries & Excel Macros) on a weekly/monthly basis and distribute it to county and consortia clients.
  • Enhanced Prisoner Data Match response process for Data Exchange subsystem
  • Created prisoner verification details and data match discrepancies for different health program areas
  • This involved COBOL, DB2,MQ, JCL changes
  • Coded one time SQL queries to convert history records in Discrepancy table and work item table
  • Supported ECF documents transformation to CARES.
  • Fixed routine and adhoc issues thru SPUFI’s.
  • Provided system & acceptance testing & production implementation support
  • For the production issues raised by DHS clients, analysed standard CARES Eligibility tables (Case level or Individual tables) and other tables for issues and fixed them using SPUFI’s and/or recommend CWW online fix and tracked resolution in JIIRA
  • Provided Pager support by fixing Mainframe production nightly Job abends on a rotation basis. Used Control-M for monitoring the jobs.
  • Provided application support for Changeman Version upgrade to 8.1.
  • Tested all major changeman functionalities with new changeman version in DEV & INT for various types of modules DB2, IMS, Stored procs etc and resolved issues with DHS prod support team. And successfully provided implementation support and rolled out Version 8.1
  • Helped application teams in scheduling test cycle jobs in Control-M, manually order, hold, force complete, rerun, etc

Environment: COBOL, JCL,DB2,IBM Command Editor, IBM Data Studio, Platinum, SPUFI,CHANGEMAN,IMS DC, WebSphere MQ messaging, File-AID, JJ SCAN,HOD11,EOS,Control-M,JIIRA, CATS,CWW(CARES WORKER WEB),ACCESS

Team Lead

Confidential

Responsibilities:

  • Undergo domain certifications in Cards and Payments & Self Study DNB Application documents.
  • Brainstorm, raise queries and enhance understanding & Engage in Knowledge Sharing sessions with DNB SME”s
  • Understand various Applications’ objective, functionalities, interfaces and conventions and understand online / batch processes and flow diagrams.
  • Analyse IMS DB/DC programs using xpediting applications and extensively document Functional & Technical understanding of application.
  • Perform playback to the SME’s using the document and get approvals
  • Finalise the approved Understanding document into Application Maintenance Manual
  • Worked on SEV 3 level ticket. Made necessary changes out of the understanding of the application and migrated using SCLM

Environment: COBOL, JCL,DB2, IMS, VSAM,BMC, DB2 Admin, SPUFI, SQL,IMS DC, CICS, XPEDITOR,SCLM,TWS, HOD, Tivoli Output Manager

Senior Developer

Confidential, PA

Responsibilities:

  • Coded a Cobol program to read current data & deactivated data for a given INS ID and the mnemonic, unstrung(parse) it, loaded into cobol array, compared both and wrote the changes to report
  • Joined with the other tables for the ISIN, CUSIP and other details and printed new rows when there are multiple changes
  • Created jobs for program execution and ftp
  • Made changes to use of Application timestamp table for executing jobs without overlap of securities processing
  • Loaded 8 new BTRMs and attributes to EIP database
  • Code and execute one time SPUFI to load all AVTs to EIP database
  • Make backend table changes and job changes and transmit xmls for the IDM UI to receive and show ICB classification data in Raw and Gold SRM UI.
  • Modified Daily & Monthly 4packs, code cobol changes to exclude duplicate ZL Account, Include new BTRMS for distribution, Nightly Delta processing job change, and Parm change to allow ZL trade and Margin. The CCB was recalled so worked on retrofitting
  • Project includes, distributing of domestic and offshore funds holdings data (marketed for sale in chile) to Chilean government which will be delivered via Compass. EIP would be distributing this holdings report on a quarterly basis with a 30 day lag.
  • Made changes to source & store new data attribute from Bloomberg namely TICKER & EXCHG CODE
  • Coded cobol module and associated components to distribute holdings for Confidential ’s five domestic funds to International Team and then to Credit Suisse (Client).
  • Coded cobol and associated components to generate and drop Barclays message to MQ. One time job change/spufi setup to load tech data serv. Update stored procedure to change the Non region job statistics to monitoring UI. Fix CAT defect..
  • Analyzed and fixed data not populating in GOLD thru SRM refresh
  • Attend scrum project planning sessions and provide inputs (estimation, functional/technical) to set project timelines
  • Job setup, CAT region abend fix
  • Used Mainframe Express Micro Focus tool to debug programs.
  • Created New/Modifed Shell Scripts.
  • Worked on Control-M on mainframe and Enterprise Scheduling tool as part of monitoring QA jobs
  • Worked on IBM Optimization for Db2 for Z/OS and Explain of SQLs
  • Modified Eazytrieve reporting programs as per the requirement.
  • Assisted SAT testers for testing few CCBs

Environment: COBOL, DB2, STORED PROCS, JCL,UNIX,DB2, VSAM,BMC,XML, SPUFI, SQL, Endevor, XPEDITOR, Mainframe Express Micro focus, INSYNC,REVOLVE,$AVRS, Infodevl View Direct, Control-M, Control-M Enterprise Manager,Team tracker, Quality Center,HOD

Senior Developer

Confidential, PA

Responsibilities:

  • Modified 4 COBOL-DB2 programs as part of FSDR Normalization project to add settlement date to ledger entry table
  • Coded new and later enhanced 8 cobol stored procs to manage specific plan info, event info, & adjustments from COMPASS data structures
  • Coded 1 stored proc to manage expense ratio and 1 to manage fund eligibility & holdings eligibility
  • Coded 4 new native stored procedures to fetch accrued and manual expense amount, manage fee setup, data elements functionality & admin fee net expense
  • Coded 1 COBOL-DB2 program that extracts and populates a report with all audit records for the period from the prior audit execution through current time if no prior audit run then from last rollover.
  • Modified the process to receive DTCC dividend data(realized & pending) for Dividends, Splits, stock dividends & Reverse split from EIP(FIS) and not from InvestOne.
  • Provide an ongoing monthly feed for all fund/accounts/sub accounts showing the monthly cash flow to/from the advisor.
  • Provide a 1 time feed with 10 years worth of monthly cash flow for all fund/accounts/sub accounts.
  • Coded 2 cobol batch programs, one to create date parms for all the FOMI extracts (transaction analysis & fund trend data) & another program to get cash value of a given month for all funds from InvestOne.
  • IPA to QEG trade Communication - creation of new TIBCO process to communicate requested and executed trades
  • Coded 2 Native SQL SPs to retrieve all orders for passed in batch id, get all trade orders, drops to MQ
  • IPA End of day process - Changes to support new sourcing and posting strategies:
  • Modified program that loads Executed Trade File from CRD OMS to Anchor, to support new file format changes
  • Code a new program to load the executed trades file to Anchor requested ON DEMAND
  • Created Invest One Reports
  • Create Trade Match Report, Trade Exception Report and Trade Impact Report
  • Modified VB file read to table read and created PTP Calculation Report, & Exception Report
  • Perform scheduling changes to disable Trade Interface Summary Report, create extract from Anchor & SWIFT to DTC, transmit SWIFT to DTC, Validate transmission
  • Coded 2 COBOL programs to insert cleared and uncleared history data into history conversion table
  • Assist in designing dividend management data structures
  • Static load of data - Code stored proc to insert and delete DIV data into EIP and load AVT Dividend types
  • Gather Invest One data:
  • Generate fund list from IDR based on Ex-Dt,
  • Run FOMI process to extract data from Invest one
  • Create jobs to dynamically create FOMI request parms to fetch Share ending balance and Fund units amount
  • Load data to IDR using Invest one extract file-,
  • Validate data from Invest1, read file and update Div Management data
  • Code program to persist record date, reinvest date while calculating dividends
  • Code stored procs to retrieve Assigned, unassigned, modified Dividend details and update manual attributes
  • Modified 5 online programs, to maintain, add, change, delete and inquire Yield tolerance percentage code value in compass account extension table
  • Create associated components (JCL, PROC, OPT, Parm, DOC) and Unit test, and support SAT & UAT testing.
  • Interact with IT support teams for DBA activities(Table structure changes, Static load, AIDRP access), BAM warranty access, Endevor migration tasks etc
  • Create Change records for prod implementation, represent these changes in Inspection meeting
  • Post certify elevation and perform post implementation support
  • Analysed & estimated the mainframe impact of IPA/TOM IDR Phase II and provided on call support for implementation
  • Fix random issues with IPA test cycles and support testers by executing test cycles
  • Create and run refreshes of test tables to support prod parallel testing

Environment: COBOL, STORED PROCS, NATIVE SQLs, JCL, CICS,IBM Database Admin tool, DB2, VSAM,BMC, SPUFI, SQL, Endevor, XPEDITOR, DTCN, MICRO FOCUS, INSYNC, REVOLVE, Infodevl View Direct, Control- M,HOD,INVEST ONE,CRD

We'd love your feedback!