To obtain a challenging position as QA Analyst / Data Analyst in a fast-paced technology company, by utilizing my proven Technical, Interpersonal, Analytical and Documentation skills, abilities and my diversified experience in Information Technology to ensure the company's sustained growth and continued success.
- Over 7 years of progressive working experience as QA Analyst / Data Analyst Data Validation, Data Reporting, Data Extraction, Data Mapping, Data Modeling / QA Analyst and Testing with diverse projects, clients and industry, over 10 years in Information Technology with added advantage of Applications Development background on IBM Mainframe /AS400 /Unix.
- Expertise on Systems Development Life Cycle SDLC includes Agile and Waterfall.
- Extensive knowledge on MDM, Data Profiling ad Data Quality, Data Integration.
- Developed Use Case Models, Activity, Sequence and UML diagrams.
- Experience with Data Analysis, Data preparation, Graphical Presentation, Reporting, Data Validation and Documentation.
- Experienced with data modeling, database design, construction, and tuning.
- Extensive application development beginning with the business process analysis through coding, testing and implementation.
- Created Data Models and documented the Data dictionary Metadata .
- Extensively used SQL to Analysis and drill down to determine root cause of the problem and suggest solutions.
- Extensively used TOAD to create and run SQL queries on different Data Bases and performed backend testing, and check data integrity and data consistency.
- Expertise to translate business requirements documents BRD/FRD/SRD/TDD/Use Cases Business Requirements, Functional Requirements and Technical Requirements into testing documents with strategies/plans, test scenarios and test cases, including appropriate SQL scripts to execute testing cases.
- Expertise on STLC Software Testing Life Cycle Methodology and used HP QC, ALM Defects Management Tool for defects and defect tracking.
- Performed planning and development of Test Plans, Test Cases and Test Scenario to meet product's business requirements, experience in both Manual and Automated testing, used QTP Quick Test Professional and LoadRunner for automated testing.
- Conducted and supervised different testing included Unit Testing, Sanity Performance Testing, Smoke Functional Testing, White/Black Box Testing, Positive and Negative Testing, Security Testing, Stress Testing, Integration Testing, System Testing, Volume Testing, Load Testing, and Regression Testing.
- Validated Medical, Hospital and Dental claims in FACETS, verified Codes with description and requirements, verified adjudication process of Claims in FACETS.
- Expertise in and HIPAA-EDI Testing Privacy with multiple transactions exposure such as Inbound/Outbound 834 Membership Enrollment, Claims 837-Institutional, 837-Professional, 270/271-Eligibility Benefit Inquiry/Response, 276/277-Claim Status Inquiry/Response, 835-Claim Payment/Remittance Advise, 820 Premium Payments Transactions and testing in Client Server systems and Mainframe Applications.
- Complete Understanding of the ICD-10-CM Diagnosis and ICD-10-PCS Hospital procedure code sets and conversion from ICD-9 to ICD-10.
- Extensive experience on Mainframe System and applications Analysis, Programming, Testing and Implementation and support new/existing applications Batch/on-line and maintain Legacy applications as per company's Application Architecture on Mainframe environment using TSO/ISPF, COBOL I COBOL II, CICS, MVS JCL, DB2, SPUFI/QMF, SQL, VSAM/GDG Files, File-AID, Endeavor, SyncSort, Easytrieve, Xpediter, Abed-Aid, MS office suite, Windows etc.
- Strong knowledge of OLTP and OLAP Systems, Dimensional Strategies, Surrogate Key, Star Schema, Snowflake Schema.
- Extensive knowledge on DWH process which include, Data modeling Conceptual, Logical and Physical , ETL process and tools, B.I process and tools.
- Excellent knowledge on DataStage/Informatica as ETL Extract Transform and Load tools, Business Objects/COGNOS as B.I tools used for Data Mining, analysis and reporting for Management to make decision.
- Possess excellent organizational, interpersonal, communication and documentation skills with good process management skills along with a remarkable ability to gather requirements to bring out quality product.
- Ability to successfully manage multiple deadlines and multiple projects effectively through a combination of business and technical skills.
Data Analyst / QA Analyst
- Working on Pricing Solution application deal with DW Data Warehouse .
- While working with Nationwide, representing IBM, a most popular a world renewed company in Computing world.
- Performing Multi role working as Data and performing Quality Assurance Analyst role as well.
- Working with a team involved in designing and preparing ETL mappings to load the data using Informatica from Source to Staging and Staging to Target for Dimensions and Fact tables.
- Pricing application receiving problems which relate with business while data is loading in DW.
- All problems schedule to implement rectification every month under monthly release.
- Performing testing for all problems listed in requirement document for monthly release.
- Documenting all testing efforts in every month release document, under testing.
- Extensively working on SQL using Oracle-TOAD to create and run SQL to analysis data to find root cause of the problems and suggested solutions.
- Working on B.I too MicroStrategy to generate different reports depends on State and dates used filters.
- Working on MARI Measuring Actuarial Rate Impact , it is part of Actuarial work on Mainframe.
- Using Rate Control software to modify rate and then move in to mainframe and running job.
- Providing daily status report for problems and testing efforts to application and Iteration Manager.
Technical Environment: UNIX, Teradata, Oracle, TOAD, Informatica, MicroStrategy, IBM Mainframe, OS/390, COBOL, JCL, DB2, , MS Office, Windows, etc.
Lead QA Analyst
- Working with 3-4 Quality Assurance.
- BCBS KC using DB2 as Data base maintaining on IBM Mainframe.
- Create and run Query on DB2 for each Parallel job when completed successfully and tested by DataStage developer.
- EDW consist all HIPAA-EDI transactions Data .
- Using Tracker Excel for task and time management for jobs and QA as well.
- BCBS KC has developed own standard for this project call CDMA Codes and Data Mapping Application .
- Participated in development of MS Access SQL utility Model to create and run query for verification of DataStage job.
- About 15 DataStage developers worked to convert DataStage Server jobs in to DataStage Parallel jobs.
- Using Daptiv 3rd party tool for defect management and to communicate with developer to fix defect.
- Closely working with BCBS KC staff in Kansas City and Project Manager/Project Architecture. Involved in planning the Test Strategy for the whole testing period.
- Prepared Test Strategy, test plans and Test Cases, Scenarios as per requirements and executed as and when needed.
- Provided daily status report about problem to project Manager.
Technical Environment: MS Access, Mainframe, OS/390, DB2, Daptiv, MS office suite, Windows.
IT Testing Analyst
MVP Health Care corporate office base in Schenectady, NY, has different sub-offices in Rochester and Syracuse, NY. MVP Health Care has trading Business Partnership with Cigna Health Care. MVP Health Care using Database Sybase to meet challenges of data processing and Data storing.
- Worked on EDW FRDM Facets Reporting Data Mart is a project under Enterprise Data Warehouse MVP is migrating from Sagent a Pitney Bowes ETL tool to Informatica and Business Objects to COGNOS a B.I. tool .
- Worked on Project EDI software upgrade from 4010 to 5010 and compliance.
- Worked on System Testing functional Testing of NYS 837I Institutional Claims and NYS 820 Medicaid Premium Payment .
- Closely working with SME's Business Analyst, System Analyst of EDI application based on Mainframe.
- Moved data from production to test environment called refresh on bi-weekly basis.
- Performed Data Analysis, using Oracle SQL Developer to execute Queries to make sure its correctness and completeness in test environment.
- Performed Performance and Load Testing used LoadRunner to verify System performance and System Load.
- Prepared Test Strategy, Test plans and Test Cases, Test Scenarios as per requirements and executed as and when needed.
- Working with development team in order to modify existing MVS COBOL program MVS JCL, to meet HIPAA-EDI 5010 compliance, using SPUFI, TSO/ISPF, MVS COBOL, MVS JCL, Serena ChangeMan/Endevor, SQL, SyncSort, Easytrieve etc.
- Using Facets to check claims and benefits after data successfully moved from database to Facet Database based on Oracle database.
- Executing test cases in Test Lab, found defect, reporting Test Lead/Test Manager, creating defect in Defects Module of MQC version 11 called ALM , assigning defects to development team based on defects.
Technical Environment: Mainframe z/OS/390 , TSO/ISPF, DB2, SPUFI, MVS JCL, HP QC ALM , LoadRunner, FACETS, Oracle SQL Developer, UBE, MS office suit, Windows.
Application Analyst/Data Analyst Consultant
AMERIGROUP Corporation is a multi-state managed healthcare company focused on serving people who receive healthcare benefits through publicly funded healthcare programs, including Medicaid, Children's Health Insurance Program CHIP , Medicaid expansion programs and Medicare Advantage. Amerigroup Corporation using FACETS software packages to process claims, and MDE Medical Data Express to generate query software package to verify claims.
- Directly involved in download data from Production environment to different Test environment, defined in the Oracle database.
- Performed data analysis to make sure its correctness and completeness in a test environment.
- Worked on a new business as well as expansion in business areas in different States including Louisiana New Business , Washington New Business , Texas TX-JP expansion , NY NY Homeless, NY Health Plus expansion .
- Assisted and coordinated with others teams while involved with multiple work streams.
- Attended JAD session and participated in discussion to resolves issues.
- Attended weekly meetings to keep up to date with issues and business changing elements.
- Extensively executed SQL on Data Base to download Data from DB2 Mainframe to Test environment in Facets Data Bases Oracle .
- Performed Data Analysis to check and verify data completeness and correctness.
- Documented results of all SQL executions and submitted to Manager on weekly and required basis.
- Wrote new and modified existing Shell Scripts on UNIX to Data from source to Facets.
- Prepared Test Strategy, Translated BRD documents in Test Plans, Test Cases and Test Scenarios, closely worked with Development and Testing team.
- Closely worked with SME's, Business Analyst and Systems Analyst on every stage of application.
- Provided daily status report about problem to Project Manager, Business and Systems.
- Performed back-end testing by using SQL Queries SPUFI on DB2 to ensure data consistency and Data Integrity, and documented results.
Technical Environment: IBM Mainframe OS/390 , TSO/ISPF, DB2, SPUFI, MVS JCL, SyncSort, FACETS, MDE 4.2.3 Claim Test Pro , Oracle, SQL Server, HP MQC, MS Office, Windows.
Lead QA Analyst / Tester
The Palo Alto Medical Foundation for Health Care, Research and Education is a not-for-profit health care organization that is a pioneer in both multi specialty group practice of Medicine and outpatient Medicine.
- Leading a team of 6-7 Software Tester, distributed the work among them as per their strength and knowledge and resolved issues for each tester.
- Performed task management and time management for the project/team.
- Participated in developing Test Plans, Test Cases for System Testing.
- Developed the test cases as per the HIPAA regulations 270, 271, 275, 276, 278, 834, and 837 .
- Performed manual testing considering both positive and negative scenarios.
- Conducted Regression testing, identified software errors and interacted with developers to resolve technical issues.
- Performed Performance and Load testing on Claims inquiry system used LoadRunner.
- Performed Functional and Regression Testing of FACETS.
- Involved in creating test cases and test scripts in QTP.
- Performed Functionality and GUI testing using QTP.
- Defining and performing the Test strategies and associated scripts for the verification and validation of the application and ensuring that it meets all defined business requirements and associated functionality.
- Provide management with metrics, reports, and schedules as necessary and participated in the design walkthroughs and meeting.
- Worked with EDI transactions 837 and 835 Remittances and validated the Claims in FACETS.
- Involved in processing Claims in FACETS and validating the full cycle process to make sure the checks are generated
- Worked on the continuous improvement of QA Process by reviewing and evaluating existing practices with standard testing guidelines.
- Worked in Agile methodology to incorporate the changes in different phases.
- Used Quality Center to track and report bugs and defects and was also responsible for communicating the status to everyone involved.
- Performed Back End Testing for data validation to ensure the accuracy of data.
Technical Environment: Quality Center, LoadRunner, QTP, MS SQL Server, Windows, MS office suit, MS Excel.
ETL Analyst /Data Analyst
- Involved in requirements gathering, source data analysis, and identified business rules for data migration and for developing Data warehouse / Data Mart.
- Closely worked with SME/Business Analyst/Systems Analyst to understand Business requirement and technical solution.
- Worked on MDM for integrated Data of Credit Cards and MDM Hub for RBC retail Banking Data
- Prepared ER Diagrams to resolve issues in Data warehouse.
- Involved in identifying and defining the data inputs and Captured Meta Data and associated rules from various source of data for ETL Process for Data Warehouse.
- ETL Extraction, Transformation, and Loading Project includes design and development of a Data Stage DBs with a series of scheduled DTS Packages invoked through DTSRUN with parameters, driven by configuration files .
- Created source to target mapping documents for Datastage ETL , to load Data from staging area to Data Warehouse.
- Worked on documentation of the data model using an extensive set of ERwin conceptual, logical and physical diagrams.
- Worked on DataStage to cleansing data from IMS/VSAM/SAM/QSAM as per requirement of project and then uploaded in Data Warehouse based on Teradata .
- Extensively used Teradata SQL Assistant, Teradata Manager to run Queries and utilities BTEQ, Fastload, Multiload to load bulk data in Data Warehouse FastExport to unload data from Data Warehouse in required format.
- Used DB2, VSAM/GDG/SAM/QSAM as a source to upload data in Data Warehouse, used SyncSort utility to sort data.
Technical Environment: IBM z/OS, TSO/ISPF, COBOL, VSAM, JCL, DB2, SyncSort, SQL Server 2005, Erwin DM, DataStage 7.5, Teradata and utility, MS office suite, Windows.
- Involved in planning the strategy for the testing efforts.
- Prepared Test Plans and Test Cases, Scenarios as per requirement and executed.
- Worked Software Testing Life Cycle STLC , used Mercury Quality Center Mercury Test Director Defects Management Tool for defects/bugs tracking.
- Tested new and modified programs to check functionally of business as well as requirement of end-user.
- Conducted Unit testing and Integration testing and verified the impact manually.
- Worked and provided production support, maintenance and testing. Developed new programs and modified the existing programs to incorporate new changes.
- Supported Transportation and Utility Billing Project using TSO/ISOF, COBOL, COBOL II, MVS JCL, CICS, VSAM, DB2, SQL, SPUFI/QMF, File-Aid, Endeavour, XPEDITER, etc.
Technical Environment: Mercury Test Director 8.0, IBM OS/390, TSO/ISPF, COBOL, COBOL II, JCL, VSAM, CICS, DB2, Easytrieve, SyncSort, Endeavor, Xpediter, FTP, File-AID, Windows 2000/XP, etc.
Sr. System Developer Mainframe
- Worked as team member to launch credit cards for clients used COBOL, JCL, VSAM/ GDG, SyncSort, TSO/ISPF, Easytrieve, FTP, NDM, CA-7, File-Aid and Endeavor.
- Redeveloped RMS application running on AS/400 used RPG IV, COBOL/400, CLP/400, DB2/400, SQL, Embedded SQL, Query/400, PDM, SEU, SDA, RLU and Interactive Source Debugger, reduced processing time by 40.0 .
- Worked as effective team member in the development and implementation of credit cards for Visa for CIBC and clients AMEX and Shopper Drug Mart on Mainframe environment.
Technical Environment: IBM OS/390, TSO/ISPF, COBOL, JCL, VSAM/GDG's, MQ Series, SyncSort, Easytrieve, FTP, NDM, SAR, CA-7, COBOL/400, CLP/400, DB2/400, Query/400, PDM, RLU, MS Office, Windows, etc.
Programmer Analyst Mainframe
- Worked on EDI Electronic Data Interchange , transfer of data in-bound from broker system to host system Mainframe , uploading data, into system, formatting data as data file, validating, uploading valid data in to applications, running policy application, uploading data which is still having error and sent back to broker for further rectifications.
- Worked on Insurance Package HUON Personal and Commercial to customize according to Canadian Insurance Standard CSIO and company standard as well.
Technical Environment: IBM-OS/390, MVS/VSAM, COBOL, DB2, CICS, VSAM, GDG TSO/ISPF, MVS JCL, Endeavor, SPUFI, File-Aid, FTP, SyncSort, Easytrieve, XPEDITER, MVS CEDF.
Programmer Analyst Mainframe
- Worked on COBOL, JCL Easytrieve, CMS/Xedit, to maintained insurance applications for auto and home Property and Casualty .
- Actively participated in the conversion of the application and Data from Legacy Mainframe application to HP UNIX Micro Focus COBOL and Oracle Database .
Technical Environment: IBM 3090, HP/9000, VM/ESA, COBOL, Micro Focus COBOL, UNIX, CMS/XEdit, REXX, JCL, IEBGENER, Easytrieve.