Lead/mgmt (off Shore/on Shore Client) Resume
Chicago, IL
PROFESSIONAL SUMMARY:
- Over 8+ years as an IT Quality Assurance Analyst/Tester, worked and experienced in testing web and client server application. Created Test Strategies, Test Plans, Test Scripts, Tester Cases, and Test Metrics as well as excel in bug tracking, analyzing the defects and reviewing test cases.
- Excel in various testing methodologies and types of testing. Exceptionally organized, efficient and precise with strong liaison skills, strong verbal, written and interpersonal communication skills, analytical thinking with good business acumen
- Expert in Black box testing such as Functional, Integration, Smoke, Regression, System, Unit and Acceptance; also highly skilled in executing test cases and analyzing BRD (Business Requirement Documents).
- Experienced in documentation like Test Plan, BRD, Reports with high quality and conducting/facilitating JAD sessions with superior organization and presentation skills.
- Gathered business requirements by conducting detailed meeting with business users and stakeholders.
- Excel in Web and E commerce applications using manual testing wrote.
- Expertise in QC/ALM 11.0 version.
- Extensive interaction with the end users while UAT.
PROFESSIONAL EXPERIENCE
Lead/Mgmt (Off shore/On Shore Client)
Confidential, Chicago, IL
Responsibilities:
- Manage the testing effort and off shore test leads/teams to plan, construct and execute release tests for the mobile application.
- Apply Agile testing methodologies by preparing SPRINT work plan on MS Project, Principle’s, processes to define as well as implement key metrics to manage and assess the testing process including test execution and defect resolution.
- Apply business and functional knowledge to meet the teams overall test objectives.
- Understanding eAgent application (3 platforms - Mobile/CRM/Legacy).
- Developed RTM/WORKPLANS/TEST PLAN/PLAN testing process for each platform as well as one major master WORKPLAN for all three platforms.
- Creating ADM process and methods using IBM RRC/RQM/RTC.
- Worked and trained off shore teams on APPCELERATOR (cloud) tool
- Conducted Interviews/Reviewed Off shore Resource resumes and selected the best highly skilled consultants from the Accenture QA Excellence Resource.
Environment: Mainframe, MS Project, Appcelerator, MS Project, IBM tools (RRC/RQM/RTC).
Confidential, Tempe, AZ
Lead (UAT Product Managements - FCS/MAS Applications)
Responsibilities:
- Conducted meetings to analyze the documents/process overview of the project.
- Created Test Plans and Test Data by gathering information from the documents.
- Managed numerous team members by coordinating work as well as created spread sheets by breaking down the documents for more efficient work flow and success of the project execution.
- Lead the UAT team by providing direction and support in terms of implementing SDLC processes and standards, and participated in Office of Quality Control review meetings.
- Conducted reviews on XML files which were transferred from EXPRESS platform.
- Validated Test case data in XML data files
- Provided leadership, test framework and actively identified needs to resolve them immediately. Conducted team status meetings regularly and facilitated test issue resolutions.
- Gathered and analyzed datafrom files and researched to identify problems and resolve discrepancies.
- Created and performed Soap UI testing by executing/testing scripts after verifying functionality of the application.
- Created Project process flow in excel sheet for the team to understand minor/major details in the project.
- Collected data from 3rd party vendors and customers and updated the Data information to the appropriate teams.
- Communicated effectively with project team members, stakeholders and made recommendations to the management on process and test approach
- Gathering and documenting project deliverables.
- Participated and conducted Agile Scrum Daily project meeting and planning using the AGILE process
- Worked on backend and front end mainframe - CBOS.
Environment: Mainframe - CBOS, XML, Word, Excel, ALM (QC - 11.0) and DB2
Confidential, Baltimore, MD
Subject Matter Expert/Sub-Lead/Business Analyst
Responsibilities:
- Participating in project initiation meetings to assess the purpose, scope and vision of the project by interacting with the CLIENT at SSA.
- Analyzed Business Requirements documentation and develop test plan from end to end testing of the Telephonic Application
- Extracted test cases from the functional requirement specification, Technical Requirement Specification and UI documents and wrote detail test procedures in Excel to cover the entire functionality of the application.
- Provided defect and test status reports to the project management team periodically through Quality Center.
- Communicated and interacted on a regular basis with project manager and development team during different stages of the project.
- Involved in tracing of the test cases with the new enhancements specification through RTM.
- Managed schedules and collaborated with different teams working on the project.
- Assisted with required data analysis, interpretations, correction and document presentation for various assignments.
- The scope of the project is to focus on testing the set of requirements that is expected to move to the production as production release to ensure that the release meets the user expectations
- Designed requirements traceability matrix to trace/manage business requirements.
- Prepared 1000+ test scripts for CARE 2020 project.
- Updated the Scripts related to Telephonic Application.
Environment: Quality Center and MS Office Suite
Confidential, Birmingham, AL
Business System/Coordinator/Execution Analyst/Sr. QA
Responsibilities:
- Expertise in Problem solving and Tracking Bug Reports using ALM.
- Updated the downtime of the OLB application to peer’s.
- Extensive discussion with Compass Bank Team to gather information related to the merger/migrates of BBVA Bank.
- Created Test Requirement to integrate both application to prepare one integration test scripts for both Alnova and Legacy application.
- Participating in project initiation meetings to assess the purpose, scope and vision of the project by talking to the key stakeholders in the organization
- Created Test Plans
- Analyzing test requirements and reviewing the documentation
- Reviewing Test Cases before sign off.
- Creating Use Cases and validating with development using SDLC approach.
- Creating UI documents.
- Gathering and documenting project deliverables.
Environment: Mainframe, Yodlee, Word, Excel, PowerPoint, ALM (11.0), JIRA and DB2
Confidential, Charlotte, NC, USA
Sr. QA Lead Execution Analyst
Responsibilities:
- Expertise in Problem solving and Tracking Bug Reports using ALM.
- Updated the downtime of the OLB application to peer’s.
- Creating new Customer profiles in BOSS application and linked them to accounts.
- Creating new online ID’s For testing Purpose
- Planned and schedules for functional and performance QA work for new and ongoing Atlas applications used by BOA.
- Worked across multiple teams (Off shore, near shore, on shore teams) in coordinating, educating and implementing / standardizing QA processes.
- Communicated and directed the team regarding new updates and forward the information related to Bill Pay. Worked/used on numerous 3rd party tools like: YODLEE, LEAN, FISERV, and Mainframe
- Transferred scripts from Quality Center to QTP.
- Manage test case development and execution.
- Worked/helped offshore team to verify the same result on particular script/step of the application.
- Monitored test execution of application work flow modules after deploying them.
- Manually conducted Positive and Negative tests. Analyzed system specifications.
- Analyzed the results of defects in ALM and re-tested the show stoppers.
- Trained New Rookies (Testers) and KT (Knowledge Transfer) for new group of Analyst regarding the SDLC, Project details and set up their data to their respective ID numbers.
- Created Reports on: How to Document and report bugs found during each phase of the testing cycle to the QA Testing using ALM.
- Took the responsibility of consolidating the User Acceptance testing Results by the entire team and provided the documentation for the test manager
- Customer/end user interaction is major responsibility while doing the User Acceptance testing for the knowledge transfer as well as to resolve the customer issues
- Involved in Test strategy creation for the UAT by identifying the set the test cases that are related to the release of the application
- User acceptance testing Defects are Identified and recorded in the ALM.
- Was involved to establish the configuration Management process for the release as part of the QA team
- Involved in establishing the trust between the configuration team and testing team to build the environment baseline, which has been updated with the release baseline
- Interacted with Teams, Manager and clients to Identify process flow and narrative for our vendor management process.
- Identified the required e-Bills Test data to execute. Execution of Manual Test procedures.
- Tracked Defects in ALM.
- Prepared status summary reports with details of executed, passed and failed test cases using Test Manager in the training.
- Made extensive use of MS Office tools to create and maintain documents such as test plans, test execution and test results documents
- Conducted and Initiated Triage calls related to deployment issues or defects.
- Oversaw UAT Black box (functional) and End-to-End, testing of various releases.
- Interacted with UAT team and Line of Business in solving the issues.
- Supervised, Trained, Updated team information related to the project.
- Complete knowledge of Front End testing and Back End testing using Black Box testing Methodology
Environment: JAVA, Mainframe, Yodlee, Lean, Window, Quality Center, Word, Excel, Target, Discovery, WCC, Weas less and Dartnet.
Confidential, Fairfax, VA
QA Analyst
Responsibilities:
- Tested the software using Manual Testing Techniques.
- Preparing and executing the Test Cases & Test Data. Involved in testing different Modules,
- Prepared test cases for User Define Enhancement, Imaging and Commissions Sales Fee.
- Current Projects and Goals to achieve by solving PMR (Bug) created and Ad Hoc Testing.
- Responsible for executing and maintaining test scripts/cases in QC
- Performed Functional testing, Acceptance Testing, Regression testing and System testing.
- Detecting bugs & classifying them based on the severity and priority in JIRA Bug Tracking system.
- Created Graphic and Slide Presentation to convey the process of the application before creating scripts.
- Prepared Test cases using GUI spread sheet before review/update/discussion with BA/Developer.
- Creating Automation Test Plan and getting approvals
- Creating, Storing, Organizing, and managing Test Automation Resources
- Interacted with Developers regarding new updates, solving issues which prone to be a show stopper.
- Responsible for the development and configuration of Mercury Quality Center and Quick Test Pro.
- Attended meeting and interacted with Business Analyst and developers to acquire knowledge regarding new requirements. Helped and Performed as required to meet the deadline.
- Creating, Enhancing, debugging and Running Tests
- Organizing, monitoring defect management process
- Handling changes and performing Regression Testing
- Finding solutions for Object Identification issues and error handling issues
- Co-coordinating team members and Development team in order to solve issues.
- Interacting with client side people to solve issues and update status.
- Worked on METADATA which uploads images to Fedora (3rd Party tool) to transform the image files in the application.
Environment: JIRA, Window, Quick Rules, QC, Metadata, Fedora, J2EE, MS Word and Excel.
Confidential
Quality Assurance Analyst
Responsibilities:
- Developed Manual Test cases for specific functional requirements.
- Made recommendations for manual and automated testing of varied functionality.
- Performed regression testing to weekly builds.
- Used PVCS Version Manager to check out the latest versions of the build for testing purposes, check in the updated test cases and test documentation periodically.
- Participated in status meetings to report issues.
- Communicated with developers through all phases of testing to prioritize bug resolution.
- Understanding Test Requirements and analyzing the Application.
- Creating Tests/Test Scripts using keyword driven methodology.
- Creating reusable components.
- Performed system and regression testing for new releases.
- Worked with developers to reproduce errors and resolve software issues.
- Debugging Tests and Fixing errors
- Executing/Running Tests
- Performed Navigation testing and cross browser testing with IE and Netscape to test the application behavior using Quick Test.
Environment: Test Director, PVCS Version Manager, MS Office, Window XP.
Confidential, New York, NY
Quality Assurance Tester
Responsibilities:
- Knowledge of Software Development Life Cycle (SDLC) and Test Methodology
- Expertise in Problem solving and Tracking Bug Reports using Test Director 7.6
- Manually conducted the Sanity test.
- Manually conducted Positive and Negative tests
- Complete knowledge of Front End testing and Back End testing using Black Box testing Methodology
- Designing Test cases and Test procedures for the AUT
- Analyzed system specifications.
- Excel in writing Test plans and implemented numerous test cases in Test Director
- Analyzed the results of manual and automated tests.
- Tracking and reporting defects, using Test Director.
- How to Document and report bugs found during each phase of the testing cycle to the QA Testing using Test Director.
- Prepared Test plans based on analyzing the user cases, Business process specification documents and User Interface documents.
- Identified the required Test data to execute Test Cases.
- Execution of Manual Test procedures.
- Tracked Defects in PVCS Tracker.
- Performed End-to-End testing manually
- The functional testing of web pages.
- Prepared status summary reports with details of executed, passed and failed test cases using Test Manager in the training.
- Made extensive use of MS Office tools to create and maintain documents such as test plans, test execution and test results documents
- Performed Smoke testing on all code which was old applications for training purpose.
- Learnt how to analyze system specifications and developing detailed system test Plan
- Developed Test Procedures and Test Cases.
- Learnt Manual Testing for the whole system
- Acquired knowledge in Functionality Testing, Security Testing, Parallel Testing and Regression Testing.