Qa Test Lead Resume
Columbus, OhiO
PROFESSIONAL SUMMARY:
- 14 years of extensive experience in Confidential: diversified experience in Quality Assurance and Software Performance Testing
- Hands on experience in Performance benchmarking of Confidential - Sterling Multi Channel Fulfillment product (OMS).
- Knowledge of Sterling Service Deination Framework (SDF) and Agent and Integration Framework
- Hands on experience in Performance benchmarking of Warehouse Management Open System (WMOS, Manhattan Warehouse Management System
- Superior Analytical and Problem solving skills
- Ability to work independently as well as within a team environment
- In depth understanding of Software Development Life Cycle (SDLC)
- Confidential with business users to review and update Business Process Requirements (BPR) and Software System Requirements (SSR). Responsible for requirements gathering, creating the test plan, high level test case documentation & complete End to End Performance testing
- Implemented various QA methodologies and Best Practices involving preparation of Test Plans, writing Test cases
- Skilled Manual and Automated testing Techniques using HP-Mercury Interactive Tools: Load Runner, Silk Performer, Test Director and Quality Center
- Execute Performance test scripts using Load Runner/Silk Performer and analyzed the results.
- Conducted load, stress, configuration, isolation, capacity, failover, disaster recovery (DR) and endurance tests using Load Runner/Silk Performer.
- Expert at creating effective Load Models to mimic production/real life load on to the system
- Experienced in testing Stand-Alone, Client-Server, Web-Based and Web Services (SOA) Applications
- Proficient in different phases of Scripting: Record, Enhance, Configure and Dry run
- Performed Back End Testing by executing SQL queries and Data Driven Testing
- Ability to effectively analyze a variety of Performance Reports and Summary Reports
- Build tools that can help gather and present results from raw server statistics
- Experienced in reporting bugs using Bug reporting tools such as HP Quality Center. Ability to analyze and debug with Root Cause Analysis
- Make critical decisions with respect to time versus quality tradeoffs
- Leading a team to achieve desired goals (onsite & Offshore)
- Exceptional team player with the ability to lead, manage and work independently in a fast paced environment
- Driven to meet aggressive milestones. Excellent organizational, communication and interpersonal skills
- Open to learning new technologies
TECHNICAL PROFICIENCY:
Testing Tools: Silk Performer 2008/2009/2010/2011 & Version 9.0, LoadRunner 7.6/8.0/8.2/9.0
Bug Reporting Tools: MS Excel, MS Word, HP Quality Center
Scripting Languages: Java Script, TSL, and VB Script
Web Technologies: HTML, XML, EJB, and JSP
Database: Oracle, MS SQL Server and MS Access
Platforms: Windows XP/ NT/ 2008, UNIX, and Red Hat Linux
Monitoring Tools: Sitescope, Perfmon, SAR (UNIX), LoadRunner Online Monitoring and Diagnostics, NMon, VMStat & Silk SAM
PROFESSIONAL EXPERIENCE:
Confidential
QA Test Lead
Environment: Borland-Silk Performer 7.1, 8.1 (2010), 2011 & 9.0 UNIX, Oracle 10g, ExaData, Manhattan - Warehouse Management System (WoMS), Confidential Sterling Order Management System
Responsibilities:
- Conducted formal and informal meetings for Requirements Gathering prior to the start of the performance testing cycle
- Confidential with business users to review and update Business Process Requirements (BPR) and Software System Requirements (SSR)
- Worked closely with developers, SMEs and functional test leads to understand the functionality of the application and processes
- Coordinated with Business for gathering business requirements and worked with infrastructure team for environment setup and data preparation in the testing environments
- Estimate the test efforts, test schedule and work with the project managers to support the project schedule
- Involved in functional testing of WMOS applications and EIS application of Manhattan Associates.
- Created a process, SQLs and excel tools that helped us create required amount of data in a short span of time
- Scripting Standards were implemented and documented across all performance testing scripts achieving
- Ensuring the required business processes have been captured
- Reusability of scripts during each years peak season and quarterly/monthly SND testing
- Being a data driven application, the performance test scripts read the data from a database helping us achieve an execution for a duration of 16 hours
- Performance monitoring standards were created across the projects
- Standard report template that would present Predictive and Comparative analysis based of data points captured
- Various tools like Silk Performer SAM, VMStats, NMon, SQLs were used to capture thresholds, load implemented and bottle necks
- Created Load Models that would help mimic a production like load on to the system during the performance testing execution
- Conducted result analysis and interacted with developers, QA Leads and Sr. Architect to resolve bugs and defects
- Identified bottlenecks
- Increase efficiency with measurable and repeatable load generation
Confidential, Houston, Texas
Performance Test Lead
Environment: LoadRunner 8.1 FP4, UNIX, MySAP ECC6.0, Oracle 10g
Responsibilities:
- Responsible for requirements gathering, creating the test plan, high level test case documentation & complete End to End Performance testing.
- Reviewed business process and functional requirements. Confidential with SMEs to prepare BPR documents
- Analysis of 13 months Usage history from production to identify the highly used T-Codes, Batch jobs & Reports
- Performance Tested various modules: HCMS - Human Capital Management System, FICO - Finance & Controlling, WM - Warehouse Management, MM - Materials Management, IM - Inventory Management, QM - Quality Management, PM - Plant Maintenance and PP - Production Planning
- Identified Batch Jobs: Day/Night, Inbound/Outbound and High Volume
- Use of SAP T-Codes to monitor certain performance parameters on the application
- Contributed to project delivery including project progress reporting and other governance processes
- Creation design & running the load test scenarios, and generating reports for performance analysis.
- Maintained historical records of test results and identified problematic trends
- Conducted result analysis and interacted with developers, QA Leads and BASIS team to resolve bugs and defects
- Participated in the project meetings
Confidential, Columbus, Ohio
Performance Test Lead
Environment: HP-Mercury LoadRunner 9.0, Sitescope, Mercury Quality Center, UNIX, Vitria Order Queue
Responsibilities:
- Responsible for requirements gathering, creating the test plan, high level test case documentation & complete End to End Performance testing.
- Analysis of the requirement documents, test case documentation.
- Confidential with business users to review and update Business Process Requirements (BPR) and Software System Requirements (SSR)
- Providing Assistance in Designing Performance testing Scripts for eFCs as per the business requirements
- Developed use cases, workflow, screen mock-ups as per Business requirements
- Worked closely with the business owner (Director), application manager and SMEs to gather the requirements
- Created a Level of Effort (LOE) document for various projects, projecting the performance testing effort, helping the business plan and budget.
- Analyzed performance statistics (response time, throughput, network load, etc) of production application (VOQ) to confirm performance projections and application usage trends to Baseline the application
- Created various Business Process Scripts
- Ran the Load tests with various configurations, analyzed test results and reports.
- Maintained historical records of test results and identified problematic trends
- Conducted result analysis and interacted with developers, QA Leads and Sr. Architect to resolve bugs and defects
Confidential, Richmond, VA
Performance Test Lead
Environment: HP-Mercury LoadRunner 7.8/8.1 (FP4), Diagnostics, Sitescope, Red Hat Linux 2.4.21-40, Weblogic 8.1 SP6, XML, HTML, MQ Series, UNIX, Oracle 8i
Responsibilities:
- One of the three members of the core team responsible to setup and stabilize the account
- As a senior member of the QA team, I was responsible for requirements gathering, creating the test plan, high level test case documentation & complete End to End testing.
- Developed resource plans, ensured that project plans support business schedules and have clear requirements
- Analysis of the requirement documents, test case documentation.
- Validating the test data population in the back end.
- Scripted using the Web services protocol, Parameterized data values, used Client Side Secured certificates. CPU Memory, Network and Weblogic parameters were monitored while running Baseline, Performance Load, Stress and Endurance Tests
- Created various Business Process Scripts and performed Data driven testing
- Created a Level Of Effort (LOE) document for various projects, projecting the performance testing effort, helping the business plan and budget.
- Established a Successful Onsite - Offshore model, that proved to be a huge gain for the client
- Creation of Vuser scripts, design & running the load test scenarios, and generating reports for performance analysis.
- Created Load Models for various projects, getting the client sign off before proceeding
- Created a Tool for generating a Web Services script for LoadRunner (SOAP).
- Created a Tool to generate report from the Application Server Logs
- Performed parameterization & Correlation by using various LR functions to enhance the Vugen scripts. Created Load tests by uploading scripts from VuGen, assigning Vusers and Load Generation machines.
- Maintain the LoadRunner environment for the team, grew it from a 38 machine to 61 machine environment
- Performed Back-End Testing using SQL queries and UNIX scripts for a couple of projects
- Used Sitescope to monitor the possible Bottlenecks in the application.
- Ran the Load tests with various configurations, analyzed test results and reports.
- Monitored the online graphs during the scenario run and analyzed the graphs.
- Analyzed the result files generated after each run of load test and exported the data to excel sheet.
- Conducted result analysis and interacted with developers, QA Leads to resolve bugs and defects
- Participated in the go/no go project meetings