We provide IT Staff Augmentation Services!

Qa Test Manager Resume

2.00/5 (Submit Your Rating)

Orlando, FL

SUMMARY

  • Over 10 + years of experience in the Information Technology industry, which includes over 9 years of experience in Software Quality Assurance with expertise in manual and automated testing and extensive experience in automated test tools such as HP’s ALM, Performance Center 9.5, 11, Load Runner, Quick Test Professional (QTP), Win Runner, Test Center, Quality Center Test Director
  • Led Onsite and Offshore teams on several end - to-end QA engagements that included specialization projects in performance testing, automation/regression testing, functional, integration and manual testing
  • Experienced in developing different performance test scenarios such as Load Test, Stress Test, Endurance Test, Smoke Test, Scalability testing and Regression testing
  • Excellent knowledge of Software Development Life Cycle (SDLC), iterative software development life cycle process as per Rational Unified Process (RUP) for business analysis, worked in Agile oriented envionments, Object-Oriented Analysis/ Design (OOA/D), data analysis, and specialized and detailed requirement analysis
  • Expertise and experienced in performing different kinds of Performance Tests like LoadTests, Stress Tests, Endurance Tests using as HP’s ALM 11.0,Load Runner11, Performance Center 9.5 and 11.0 and Test Center 8.0 on various types of applications like client server applications, web applications using different kind of protocols like http/web, RTE, SOAP UI, Siebel, Winsock, .Net, Oracle 2 tier, Oracle NCA, Web Services, JAVA, SAP (web), SAP GUI (click and script) using multiple Load Generators and Controllers
  • Created and executed automated regression tests like Structural, Functional, GUI and Database using Regression testing tool HPQuick Test Pro (QTP) and Quality Center for different applications developed in different plat forms. Also reported defects in defect tracking systems (Test Director, Quality Center, Remedy and HP Service Desk)
  • Sound knowledge and experience of providing quality assurance in web-based and client/server applications including GUI testing with expertise in working in Unix and Windows environment and sound understanding of SDLC and role of SQA within the software development processes in different development environments such as J2EE, .NET, and Web.
  • Very sound experience using Java virtual machine(JVM) including garbage collection, heap size and threading
  • Used java profiling tools such as JProbe and JProfiler to aid Performance Analysis and ongoing efforts for Performance engineering. Activities included and not limited to Isolate code bottlenecks, monitor running threads, discover deadlocks, and pinpoint inefficient SQL. Identify the most frequently executed sections of source code, as well as those that account for the majority of execution time. Also very familiar on designing strategies to analyze network traces.
  • Expert knowledge and understanding of basic principles of manual and automation testing methodologies and techniques with expertise in creating and executing test plans and test-cases for various types of testing such as black-box, integration, performance, security, stress, and regression and using security, compatibility, and comparison testing for GUI and web applications.
  • Handle the tasks of installing and configuring software as well as tuning memory usage and performance of existing applications. In-depth knowledge of analyzing systems, evaluating system performance, and responses
  • Predictive performance modeling using HyPerformix Strategized, Microsoft Excel, Crystal Ball, Best/1, OpNet or equivalents
  • Extensive experience in bug tracking tools such as TestDirector, Quality Center, TestManager, and ClearQuest
  • Experienced in SQL, PL/SQL and Oracle development with good understanding of developing schemas and stored procedures and packages
  • Proven expertise in elicitation techniques such as interviewing, questionnaires, brainstorming, focus groups, prototyping, cost/ benefit analysis, and risk analysis
  • Thorough knowledge and understanding of preparing various documents required in testing environment with expertise in interacting with business users, Development team, Application Support team as and when required for updating QA test environment, analyzing defects, and following-up on testing
  • Self-motivated, detail-oriented, time-bound, and responsible team player with excellent communication skills with a distinguished ability to coordinate and lead an Onsite/ Offshore team
  • Ability to communicate effectively with strong technical and troubleshooting skills

TECHNICAL SKILLS

OPERATING SYSTEMS: Unix, HP Linux, Windows 95/ 98/ 2000/ NT/ XP, 7.0

ERP: SAP (ECC, BW, BPC, BI, MM, SD, OTC), ORACLE FINANCIALS, PeopleSoft - HCM

LANG/ TECH: C, C++, Java, EJB, J2EE, JSP, .NET, VB.NET, HTML, XML, ASP, JavaScript, Visual Basic 5.0/ 6.0, TSL, SQL, PL/SQL, VBScript

DATABASES: Oracle 8.0/ 8i/ 9i, MS SQL Server 2000/ 2005, MS Access

SERVERS: IIS 4.0/ 6.0, WebSphere

TOOLS: HP’s ALM (Application Lifecycle Management), Performance Center9.5, Load Runner 7.8, 8.0, 8.1, 9.1, 9.5, 11; Quality Center 8.2 (QTP 8.x, WinRunner 7.0/ 8.2; TestDirector 6.0/ 7.5/ 8.x), Junit, HttpUnit, Remedy, MS FrontPage

JAVA PROFILING TOOLS: JProbe, JProfiler

MONITORING TOOLS: Sitescope, Wireshark, NMON, PerfMon, Mobilink (SAP- Sybase)

PROFESSIONAL EXPERIENCE

Confidential - Orlando, FL

QA Test Manager

Responsibilities:

  • Test Manager, supervising overall testing activities of the Wyndham Voyager program. Led an Onsite/Offshore QA Testing team of 20 QA resources
  • The aim of the Voyager program is to make vacations accessible to the Wyndham Owners more frequently and more effectively. This needed significant changes to the existing systems and their alignment. It also required retirement of some legacy and redundant systems. The underlying foundation to achieve the Voyager objective lies in transforming the current reservation and inventory systems and processes and consolidation of their property management systems
  • Plan, organize, direct, control and evaluateall the activities and operations related to Quality Assurance
  • Consult and negotiate with Wyndham leadership to prepare specifications, explain proposals and test implementation strategies, stabilization initiatives and present high level but comprehensive reports and findings to them
  • Developed and implemented policies, standards and procedures for the quality and technical work performed all platforms of QA at Wyndham using the implementation partners best practices guidelines
  • Created and reviewed Master Test plans for Manual/Regression/Automationand Performance Testing
  • Successfully handled multiple QA projects in different phases of testing andmaintained testing integrity and high quality test execution
  • Helped the client create an Enterprise data team to handle complex SQL queries and data requests across the different client server applications, namely ‘Agent Desktop’ and ‘Web’
  • Involved in providing capacity planning expertise. Also implemented workload models sizing the application as project demand for required resources to meet business needs
  • Conducted training programs related to different QA techniques including use of performance testing best practices, performance monitoring, performance engineering, hardware workload and optimization
  • Defined and designed performance Test Scope, Test Execution Plans and Monitoring plans
  • Responsible for generating new strategies and plans for improving presentation of testing artifacts reporting namely, Status reports, Test results, performance monitoring reports (CPU, Memory Utilization/heap size/stack overflow etc.)
  • Helped develop SLA requirements and got approval from project partners, Architects, Directors, Project managers, Associates for base line and acceptable criteria for performance test results comparison and metrics.
  • Designed execution strategies and monitoring strategies to monitor different components of the environment like Tomcat batch grids, Application Servers, SQL databases and JVM performance
  • Performed hands on troubleshooting and guided embedded performance engineering teams in debugging application code problems, verifying fixes, reporting and tracking bugs, and updating regression test suites
  • Created cost estimates of the project and resource forecast for onshore and offshore resources and teams

Confidential, Atlanta, GA

Test Lead

Responsibilities:

  • Test Lead, leading an Onsite/Offshore Performance Testing team
  • Confidential brand operates in 2,248 big-box format stores across the United States (including all 50 U.S. states, the District of Columbia, Puerto Rico, the Virgin Islands and Guam), Canada, Mexico and China, with a 12-store chain. Confidential is headquartered at the Atlanta Store Support Center in Cobb County, Georgia, in Greater Atlanta. In terms of overall revenue, Confidential is the largest home improvement retailer in the United States, and the fourth largest general retailer. The store operates out of large warehouse style buildings with megastores operating in larger facilities
  • Gathered performance requirements from different SBU Managers (Service Business Units), Tech Team leads and stake holders
  • Was dedicated to and overlooked performance testing for almost the entireSupply Chain and SAP domain within the QAS group for Confidential .
  • Defined and designed performance test scope, test plan and strategy documents
  • Defined performance Test Execution Plans and Monitoring plans
  • Developed SLA requirements and got approval from project partners, Architects, Directors, Project managers, Associates for base line and acceptable criteria for performance test results comparison and metrics.
  • Developed complex and detailed business workflows into Load Runner scripts to generate data to be used in performance tests. Especially for black box components such as Tibco BW and BE, WMS and SAP BI components
  • Developed Load Runner scripts using Web (html/http), RTE (remote terminal emulation), Web Services, WinSock, Mobile Protocol, SAP GUI, Citrix, Java, .Net and Ajax protocols.
  • Developed different Load Runner scenarios. to stress different modules of the application like Work Management, Supply Chain, SAP, Unix, Java/J2EEand 60 different interfaces and middle ware components.
  • Developed execution strategies and monitoring strategies to monitor different components of the environment like Tomcat batch grids, Message Queue Servers, Application Servers, Oracle databases and JVM performance.
  • Executed several performance test scenarios and analyzed results and helped development in tuning the application for better performance
  • Executed several performance test scenarios and analyzed results and helped development in tuning the application for better performance using HP ALM performance Center 11.0
  • Analyzed the results and generated final results documents for senior management
  • Led a very large Onshore/Offshore team in a very dynamic and complex technology
  • Developing customized performance reports and reported the various performance bottlenecks in the application back to the application development team
  • Using HP Quality Centerto track and report system defects and bug fixes
  • Used tools such as JUnit and HTTPUnit for setting up unit testing frameworks for the aid of the development and QA teams
  • Responsible for senior technical architects in generating models for current architecture to improve system performance
  • Performed responsibilities of developing and maintaining the models of system capacity
  • Used Apache Subversion (often abbreviated SVN)to maintain current and historical versions ofHome depot proprietary source code files, web pages, and documentation. Our goal was tofollow up and continue the tradition of the widely used Concurrent Versions System (CVS) used previously at THD
  • Created cost estimates of the project and resource forecast for onshore and offshore, execution plans for different phases of the project

Confidential

Responsibilities:

  • Worked as Team Lead in Performance Test execution of WMS 4.4 upgrade. Gathered performance requirements and created performance test Scope andTest Plan with various kinds of tests like Load Test, Stress Test and Endurance Tests.
  • Involved in reviewing in existing scripts and new scripts for various critical business process like Disposition Auto link, RF detail receive, Close Carton, Anchor Carton, Load Carton, Close Trailer using RTE Protocol.
  • Designed and developed complex test scenarios to cover end to end processes. Also collaborated and synergized with the TMS and LMS development and testing teams.
  • Responsible for conducting initial research on the usage patterns of customers to identify the requirements of system performance
  • Executed several performance test scenarios and analyzed results and helped development in tuning the application for better performance using HP Performance Center 9.5 and ALM/ Performance Center 11
  • Ran detailed and exhaustive tests for this release. This release at Confidential was one of high visibility and 4.5 w/LMS and R4.6 particularly had a lot of customizations and modifications added since last tested. We ran load, stress, endurance (18 hours) and fail over tests.
  • Managed team of 4 performance test engineers specific to this project both onshore and offshore.
  • Ran detailed and an exhaustive test for this release as Tibco was a black box at THD. Being a major/game changing pilot it needed to be thoroughly performance tested.
  • Collaborated with onsite Tibco architects, designers, DBA’s, actual users, and data analysts to come out with innovative solutions to test the application.
  • Responsible for reviewing the changes of system architecture and benchmarking the performance of system
  • Created load runner scripts using Http/flex protocol and execution of several load tests (load/stress/endurance).
  • Developed different Load Runner scenarios to stress different modules of the application like IPR systems, batch frame work, Tibco BE and MQ Queues.
  • Managed team of 4 performance test engineers specific to this project both onshore and offshore.
  • Cutting edge mobile technology introduced for the Canadian stores had never been performance tested
  • Collaborated with onsite SUP/SAP architects, designers, developers, project managers, pilot store users and data analysts to come out with innovative solutions to test the application.
  • Recording the scripts/business workflows with .net Protocol in Vugen and try to decode the proprietary Sybase code (using specific filters in the recording options) replicating traffic from the First phone to the SUP/SAP servers and back.
  • On my directive the project team engaged their in house application development architect (.Net Architect) to review and work closely with the Performance testing Onshore/Offshore team to come up with a testing solution which then helped build a testing framework for the releases in the future.
  • Developed different Load Runner scenarios to stress key modules of the application like Article Look up/SRI/Cycle Count/OPW
  • Managed team of 4 performance test engineers specific to this project both onshore and offshore.

Confidential, Salisbury, MD

Test Architect

Responsibilities:

  • Test Architect, leading an Onsite QA team
  • The Perdue brand is the number-one brand of premium chicken in the Eastern US and the third largest poultry company in the US—synonymous with quality products around the globe, providing food and agricultural products and services to customers in more than 100 countries, with annual sales of $4.6 billion
  • Managing a team of Performance Test Analysts/ Engineers and overlooking the entire execution of the project from a scripting, execution and analysis process perspective
  • Participating in the creation of test plan in accordance with the functional requirements for the re-implementation of the SAP 4.6 to 6.0 ECC for client
  • Preparing high-level and detailed system design and capacity planning requirements including acceptance criteria
  • Collaboratively developing and maintaining templates, QA checklists, SOPs, and user training materials
  • Executing performance, stress and load tests for PP, MM, Finance, Costing, Inventory Procurement, Sales and Delivery and CO-PA modules under SAP 6.0
  • Working with different set up requirements such as enabling scripting on SAP R/3 sever-side and enabling scripting on SAPGUI client-side to ensure that availability of the scripting API while recording LoadRunner script by using SAPGUI protocol
  • Monitoring the team to perform manual correlation by using SAP status bar functions to capture output data of one business transaction is needed for another business transaction
  • Leading the team in recording and modifying testing scripts and in the creation of test scenarios using HP LoadRunner 9.52
  • Utilizing Performance Center project dashboard features such as set up project criteria, setting up performance targets, publishing performance targets and publishing load test run that provides the overview of entire performance tests and drill-down each load test
  • Using LoadRunner SAPGUI monitors to measure resource usage such as, average CPU time, average response time, and the number of logical ABAP requests for data in the database
  • Installing and configuring SiteScope 7.9.5 for SAP CCMS monitor, which will gather resource usage of all SAP R/3 servers and display it via web interface during scenario run
  • Using LoadRunner SAP CCMS monitors to measure resource usage such as, SAP portal monitor measurements and SAPGUI monitor measurements
  • Utilizing performance monitoring transaction codes in SAPGUI to find possible performance bottlenecks
  • Providing statistics on the buffers, workload processing, CPU and memory utilization, database activity, system errors, buffer swaps, table locks, and ABAP dumps
  • Analyzing the graphs and reports created during performance testing using report utilities in HP LoadRunner 9.52
  • Preparing detailed and drilled-down results for the ABAP Developers, Network Engineers, SAP Basis team, and Functional/ Business teams to review, understand, and implement before for the Go-Live date in the following months
  • Configuring standard and major alerts with pre-defined threshold values; if the threshold value is exceeded LoadRunner analysis generates an alert to pinpoint problems that arise during a load test scenario run
  • Using HP Quality Center to track and report system defects and bug fixes
  • Developing customized performance reports and reported the various performance bottlenecks in the application back to the Development team
  • Updating the project plan and testing related progress reports to the Technical Directors and Leadership team at Confidential on a weekly basis

Confidential, Chicago, IL

Test Architect

Responsibilities:

  • Participated in meetings with the client in planning strategies and designing test plans
  • Worked on troubleshooting the scripts, which were not running for multiple iterations, and re-scripted few of them
  • Worked on scripting the Oracle R12 ERP solution—the R12 implementation was called ‘Merlin’ and was a very challenging project completed in record time; widely used Vugen protocols Web-HTTP, RTE (Remote Terminal Emulation for the WMS package used at Zebra) and Oracle NCA
  • Introduced a new scripting methodology to reduce the scripting time
  • Created correlation libraries
  • Performed application benchmark testing and shake-out testing
  • Demonstrated expertise in creating intensive and advanced LoadRunner scripts, using custom functions and programs to support performance-testing efforts
  • Moved all the scripts from development environment to production environment
  • Analyzed both the environments for the changes to be made in the scripts
  • Ran all the scripts on the controller and coordinated with the Database Administrators and System Administrators during the test run
  • Analyzed the test results and generated reports

Confidential, Plano, TX

QA LEAD/Solutions Architecting

Responsibilities:

  • Designed the data warehouse from the various data marts and legacy systems
  • Modified existing SSIS packages stored procedures—changing and adding new connections, adding new variables and data flow tasks, and changing logic according to business rules
  • Created and modified Windows Scripts to run multiple store procedures
  • Implemented Extraction, Transformation, and Load (ETL) activities through DTS/ SSIS packages from different OLEDB data sources, and scheduled nightly data loads to ensure updated data
  • Developed and tested various ETL methods and performed data mapping
  • Created landing, staging, ETL, and presentation table and populated the data into those tables from the data warehouse using SSIS and linked server
  • Created and modified stored procedures to populated and accommodate new fields for the claim center
  • Made ETL process faster by creating indexes on columns used by joins within stored procedures
  • Changed some long scripted functions to stored procedure to reduce network traffic
  • Created scripts for consistency and integrity checks for data in staging phase
  • Worked with the Configuration team to come up with best strategy to implement data structures
  • Worked closely with Business Analysts to understand and implement business logic
  • Implemented backup strategy for legacy replicas, staging phases, and core tables
  • Provided solutions for performance issues and solutions where PIDs had limited column lengths
  • Formulated test plan, test scripts, and test cases for functional, system, integration, and performance analysis, based on the design document and user requirement document, for the functional, security, and performance testing
  • Performed unit testing and performance testing on issues resolved by other Developers
  • Participated in data analysis for the data warehouse and data mart system
  • Worked with the Development team to provide detailed data model understanding
  • Implemented workflow and personalization features
  • Reviewed quality assurance and technical documentation to ensure they satisfy the business requirements
  • Utilized SVN for version control

Confidential, Pittsburgh, PA

Performance Testing

Responsibilities:

  • Formulated test plan, test scripts and test cases for functional, system, integration, and regression testing based on the design document and user requirement document for the functional, security, and performance testing
  • Developed Vuser scripts using LoadRunner 8.1 PeopleSoft protocol version (HTTP/ HTML) protocol based on the user workflows
  • Enhanced Vuser scripts by introducing the timer blocks, by parameterizing user IDs to run the script for multiple users
  • Manually correlated the Manager/ Employee IDs, to save the dynamically changing Manager/ Employee IDs into a parameter by going to the body of the server response in the LoadRunner
  • Used Web Reg Find function to search for the text to see if the desired pages were returned during replay in Virtual User Generator (VUGen)
  • Changed the runtime settings such as pacing, think-time, log settings, browser emulation, and timeout settings in LoadRunner VUGen and Controller to simulate the real scenario.
  • Created various scenarios in LoadRunner controller for performing baseline, benchmark, stress, and endurance tests
  • Performed baseline test with one user and five iterations and benchmarked test under a load of 100 users using LoadRunner Controller
  • Used Scenario-by-Schedule in the Controller to change the ramp-up, duration, and ramp-down settings
  • Executed stress tests with a load of 225 users to see the breakpoint of the application
  • Monitored the metrics such as response times, throughput, and server resources such as CPU utilized, available bytes, and process bytes by using LoadRunner Monitors for IIS
  • Analyzed server resources such as total processor time, available bytes, process bytes, and heap usages to look for performance bottlenecks
  • Worked with the Analytics team and built reports for the team, which were used for developing business models for the different releases of application such as Manager Self Service (MSS) portal Employee Self Service (ESS) Portal, eBenefits, and Employee Enrollment
  • Provided traceability across the project by analysis and review of project elements involved in traceability including requirements, testing artifacts, end-user support documentation, and training materials
  • Worked with the Database Administrator and Database Developers to convert the functional specification to technical specification; identified bottleneck and performance issues of the ODS population and worked together to resolve the issues
  • Primary point of contact for all testing needs of the project and coordinated with different team members and Project personnel for the execution and completion of test effort to release defect-free systems to production environment
  • Created MIS reports, PARS reports, and Weekly Work Progress schedules for the management

We'd love your feedback!