We provide IT Staff Augmentation Services!

Qa Performance Test Lead Resume

2.00/5 (Submit Your Rating)

IL

Masters in Computer Engineering, with over eight years of experience involving quality assurance of Client/Server and web based applications. Possess exemplary professional skills in testing methodologies and software development and was involved in all phases of the Software Development Life Cycle. Performed different dimensions of testing, Functionality, Usability, Reliability, Performance,and Regressiontesting. I am experienced in both Manual and Automated testing techniques. Proficient in the use of Mercury testing tools such as Quality Center, QTP, and Loadrunner.Extensively used load runner to create Vugen scripts to test web, database and client server applications. Created Vugen scripts and updated the scripts using parameterization and correlation. Executed test scenarios and performed application tuning to get the desired results. I am a dynamic and assertive team player with strong commitment.

EDUCATION:
M.S (Computer Engineering),

Certifications:
HP Certified - HP Performance center v9 - Accredited Integration Specialist (AIS)
HP Virtual User Generator Software
HP LoadRunner Software

HP Certified - HP Quality Center 9.2 Software

Brainbench Certified

  • Software Quality Assurance Certificate
  • Software Testing Certificate
  • Business Communication Certificate
  • Time Management (U.S.) Certificate

SKILLSET:

GUI /Tools:

Visual Basic 6.0

Technologies:

ODBC, SQL Server, Weblogic Server, WebSphere, Apache, J2EE

Automation Tools:

WinRunner, Loadrunner, Performance Center, QTP, Silk :Performer, Sitescope, HP Diagnostics, BAC

Defect Tracking Tools:

Rational Clear Quest, Test Director, Quality Center

Protocols:

HTTP, Web Services, Siebel-Web, Citrix Multi Protocol, COM DCOM, JAVA, Winsock

Packages:

Microsoft Office 2000, MS Visio, MS Project.

Confidential, IL August 2007 Current
QA Performance Test Lead TSS - GCA

Project was testing the Strategic Know Your Customer system and Pega Rules process commander. The main feature of the application is that the generate package with required templates and products. The application was also used to create KYCs. The KYCs were created based on the customer entitlements and requirements.
The Process commander system was mainly used to create cases depending upon the customer needs and accounts. The system creates messages depending upon the cases that are being created. Customer can create correspondence and adjustments as required and can also set SLAs for the cases. External customer can transfer cases to different groups and can also amend, cancel and recall their cases.

Responsibilities:

  • Responsible for defining the scope of the project gathering business requirements and documents them textually or within models.
  • Interviewing business area experts, asking detailed questions and carefully recording the requirements in a format that can be reviewed and understood by both business people and technical people.
  • Co-ordinate testing activities with different team and ensure that the project schedule is met without any issues.
  • Defined input requirements to document the Business Requirements Specifications and developed the Requirements Traceability Matrix.
  • Worked on the improvement of QA Process by reviewing and evaluating existing practices with standard testing guidelines.
  • Responsible for co-coordinating with the offshore team to complete the project within the deadlines.
  • Provide management with metrics, reports, and schedules as necessary and participated in the design walkthroughs and meeting.
  • Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile and executed Performance Testing using Load runner
  • Involved in studying the server statistics and determine the capacity on the system
  • Implemented workload models sizing application or project demand for required resources to meet business needs.
  • Identified hot spots in the application and conducted component testing for each of the component and fine tuned them to meet the acceptance criteria.
  • Using Load runner, created scenarios for Load and Performance testing with different host systems and also configured the test environment. Generated graphs and studied them to monitor the software performance.
  • Verifying the Vugen Scripts and Scenarios created by the team members before test execution
  • Have created load runner test scripts in C language and have updated the test scripts by using if loops and randomization functions.
  • Worked on creating scripts for different protocols like web, web services (XML, SOAP), JAVA, ODBC and Siebel.
  • Worked with the development team on MQ series projects to release messages during performance test execution.
  • Responsible for setting up monitors in controller, sitescope and interscope to monitor the CPU and memory utilization.
  • Used Secure CRT, Perfmon to setup monitor the Unix Resources.
  • Using HP Diagnostics to identify the root-cause of the transaction that caused the slowness in the system.
  • Conducted Regression and functional testing for different applications.
  • Was involved in Capacity Planning for Enterprise release applications , to calculate the future growth in the transaction volume and concurrent users on the system
  • Conducted discussion with Development team to decide on the hardware required, based on the performance test, to support the future user growth.
  • Written commands in SQL to verify backend data and have also monitored CPU and memory usage on SQL and Oracle servers.
  • Analyzed various graphs generated by Loadrunner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Used Sitescope to gather server information Memory and CPU usage.
  • Wrote and tracked the defects using Quality Center and communicated with the developers.
  • Tested the pre-prod servers to make sure, production has the similar performance compared to the performance test environment.
  • Conducted weeklymeetings with Project Head, Business and development teams.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA’s and Network Admin to ensure optimum performance.
  • Provide weekly updates to the client and application team based on the test results and analysis.

Environment: Windows NT, IIS, Citrix, ODBC, JAVA, C, C++, J2EE, Weblogic, Apache, Oracle10g, 11i,.NET, XML, Quality Center, Load runner 8.1, 9.4 , HP Diagnostics, Sitescope, QTP, UNIX, Weblogic, Websphere, Apache, SQL Server, MS Office, MS Access, MS Vision, MS Project, VB, J2ee analysis, HTML.

Confidential,IL January 2006 September 2007
QA Performance Test Lead/Tester

The project involved testing client/server and web-based application developed for the Home Loans Department.
Project was a Web-based application, which can be accessed by the brokers and loan officers and users from the different locations. Users can request the loan online and it consisted of four modulesfunding, pay down, transfer and shipping. Loans can be transferred and paid down online using this application. Worked on risk and collateral related applications.

Responsibilities:

  • Studied the URS document and created the Functional Requirement Specification document.
  • Worked according to the activities laid down in each phase of Software development life cycle and Coordinated the environment setup for Testing.
  • Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.
  • Created the strategy document that defines the test environment, phases of testing, time lines of different phases of testing, entrance and exit criteria into different phases of testing and resources required to conduct the effort.
  • Using Load runner, created scenarios for Load and Performance testing with different host systems and also configured the test environment. Generated graphs and studied them to monitor the software performance.
  • Created Vugen Scripts in C and have used certain C++ functions to update the baseline script.
  • Verifying the Vugen Scripts and Scenarios created by the team members before test execution
  • Have written TSL scripts for test automation for Ajax web based applications.
  • Used Correlation to parameterize dynamically changing parameters like Session ID’s.
  • Monitored resources to identify performance bottlenecks, analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Created SQL queries to verify the Database response and to verify if the response time was within the SLA.
  • Used TOAD to create DB commands and validate SQL data base.
  • Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.
  • Confirmed the scalability of the new servers and application under test after the architecture redesign.
  • Tested the pre-prod servers to make sure, production has the similar performance compared to the performance test environment.
  • Conducted weeklymeetings with Project Head, Business and development teams.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA’s and Network Admin to ensure optimum performance.

Environment: Windows NT, Citrix, Oracle 2-tier, Loadrunner 8.0, 8.1, Quality Center, QTP, XML, SOAP, C, C++, COM/DCOM, SQL, ODBC, Mainframe, Weblogic, UNIX, Websphere, JBOSS, Apache, SQL Server, Ajax, MS Office, MS Access, MS Vision, MS Project, VB, J2ee analysis, HTML.

Confidential,NJ January 2004 December 2005
QA Performance Test Lead/Tester

The project involved testing the client/server and web-based application developed for Claim Management System, online mortgage and Insurance application.
Project was a Web-based application, which can be accessed by the brokers and loan officers and users from the different locations. This online application gives the users the ability to use various tools like mortgage qualification, mortgage payment, refinance, debt consolidation and monthly payment calculators. It also allows the customer to View the Account status, Asset allocation chart, current market indices, trade equities online.
Client-Server application, Citrix, which was a Insurance Claim Management System, Where users can log in and check the status of the logged claims, add in new claims and track the claim information based on the Claim ID and manages the claims.

Responsibilities:

  • Studied the URS document and created the Functional Requirement Specification document.
  • Worked according to the activities laid down in each phase of Software development life cycle and Coordinated the environment setup for Testing.
  • Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.
  • Identified and classified Manual and Automated test cases by isolating the repetitive actions.
  • Developed detailed Manual Test cases and Scenarios. Studied Requirements and designed manual test cases accordingly.
  • Identifying the functional test cases for Regression Testing and automated these Test Scripts using QTP.
  • Installed the Citrix client to talk with the Citrix server and record the traffic going back and forth.
  • Created Database Vuser scripts to simulate client activities and performed Load, Stress and Performance test using LoadRunner/Performance Center
  • Generated and Created VuGen/Vuser scripts using Vuser Generator and Created Scenarios in LoadRunner Controller
  • Analyzed LoadRunner/Performance Center test result Involved in Preparing Test Plan and Test Cases based on the analysis of the business requirements.
  • Used QTP to develop scripts to perform Functionality and GUI testing.
  • Tested Peoplesoft payables for response time requirements, which streamlines accounts payable operations and enhances supplier relationships.
  • Inserted rendezvous points in order to simulate heavy loads for conducting Load Testing.
  • Used ramp-up and ramp-down to simulate real-time scenarios.
  • Analyzed the results using LoadRunner’s Online Monitors and Graphs to identify bottlenecks in the server resources.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.
  • Confirmed the scalability of the new servers and application under test after the architecture redesign.
  • Conducted weeklymeetings with Project Head, Business and development teams.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA’s and Network Admin to ensure optimum performance.

Environment: Windows NT, Citrix, QA Load, WinRunner, LoadRunner 8.1, Quality Center, Performance Center, Peoplesoft Payables, J2ee analysis ,Oracle DB, QTP , Site Scope, MS Office, MS Access, MS Vision, MS Project

Confidential,OHJune 2001 December 2003 Sr. Quality Assurance Analyst

Project was testing the online insurance handling, finance and Banking module. It helps people find the right insurance for their individual needs. Online Insurance requires the user to login with a valid user id and password to authenticate the person with a secure web log-in. After authentication, they can learn about insurance, get a quote, retrieve a quote or buy an online insurance policy. Customers can access their account information online, get details of previous transactions, view statements for the last 2 years and they can also file a claim or pay their monthly payments online.

Responsibilities:

  • Participated in all the phases of SDLC, starting from requirement, design, development, Testing and implementation phase.
  • Supervised the testing process from post development through build, system and integration testing.
  • Created Test Plans and test cases using Test Director. Organized test cases into test sets for the purpose of execution.
  • Worked according to the activities laid down in each phase of Software development life cycle and Coordinated the environment setup for Testing.
  • Gathered and created test data like pre-test, input, regression, actual and expected data.
  • Written Test Cases for Functional, Integration, Regression testing for different positive, negative and boundary scenarios.
  • Extensively created Data Driven Test scripts to read data from text files for testing. Checked for data boundaries/limits, incorrect input of data.
  • Performance testing was done using LoadRunner byplanning the load by analyzing Task distribution diagram, Transaction Profile and User profile, creating virtual users and analyzing different reports.
  • Involved in creating Load test scenarios using LoadRunner to bombard the server with virtual user requests and tests the performance under stress conditions.
  • Extensive Parameterization and correlation of the VuGen scripts to ensure the real time load conditions. Used Rendezvous points to load test specific transactions.
  • Used LoadRunner sophisticated analysis module to drill down to determine the specific source of bottlenecks within the application architecture and generate actionable reports.
  • Monitored Performance Measurements such asend-to-end response time, network and server response time, server response time, middleware-to-server response time.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA’s and Network Admin to ensure optimum performance.
  • Followed up on the bugs logged with developer and retest the test cases.

Environment: Windows NT, XML, SOAP, Test Director, WinRunner, QTP, LoadRunner, FlexPLM, MS Office, MS Vision, MS Project..

Confidential,Chennai March 1999 - January 2000
System Analyst

Project included various modules related to customer’s history maintenance, plan configurations, automatic and manual invoices generation and customer service.
Responsibilities:

  • Responsible for defining the scope of the project gathering business requirements and documents them textually or within models.
  • Interviewing business area experts, asking detailed questions and carefully recording the requirements in a format that can be reviewed and understood by both business people and technical people.
  • Defined input requirements to document the Business Requirements Specifications and developed the Requirements Traceability Matrix.
  • Conducted walkthroughs with stakeholders to design complete and detailed test plans and test cases.
  • Analyzed the system requirements and developed detailed test plan, test cases and test scripts for functionality and GUI testing
  • Implemented and monitored Individual Development Plans focusing on total performance, including both quality and productivity.
  • Interfaced with the developers toprovide all kind of support and resources needed to resolve technical issues.

Environment: Windows NT 4.0, Test Director, WinRunner, MS Excel, Visual Basic, MS Access, Oracle.

We'd love your feedback!