Sr. Performance Test Lead Resume
Concord, CA
SUMMARY
- Nine years of experience in the field of Information Technology (IT) with emphasis on Software Quality Assurance and support activities.
- My background includes working knowledge of the standard Software Testing Methodologies during the Application Development Life Cycle process.
- Furthermore, I have experience in the Black Box testing of stand - alone and Client/Server applications using regression, front end, back end, load, stress, performance and functional testing.
- In-depth Experience in Software Quality Assurance and Testing (Manual and Automation using Mercury Products), Test Plan Formulation, Functionality, GUI, Performance and Regression Testing in Web Based Applications in different environments and different Browsers.
- Extensive experience in Performance Testing for the Applications developed in C, C++, Java, Visual Basic, HTML, JSP, Servlets, Applets, UNIX, DB2 and Oracle.
- Experience in creating test plan, test scenarios and test cases using requirements documents.
- Participated extensively in Facets, HIPAA and Health Claims under Health Care Domains.
- Experience in FACETS with various application involving sales and enrollment, claims processing, customer service and billing areas
- Strong skills using, installing, and configuring all components of Load Runner including VuGen, Controller, Analysis, and Load Generators.
- Proficient with using Oracle, PeopleSoft, SAP Web, Web http/html, Flex, Ajax TruClient and Winsock protocols in Load Runner
- Developed and deployed test Load scripts to do end to end Performance testing using Load Runner.
- Analyzed the test results (TPS, Hits/second, Transaction response time, CPU utilization etc.) using Loadrunner Analysis, various monitor tools and prepare Test Reports
- Excellent expertise in automated testing tools like Quality Center, Load Runner, Performance Center and Quick Test Pro.
- Expertise in Performance Testing such as Load, Stress, Baseline and Scalability testing
- Adept at developing detailed Functional, Integration and System test specs.
- Converting high-level test requirements into reliable, repeatable, and maintainable test cases.
- Experience in monitoring servers using tools like SiteScope, Wily Introscope, TeamQuest and Tivoli Performance Viewer.
- Profound insight to determine priorities, schedule work, and meet critical deadlines within budgetary guidelines.
TECHNICAL SKILLS
Automated Test Tools: WinRunner7.x/6.5, Test Director8/7.x/6, Quality Center 8.0/9.0/9.2/10.0 , Performance Center11.0/12.0, LoadRunner 7.0/8.0/9.0/9.5/11.0/11.52/12 , QTP 8.0/9.2/9.5, HP Service Center, SOAP UI
Languages: TSL, C, C++, PL/SQL, Java
Configuration Mgmt Tools: ClearCase, PVCS, Visual SourceSafe
RDBMS: Oracle 8i/7.x, MS SQL Server 2000/6.x, MS Access 2000/9.x, DB2, Sybase
Front-End: Visual Basic 6.0, Developer 2000
Internet: HTML, DHTML, VBScript, JavaScript
Others: MS-Office, MS-Excel, XML, Novell, J2EE
Operating Systems: UNIX, MS-DOS 6.22,WinXP/ 2000/98/95/3.1 , Win NT4.0/3.51
PROFESSIONAL EXPERIENCE
Confidential - Concord, CA
Sr. Performance Test Lead
Responsibilities:
- Gathered Nonfunctional requirements by participating in PTR Walkthrough for different phases and prepared the test approach, reviewed them with workgroup and got signed off by Business.
- Identified and defined scope for various phases of Testing and prepared the workload/performance model based on the production volumes.
- Strong knowledge of using Single and Multiple protocols in Loadrunner Vugen like Web Http, WebSevices, Ajax Tru Client.
- Ability to identify root causes and derive corrective actions to meet short and long term business requirements using resourceful approaches.
- Exceptional ability to build productive relationships with business users, test teams, development teams and clients across functional and technical disciplines and thus generating accurate and detailed business requirements.
- Ability to successfully manage multiple deadlines and multiple projects effectively through a combination of business and technical skills.
- Used to identify the queries which are taking too long and optimize those queries to improve performance.
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Independently develop LoadRunner test scripts according to test specifications/ requirements.
- Executed Duration/stress/load/rendezvous/DataCenter Failover scenarios and regression testing for various operations and performed detailed test analysis reports and perform Disaster Recovery.
- Memory Leaks were identified in Different components. Protocol to Protocol Response times, Web Page break downs, Components sizes were analyzed and reported using Wily Introscope.
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Inserted rendezvous points to create intense load on the server and thereby to measure server performance.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Created dashboard in Wily console for the monitoring purpose and monitor method level response time statistics of an application.
- Performance test to simulate the activity of 10000 concurrent users to generate production volumes.
- Create final go/no go reports and publish it to wider group of audience.
- Managed offshore team and ensure testing is completed on time and within budget, resolved the issues faced by offshore due to data security by reaching out to appropriate security team
- Ensured entry/exit criteria are met and communicated the same to required stake holders and workgroup members.
- Execute performance test using a Performance Center, working with developers, DBAs, server team and architects, monitoring key components during the test.
- Used LR tool to view Load test results and drill down to identify possible bottlenecks
- Monitor different graphs/measures like Transaction response time, transaction rates, and other time-sensitive requirements against the published requirements for the system/application and verified whether the performance requirements have been achieved during normal projected user loads.
Confidential - Irvine, CA
Sr. Performance Test Lead
Responsibilities:
- Created test strategy and test approach document for releases, got it reviewed by stake holders, project sponsors and kept updating these documents as and when the changes are required
- Gathered Non functional requirements for different phases, reviewed them with workgroup and got signed off by Business.
- Prepared test plan based on business requirement document and prepared test cases and test procedures.
- Identified and defined scope for various phases of Testing
- Reviewing the documents to understand the Eagle application and attended review sessions to understand the Day in the Life Cycle.
- Worked with project team members and end users to evaluate & review the vendor-created test artifacts and validate them against non functional requirements and design documents.
- Worked with application, Business Analysts, Technical Architects and other project stake holders to estimate, plan and execute performance test.
- Worked with Configuration team to setup test environment, test data and data refresh/restore process to repeat tests.
- Define and prepare workload model for performance test execution
- Performance test to simulate the activity of 400 concurrent users while daily, weekly or month end batch processes are being executed simultaneously
- Hands on experience preparing Test plans, Automated Tests and Test Data and executing the same
- Review test deliverables, defects, test results created by team members
- Create reports (weekly reports) and create testing metrics as requested by senior management
- Managed offshore team and ensure testing is completed on time and within budget, resolved the issues faced by offshore due to data security by reaching out to appropriate security team
- Ensured entry/exit criteria are met and communicated the same to required stake holders and workgroup members.
- Participated in QA reviews and provided required support and clarification as needed for the reviewers
- Ensured all the testing and reports are version controlled and are accessible for testing team and other privileged users
- Execute performance test using a Performance Center, working with developers, DBAs, server team and architects, monitoring key components during the test.
- Coordinate sessions with various teams to address any defects found, facilitate application tuning, repeat tests
- Created worst case load scenarios using LR Controller to test the system stability and processes
- Used LR tool to view Load test results and drill down to identify possible bottlenecks
- Monitor different graphs/measures like Transaction response time, transaction rates, and other time-sensitive requirements against the published requirements for the system/application and verified whether the performance requirements have been achieved during normal projected user loads.
- Used OVPM for identify Memory leaks, CPU usage, Network summary and application’s poor response time.
- Involved in writing SQL queries in Oracle 10g to validate the reports creation
- Analyze test results, prepare final reports and present to the key stake holders
Environment: Tools: Quality Center 11.00, Performance Center 11.00, Load Runner 11.00, HP Performance Manager (OVPM), Oracle Client 10g.
Confidential - Folsom, CA
Sr. Performance Tester
Responsibilities:
- Prepared test plan based on business requirement document and prepared test cases and test procedures.
- Identified and defined scope for Integration and System Testing
- Reviewing the documents to understand the Retail business logic and attended review sessions to understand the Retail Life Cycle
- Maintain Traceability Matrix throughout the SDLC to ensure complete and comprehensive test coverage and delivery.
- Performance test to simulate the activity of 650 concurrent PeopleSoft users while daily, weekly or month end batch processes are being executed simultaneously
- Participate meetings planned for particular release and obtain necessary technical automation requirement. Such meetings includes design review, test execution timeline, etc.,
- Hands on experience preparing Test plans, Automated Tests and Test Data and executing the same
- Developed Load model, calculated pacing by gathering data from previous release.
- Responsible for creating test scenarios using the Load Runner tool and generated Vuser Scripts using Vugen
- Used Web(HTTP/HTML) and PeopleSoft protocols for recording the application
- Wrote code in the Vugen to make sure the transaction status is updated into the database
- Used API functions to confirm the scheduled People Soft job is in posted state
- Used Controller to Launch 659 concurrent Vusers to generate load using 12 Load Generators
- Created worst case load scenarios using LR Controller to test the system stability and processes
- Used LR tool to view Load test results and drill down to identify possible bottlenecks
- Monitor different graphs/measures like Transaction response time, transaction rates, and other time-sensitive requirements against the published requirements for the system/application and verified whether the performance requirements have been achieved during normal projected user loads.
- Used Wily Introscope for identify Memory leaks and application’s poor response time.
- Automated the installation of daily builds using PTF(PeopleSoft Test Framework) and implemented regular expression verification of application
- Involved in writing SQL queries in Oracle SQL developer to validate the reports creation
- Documented and reported the Performance results (Response Times) to the whole application team
Environment: Tools: Quality Center 11.00, Load Runner 11.00, PTF8.51.12, Wily, Oracle SQL Developer3.0.04.
Confidential - Phoenix, AZ
Sr. Performance Tester
Responsibilities:
- Prepared test plan based on business requirement document and prepared test cases and test procedures.
- Developed test plans, automation of new feature test suites, across the product line.
- Identified and defined scope for Integration and System Testing
- Automated the installation of daily builds using QTP and implemented regular expression verification of application
- Participate meetings planned for particular release and obtain necessary technical automation requirement. Such meetings includes design review, test execution timeline, etc.,
- Involved in the creation of the architecture of Automation framework by using modular approach undertaken for automation
- Hands on experience preparing Test plans, Automated Tests and Test Data and executing the same
- Experience in Preparation of Automated Tests within the Keyword driven automation framework
- Good expertise in VB Script and Automation Infrastructure development
- Coordinate with manual tester and set up expectation on what will be handled by automation team
- Modified, updated and improved existing automated test script routines and management programs, created and executed automated scripts
- Analyzed system processes to develop strategic System Integration Test Plan
- Developed Load model, calculated pacing by gathering data from previous release.
- Responsible for creating test scenarios using the Load Runner tool and generated Vuser Scripts using Vugen
- Used Web(HTTP/HTML) protocol for recording QRT application
- Used Controller to Launch 759 concurrent Vusers to generate load using 8 Load Generators
- Worked on scripts to meet the SLA.
- Created worst case load scenarios using LR Controller to test the system stability and processes
- Conducted Endurance Tests that lasts for 8 business hours
- Used LR Analysis tool to view Load test results and drill down to identify possible bottlenecks
- Monitored the application performance from Wily perspective
- Extensively used Controller to perform and execute Baseline, Ramp-Up, Endurance, and Stress test cycles.
- Involved in writing SQL queries in Oracle SQL developer to validate the quotes creation
- Documented and reported the Performance results (Response Times) to the whole application team
Environment: Tools: Quality Center 11.00, QTP 11:00, Load Runner 11.00, Oracle SQL Developer3.0.04, Wily
Confidential - Buffalo, NY
Sr. Performance Tester
Responsibilities:
- Requirements Analysis of various components of FACETS 4.71 with developers/architects
- Test Plan/Test Strategy preparation involving the test team for testing individual Web Services
- Use Cases analysis of the following business flows in FACETS 4.71
- Subscriber and Member enrollment inFacets
- Class, plan, products, group and sub-groups
- Provider, Utilization Management
- Registration process of common practitioner inFacets
- Test Cases/Test Script review for all business scenarios
- Co-ordination of Test Design/ Test Execution efforts across teams and with business
- Defect tracking and management using MQC
- Involved in discussions to with senior management toprocure Software Testing tools for the organization
- Coordinated data conversion efforts to Trizetto FACETS 4.71
- Presenting Test Metrics to management for tracking status of the project and deciding Go/No-Go for each scheduled release of the application
- Used Load Generators to generate load on multiple machines
- Prepared Load test plan based on business requirement document and prepared test cases and test procedures
- Developed test plans, automation of new feature test suites, across the product line and multiple operating systems
- Identified and defined scope for Integration and System Testing
- Created Load Scenarios in LR Controller and Scheduled the Virtual Users and Parameterized Vuser Scripts to generate realistic load on the Server
- Used Winsockets Protocol for recording Facets and Web HTTP protocol for CWS scripting
- Created worst case load scenarios using LR Controller to test the system stability and processes
- Used LR Analysis tool to view Load test results and drill down to identify possible bottlenecks
- Extensively used Controller to perform and execute Baseline, Ramp-Up, Endurance, and Stress test cycles.
- Involved in writing SQL queries in Sybase to validate the payment files.
- Documented business process properties and developed test data for use in data validation
- Used Quality Center Dashboard feature & MS Excel to report project status, defect status and closure information, ensured data is correctly reported in it when defect is resolved or closed.
Environment: Quality Center 10.0, HP Load Runner 11.0, HP Quick Test Professional, FACETS 4.41/4.71, MS SharePoint, Sybase.
Confidential, Mason, OH
SAP Functional / Performance Tester
Responsibilities:
- Developed and Documented Test scenarios and Test cases in accordance with the Business Requirements Documents and the Functional Requirements Specification documents.
- Used Quality Center to develop test cases, and executed them in test lab.
- UsedQuality Centerfor tracking and reporting, also followed up with development team to verify bug fixes, and update bug status
- Developed test plans, automation of new feature test suites, across the product line.
- Ensured the appropriate parties review and sign-off on test cases prior to test execution
- Involved in functional, Integration and Regression testing after release of each sprint.
- Performed test on the application in different scenarios after release of each sprint.
- Involved in End to End test of whole application (start of Day, Register Open, Till open, Minor challenge, scan, Tender).
- Involved in testing special functionalities dual till, manager access, admin access.
- Checked the data flow through the front end to backend and used SQL queries, to extract the data from database.
- Developed test cases for Black box testing like GUI, Functionality Testing, System Testing and User Acceptance Testing.
- Developed automated test scripts using VB Scripts to perform Functional and Regression Testing
- Extensively used QTP checkpoints, Bitmap checkpoints, object repository - shared, functional library, Database checkpoints while doing the functional test of the application
- Imported and exported data from and to Excel to and from QTP
- Identified and defined scope for Integration and System Testing
- Performed load and performance testing on the web-application server using LoadRunner
- Created Vuser scripts for multiple protocols like HTTP and SAPGUI
- Created Load Scenarios Scheduled the 200 Virtual Users and Parameterized Vuser Scripts to generate realistic load on the Server
- Used VuGen to create LoadTest scripts for (Web, Database and multi protocol)
- Used Controller to Launch 100,200 concurrent users to generate load
- Used Load Generators to generate load on multiple machines.
- Automated performance test scripts and verified the response time under different load conditions using LoadRunner
- Created worst case load scenarios to test the system stability and processes
- Worked with network delay monitor to Identify network bottlenecks within distributed environments
- Generated Server Monitor statistics like CPU usage, I/O, Memory, Processes, Thread Counts, Bytes/Sec etc
- Wrote custom functions and programs to support the load testing efforts, monitor resources to identify performance bottlenecks, analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed
- Generated performance graphs, session reports and other related documentation required for validation and analysis
- Compared and analyzed actual to expected results and reported all bugs to the development team
Environment: Quality Center 9.2, Quick Test Pro, Load Runner, Java, VBScript, SQL*Plus, Oracle, PL/SQL, Oracle Database 10g, Windows 2000
Confidential, Irvine, CA
Sr.QA Analyst
Responsibilities:
- Developed and Documented Test scenarios and Test cases in Quality Center in accordance with the Business Requirements Documents and the Functional Requirements Specification documents.
- Tested New Features and their impact on existing functionality before each release.
- Used Quality Center to develop test cases, and executed them in test lab.
- Developed test cases for Black box testing like GUI, Functionality Testing, System Testing and User Acceptance Testing.
- Extensively usedQualityCenterto upload requirements, write test cases and executed the test cases.
- Reported defects usingQuality CenterVerified fixes and closed bugs during regression testing.
- Performed manual testing executing all the test cases inQuality Centerbefore switching to automation testing.
- Designed a customized application in Microsoft Excel, which provides tools to assist other developers of new data analysis systems
- Imported and exported Excel Sheets
- Reported defects usingQuality CenterVerified fixes and closed bugs during regression testing.
- ProducedRequirements Traceability Matrixto support the application development.
- Created and executed data driven test scripts inQTP.
- Created the automated test scripts. Verified and Validated the Automated Test scripts using QTP.
- Regression testing was performed for each new build using QTP.
- ConductedFunctionality testingduring various phases of the application usingQTP.
- InsertedCheck Pointsto Check for the broken Links, Text, and standard properties of an object usingQTP.
- Checked the data flow through the front end to backend and usedSQL queries, to extract the data from the database.
- Involved in the User Acceptance testing (UAT) to check the reliability for end users.
- Performed Testing on various web parts, Key Performance Indicator’s and InfoPath forms in SharePoint Sites.
- Checked the data flow through the front end to backend and used SQL queries, to extract the data from database.
- Developed BPT Test Cases and tested the whole application and defects are isolated.
- Wrote many SQL queries like inner, Outer, Left and Right Join queries to check with existing data
- Performed the data validation and verification to meet all test condition requirements.
- Performed highly Backend data validation testing on BRM data base and ATG data base for verifying user's Order data, financial data, tax calculations and price calculations.
- Analyzed Results with Business Users.
Environment: Quick Test Pro, Quality Center, Unix, VBScript, HTML,ATG, XML, .Net Framework, C#, Share point server 2003,JavaScript, SQL, Microsoft Project, Windows 2000.
Confidential, Alpharetta, GA
Web Services Performance & SOA Tester
Responsibilities:
- Prepared Load test plan based on business requirement document and prepared test cases and test procedures
- Tested service-oriented architectures (SOA) procedures suitable for secure, reliable, and high-performance deployment.
- Developed test plans, automation of new feature test suites, across the product line and multiple operating systems
- Responsible in creating, execute, enhancing and maintain performance testing scripts(HTML-based,URL-based scripts) using HP LoadRunner
- Responsible in running the Vugen script in Multi User mode using the LoadRunner Controller
- Responsible in Planning the entire Load test process based on the performance requirements document
- Identified and defined scope for Integration and System Testing
- Created Vuser scripts for multiple protocols like HTTP and FLEX
- Involved in setting up the Automated Testing Environment for creating, and running automated tests using QTP.
- Met with client groups to determine user requirements and goals. Drafted test strategies, test cases and test plans based on functional specifications.
- Created and implemented functionality tests using Mercury Interactive QTP to automate the Regression Testing of the application.
- Manual Testing was done to perform functional testing on the interface.
- Data Driven Testing was used to play back tests against new, untested versions of the application to identify application defects.
- Created and reported defects using Quality Center.
- QTP was used to perform Backend testing, to test the module functionality during user data retrieval from the database to mobile switching center.
Environment: QTP, Quality Center, SQL, Oracle, Windows XP/2000, HTML and XML, HP Service Center, SOA Test.
Confidential, Birmingham, AL
QA Automation and Performance Tester
Responsibilities:
- Analyzed the business requirements and documented them into test cases and imported them to QC
- Worked independently in creating, debugging, executing of the LoadRunner scenarios while assisting the tuning
- Mentored one of the QA Analysts to work with the Vu Generator in creating Vuser scripts, the Controller Scenarios
- Performed load testing based on actual production usage of the application, documented in the workload model
- Created worst-case scenarios to determine the performance of TMW applications in accordance with the approved Business test Cases
- Monitored system resources applicable to the assessment of system performance
- Created Baseline transactions and resource utilization for comparative analysis against future releases
- Created Load test scripts using VUGenerator and enhanced them using C
- Monitored load balancer to analyze the load sharing between servers and optimized them to handle bulk data to avoid bottlenecks in the network
- Created Web Vuser scripts to load test the middleware of the architecture mainly WebLogic and the complete transaction
- Created Vuser scripts (Http/Web Protocol, and Oracle Protocol) using Vu Generator for all the three applications to simulate actual users and their actions
- Configured and ran manual as well as goal oriented scenarios in Controller using scripts created in Vu Generator
- Analyzed Performance Bottlenecks using Load Runner Monitors and Graphs
- Created a baseline and ran sanity, scalability and stability tests in staging and production environments
Environment: QTP, Quality Center, .net and SQL, Oracle, LoadRunner, Java J2EE, WebLogic, Applets, Java Servlets, TOAD