Qa Analyst Resume
Sacramento, CA
SUMMARY
- 8 years of IT experience in different versions of Load runner with emphasis in Performance Testing.
- To take up a challenging assignment in Quality Assurance/Performance Testing that provides me with an opportunity to enhance my software testing skills and allows me to apply my analytical, documentation and testing expertise for the organization in the most efficient manner.
- Using the correct protocol required for a successful recording of an application.
- Recording/Developing/Enhancing Vuser scripts.
- Handled various Correlations of the dynamic data including boundaries of different patterns.
- Well versed with all functionality of Virtual User Generator.
- Check Web pages for expected keyword(s) and redirect script replay accordingly.
- Handling of error conditions generated by the application.
- Proficient in putting loops into the Load runner scripts to run scripts for multiple iterations.
- Configuring Run - time settings(Action/Think-time.....) for Vugen and Controller
- Conducted performance testing, stress testing, Endurance testing using LoadRunner.
- Monitoring Vuser status.
- Working with different Vuser types and groups.
- Configuring/ Activating and online monitoring of Performance Counters.
- Performed IP Spoofing using LoadRunner.
- Analyzing scenario performance, graphs and reports.
- Analysis of cross results, cross scenarios and merged graphs.
- Filtering and sorting information.
- Used Quality Center for tracking and reporting bugs.
- Knowledge of Java Virtual Machine internals including connection pooling, threads, synchronization, and garbage collection
- Activating / configuring monitors and adding desired performance counters into the Graphs.
- Utilized various performance tools such as Oracle Enterprise Manager, pmon, nmon, top and WebLogic console for monitoring database cluster contention, I/O, User, CPU activities and overall server(s) performance
- Utilized Database, Network, Application server and WebLogic Monitors during the execution to identify bottlenecks, bandwidth problems, infrastructure problems, and scalability and reliability benchmarks
- Created different scenarios to isolate bottlenecks like Smoke Test, Scalability testing, Reliability testing, Stress testing, fail-over testing, Performance regression testing etc.
- Extensively Worked in Web, webservices, SAPGUI, Citrix, Oracle Protocol.
- Installing, maintaining and administering LoadRunner software.
- Hands-on experience on different versions of Load runner
TECHNICAL SKILLS
Load Testing tool: LoadRunner 11.5,11,9.5, 9,8, HP Performance Center 9.52, HP ALM 11
Monitoring Tool: Sitescope, Perfmon, pmon, nmon, top and WebLogic console, Tivoli, Zenoss, Java visual vm
Diagnostics Tool: Dynatrace 5.5, HP Diagnostics probe for .NET, J2EE, Wily Introscope
Operating system: AIX, UNIX, Solaris, Windows 2003, 2005, 2008 R2
Environment: Web Servers(Apache, IHS, IIS 6.0)App Servers(Weblogic, Websphere, IIS 7.0, Websphere Process Server, Websphere ESB Server), IBM MQ Series, CICS, Datapower, Solaris E2900, P7 AIX
Databases: MS SQL Server, Oracle, LDAP, Access, DB2, Sybase
Languages: Java, JSP, Html, Visual Basic, Oracle, C, C++, SQL, XML, .Net C #, ASP
Other: Quality Center, Shell Programming
Methodologies: Agile, Waterfall, RUP
PROFESSIONAL EXPERIENCE
Confidential, Sacramento, CA
QA Analyst
Responsibilities:
- Assess Performance testing needs as well as design test plans and coordinate the Performance testing strategy for the release.
- Designed workload profiles based on the production volumes and planned for LoadRunner scenarios.
- Developed LoadRunner scripts using web and webservices protocols implementing the X509 certificates using web service set security.
- Conducted Baseline Test, Endurance test and Stress test.
- Ensure performance test scripts are documented, reported and tracked in HP ALM 11.5.
- Performed peer review for deliverables of other Performance Testing Analysts.
- Analyses of performance issues identified during the testing cycle and escalate issues as required to the appropriate support teams for resolution and ensure that the issues are closed within the required timeframe.
- Worked closely with developers and DBAs to remediate bottlenecks.
- Provided the proper level of upward communication regarding status and accomplishments, as well as, escalation of issues and recommendations for issue resolution.
Environment: LoadRunner 11.52, Agile Methodology, Web Services, SOA Test, X509 Certificates, QTP 11, HP ALM 11.5.
Confidential, Columbus, OH
Performance Test Lead
Responsibilities:
- Defined the performance goals and objectives based on the inputs (Architecture, Hardware and Production statistics) and interacting with End users.
- Analyzed production volumes using Dynatrace Datamining tool and planned for loadrunner scenarios.
- Worked closely with Development team and Functional Test Lead to discuss the Testing aspects of the application and design the Performance Test plans.
- Used VUGen to develop scripts for both Web Services and frontend.
- Developed Web Services LR script using web custom request instead of soap request.
- Developed frontend LR scripts using URL mode to simulate the end-user accurately and used blocks.
- Check webpages for expected keyword(s).
- Random Thinktimes to reflect a variety of users.
- Used LR Controller 11.52 to execute tests, and maintain scripts in the Quality Center 10.
- Co-ordinated with other mainframe maintenance teams in order to avoid resource contention and scheduled tests only when the environment is silent(e.g. weekends).
- Conducted Load Test, Stress test, Capacity test and Endurance test for the Web Services along with other regression services before release.
- Logged errors into a local drive so that error response during load tests can be stored and shown to various teams at the time of analysis (E.g. Identified Mainframe abendts using those logs).
- Used Dynatrace for monitoring of Application Servers and Database Servers.
- Used Zenoss for monitoring Datapower appliance.
- Integrated loadrunner with Dynatrace, Analyzed load test results, identified bottlenecks and logged them as defects in Quality Center.
- Interacted with developers (SOA and UI) and admins/monitoring during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Recommended solutions for best performance to expected loads in the production.
Environment: LoadRunner 11.52, Dynatrace 5.5, Agile Methodology, WESB 7.5, Datapower appliacane XI50, Websphere Application Server (WAS8.5), Websphere Process Server(WPS), TOMCAT PEGA/PRPC, DB2, QMF For Windows, CICS, CTG, Oracle, SOAP UI, DOJO, Zenoss Monitoring.
Confidential - Franklin, TN
Performance Test Lead
Responsibilities:
- Defining the performance goals and objectives based on the inputs (Architecture, Production statistics). Created Test Schedules and laid out strategies project that are understandable to others.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the Preprod(Performance Testing) environment to match production.
- Worked closely with Development team to discuss the Design and Testing aspects of the applications to design the Performance Test plans.
- Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented.
- Used loops in the Load runner scripts to run scripts for multiple iterations.
- Configure and set up monitors in SiteScope.
- Involved in upgrading of performance center from v .
- Used Performance Center to execute tests, and maintain scripts.
- Conducted Baseline test, Capacity test, Peak Load Test and Endurance test.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Implemented and maintained an effective performance test environment.
- Presented Reports and Graphs generated by Performance Center to all the stakeholders in a simplified manner.
- Identified and eliminated performance bottlenecks development lifecycle.
- Investigate and troubleshoot performance problems in Preprod (Performance Testing) environment. This will also include analysis of performance problems in a production environment.
- Worked closely with developers, project managers, Manual testers and administrators (IIS and SQL) in the development life cycle.
Environment: Load Runner, Performance Center 9.52, ALM 11, IIS 6,7, SQL Server, MS Office, MS-Visio, Windows and .Net
Confidential
Technical Team Lead, Performance Testing
Responsibilities:
- Defining the performance goals and objectives based on the client requirements and inputs
- Responsible for Performance, Stress & Regression Testing enterprise wide PeopleSoft ERP systems, Web applications.
- Extensively Worked in Web, Web Services, Click and Script, ODBC protocols in LoadRunner
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
- Responsible for developing and executing performance and volume tests
- Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
- Partner with the vendor organization to analyze system components and performance to identify needed changes in the application design
- Configure and set up monitors in Sitescope.
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
- Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
- Implemented and maintained an effective performance test environment.
- Identify and eliminate performance bottlenecks during the development lifecycle.
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Conducted Duration test, Stress test, Baseline test
- Verify that new or upgraded applications meet specified performance requirements.
- Used Performance Center to execute tests, and maintain scripts.
- Used to identify the queries which were taking too long and optimize those queries to improve performance
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Created Test Schedules.
- Worked closely with clients
- Interface with developers, project managers, and management in the development,
- Execution and reporting of test performance results.
- Manage Performance Center servers, monitor the server status, edit server information, and check server performance.
- Manage timeslots, view user reservations, and monitor availability of time and resources.
Environment: Load Runner, Performance Center, IIS, HP Diagnostics, HP Sitescope, WebLogic, Oracle, QC, QTP, SOA Test, MS Office, MS-Visio, Java, Windows and LINUX, LAN, WAN, .Net, Java.
Confidential, McLean, VA
Senior Performance Tester\Performance Center Admin
Responsibilities:
- Involved in gathering business requirement, studying the application and collecting the information from developers, and business
- Created Vuser scripts that contain tasks performed by each Vuser, tasks performed by Vuser’s as a whole, and tasks measured as transactions.
- Developed Vuser Scripts in web and Citrix Protocols
- Designed tests for Benchmark and Stress testing
- Parameterized large and complex test data to accurate depict production trends.
- Validated the scripts to make sure they have been executed correctly and meets the scenario description
- Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour
- Added performance measurements for Oracle, Web Logic, IIS in LoadRunner TestCenter.
- Analyzed results using LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
- Maintained test matrix and bug database and generated monthly reports.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels
- Used LoadRunner tool for testing and monitoring actively participated in enhancement meetings focused on making the website more intuitive and interesting
Environment: Load Runner 9,Performance Center 9, Citrix, Quick Test Pro, LDAP, Oracle, MS SQL Server, Web logic 8, Web Sphere, Load Balancer, JAVA, Test Director J2EE Diagnostic Tool, web, Windows 2000 / XP, Solaris, AIX, IE
Confidential, San Bruno, CA
Senior Performance Tester
Responsibilities:
- Work closely with software developers and took an active role in ensuring that the software components meet the highest quality standards.
- Worked as an independent consultant for performance testing and coordinated with multiple vendors.
- Worked closely with Development team to discuss the Design and Testing aspects of the applications to design the Test plans.
- Reviewed BRD, SRS to prepare Performance acceptance criteria and Test Plan.
- Actively participated in the daily project meetings and walkthroughs.
- Involved in preparation of estimation, capacity matrix, performance strategy docs and conducted assessments and data modeling using excel.
- Developed Vuser scripts using Web (HTTP/HTML), Web Services, Microsoft .Net, ODBC and Oracle NCA.
- Created customized LoadRunner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
- Extensively worked on Load runner, created Scripts based on prioritized/critical scenarios and scattered the peak load over the production like distribution ratio.
- Enhanced Vuser scripts by adding correlations, parameters, condition controls, and checking/validation functions
- Worked on Web, Clint-server, Main frame, and SOA, J2EE, .Net and legacy applications.
- Designed performance test suites by creating VU test scripts, workload scenarios, setting transactions, rendezvous points and inserting them into suites using Load Runner.
- Responsible for testing the applications in different scenarios like Average load test, Spike test, Endurance test, Volume test and Peak Load test.
- Configured Offline & Online Diagnostics like J2EE/.NET Diagnostics through performance center.
- Used Sitescope to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.
- Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat using UNIX commands like top, vmstat, svmon and netstat.
- Analyzed JVM Heap and GC logs in Web Sphere during test execution.
- Conducted result analysis and communicated technical issues with developers and architects
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.
- Created comprehensive test results report.
Environment: LoadRunner, Performance Center, SiteScope, SAP, Oracle, Peoplesoft, MS SQL Server, Weblogic, Load Balancer, JAVA, Quality Center, J2EE Diagnostic Tool, web, Windows 2000/XP, HP-AIX
Confidential, Cleveland OH
Performance Tester
Responsibilities:
- Gathered business requirement, studied the application and collected the information from Analysts.
- Developed and deployed test Load scripts to do end to end performance testing using LoadRunner
- Developed Virtual User Scripts for Web (Http/html), Java protocol.
- Used MQ Client protocol to test websphere MQ.
- Developed LoadRunner scripts by using Virtual User generator for Base Line, Soak (Endurance test) and Stress test scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by checking the content in sources.
- Simulated multiple Vuser scenarios in Loadrunner.
- Defined Rendezvous point to create intense load on the server to measure the server performance under load.
- Monitoring software and hardware behavior during test run using PERFMON and LoadRunner online monitors.
- Responsible for monitoring Oracle Database performance for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of application.
- Identified and analyzed memory leaks at each component level.
- Database stored procedure executions, Indexes and deadlocks with load were analyzed.
- Ensure that defects are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
Environment: Web Sphere, RUP, Windows 2000 Advanced Server, IIS 5, IBM AIX, DB2, PL/SQL, MQ series (IBM and MS), Mainframe, Performance Center, LoadRunner, Quality Center, WinRunner, Quick Test Professional, IBM Rational, Sitescope, HP J2EE Diagnostic, HP Business Availability Center .
Confidential
Performance Tester
Responsibilities:
- Experience in requirement gathering and analysis, data analysis, data mapping, functional design, quality assurance/Testing and documentation for Performance goals
- Developed Test Plans, Test Scenarios, Test Cases, Traceability Matrix, Test Summary Reports and Test Execution Metrics.
- Developed Vuser Scripts in web, web services Protocols.
- Tested web services applications using SOAP Client as well as by using WSDL Files.
- Developed and Executed the Test cases & scripts for Functional, System, Regression, Integration, Performance and UAT.
- Involved in writing Complex queries using SQL for Data Integrity checks.
- Participation in requirement / Use Case analysis, risk analysis and configuration management.
- Validating the data thru various stages of data movement from staging to Data Store to Data Warehouse tables.
- Created Test Cases using the SDLC procedures and reviewed them with the Test lead.
- Executed all the Test Cases in the Test Environment and maintained them and documenting the test queries and result for future references.
- Testing included unit, system testing, regression testing, Business Objects reports testing.
- Provide flexible & high quality support to wider Testing strategies on key regular deliverables & ad hoc reporting issues.
- Created scripts for web applications using web/http protocol and client server applications using web services protocol.
- Validated client application interaction with middleware application, which is via HTTP/SOAP
- HP Business Process Testing Using SOAP UI, Quality Center for .COM Web Application
- Developed Web Service Vuser scripts for a Web Service Call using Soap UI.
- Involved in performance testing and governance validation.
- Retested the modifications, once bugs are fixed after reinstalling the application.
- Developed automated scripts using Win runner for functional and regression testing.
- Reported the bugs through email notifications to developers using Quality Center.
- Generated Problem Reports for the defects found during execution of the Test Cases and reviewed them with the Developers. Worked with Developers to identify and resolve problems.
- Lead and Schedule QA project status meetings and publish meeting minutes.
- Developed presentation and testing implementation learning to other testing resources for cross functional training.
- Involved in Performance testing of Web Applications with Load Runner.
- Developed Load Runner Scripts using Vuser Generator
- Created Scenarios for Performance Testing using Load Runner Controller and Analyze performance at server monitoring.
- Validate the XML files and accurate the Data.
- Well versed with Unix Shell Scripting.
Environment: Load runner, Quality center, Java, XML, Oracle, Business Objects, SQL server 2000, Windows XP, Telnet, Web Sphere, Lotus Notes and UNIX