We provide IT Staff Augmentation Services!

Performance Tester Resume

4.00/5 (Submit Your Rating)

Baltimore, MD

SUMMARY

  • To obtain a Performance Tester position where I can utilize my technical skills and education dat will allow me teh opportunity to grow and be challenged in new areas.
  • Expertise in implementing performance validation and optimization solutions and testing utilizing HP LoadRunner / Performance Center in web and client/server environments.
  • Expert noledge in LoadRunner scripting using Web HTTP/HTML, Web service and Ajax True Client protocols.
  • Good noledge on analyzing teh Load Runner results and working closely with cross functional technical teams in fixing teh performance bottlenecks.
  • Coordinated with testing services and tools team to install Dynatrace agents on teh Performance environments to identify teh bottlenecks.
  • Working noledge on Analyzing Heap Dump and Thread Dump
  • Knowledge of working in JProfiler and HttpWatch.
  • Strong Knowledge on entire QA Life Cycle, which includes Designing, Developing and Execution of teh entire QA Process and Documentation of Test plans, Test Cases, Test Procedures and Test Scripts.
  • Experience in on - site / offshore model and communications.
  • Highly experience with SDLC and various testing methodologies.
  • Testing experience with multiple platforms, web server/application servers, databases, web technologies, object oriented programming languages.
  • Experience in teh working Agile Development and SCRUM.
  • Working noledge in performance tools like HP Diagnostics.
  • Proficient in Manual and Automated Testing tools such as Mercury Test Suite (QTP, Quality Center & Test Director) for testing ERP Applications, Web and Client /Server.
  • Experienced in building Performance Test Strategies and Frameworks based on teh planned scope of Performance Testing.
  • Knowledge in UML and Design Patterns, full understanding of OO principals, experience with OO modeling tools.
  • Knowledge of JAWS tool for 508 compliance testing.
  • Experience with writing SQL/PL-SQL scripts for test automation.
  • Evaluate Non-Functional Requirement documents and identify performance test needs; types of tests, scripting scenarios, volume projections, etc.
  • Develop a Performance Test Plan/Strategy document dat includes short and long-term test objectives, types of tests required, workload matrix, scenarios, etc.
  • Develop test execution scenarios for various types of tests such as load, stress, endurance, and run tests.
  • Perform heap and thread analysis, database deadlock detection, understand and detect resource contention, etc.
  • Document test results and develop custom made summary reports to satisfy both technical and non-technical stakeholders.
  • Analyzed and Monitored Sitescope and CA Wily Introscope for teh performance of teh server by generating various reports for CPU utilization, Memory Usage, load average etc.
  • Used DynaTrace to diagnose and troubleshoot application performance issues.
  • Experience performance testing in an SAP environment covering modules for SAP Portal, SD, and MM.
  • Expert Knowledge in LoadRunner scripting using SAP GUI, SAP Web, Web Services, Flex and ODBC protocols.
  • Extensively Testing Knowledge in SAP R/3 SD and MM modules.
  • Understanding of SD functionality from Developer’s and client’s perspective to extend business content based on requirements.
  • Experienced to monitor infrastructure and application monitoring using SiteScope when system is under load.
  • Performed Web Performance, Load, Stress, and Endurance Testing using Visual Studio Ultimate 2012(VSTS).
  • Experienced to monitor infrastructure and application monitoring using Splunk and Nagios XI.
  • Experienced in analyzing teh Performance of teh application using LoadRunner with different Virtual users.
  • Used SOAPUI for API, Web service, Functional and Load Testing.
  • Used JMETER for performance testing.
  • Knowledge of IBM Rational tool for performance testing.
  • Knowledge of scripting with automated regression testing tool Selenium.
  • Expert in Windows Typeperf & Perfmon Utility to create custom config file and collect windows resources statistics remotely and generate report with PAL.
  • Experienced in using vmstat, Sar, Topas Utility & System Monitor in UNIX System to measure Unix system Performance under load.
  • Experienced in identifying Memory Leak issue, Java Heap, Garbage collection issues in WebSphere Web Application Servers.
  • Extensively experienced to create and execute Batch File & UNIX Shell Script.
  • Expert scheduling job in both Windows & Unix System through Task Scheduler & Cron Tab.
  • Experienced using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Topas in UNIX system.
  • Knowledge of testing front end web technologies based on JavaScript, DHTML, CSS, and of course, HTML.
  • Experience in performing System, Functional, Regression, Integration and Acceptance testing on Win/Web based applications.
  • Highly experience with test management, defect tracking and automation tools (Quality Center, Quick Test Pro /UFT and Load Runner.)
  • Proficient of performance bottlenecks and end-to-end performance measures (server response time, throughput, network latency etc.)
  • Experienced in creating and supporting Performance Test Scripting and execution for teh testing organization by serving as a member on a dedicated Performance Test Team dat focuses completely on Load, Stress and Performance Testing.
  • Experienced in Virtualization Technology like Microsoft Hyper V, Oracle VMware and VMware Workstation.
  • Experienced in using utilities RDP, Putty, Process Explorer, File Mon, Process Mon, PoolMon, Fiddler.
  • Experienced in Load testing, Scenario creation and execution, Measured Throughput, Hits per second, Response time, and Transaction time using LoadRunner Analysis.
  • Experienced in building Performance Test Strategies and Frameworks based on teh planned scope of Performance Testing.
  • Experienced in developing and executing test scripts using Automation tools i.e. Quality Center/TestDirector, LoadRunner, Jmeter and Performance Center.
  • Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection and loading testing of LDAP, FTP, and SOAP using Jmeter.
  • Experienced in Bug-reporting and Bug-tracking using teh test management tools like TestDirector, Quality Center/ALM, JIRA, Spira Team, Bugzilla and TFS.
  • Experienced in developing and executing test scripts using Automation tools i.e. Quality Center, LoadRunner and Performance Center.
  • Experience with multiple network services and protocols such as: DNS, DHCP, VPN, TCP/IP, FTP, SMTP, POP3, and IMAP.
  • Experienced in using MS Sharepoint for business collaboration.
  • Extensively experienced in MS Office Suites to create and analyze report and graphs.
  • Experienced in peer review of test cases and Preparing Test Reports.
  • Experienced in collecting and analyzing database performance using SQL Profiler, Activity Monitor.
  • Expert in creating SQL queries against Oracle and MS SQL server.
  • Experienced in Service Oriented Architecture (SOA) testing and Web Services testing.
  • Self-motivated, able to work independently with moderate/minimal guidance
  • Ability to work with minimum documentation and minimum supervision.
  • Proven ability to meet deadlines, ability to form, build and manage effective teams

PROFESSIONAL EXPERIENCE

Confidential, Baltimore, MD

Performance Tester

Environment: HP ALM, Vugen12.0, Performance Center 12.0, UFT12.0, Java, JavaScript, VBScript, HTML, Internet Explorer, Chrome, WebSphere, iPlanet,, Aternity 7.0,, MS Office, Oracle 11g, Windows Server 2010, Solaris, Windows 7.

Responsibilities:

  • Developed VuGen Script, used manual correlation and auto correlation technique, set-up run-time setting, created scenario, analyzed teh results to find out bottlenecks and root cause using HP VuGen and Performance Center.
  • Planed, created, modified and executed performance test scripts as needed.
  • Executed various kinds of tests based on requirements from business (baseline, Load, and Soak tests).
  • Experience in developing Test Strategies, Test Cases and Test Procedures from System/Software Requirement specifications and Business Requirements.
  • Experience in performing System testing, Integration testing, Module testing, Sanity testing and Regression testing on Win/Web based applications.
  • Performed Load testing using JMETER.
  • Used DynaTrace to diagnose and troubleshoot application performance issues.
  • Experience working in teh agile development methodology.
  • Helped and mentored other testers in testing and Best Practices.
  • Designed, defined and documented unit and application test plans.
  • Transformed test plans into test scripts and executes those scripts.
  • Involved in test optimization through systemic improvements in process and quality.
  • Worked with operations, DBAs and developers to halp solve performance issues in teh application, network, and environment.
  • Provided analysis of performance test results, bottlenecks, sizing guidelines, and recommend improvements.
  • Identify functionality and performance bottlenecks, including deadlock conditions, database connectivity problems, and system crashes under load.
  • Generated automated Test Scripts by VB Scripts using HP QuickTest Professional.
  • Developed QuickTest Pro. VB script test scripts in programming mode. Administered, maintained teh Scripts.
  • Created automation test framework using UFT and maintained Test Scripts.
  • Developed automation script using UFT and executed in UAT environment for end user performance testing using Aternity monitoring tool.
  • Used InfoVista network monitoring tool for network utilization.
  • Used ALM/QC to organize and manage all phases of teh software testing process, including planning tests, executing tests, and tracking defects.
  • Created and performed maintenance of object repository using UFT for functional, sanity, and regression testing.
  • Developed SQL Queries and procedures to perform database testing.
  • Developed automation scripts using VB script.

Confidential, Seattle,WA

Software Test Engineer

Environment: Quality Center 10.0, Load Runner 11.0, Performance Center 11.0,MS Office, Oracle 11g, Windows Server 2010, Weblogic, Solaris, Java, C#. .NET, ASP.NET, ADO.Net, SOAP, WCF, WPF, Windows 7, Visual Studio Ultimate 2012. MS SQL Server, .Net Framework, MSMQ, FrontPage, PHP, SQL, PL/SQL, Splunk, Nagios XI.

Responsibilities:

  • Interfaced with business analyst to define Performance requirement.
  • Found out and troubleshoot performance bottlenecks using LoadRunner.
  • Proficient in LoadRunner VuGen scripting with extensive in correlation library.
  • Custom LoadRunner code written to meet teh requirement of teh client along with error handling.
  • Executed various kinds of tests based on requirements from business (baseline, Load, and Soak tests).
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User (Vuser) load on teh server, configured teh scenarios before executing them in teh LoadRunner.
  • Performed Web Performance, Load, Stress, and Endurance Testing using Visual Studio Ultimate 2012(VSTS).
  • Extensively experience in testing web, web services using Visual Studio 2012 ultimate.
  • Experience writing and executing queries in SQL (MS SQLServer 2012)
  • Found out and troubleshoot performance bottlenecks using VSTS.
  • Experience troubleshooting environmental issue using Splunk and Nagios XI and creating custom reports.
  • Performed Service oriented architecture testing and Web Services testing using Visual studio 2012(VSTS).
  • Designs, builds, tests and releases data visualizations, reports and security metrics for enterprise security and operations using Splunk.
  • Analyzed teh results of teh tests dat were used to assist in teh identification of system defects, bottlenecks and breaking points using LoadRunner.
  • Conducted End-to-End, Manual testing of teh system and prepared and managed test cases using Quality Center.
  • Used Quality Center and Spira Team to track, report, and manage defect throughout teh test cycle and attended Defect Status Meeting on daily basis during testing cycle.
  • Identified performance issues & performed tuning in collaboration with Dev/DBA partners
  • Created and executed SQL query scripts for Database Testing
  • Involved in teh test case walkthroughs assessment meetings.

Confidential, Inwood, New York

Performance Tester

Environment: Oracle 10g, Load Runner 11.0,Quality Center 10.0, MS Office, MS Excel, Windows Server 2008,Java, J2EE, VBScript, JavaScript WebSphere, AIX, SiteScope, CA Wily IntroScope Performance Center, JIRA, SAP R/3 4.7 and SAP ECC 6.0

Responsibilities:

  • Gathered all teh performance Test requirements from various stakeholders & other business personnel.
  • Collaborate with business analysts, system engineers, network, development, and other teams to understand and validate configuration settings to ensure optimal product performance, scalability and availability for production environments.
  • Understand teh application Architecture and teh scope for Performance testing.
  • Allocated work amongst resources both at offshore & onsite.
  • Developed and updated test cases in Quality Center.
  • Tested enhancements, bug fixed for software releases and patches and document findings using Quality Center.
  • Used CA Wily IntroScope and Precise monitoring tool to monitor teh server resources utilization.
  • Used SiteScope monitoring tool to monitor teh server resources utilization.
  • Responsible for definition, documentation, and execution of all performance test plans necessary to ensure teh release of a fully validated system.
  • Performed all activities in teh performance testing life cycle.
  • Responsible for designing and implementing performance tests for a highly scalable web-based systems.
  • Used JIRA to track, report, and manage defect throughout teh test cycle and attended Defect Status Meeting on daily basis during testing cycle.
  • Worked in ETL Tool to extract data from Oracle, ASCII files, XML and MS excel
  • Worked closely with product management and development engineers to conduct performance, Load and stress tests.
  • Created performance scripts & scenarios using LoadRunner.
  • Developed VuGen Script, used manual correlation and auto correlation technique, set-up run-time setting, created scenario, analyzed teh results to find out bottlenecks and root cause using HP LoadRunner.
  • Involved in tuning application to improve response times, queues and overall performance using LoadRunner.
  • Created and executed SQL query scripts for Database Testing.

Confidential, Hartford, CT

QA Tester

Environment: QuickTest Pro, Load Runner, Quality Center, Oracle, SQL Server, Visual Basic, Windows 7/XP/Vista, HTML, Internet Explorer.

Responsibilities:

  • Generated test scenarios and procedures based on architecture, use cases, requirements and documentation working closely with Test Lead and System Engineer.
  • Performed Service oriented architecture (SOA) testing and Web Services testing using SOAPUI.
  • Identified system defects and effectively documented and communicated defects to testing lead.
  • Responsible for teh development of test data to be used in performing teh required tests.
  • Used Quality Center for test-scenarios as per teh functionality and business requirements specified in teh use-cases and design specifications.
  • Performed complex functional, integration, and regression tests on multiple software products / product areas to validate links, objects, images and text on Web pages using QTP.
  • Created and performed maintenance of object repository using QTP for functional, sanity, and regression testing.
  • Created, enhanced and maintained high-end test scripts for various functional and regression using QTP
  • Generated automated Test Scripts by VB Scripts using HP QuickTest Professional.
  • Developed QuickTest Pro. VB script test scripts in programming mode. Administered, maintained teh Scripts.
  • Created Configuration folders on teh Server for QTP Environment Setup.
  • Created Libraries, Test Scripts, and Database Connectivity Scripts using QTP, used different types of actions like reusable actions, external actions with in teh scripts.
  • Configured settings, record and run settings, options and web event record configurations.
  • Maintained and administered teh objects in a common repository.
  • Created automation test framework using QTP and maintained Test Scripts.
  • Created connections between Quality Center and QuickTest Pro.
  • Created automated test labs in teh Quality Center and selected scripts from Quickest Pro.
  • Ran QTP automated scripts every night and stored teh results in Quality Center.
  • Developed some of teh automated test scripts using Descriptive Programming techniques.
  • Generated Data Driven scripts dat access teh backend database
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User (Vuser) load on teh server, configured teh scenarios before executing them in teh LoadRunner.
  • Developed automated test scripts for load, performance, and stress testing using LoadRunner.
  • Created and executed SQL query scripts for Database Testing.
  • Provided daily status to Test Lead.

Confidential, New York, NY

Test Engineer

Environment: QuickTest Pro, Load Runner, SQL Server, Java, Visual Basic, JavaScript, VBScript, XML, ASP, Windows XP, Visio, HTML, Internet Explorer.

Responsibilities:

  • Designed and planed test tasks to assure products developed by teh development team meet company’s quality standards.
  • Accountable for teh transformation of business needs into requirements artifacts such as specification, conceptual design, detailed requirement specification from which applications and solutions were developed.
  • Documented teh functional and data requirements based upon teh business process description to ensure teh final technology deliverables coincides with teh context of business process.
  • Led and contributed to teh analysis of test data requirements, identification and preparation of test data.
  • Produced detailed metrics to track and manage testing progress.
  • Planed, created, modified and executed performance test scripts as needed.
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User (V user) load on teh server, configured teh scenarios before executing them in teh LoadRunner.
  • Assisted teh team in design and implementation of an automated functional/regression test architecture.
  • Helped and mentored other testers in testing and Best Practices.
  • Designed, defined and documented unit and application test plans.
  • Transformed test plans into test scripts and executes those scripts.
  • Involved in test optimization through systemic improvements in process and quality.
  • Worked with operations, DBAs and developers to halp solve performance issues in teh application, network, and environment.
  • Used TestDirector to organize and manage all phases of teh software testing process, including planning tests, executing tests, and tracking defects.
  • Created and performed maintenance of object repository using QTP for functional, sanity, and regression testing.
  • Developed SQL Queries and procedures to perform database testing.
  • Developed automation scripts using VB script.
  • Basic understanding of MS Visual Studio Test Professional and MS Team Foundation Server.

We'd love your feedback!