We provide IT Staff Augmentation Services!

Sr. Performance Test Engineer / Quality Analyst Resume

4.00/5 (Submit Your Rating)

Pittsburgh, PA

SUMMARY

  • US Citizen, over Six years (6+) of diverse experience in the field of information technology with the emphasis on Performance testing and Software QA Analyst for Web - based, Client/Server applications
  • Experienced in successful completion of software projects by executing on software quality activities throughout the software development life cycle
  • Expert in test plan development, test case development, test case execution, test results analysis, and defect reporting
  • Experienced with testing at different levels (Smoke, functional, Regression, integration, GUI, system, Mobile testing and performancetesting)
  • Extensive experienced in developing or supporting test automation development using HP LoadRunner, Apache JMeter and HP Quick Test Pro (QTP).
  • Expert in Performance, stress, and volume testing using Load Runner and JMeter
  • Experience in analyzing application performance requirements, designing performance and load test scenarios, test environment, test dataetc. Extensive experience on different load runner protocols (WEB (http/html), Web services, TruClient AJAX,TrueClient web/Mobile,Oracle NCA and Citrix.
  • Hands-on experience in using automated tools like Performance Center and test management using HP Quality Center.
  • Experienced with diagnostics tools to identify root cause like Dynatrace, SiteScope, HP diagnostics. Hands-on experience with performance testing JVM based applications.
  • Experienced with designing scenarios in Controller for performing load, stress, endurance, and standalone tests along with analyzing results and reporting.
  • Working knowledge in Cloud test using BlazeMeter with JMeter with REST, web/http services, JSON, messaging, real-time, authentication/authorization
  • Experienced withiOS Mobile Device andAndroid for Mobile performance testing using TrueClient Web/Mobile, C Vuser protocol and device Anywhere, Mobiready and See test Automation Mobile testing tools
  • Knowledge of automate build tools and continuous integration (Maven, Jenkins)
  • Experience inVugenScripting using C, VB, Java, and Java Scripting and Experienced writing complex queries in SQL for backend testing in Oracle, and MS SQL Server
  • Excellent knowledge and Experience of cloud testing tools like- AWS, SOASTA CloudTest and BlazeMeter
  • Experienced in Relational Database Management Systems and back-end database testing.
  • A motivated self-starter with exceptional team building, leadership, project management and interpersonal skills

TECHNICAL SKILLS

Test Tools: HP LoadRunner, Performance Center/ALM, HP Quality Center, JMeter, BlazeMeter, Quick Test Pro (QTP), Fiddler, Dynatrace, JIRA, Putty, Device anywhere and Mobiready

Databases: MySQL, MS Access, Oracle, SQL Server, and DB2

Web Performance: Fiddler, Webpagetest, PageSpeed Insights and HttpWatch

APM tools/Profiling: DynaTrace, SiteScope, HP Diagnostic, Site 24x7, Standard Set

Languages: C, UNIX, SQL, XML, WSDL, HTML, CSS, SOAP/REST, JSON, Java

Server: Web Logic, Web Sphere, Tomcat, Apache, JBoss and FTP

PROFESSIONAL EXPERIENCE

Confidential, Pittsburgh, PA

Sr. Performance Test Engineer / Quality Analyst

Responsibilities:

  • Work closely with product development team and participate with the agile team on daily scrum, and sprint planning
  • Design, develop and maintain test framework which measures and reports on the performance of the application suite.
  • Prepare test plan based on understanding of Business requirements and the design of system architecture
  • Perform Functional, Integration, End-to-End, Regression, Performance and UAT testing
  • Develop performance test strategy, design/define performance test scenarios on Load Runner 12.01/12.50 and Performance center 12.01/12.50
  • Extensively used HP Loadrunner for Developing Vuser Scripts in Enterprise application
  • Expert using various protocols for example Web-HTTP/HTML, TrueClient web, TrueClient Mobile, TruClient AJAXandWeb Services
  • Enhanced Vuser scripts for parameterization, correlation, rendezvous points, and transaction
  • Develop custom C functions for string manipulation using variables, conditional statements, loops, and LR functions
  • Managed performance tests scriptsinPerformance Centerby uploading, executing, and analyzing
  • Performed SOA testing by creating SOAP request using LoadRunner and JMeter
  • Configured scenarios and set up the monitors to capture the performance of the Application servers,Web servers and Database servers using Performance Center
  • Identify potential bottleneck issues per defined process and escalate the risk and mitigate the issues.
  • Monitored online graphs like Transactions per Second, Throughput and Response time at Client side and analyzed after the completion of test reviewed code to identify any code errors
  • Analyzed various graphs generated including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs
  • Extensive experienced in Dynatrace pure path technology and Monitoring Response time inPurePathDashboardand different server like-web server, database server and application serverfor real time performance
  • Mentor,coach, and guide team members.2 offshore and 1 onsite
  • PerformedBaseline test, load test andhigh volume ofultimateusers usingJMeterand monitoring the performance of the load test on the system
  • Testing functionalityand performance testing withSoapUiand verifying the web services andWorking with Http MonitorusedHttpWatch
  • Communicatingand collaborate closely with developers, business analysts and internal stakeholders to provide guidance towards resolving issues

Environment: Load Runner,JMeter/BlazeMeter,Application Lifecycle Management (ALM),Performance Center,Dynatrace, TrueClient Mobile protocol, Quality Center, J2EE, Web sphere, SQL,VMware,Apache, Tomcat.

Confidential, East Lansing, MI

Performance Tester / FunctionalQA Analyst

Responsibilities:

  • Actively participated in agile scrum software development environment including daily scrum, and sprint planning
  • Developedstandard performance test plan and communicated with the Homebrew team members to discuss the test report and follow up story in JIRA/Agile backlog
  • Executedload, volume and performance test forJAVAbased platform
  • Build Performance test for web andCloud basedapplications inLinux
  • Monitor application and multi-web server metrics and analyzePerfMonmetrics
  • Analyzed Memory Load,CPU, Thread, Response Code and Network I/O Load for performance bottleneck issue triaging. Use Commands such as Top,Perfmon,Wget, Sir andVmstat
  • Used FTP program to upload files for offline testing and viewed server log files
  • Running SQL query performance test withJMeterusingJDBC protocol under some given load and capturing the impact of performance issues, and share results.
  • Developed embedded scripts using Badboy and integrated withJMeter.
  • Participated in the development of the project plans by outlining QA tasks, deliverables,deadlines, and time estimates
  • Develop and maintain Key Performance Indicators (KPI) for the various applicationsincluding response time,failover, time to failure, and recover
  • Worked extensively withWeb/Http, Mobile TrueClient, Web service/SOAP protocolsand developed scriptsusing HPLoadrunner
  • Build test automation for UI and WCF/REST Services and created reusable and shareable components using JMeter inLinux platform
  • Assist QA Performance Team members to develop script using JMeter and running test in theBlazeMeterCloud
  • Executing different Scenarios Using Sampler, Controllers & Listener of JMeter like Benchmarking, increasing Load and Stress.
  • Performed Testing Application servers using AjaxDynaTraceEditionandanalyzed through Hotspot, Pure path and Timeline
  • Installed and ConfiguredAppDynamicsand monitoring Business transactions, response time.

Environment: LoadRunner, SpringBoot,JMeter,J2EE, DynaTrace, My SQL, TomcatWebLogic,ActiveMQ,JIRA, iOS 7, Mobile safari,VMware

Confidential, Minneapolis, MN

Performance Test Engineer& Functional Tester

Responsibilities:

  • PerformedSmoke,Functional, SystemsIntegration,Regression,Database testing and Performance testingat various phases of the development and test cycles
  • Extensively workedin Web,Mobile/True ClientandWeb services(SOAP) Protocol inLoadrunner, simulate virtual users for load, stress, and volume testing
  • ConfiguredLoadRunner Controller, Load Generator and Execute Performance Test for multiple cycle of Test scripts
  • Uploaded scripts, created timeslots, created scenarios, maintain scripts and run the load tests inPerformance Center.
  • Analysis test results response time, transaction per seconds and throughput per graphs
  • Monitored response time, TPS/ Throughput under load through LAN connection formobileApplication performance
  • Configure application through profiling tools such asVisualVM,Jconsole
  • Used Wily Introscope for monitoring J2EE Applicationsandmonitored the resources metrics to find the performance bottlenecks
  • Design and develop automation testing scripts and executed them to perform verification tests on application usingQuick Test Pro
  • Developed complex SQL queries to performed back-end testing inMS SQL ServerRDBMS
  • Worked closely with developers, QA engineers, network engineers, and management team

Environment: LoadRunner, Performance Center/ALM,JMeter,J2EE, UNIX, Web logic, Web sphere, Oracle, QC,SoapUI,VMware

Confidential, VA

Software QA Analysts

Responsibilities:

  • Analyzethe Functional Requirements and Design Specifications documents to ensure that the system met all of the technical and business requirements of the applications.
  • Manually generate and implement templates forTest Plan,Test Cases, andTest Scriptsand performed Manual Testing on the entire application.
  • PerformedIntegration,System,User Acceptance,FunctionalandRegression testing.
  • Used Quality Centerto develop test cases, test scripts, executing the scripts.
  • Workedclosely with software developers, business analysts, Sys Admin, and other project management personnel involved inSoftware Development Life Cycle (SDLC).
  • Performback end testingby verifying the data in theOracle Database.
  • Involve in the team meetings with representatives from Development, Database Management, Configuration Management, and Requirements Management to identify and correct defects.

We'd love your feedback!