We provide IT Staff Augmentation Services!

Sr Performance Engineer Resume

NyC

SUMMARY:

  • Over Nine years of diverse experience in the field of Performance and QA Automation Testing
  • Expert in Performance Testing using LoadRunner and Jmeter
  • Experience working in Windows and UNIX Environment
  • Experience in monitoring servers using tools like Site Scope, Wily Introscope, Dynatrace, Team Quest and Tivoli Performance Viewer.
  • Analyze the CPU Utilization, Memory usage, Garbage Collection and DB connections to verify the performance of the applications.
  • Develop the Test Recommendations for each test pass and Test Results.
  • Analyze the network connections between the servers
  • Load test execution and work with Product Management and Development determining the number of virtual users to be used during the performance test.
  • Coordinate web application performance testing pre - tasks: Ensure performance test requirements are received. Develop product specific performance test plans.
  • Supervise script recording, ensuring correctness and quality with an understanding of monitoring requirements based on test objectives, test execution tasks. Coordinate test window and lab utilization.
  • Secure technical support for monitoring of infrastructure and to qualify observations.
  • Work with other technical team members (Architects, DBA) to support the test execution to ensure correct environment configuration just prior to execution.
  • Execute performance / load / stress and other non-functional tests. Monitor application logs to determine system behavior. Address all technical issues, facilitate the resolution and necessary follow up with PM, IS/IT, Development, and other cross-functional departments.
  • Wrote manual test cases in Jira.
  • Successfully created UFT QTP automated test scripts for smoke and regression testing within time
  • Performed system and regression testing.
  • Prepare and review Test cases and Test scripts to test the application ensuring application compliance with requirements and able to manage action items
  • Logged and tracked defects using Quality Center.
  • Performed testing includes functional testing, user acceptance testing, integration testing, system testing, exception testing, compatibility testing of client/server and web.
  • Developed and executed automated test scripts by using VBScripts, Descriptive Programming and Function Libraries in QTP for DAO and IRA applications
  • Conducted backend testing by querying databases to synchronize testing databases and checked for data integrity and proper routing based on work flow rules at each step.
  • Developed automated QTP test scripts for functional, regression and GUI testing
  • Work extensively in a very fast paced in agile environment.

TECHNICAL SKILLS:

Domain knowledge: Banking, Accounting, Communication,Insurance

Web Technologies: HTML, XML, XHTML.

Languages: SQL, C, C++, Java, Shell,Python.

RDBMS: MS Access, SQL 2014, MySQL

Methodologies: RUP, CMMI

Operating Systems: Windows 2003,2008, 2012, XP/, UNIX, MACOS

Networking: TCP/IP, FTP, UDP, HTTP, FTP, FTPS, SFTP.

Automation Tools: LoadRunner 12.50, Jmeter 5.2, QTP, Tosca, Performance Center, Vugen,Blazemeter,controller

Defect Tracking Tools: Test Director, ClearQuest, Quality Center.jira

Monitoring Tools: TWS, SiteScope, WilyIntroscope, Dynatrace, Tivoli

PROFESSIONAL EXPERIENCE:

Sr Performance Engineer

Confidential, NYC

Responsibilities:

  • Worked with the Business to gather the Non Functional performance requirements and reviewed the requirements for missing pieces of information that would delay the testing.
  • Worked with the Functional test team to understand the application and to identify the Performance test cases.
  • Created Scripts for Java built application in HTTP, Truclient, Web services protocols using LoadRunner VuGen.
  • Worked on different complicated web services with some unique type of verifications needed.
  • Validated the created documents in the File server using SoapUI with the final token Id generated.
  • Enhanced the scripts by Parameterizations, Correlations and other functions to emulate the real time load conditions using VuGen.
  • Created Performance Test Scenarios by Loadrunnner,Jmeter Control and Blazemeter by different techniques such as Schedule by Scenario, Schedule by Group, Ramp Up, and Ramp Down for the planned work load.
  • Worked with Performance test concepts like Rendezvous point, pacing, runtime settings.
  • Executed load tests on to find the system performance under load and to identify any performance bottlenecks.
  • Monitored application and server utilization, response time, throughput, hits per sec by using Dynatrace to find the server behavior under load.
  • Analyzed the test results by the LoadRunner analyzer and Wily Introscope for finding the bottlenecks.
  • Prepared the final test report with the final test results, way of testing, observations and recommendations.

Environment: HP LoadRunner, JMeter,BlazeMeter, Kafka, AWS,VuGen, Controller, Analysis, Performance Center, Web HTTP/HTML, ANSI C, XML, Soap, JSON, Restful, Performance Center, SoapUI, Postman, Web services, QC, CA Wiley, IBM MQ, Dynatrace, Load Balancer, Web page diagnostics and Windows.

Sr. Performance Engineer

Confidential, NC

Responsibilities:

  • Planed, designed, Implemented and Executed Stress/Load/Performance Testing Efforts. Analyzed System Usage by reviewing the user profile, transaction profile, system resource usage diagrams and designed performance, stress, Endurance tests.
  • Created Virtual Users in VuGen and configure Scenarios to meet the load testing requirements in Performance Center.
  • Extensively used Web (Http/html), Webservices protocols in Loadrunner.
  • Configured UNIX, database resources for performance monitoring on Performance Center.
  • Automated the major functionalities of the application and the bugs reported using Jmeter
  • Developed and deployed test Load scripts to do end to end performance testing using LoadRunner.
  • Developed test cases for the application after analyzing the Business Requirements in Quality Center
  • Assisted in Test plan development
  • Extensively used Quality Center for reporting, tracking bugs and for document control
  • Developed baseline scripts for future regression testing of the application using Quick Test Pro
  • Script enhancements and Data Driven Testing were implemented with UFT QTP
  • Logged results and Defects using Quality Center
  • Responsible for executing the test cases and test automation scripts as per the plan for the application testing
  • Conducted Functional and Regression testing
  • Coordinated with business users for the User Acceptance Testing
  • Monitored Quality Center to track defects
  • Retested the bug fixes and closed when test passed.
  • Developed test scripts for Functional, Regression and data driven tests using UFT QTP
  • Developed test cases for the ratings & Quote of the different insurances provided by the company.
  • Interacted with developers, discussing the spec provided by the Analysts and also involved in identifying the changes and the discrepancies in the application
  • Part of implementation meetings with project team before migrates the release into production.
  • Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects.
  • Performed the Endurance test by executing the test for longer hours in order to find out any
  • Memory Leaks, slow resource consumption problem.
  • Researched on the usability of JMeter for the Products Performance Testing
  • Designed the manual goal oriented (Performance model) tests as per load distribution diagram (Un - even distribution of load) for each of normal, heavy, average volume load tests.
  • Monitored executions of the test scripts using Performance center and monitored the online graphs results.
  • Performed the testing in step-wise manner using Performance Center.
  • Worked with numerous online monitoring graphs/monitors in LoadRunner to see the performance issues and to identify bottleneck areas.Environment: LoadRunner, Quality Center, Performance Center, Dynatrace, DNT, .NET, Windows. HTTP, JIRA,QTP,Tosca

Environment: Jmeter, Windows, HTTP, Mainframes, Controller, LoadRunner, .Net, Java Dynatrace, ALM Quality Center, Performance Center

Sr. Performance Engineer (Lead)

Confidential, Long island city, NYC

Responsibilities:

  • Worked according to the activities laid down in each phase of software development life cycle and Coordinated the environment set up for Testing.
  • Leading onsite and offshore Performance team.
  • Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.
  • Identified bottlenecks by isolating the repetitive actions.
  • Developed detailed manual test cases and scenarios.
  • Studied Requirements and designed manual test cases accordingly.
  • Created GUI and Database Vuser scripts to simulate client activities and performed Load, Stress and Performance test using Load Runner/performance center
  • Used QTP to develop scripts to perform Functionality and GUI testing
  • Researched on the usability of Jmeter for the products performance testing
  • Automated the major functionalities of the application and the bugs reported using Jmeter
  • Produced documentation for performance team on set up of Jmeter test environment, and assisted with research on distributed testing best practices
  • Inserted rendezvous points in order to simulate heavy loads for conducting Load Testing.
  • Used ramp-up and ramp-down to simulate real-time scenarios.
  • Analyzed the results using Load Runner online monitors and graphs to identify bottlenecks in the server resources.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.
  • Confirmed the scalability of the new servers and application under test after the architecture redesign.
  • Conducted weekly meetings with Project Head, Business and development teams.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA’s and Network Admin to ensure optimum performance.
  • Understand the testing effort by analyzing the requirements of project
  • Organize the testing kick-off meeting
  • Build a testing team of professionals with appropriate skills, attitudes and motivation
  • Identify requirements and forward it to the project manager(Technical and soft skills)
  • Arrange hardware and software requirements for the test setup
  • Communicate with the client(If required)
  • Review various reports prepared by test engineer
  • Ensure the timely delivery of different testing milestones
  • Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Environment: LoadRunner12.1, Jmeter, Eggplant, Quality Center, Performance Center, Wily Introscope, Eggplant, HP diagnostics, Clear Quest, Clear case, J2ee analysis, Oracle 11g, MS Office, MS Access, MS Vision, MS Project

Sr. Performance Engineer

Confidential, Rancho Cordova, CA

Responsibilities:

  • Analyzed the Business and Technical requirements and developed performance Test approach.
  • Created Performance Work load model to create scenarios for various tests.
  • Participated in design, Use case, Test Approach and Result reviews.
  • Involved in defect tracking and customized reports on defects using Quality Center.
  • Created Single and Multi protocol scripts using Web(Http/Html), Webservices, and Tuxedo recording Protocols.
  • Parameterized and correlated unique and dynamic content in the vugen scripts to emulate the real time scenarios.
  • Created test data by inserting C statements to save the data to parameter and write to a output file.
  • Customized the Loadrunner Vugen scripts using C to meet the business requirements.
  • Manually tested the application after every weekly code deployment into the Pre-prod environment and wrote defects in Quality Center.
  • Configured Load generator machines by setting up the environment variables, installing client software and creating configuration files.
  • Created Load test scenarios in Performance Center and executed the tests.
  • Used Loadrunner Analysis to analyse the results and prepare customized reports.
  • Monitored the CPU Utilization, memory heap of the Application servers, Middleware servers, Database servers, Webservers using mercury Sitescope, Diagnostics.
  • Monitored the Server requests, GC, JDBC Connections using Diagnostics and Introscope.
  • Analyzed various graphs generated by Load Runner Analysis including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Created Results report from the Analysis data and monitoring data of various Performance Metrics from the Monitoring tools.
  • Co-ordinated with the offshore team in test execution and results reporting.

Environment: Java, J2EE, C, LoadRunner 12.02, QC 11, JIRA, SQL, VB Script, Windows Server2012, Windows 7, CA Wiley Interscope, DynaTrace, Unix, SQL, Hybris, Endeca, SOAP UI, Vugen, Web Services, VMware, HP Diagnostics, HP Analysis, Endeca, Hybris

Performance Engineer

Confidential, Mclean, VA

Responsibilities:

  • Monitored different graphs like transaction response time and analyzed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Found performance degradation issues like “Out of Memory” problems and improved Thread pool utilization, Memory Leaks, JDBC connection Pool size, & Transaction Rollbacks.
  • Analyzed Load pattern and created test scenarios to emulate the real life stress conditions.
  • Created Test Metrics, Bug Database and generated weekly reports for senior management.
  • Conducted meetings with developers, application team, business team to analyze the defects evaluate the test executions.
  • Involved in the decision making with the management for final applications releases.
  • Interacted with the Business Analyst and application teams to discuss the performance requirements and load test strategy.
  • Developed the performance Test Plans and Load Test Strategies.
  • Developed Vuser scripts using Web (HTTP/HTML), Ajax (Click and Script), Web Services, Microsoft .Net, ODBC, Oracle NCA, PeopleSoft Enterprise protocols.
  • Created customized LoadRunner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Enhanced Vuser scripts by adding correlations, parameters, condition controls, and checking/validation functions.
  • Used SiteScope and Introscope to monitored the databases, application and web servers (at OS & Application level) for Performance bottlenecks while conducting Load, Stress, volume, and Memory tests.

Environment: HP LoadRunner, SiteScope, HP Performance Center, Quality Center, PeopleSoft ERP Systems, ClearQuest, ClearCase, Winscp, TWS, Web Services, Java, IHS, IIS, DB2, SQL server, WinSQL, Windows XP, Web Sphere, ITM, AIX and UNIX, WilyIntroscope.

Performance Engineer

Confidential, Erie, PA

Responsibilities:

  • Used SiteScope and Introscope to monitored the databases, application and web servers (at OS & Application level) for Performance bottlenecks while conducting Load, Stress, volume, and Memory tests.
  • Assisted QA Manager in coordinating/leading the testing efforts.
  • Gathered business requirement, studied the application and collected the information from Analysts.
  • Created LoadRunner Scenarios and Scheduled the Virtual user to generate realistic load on the server using LoadRunner.
  • Created and implemented Performance tests using Mercury Interactive LoadRunner.
  • Involved in developing the Test Plan Strategy, build the test client and test environment.
  • Enhancing the scripts using Generator and performed Parameterization and Correlation to meet the requirements.
  • Written Test Plan, Test Scenarios and Test Scripts to follow the CMM Level 2 standards.
  • As an Automation Engineer carried out testing, to check whether the application is functioning as per the design documentation and functional requirements.
  • Involved in performing load and stress test on the application and server by configuring LoadRunner to simulate hundreds of virtual users and provided key metrics to the manage ment.
  • Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
  • Conducted all tests in the Controller by creating 100, 200, 400 virtual users for load, stress and steady state test respectively.
  • Performed Data Driven and Security Testing.
  • Involved in conducting stress tests and volume tests against the application using Load Runner.
  • Helped resident DBAs identify and resolve bottlenecks.
  • Written & executed UNIX Shell scripts to see the output.
  • Used Test Director to invoke the scripts and initially performed the baseline testing and organized all the scripts systematically and generated reports.
  • Extensively used Test Director for test planning, maintain test cases and test scripts for test execution as well as bug reporting.
  • Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.

Environment: Windows NT Server, SQL, Web Logic, IIS, C#, Java, Win Runner, Load Runner, Jmeter Site Scope and Test Director.

Hire Now