We provide IT Staff Augmentation Services!

Sr. Performance Test Engineer Resume

3.00/5 (Submit Your Rating)

Boston, MA

SUMMARY:

  • Competent IT professional with an experience over 9 years as Sr. Performance Test Engineer extensive knowledge of test strategies and quality assurance schedule management . Communicates well with clients throughout testing processes
  • Worked in the Deliverables which involves Waterfall and Agile Methodology of SDLC
  • Broad experience in Software Testing Lifecycle and Defect lifecycle including Test data analysis and setup and have an exposure on
  • Understands the importance of client - facing roles and conceptually strong with an innovative & analytical approach and understands the place of creativity in the testing and mitigation process
  • Good communication skills, verbal as well as written coupled with good presentation skills
  • Good team member understanding of wrap-around development and testing processes
  • Self-motivated and goal-oriented with a high degree of flexibility, creativity, commitment and optimism
  • Have worked on various Windows and UNIX platforms with good knowledge defect tracking using tools like MICRO FOCUS Quality Center and JIRA
  • Basic Knowledge on Perfecto, Eggplant
  • Providing Performance Testing, Analysis and management expertise to execute one of the large and complex project for leading US Telecom provider
  • Good Knowledge of automated test tools - MICRO FOCUS Load Runner 12.55(Web HTTP/HTML, Java Vuser, SOAP WebServices, Ajax/TruClient), MICRO FOCUS Performance Center 12.55, Quality Center and Dynatrace 6.5, CA WILY Introscope 10.1
  • Involved in performance tuning of JVM by analyzing garbage collection metrics. Worked on Jconsole tools to get and Analyze Heap dumps, Thread dumps
  • Experience in Configurations of JVM settings, Web Container Thread Pool, Connection Pool in WAS Administration Console
  • Experience in Performance Monitors to analyze the System Bottlenecks, GC Heap size, CPU utilization, thread details, Pool size, and session details

TECHNICAL SKILLS:

Operating Systems: Windows 97/XP

Languages: SQL, C, C++, JAVA, Unix

Databases: Oracle 11g, MySQL, Mainframe

Testing Tools: Micro Focus Load Runner 12.55, SOASTA, MICRO FOCUS Quality Center 12.60, JMETER, CA Wily Introscope 10.1, Dynatrace, Fiddler, JConsole, MICRO FOCUS Performance Center 12.60, Nmon, WebLogic Admin, JIRA, Application Performance Analyzer, HP UFT 14.03

Web Technologies: HTML, XML, REST API, SOAPUI, WSDL and Messaging Queue API s

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Sr. Performance Test Engineer

Responsibilities:

  • Initially we were working in POC for 1 distributed and 1 mainframe application performance testing
  • Since this was a POC first we had to document the performance testing procedures and practices
  • Currently working on multiple distributed and mainframe applications performance testing delivery
  • As an Onshore Lead my responsibility is to interactive with Client and gather the requirements for the applications in order to proceed with performance testing
  • Allocate the projects to the team members and follow up on regular updates and help them if needed and review test plan once done
  • Work Closely with Functional Teams for the Environment availability and any other issues which are blocking the performance test execution.
  • Review the Load Runner Scripts of the team members and make sure proper coding standards been followed and provide necessary comments
  • Plan for the test Executions like Baseline, High Availability and Latency test depends on the applications
  • During test execution we use to monitor the servers in Dynatrace as well for Memory, CPU Utilization and GC Heap as a part of Distributed application testing.
  • Also for Mainframe test execution will be analyzing the SMF/RMF reports and APA reports as well if necessary
  • We will prepare Performance test report from Analysis and present it to the client and recommend performance tuning if needed

Technology: Micro Focus Load Runner v12.55, Controller, Performance Center and Analysis for report generation, MICRO FOCUS Performance Center v12.60, IE Developer Tool bar, Dynatrace v6.5, Application Performance Analyzer, Mainframe application

Confidential, Pittsburgh, PA

Performance Test Lead

Responsibilities:

  • Gathering the Non-Functional testing requirements for the project teams of all over Retail area
  • Analyze requirements and prioritize release items based on timelines provided by team and proceed with planning/delivery
  • Allocate the Finalized projects to team members and start following with appropriate team for the Work flow Documents, System Designs and data requirements and prepare the Performance test plan accordingly
  • Work Closely with Functional Teams for the Environment availability and then start with Scripting in MICRO FOCUS Load Runner v12.55 with various protocols like WEB HTTP/HTML, Web Services, and REST API requests
  • Plan for the test Executions like Baseline, High Availability and Latency test depends on the applications
  • During test execution we use to monitor the servers in Dynatrace as well for Memory, CPU Utilization and GC Heap
  • We will prepare Performance test report from Analysis and present it to the project team and recommend performance tuning if needed

Technology: MICRO FOCUS Load Runner v12.55, MICRO FOCUS Performance Center v12.60, SOAP UI, Fiddler, IE Developer Tool bar, Dynatrace v6.5

Confidential, Chicago, IL

Team Lead

Responsibilities:

  • Deftly interacting with clients for Walkthroughs, analysis and finalization of Functional and Non Functional specifications
  • Handled responsibilities of verifying the reliability and necessary requirements of the system by coordinating with senior staff
  • Working closely with development teams in investigating and correction of software bugs for Client/Server and Web applications
  • Skillfully conducting case / system / process study for project tracking and managing team members to work on requirement mapping, Scenario design etc…
  • Assigned the tasks of integrating individual software modules into groups to perform integration testing
  • Performed responsibilities of executing end-to-end testing for each module, components, and sub systems of the application
  • Adroitly testing, troubleshooting and debugging of applications and providing post-deployment and support to client for application
  • Training New Joiners in the project and mentoring them to work on self-dependent in delivering Quality delivery on their own

Technology: Selenium Framework, MICRO FOCUS Load Runner, SOAPUI Web service requests, SQL, JAVA, MICRO FOCUS ALM, CA Wily Introscope, Fiddler, Jconsole, MICRO FOCUS ALM QC

Confidential, O’Fallon, MO

Performance Engineer

Responsibilities:

  • I was responsible for the planning, testing and execution for enterprise software used by our client’s front-line Agents, partners and end customers
  • According to the created test plan and test cases we have done the testing on the applications and verify if the test cases are successful
  • Planned and executed regression testing and automation of regression testing processes
  • Opened defects for the issue and had tracked in MICRO FOCUS ALM QC and followed up with corresponding to get the fix implemented and re-test done
  • Documented and analyzed all test results and made recommendations and published test summary reports
  • After Functional testing done, we started with performance testing with getting the necessary requirements for it
  • Identified suitable tests to conduct on the applications which determines the performance of those application
  • Developed test scripts using LoadRunner 11.5/12.0 with various protocols like web HTTP/HTML, SOAPUI Web Services
  • Monitored the performance of the key transactions and hardware vitals, such as CPU, memory, available connections etc using Dynatrace to come up with a comprehensive view of the transactions with performance issues
  • Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center

Technology: MICRO FOCUS Load runner 11.5/12.00, SOAPUI, MICRO FOCUS Performance Centre 12.02, MICRO FOCUS ALM Quality Center 12, DynaTrace, Splunk, Web Services, Jconsole

Confidential, Schaumburg, IL

Performance Engineer

Responsibilities:

  • Analyze requirements and prioritize release items based on planning/delivery teams
  • Identify suitable tests to conduct on the applications which determines the performance of those application
  • Prepare and execute test scripts using JMeter to perform application Web Services testing
  • Monitor metrics on all the servers using Dynatrace monitoring tool
  • Performing spike testing using JMeter, achieved by using synchronizing timer, blocking all the threads and release
  • Assessed and supported in reviewing test scripts developed and design load/stress test scenarios
  • Configured different test plans and analyzed the results using JMeter
  • Identifying Key Business Processes in the application, Preparing the Call Flows and execute performance testing of the application and make sure the environment is stable
  • Attend customer calls to update project status and track, reviews performance test activities

Technology: JMeter, Dynatrace, Putty, WebLogic, Oracle, Web Services Confidential &T, Chicago, IL

Confidential

Software Test Engineer

Responsibilities:

  • Gathering the Functional testing requirements for the project
  • Involved in test plan preparation and design the test cases
  • Developed Scripts using Selenium Web Driver (Java framework) according to test cases and make it ready
  • Execute the Scripts with all test cases and upload the results in MICRO FOCUS QC properly
  • Reporting and opening defects if any issue’s faced while testing and tracking them for the fix
  • Analysis of the test data required and processing it through various distributed upstream systems

Technology: Selenium Web Driver, MICRO FOCUS ALM Quality Center, SQL, Java

Confidential

Software Test Engineer

Responsibilities:

  • Requirements: Understanding Business and Functional Requirements and ensuring those in test cases with zero slippage
  • Test plan design: Provides input on test plans for features and designs
  • Quality validation: Ensuring proper execution of designed test cases
  • Automation: Creating Automations scripts for the features and executing them through the Selenium Web Driver frameworks
  • Defect Reporting: Reporting Defects during execution and tracking them
  • Test Results: Uploading Test Results in QC
  • Coordination: Communicated with developers and helped them in getting fixes on time
  • Test data analysis and design: Analysis of the test data required and processing it through various distributed upstream systems
  • KT Sessions: Knowledge transferred to team members making them productive and self-dependent
  • Client Interaction: Interacted with client Confidential the time of walkthroughs and some specific requirement change during execution
  • Tool Development: Gather the problem statement across the team and automate them with innovative ideas
  • Performance Testing: Designing the Scripts and execute them to check the performance of the application

Technology: SQL, Selenium, JAVA, Jmeter, MICRO FOCUS Quality Center

We'd love your feedback!