We provide IT Staff Augmentation Services!

Lead Performance Tester Resume

3.00/5 (Submit Your Rating)

Malvern, Pa

SUMMARY:

  • Over 9 years of experience in IT industry, specializing in performance testing of client - server, web services- SOA, web-based applications, Mobile applications, User Acceptance Testing , and Manual Quality Assurance .
  • Excellent understanding of the functionalities of QA in Development, Enhancement and Maintenance of software applications and Software Development Life Cycle (SDLC).
  • Expertise in Automated Testing in different testing methodologies like Agile, Scrum, Waterfall etc.
  • Extensive experience using LoadRunner, SOASTA Cloud Test, JMeter for Performance Testing
  • Used HP tools Quality Center (QC), LoadRunner, QTP, Performance Center and JMeter Open source tool for Performance Testing.
  • Experience with LoadRunner components: VuGen, Controller, Analysis, Load Generator and with the components of JMeter.
  • Experience in running the Performance tests in HP’s Performance Center, standalone Controller and JMeter.
  • Extensively used UNIX to retrieve server results of Windows and Linux.
  • Experience on GAPI load testing at different locations and bandwidth like UK, Australia, Singapore, US.
  • Did JVM Tuning on the Garbage collection, which is a key aspect of Java server performance.
  • Did JVM Tuning on the JVM Options
  • Revised JVM Heap Sizes analyzing the Performance of the Application.
  • Took ThreadDumps and Heapdump for finding and analyzing the Bottleneck areas.
  • Expertise in testing Web/J2EE technologies, .Net, middleware, Web services, SAP GUI, SAP ECC Customer facing applications.
  • Worked on different Protocols like Web Http/Html, Ajax TruClient,.NeT, Web Services(SOA (Service Oriented Architecture) ), FLEX, Click & Script, RTE, RDP, Java Vuser, Oracle(2- Tier), Siebel - Web(Siebel CRM and OUI), Sybase CTlib (Database ), Sybase Dblib (Database ), Multi Protocol
  • Developed Test Cases, Test Plan and Automation test scripts using HP (Mercury) LoadRunner.
  • Expertise in developing Test plans, Test Automation Script, designing Test cases, executing Test Cases, creating Test scenarios, analyzing Test Results, reporting Bugs/Defects, and documenting Test results.
  • Expertise in executing VuGen scripts in Load Runner for Performance, Load and Stress Testing using Controller in Load Runner/ Performance Center and generated reports using the Analysis tool in LoadRunner.
  • Expertise in Setting up a Performance Environment.
  • Experience in recording/coding of VuGen Scripts using different Protocols in the Environments of: Client/Server, E-Business and Enterprise Resource Planning/Management (ERP/ERM) environment.
  • Hands on experiences in performance bottlenecks, end-to-end performance, and web performance measures like server response time, throughput and network latency.
  • Experience in analyzing Performance Bottlenecks, Root cause and Server Configuration problems using LoadRunner Monitors, Analysis, Site Scope and J2EE Diagnostics.
  • Experience with Commercial Monitoring tools like HP Site Scope and BAC (Business Availability Center), HP Diagnostics, Sitescope, DynaTrace and Inbuilt Monitoring Tools etc., to monitor the databases, application and web servers (at OS & Application level) for Performance bottlenecks while conducting Load, Stress, volume, and Memory tests.
  • Expertise in Parameterization, manual Correlation and Run time Settings.
  • Extensive experience in analyzing performance bottlenecks such as very high CPU usage, memory leaks.
  • Knowledge of functional testing using HP Quick Test Pro and Quality Center
  • Good Working Experience in UNIX and Windows.
  • Vast experience in using SQL for test data creation and validation of test data on the database.
  • Good analytical, interpersonal and communication skills. Driven, committed and hard working with a quest to learn new technologies and undertake challenging assignments.
  • Ability to manage multiple project tasks with changing priorities and tight deadlines and worked in multiple management levels.
  • Expertise with analysis of business, technical, functional requirements, Non Functional Requirements.
  • A self-starter and able to work independently and collaboratively within a diverse teamwork environment.

TECHNICAL SKILLS:

Testing Tools: SOASTA Cloud Test, LoadRunner 6.5/7.8/8.0/9.1/9.5/11.04/11.52 , 12.01, 12.53, Performance Center 9.5/11.0, 12.01, 12.53 JMeter 2.5/2.7/2.8/2.9/2.10 , Rational ClearQuest 7.1.1.0, QTP 10.0/9.5, UFT 12.53

Bug Tracking tools: HP Quality Center, JIRA

Monitoring Tools: DynaTrace, SiteScope, HP Diagnostics Suite, Wily Interoscope, Windows Performance monitor, AppDynamics, Nmon, HP Diagnostics & dynaTrace etc.,

Scripting: C, VB Script, Java Script

Database: Oracle, DB2, MySQL, Sybase, and MS Access.

Methodologies: Waterfall, Iterative, Agile,

Servers: IBM Web Sphere, JBoss, Apache, Web Logic

Web Related: XML, HTML, CSS

Others: Rapid SQL, Toad, MS Project, SharePoint, Crystal Reports, SQL ServerManagement Studio, MS Visio, MS Access and Enterprise Architect

Operating System: Windows, UNIX, and Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Malvern, PA 

Lead Performance Tester

Responsibilities: 

  • Gathered business requirements, collecting the information about Service Level Agreement from Business Analyst and developers.
  • Managed a team of three off shore and seven onsite resources as Lead.
  • Responsible for performance testing using Performance Center and HP LoadRunner.
  • Developed Vuser Scripts in HTTP/HTML, Citrix ICA, Ajax - Click and Script, TrueClient Web, .Net, Web, Java, WebServices, and Database Protocols.
  • Developed scripts using UFT to get User Response Times along with Load rest servlet response times.
  • Performance testing of client-server, web services, web-based applications, and Documentum application.
  • Testing planning and designed varieties of Scenarios for Baseline, Benchmark, Load, Batch, Regression, Stress, and Endurance Testing.
  • Parameterized large and complex data to achieve complex test to achieve accurate performance and execute test in a Performance Test environment.
  • Used HPALM-Performance Center 12.53, standalone Controllers to create scenarios and run load tests.
  • Tested in GAPI environment with different bandwidths of different locations of the world (UK, Australia, and Singapore) along with USA.
  • Retrieved results from servers using UNIX.
  • Analyzed results using SkyNet report generator of Servlet response times, JVM, HEAP Memory, and CPU Utilization.
  • Analyzed results using LoadRunner Analysis tool based on Transaction per Second, Average Response times and resource usage to meet the SLA (Service Level Agreements)
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test report.
  • Monitored the Server’s CPU Utilization and used Heap memory status by using Monitoring tool: AppDynamics, Dynatrace, PerfMon, NMON, Wily and Performed in-depth analysis to isolate points of failure in the application.
  • Developed and implement GAPI Load and Stress tests with LoadRunner and present performance statistics to application teams, and provide recommendations on the issues that impact performance.
  • Executed tests using virtual machines in different countries.
  • Generated AWR reports for various applications.
  • Installed and ran Process Monitor in performance environment servers to get servlet response times.
  • Identified issues in Performance Environment by executing tests in other environments to make changes in configurations.
  • Monitor and administrate hardware capability to ensure the necessary resources are available for all the tests.
  • Performed online monitoring of Disk, CPU, Memory and Network usage while running the load test.
  • Perform in depth analysis to isolate points of failure in the application.
  • Monitor and administer hardware capability to ensure the necessary resources are available for all tests.
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a Performance Test and production Environment.
  • Responsible for analyzing application and components behavior and optimizing server configurations.
  • Maintained defect status and reported testing status weekly and monthly using defect tracking tools.
  • Interacted with developers during testing for identifying and fixing bugs for optimizing server settings at web, app and database levels.

Environment: Performance Center, HP LoadRunner, Unix, Java, .Net, MS SQL Server, MS SQL, IIS, Quality Center, Wily Inters cope, Site Scope, DynaTrace, Web Services, Mobile, Web applications.

Confidential, Indianapolis, IN 

Lead Performance Engineer

Responsibilities: 

  • Gathered business requirements, collecting the information about Service Level Agreement from Business Analyst and developers.
  • Managed a team of four off shore and two onsite resources as Lead.
  • Responsible for performance testing using Performance Center and HP LoadRunner.
  • Developed Vuser Scripts in .Net, Web, Java, Web Services, and Database Protocols.
  • Performance testing of client-server, web services, web-based applications, and Documentum application.
  • Designed varieties of Scenarios for Baseline, Benchmark, Load, Regression, Stress and Endurance Testing.
  • Parameterized large and complex data to achieve complex test to achieve accurate performance and execute test in a Performance Test environment.
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description.
  • Used HPALM-Performance Center 12.01, standalone Controllers to create scenarios and run load tests.
  • Analyzed results using LoadRunner Analysis tool based on Transaction per Second, Average Response times and resource usage to meet the SLA (Service Level Agreements)
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test report.
  • Monitored the Server’s CPU Utilization and used Heap memory status by using Monitoring tool: AppDynamics, Dynatrace, NMON, Wily and Performed in-depth analysis to isolate points of failure in the application.
  • Develop and implement Load and Stress tests with LoadRunner and present performance statistics to application teams, and provide recommendations on the issues that impact performance.
  • Monitor and administrate hardware capability to ensure the necessary resources are available for all the tests.
  • Performed online monitoring of Disk, CPU, Memory and Network usage while running the load test.
  • Perform in depth analysis to isolate points of failure in the application.
  • Monitor and administer hardware capability to ensure the necessary resources are available for all tests.
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a Performance Test and production Environment.
  • Responsible for analyzing application and components behavior and optimizing server configurations.
  • Maintained defect status and reported testing status weekly and monthly using defect tracking tools.
  • Interacted with developers during testing for identifying and fixing bugs for optimizing server settings at web, app and database levels.

Environment: Performance Center, HP LoadRunner, Java, .Net, MS SQL Server, MS SQL, IIS, Quality Center, Wily Inters cope, Site Scope, DynaTrace, Web Services, Mobile, Web applications.

Confidential,Pittsburgh, PA

Lead Performance Engineer

Responsibilities: 

  • Gathered business requirements, collecting the information about Service Level Agreement from Business Analyst and developers.
  • Responsible for performance testing using Performance Center and HP LoadRunner.
  • Developed Vuser Scripts in Web, Java, .Net, Web Services, and Database Protocols.
  • Performance testing of client-server, web services, web-based applications, and Mobile applications.
  • Designed varieties of Scenarios for Baseline, Benchmark, Load, Regression, Stress and Endurance Testing.
  • Parameterized large and complex data to achieve complex test to achieve accurate performance and execute test in a Performance Test environment.
  • Extensively used Load Runner Java Vuser for SOA, Web Services, JMS and Wrote Custom Framework for Scripting
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description.
  • Built Webservices Test Scripts, Test plans, JMS Test Plans and Web Test plans
  • Used HPALM-Performance Center 12.01, standalone Controllers to create scenarios and run load tests.
  • Analyzed results using LoadRunner Analysis tool based on Transaction per Second, Average Response times and resource usage to meet the SLA (Service Level Agreements)
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test report.
  • Develop and implement Load and Stress tests with LoadRunner and present performance statistics to application teams, and provide recommendations on the issues that impact performance.
  • Monitor and administrate hardware capability to ensure the necessary resources are available for all the tests.
  • Implementing Agile and Scrum testing methodologies.
  • Performed online monitoring of Disk, CPU, Memory and Network usage while running the load test.
  • Perform in depth analysis to isolate points of failure in the application.
  • Monitor and administer hardware capability to ensure the necessary resources are available for all tests.
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a Performance Test and production Environment.
  • Responsible for analyzing application and components behavior and optimizing server configurations.
  • Maintained defect status and reported testing status weekly and monthly using defect tracking tools.
  • Interacted with developers during testing for identifying and fixing bugs for optimizing server settings at web, app and database levels.

Environment: Performance Center, HP LoadRunner, Java, .Net, MS SQL Server, MS SQL, IIS, Quality Center, Wily Inters cope, Site Scope, DynaTrace, Web Services, Mobile, Web applications.

Confidential, Minneapolis, MN 

Lead Performance Engineer

Responsibilities:

  • Worked as Performance Analyst and Engineer. Executed various performance test conditions and created suitable scenarios based on the list of critical transactions.
  • Executed various performance test conditions by developing test scripts using the Performance test Tool SOASTA Cloud Test.
  • Create workload model based on the Vuser count and number of transactions to be achieved in a particular duration.
  • Used SOASTA Cloud Test, compositions to create scenarios and run load tests.
  • Evaluated the compliance of a systems or components with specified performance requirements.
  • Coordinate with AT&T support on performance test run process.
  • Recorded and baseline average response time of key business transactions that users would experience during peak loads.
  • Determined the response times on placing order for 100 to 140K items, Browse and search transactions, Guest Account Management, Ship to Store, and Pick up in Store transactions.
  • Monitored system resources of application servers and databases using Dynatrace during the test and identified the performance bottlenecks.
  • Accommodated Testing windows and Executions for Production Issues in regular release Testing Activities.
  • Team co-ordination and submitting the Deliverables based on timelines.
  • Monitored the Server’s CPU Utilization and used Heap memory status by using Monitoring tool: Dynatrace, NMON, Wily and Performed in-depth analysis to isolate points of failure in the application.
  • Combination of PERL and Python, which will extract logs files from web server (type of requests, average response time, invocation count, passed and failed), app server (exceptions, filtered by different types of exceptions), NMON Logs, Memory, Disk I/O, Heat Graph, and Garbage Collection Analysis create AWR reports for every test Run and reviewing with the DBA’s.
  • Analyzed the results by using SOASTA Analysis and prepared the result report by customizing the graphs.
  • Collaborated with support teams, architects, and developers on analyzing performance test results.
  • Provided feedback on development through the entire lifecycle of the release in order to ensure that we know about performance issues before customers do.
  • Archiving Release Documents for Future Review and Comparisons.
  • Archiving Dynatrace sessions after the test for future review.
  • Prepared a detail Test Schedule and Test Metrics on a daily basis for the project members to know the status of the Testing.
  • Updating the Stakeholders about the performance results by generating reports using analysis.

Environment: SOASTA Cloud Test, Java, Oracle, Dynatrace, NMON, Wily, J2EE, MS Office, Web Services, XML, HTML,Servers: WXS, WCS, and TWS.

Confidential, Washington D.C

Lead Performance Engineer

Responsibilities:

  • Created suitable scenarios based on the list of critical transactions.
  • Executed various performance test conditions by developing test scripts using the Performance test Tool HPLoadRunner12.
  • Evaluated the compliance of a systems or components with specified performance requirements.
  • Create workload model based on the vuser count and number of transactions to be achieved in a particular duration.
  • Used HPALM-Performance Center 11.50, standalone Controllers to create scenarios and run load tests.
  • Used Scheduler to schedule scripts run at particular time.
  • Recorded and baseline average response time of key business transactions that users would experience during peak loads.
  • Determined the response times for the retrieval of reports using Jasper Server.
  • Monitored system resources of application servers and databases during the test and identified the performance bottlenecks.
  • Accommodated Testing windows and Executions for Production Issues in regular release Testing Activities.
  • Team co-ordination and submitting the Deliverables based on timelines.
  • Monitored the Server’s CPU Utilization and used Heap memory status by using Monitoring tool: Wily and Performed in-depth analysis to isolate points of failure in the application.
  • Creating AWR reports for every test Run and reviewing with the DBA’s.
  • Analyzed the results by using LoadRunner Analysis and prepared the result report by customizing the graphs.
  • Collaborated with support teams, architects, and developers on analyzing performance test results.
  • Provided feedback on development through the entire lifecycle of the release in order to ensure that we know about performance issues before customers do.
  • Archiving Release Documents for Future Review and Comparisons.
  • Prepared a detail test schedule and Test Metrics on a weekly basis for the project members to know the status of the Testing.
  • Updating the Stakeholders about the performance results by generating reports using analysis.

Environment: LoadRunner 12, HP ALM-PC 11.0, HP ALM-QC 11.0, Java, Wily, J2EE, MS Office, Web Services, XML, HTML, Jasper Server.

Confidential, Auburn Hills - MI 

Senior Performance Engineer

Responsibilities:

  • Executed various performance test conditions by writing scripts using the Performance test Tool HPLoadRunner9.52/11.00/11.52andJMeter.
  • Designed the Test plan for Lending QA Project each Release.
  • Evaluate the compliance of a systems or components with specified performance requirements
  • Extensively used Load Runner and JMeter protocol s Java Vuser for SOA, Web Services, MU, MQ, JMS and Wrote Custom Framework for Scripting.
  • Developed Vuser scripts for SAP ECC(GUI) and SAP SRM(WEB) protocols based on the user workflows.
  • Wrote unit tests in Java to test my Framework developed to test Web Services.
  • Wrote LoadRunner Scripts, enhanced scripts with C functions, Parameterized, stored dynamic content in Load Runner functions.
  • Fixed all the Existing Scripts according to the changed business requirement
  • Designed the Single User, Load, Stress, and Endurance Scenarios with the production volumes provided by the Business
  • Carried out extensive automated testing with different test scripts which reflect the various real time business situations
  • Modeled the environment under construction and the user profiles that leverage the environment
  • Collected the details of the environmental stats and compared it to Production for accurate projection.
  • Used HPALM-Performance Center 11.00, standalone Controllers and JMeter Test Plans to create scenarios and run load tests.
  • Used Scheduler to schedule scripts run at particular time
  • Extensively used JMeter for Performance testing GUI, SOA and Web services.
  • Built Webservices Test Scripts, Test plans, JMS Test Plans and Web Test plans using Jmeter.
  • Analyzing applications and components behavior with heavier loads and optimizing server configurations
  • Used Google analytics to understand the production traffic from different regions, different OS and for different transactions.
  • Tested performance of Tomcat, Web sphere, web logic, F5 and Data Power etc.,
  • Accommodated Testing windows and Executions for Production Issues in regular release Testing Activities
  • On-shore, Off-shore Team co-ordination and submitting the Deliverables based on timelines.
  • Monitored the Server’s CPU Utilization and used Heap memory status by using Monitoring tools like Sitescope, Wily & HP Diagnostics and Performed in-depth analysis to isolate points of failure in the application.
  • Performance Monitoring, Bottleneck findings, Tuning using Java VisualVM, App Dynamics, Monitoring JBOSS, JAVA Memory Profiling, Monitoring Apache Logs (HTTP & HTTPS), Monitoring the Network using Wireshark, using NMON, Unix/Linux- VMSTAT, IOSTAT, using PerfMon while Performance Testing.
  • Involved in Tuning JVM.
  • Creating AWR reports for every test Run and reviewing with the DBA’s.
  • Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat and I/O Stat using UNIX commands like top, vmstat, svmon and netstat.
  • Analyzed the results by using LoadRunner Analysis and prepared the result report by customizing the graphs.
  • Collaborated with support teams, architects, and developers on analyzing performance test results.
  • Provided feedback on development through the entire lifecycle of the release in order to ensure that we know about performance issues before customers do
  • Archiving Release Documents for Future Review and Comparisons.
  • Prepared a detail test schedule and Test Metrics on a weekly basis for the project members to know the status of the Testing.
  • Updating the Stake holders about the performance results by generating reports using analysis.

Environment: LoadRunner 9.52/11.0/11.5 , JMeter 2.9 HP ALM-PC 11.0, HP ALM-QC 11.0, .Net, Web Services, MQ, MU, JMS, Web Services, XML, HTML, MS SQL Server, MS IIS Server, Web Logic, Data Power

Confidential

Performance Team Lead

Responsibilities:

  • Led a team of 6 Engineers handling a single Project with multiple Applications on multiple platforms
  • Implemented Performance Testing Process in the Organization.
  • Ordered different machines and configured them to act as Standalone Controllers & Standalone Load Generators.
  • Worked with infrastructure team to add a virtual network interface to the VM hosts for assigning IP address in a subnet to implement IP Spoofing in Load Runner.
  • Coordinated with Infrastructure teams to setup the L&P environment.
  • Coordinated with Application teams for Deployments, Backups, and Restores.
  • Coordinated with Test Environment teams to bring L&P Up and Running making sure on all the environmental variables.
  • Worked on Web HTTP/HTML protocol, for .NET Applications, Java applications and AMF, Flex for the application involving Flex communication.
  • Designed a simple framework for the engineers to test XML's and Web Services.
  • Extensively Used JMeter for Web Services testing and other Unit testing’s.
  • Designed a Custom Framework for efficient traceability of the transaction timings, User Number, HTTP download Size, HTTP request Size etc. with different testing Criteria’s.
  • Used HTTP Watch to closely monitor the calls made by all the business transactions
  • Developed performance test framework to automate and validate product interfaces
  • Designed the test plans which include scope, test strategies, test scenarios, BVM, Test types, approach and Deliverables etc.,
  • Designed scenarios for Performance Testing, generating scripts and handling Correlation as well as parameterization using LoadRunner Vugen, executed scenarios using Controller and analyzed the results using LoadRunner Analyzer.
  • Created and coded a very flexible LoadRunner script that allowed for fast configuration changes during testing.
  • Used web reg save param functions to correlate the scripts manually.
  • Enhanced script by inserting Check points to check if Virtual users are accessing the correct page which they are supposed to be accessing.
  • Helping in implementing best practices and setting higher coding standards for Performance Test Scripts.
  • Designed Performance test scripts using LoadRunner, ran Stress test, analyzed the results.
  • Held several meetings with stakeholders to understand the goal of the project, Requirements, Test Plan, Test strategies, Results review and Test Signoff.
  • Collected the details of the environmental stats and compared it to Production for accurate projection.
  • Coordinated with Functional Automation (QTP) team to run the functional scripts parallel to our scheduled testing windows to ensure the content of the pages are returned as expected under the load.
  • Installed and configured dynaTrace agents on the application server to collect the server stats and communicate to the dynaTrace client.
  • Used Monitoring tool dynaTrace for performing in-depth analysis to isolate points of failure in the application.
  • Also used dynaTrace to collect the Business Volume metrics from the production.
  • Used ignite monitoring software for Database monitoring.
  • Used HPPerformance Center 9.52/11.00 to create scenarios and run load tests.
  • Used HPLoadRunner 9.52/11.04 for writing Vuser Scripts.
  • Collaborated with dev teams to reproduce, verify, and fix performance issues Reproducing production workload for stress tests.
  • Worked closely with QA and development teams to establish an automated performance strategy, and follow up on issues, Document results and articulate issues.
  • Weekly status meetings for the project members to know the status of the Testing.
  • Involved in many Presentations to Stake holders.
  • Updating the Stake holders about the Test results by generating reports using analysis
  • Archiving the Documents for review in Future Releases.

Environment: LoadRunner 9.5/11.0, JMeter Performance Center 11.00/9.52, Quality Center 10.0, .Net, Web Services, MQ, XML, HTML, MS SQL Server, MS IIS Server.

Confidential

Performance Tester

Responsibilities:                                                                                    

  • Developed Performance testing plan: collecting requirements from Application Owners, Business Analysts, Business Leads, Architects and Developers.
  • Participated in various phases of product review, recommendation and evaluation process by working closely with Architect, Business Lead, Project Manager, Business Analyst and Windows/Unix Infrastructure SMEs to determine Business impact based on number of customers and transactions.
  • Conducting Test Plan walkthroughs, reviewing System Requirements, Test Scenarios and obtaining sign-offs.
  • Developed LoadRunner scripts by using Virtual User generator for Base Line, Soak (Endurance test) and Stress test scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by checking the content in sources.
  • Defined Rendezvous point to create intense load on the server to measure the server performance under load.
  • Monitoring software and hardware behavior during test run using PERFMON and LoadRunner online monitors.
  • Responsible for monitoring Oracle Database performance for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of application.
  • Identified and analyzed memory leaks at each component level.
  • Performed Performance testing to resolve Production issues and validate maintenance fixes.
  • Performance testing of Web applications, Web Services (SOA) and Siebel using HP Load Runner.
  • Used HP Load Runner: Vugen, Controller and Analysis for Portal/Web applications, Web Services (SOA), Siebel and XML.
  • Developed automated performance tests using Vugen, created scenarios, ran and coordinated performance testing. Integrated Performance Testing with various applications as well as within a Cloud environment.
  • Produced status reports, test results, analysis, recommendations, identified risks, if applicable and published metrics used in stakeholder’s decisions.
  • Involved in troubleshooting application performance and performance tuning
  • Participated in monitoring: HP Business Availability Center, HP Site scope, NMON and Windows performance monitor.

Environment: HP Business Availability Center(BAC), QC, WinRunner, Loadrunner, IBM Rational, SiteScope, Performance Center, HP J2EE Diagnostic, Windows, IIS 5, JMeter, IBM AIX, SQL, DB2, SQL Server, Oracle, UNIX, Siebel, SOA, WebSphere, J2EE.

We'd love your feedback!