We provide IT Staff Augmentation Services!

Lead Performance Engineer Resume

4.00/5 (Submit Your Rating)

Providence, RI

SUMMARY:

  • Over 9+ years of diversified experience as Senior Quality Assurance Analyst with Core focus on to Performance Testing and Engineering. Experience includes Requirement Gathering, Planning, Execution, Analysis of Client/Server, Web based and SOA. Extensive experience using automated tools like HP LoadRunner, JMeter and other Commercial and Non - Commercial Tools.
  • Experience working in Different domains like Banking, Health care, Financial and Retail sectors with a unique combination of skill set in solving complex quality assurance challenges and implementing solutions that work.
  • Expertise in testing Web, Java, .Net, middleware, Web services, API’s, Customer facing applications.
  • Excellent understanding of the functionalities of QA in Development, Enhancement and Maintenance of software applications and Software Development Life Cycle (SDLC).
  • Expertise in different testing methodologies like Agile, Scrum and Waterfall etc.
  • Experience in understanding application performance requirements, developing performance engineering strategies, wide-ranging exposure to complete performance testing using protocols and usage of performance monitoring tools.
  • Expertise with Performance assurance from scratch (tool selection to process implementation).
  • Expertise in Unit Testing, Smoke, Sanity, Functional, Regression, Integration, System Testing, Network Testing Load, Performance, Stress, endurance, Volume, spike, failover, configuration and UAT testing.
  • Used Automation tools like HP LoadRunner, Performance Center, ALM, and JMeter for performance testing and QTP for Functional Automation Testing.
  • Extensive Experience with LoadRunner components: VuGen, Controller, Analysis, Load Generator and with JMeter
  • Experience in testing the SOA (Web Services, API’s, XML, SOAP, MU, MQ, JMS etc.,)
  • Experience in programming languages like C, C++, Java, .Net, SQL, Java to debug and execute Load runner scripts.
  • Experience with protocols like Web HTTP/HTML, Web Services, Citrix, Siebel-Web, Mainframe(RTE), FLEX, Click & Script, RDP, Mobile Web, Database and MultiProtocols for performance testing.
  • Experience in enhancing the VuGen, JMeter scripts by Manual/Automatic Correlations techniques, Parameterizations.
  • Good with debugging scripts by running it within VuGen with playing around Runtime Settings and performing IP Spoofing using Load Runner to replicate a real world/Production like Scenarios.
  • Good skills in SQL statements, database connectivity, Oracle10g, configuring TNS file and connecting through TOAD.
  • Hands on experiences in analyzing performance bottlenecks, root cause analysis, monitoring end-to-end performance and fixing performance issues.
  • Expertise with Monitoring tools like CAWily Introscope, HPSiteScope, Foglight, Splunk, App Dynamics, Windows Performance Monitor, NMon, VM Stat, I/O Stat, HP Diagnostics and Dynatrace etc.,
  • Excellent knowledge and skills in performance monitoring CPU, Memory, Network, Web connections, throughput, transaction response times, web/app server metrics (Windows / Linux / AIX), Database metric and J2EE Performance while running Baseline, Performance, Load, Stress and Soak testing.
  • Expertise with JVM Tuning (Thread Dumps, Heap Dumps Etc.,), Garbage collection Tuning, Java server performance Tuning, Revising JVM Heap Sizes and analyzing the Performance of the Applications using JVisual V, Jconsole etc.
  • Experience in using Bug Tracking Tools like Quality Center (QC) and Jira etc.
  • Expert in deliverables like Test Report and Test Results Analysis Reports.
  • Good experience in engaging with business contacts and stakeholders for requirements gathering, architecture review and results analysis.

TECHNICAL SKILLS:

Testing Tools: HPLoadRunner -8.1,9.5,11.0,11.50,11.52,12.0,12.02,12.50,12.53, HPPerformance Center 9.5 11.0,11.5, 12, HP ALM 12.53, HP Quality Center, JMeter 2.5, 2.7, 2.8, 2.9, 2.10,2.13, 3.0, 3.3 SOAP UI and QTP

Languages: Microsoft C#, C, C++, .Net, Java

Markup/Scripting Languages: DHTML, CSS, JQuery, JavaScript, XML, HTML

RDBMS: MS SQL, Microsoft Access, SQL Server, Oracle Database

Operating Systems: AIX, HP-UX, Solaris, UNIX, Windows XP,2003,2000,Vista, Windows NT and Linux

Monitoring Tools: Performance Center, CAWily Introscope, HP SiteScope, Dynatrace, App Dynamics, Nagios, HP Diagnostics, Transaction Viewer, Splunk, Windows Performance monitor, Nmon, Ganglia

PROFESSIONAL EXPERIENCE:

Confidential, Providence, RI

Lead Performance Engineer

Responsibilities:

  • Involved in gathering business requirements, studying the application and collecting the information from developers, functional testers and business to proceed with Test strategy, Performance Test plan and Test Case design document.
  • Assist in the development of the software quality assurance (SQA) test strategy for moderately to highly complex IT initiatives across multiple SQA domains by analyzing business and technology requirements to ensure testability and traceability.
  • Performance Testing Web Applications, Siebel Web Applications, Web (HTTP/HTML), Truclient, Web services (SOAP, SOA), WebSphere MQ using HP Load Runner 12.
  • Gathering non-functional requirements for testability and participating in the review of the technical acceptance criteria.
  • Extrapolate volumes for Holiday Readiness and creation of the Work load Model (WLM) for Performance Testing.
  • Developed VUser scripts for Business Middleware Services webMethods, Datapower using the HTTP/HTML, Web services protocols and developed scripts for MF Terminal Emulator screens (green screens used by advisors) using RTE protocol in VUGen.
  • Request - response testing of Web Services calls using SOAP UI tool and exporting code to VUGen, thereby enhancing the scripts for load test.
  • Developed load test scripts using Apache JMeter and performed load testing for heavy load on a server, group of servers, network to test its strength, analyze overall performance under different load types.
  • Used Jenkins jobs for test data setup in Performance and QA Testing Environments.
  • Correlated and parameterized scripts as well as configured the RunTime settings in Virtual User Generator.
  • Provides end-to-end quality assurance throughout the entire Software Development Life Cycle (SDLC) by developing and executing front- and/or back-end test cases and scenarios in accordance with the company testing methodology.
  • Creation of different load models for volume distribution across multi transactions like Step, Burst, Peak season scenarios for Holiday Readiness and execution of tests in Performance Center.
  • Working with Business, SME’s on the production volumes and designing/executing different kinds of tests like dry run, load tests, break point tests, Scalability test, failover, Volume and Long-Duration/Soak Test Scenarios
  • Closely monitoring the systems response times, transactions per second, CPU utilization, Memory usage using different monitoring tools such as Native Load Runner monitoring, Wily Introscope, Dynatrace.
  • Configured and used Nagios, Ganglia, Dynatrace dashboards to monitor application under test and analyse the performance of the servers and database by generating various reports.
  • Monitored Different kinds of Graphs including Throughput, Hits/Sec, Transaction Response time, Windows Resources (Memory Utilization, CPU Utilization, Threads, etc), JVM heap Size, database connection pool size while executing the scripts from Performance Center.
  • Used Splunk to monitor long running transactions, transactions by error codes and generating monitoring reports.
  • Tracking software quality metrics across testing phases (e.g., SIT, Performance, UAT, Automation, and Production Validation).
  • Analyzed test results and prepare detailed Performance Test Reports including the recommendations for process improvement.
  • Participating in root cause analysis of identified defects and enforcing entry and exit criteria throughout the SDLC; Identifying and escalating quality risks, issues, conflicts, and/or resistance across teams.
  • Preparation of Daily and Weekly status reports. Attending weekly defect report meetings and presented progress updates.
  • Participated in QA reviews and provided required support and clarification as needed for the reviewers.

Environment: HP ALM, Performance Center, HP LoadRunner - 12.50/12.53, JMeter-3.3, Java, SOA, Mainframes, IBM Sterling, ESB, MS SQL, webMethods, DataPower, Quality Center, Splunk, Nagios, DynaTrace7.1, Jenkins, Wily Introscope, Ganglia, Apache Tomcat, TDA, MQ, JIRA.

Confidential, Chicago, IL

Lead Performance Engineer

Responsibilities:

  • Extensively involved in all Phases of Performance Test Life Cycle in Implementing New Process for Performance testing as well as Following the Existing Process for Multiple Applications.
  • Extensively involved in Performance testing of client-server, web services, web-based applications, and Mobile applications.
  • Gathered Requirements by collecting the information from Stakeholders. Determined performance requirements and goals based on requirements and architecture.
  • Responsible for performance testing using HP LoadRunner 12.53, JMeter 3.3 and Performance test Engineering using Multiple Monitoring Tools Available.
  • Used HPALM-Performance Center 12.53, standalone Controllers to create scenarios and run Performance tests.
  • Developed VUsers Scripts in Web (HTTP/HTML), Java Vuser, .Net, Web Services protocol and Database Protocols.
  • Extensively used Performance Monitoring tools Dynatrace & App Dynamics, Wily Introscope and SiteScope to analyse the System resources bottlenecks like Memory Leaks, CPU & Network Bottlenecks as well as problematic application & DB components.
  • Designed varieties of Scenarios for Baseline, Benchmark, Load, Regression, Stress, Steady state and Endurance Testing.
  • Performed load, stress testing and other needed performance testing to assess and improve the application performance.
  • Developed and updated performance Test cases matching the requirements and updating Use Cases for current projects.
  • Analyzed results using LoadRunner Analysis tool based on Transaction per Second, Average Response times and resource usage to meet the SLA (Service Level Agreements)
  • Created custom java code in load runner to simulate load testing mq messages
  • Imported jar files to create Java Scripts for doing the Performance tests such as load test, scalability test and soak test.
  • Created and Analyzed the Heap & Thread dumps and helping the development team to fine-tune the application performance.
  • Extensively Worked on Java memory management (JVM), analyzing garbage collection and thread analysis.
  • Coordinated with Middleware, legacy and database teams to monitor data while Performance Testing.
  • Highly involved in monitoring middleware application server's performance metrics like Thread count, JVM heap size, queue size etc.
  • Analyzed the system resource graphs, network monitor graphs and error graphs to identify transaction performance, network problems and scenario results respectively.
  • Monitored and administrated hardware capabilities to ensure the necessary resources are available for all the tests.
  • Performed online monitoring of Disk, CPU, Memory and Network usage while running the load test.
  • Performed in-depth analysis to isolate points of failure in the application.
  • Involved in creating Dynatrace & App Dynamics dashboard, reports using built-in and/or custom measures to present testing and analysis results effectively
  • Analyzed performance critical transactions using tagged web requests, Pure paths and Stack Trace of the Transactions to trace bottlenecks
  • Analyzed Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks.
  • Responsible for Setting up user profiles, configuring and adding application servers on Dynatrace tool
  • Investigated and troubleshoot performance problems in Performance Tests and production Environments.
  • Responsible for analyzing application and components behavior and optimizing server configurations.
  • Analyzed, interpreted, and summarized meaningful and relevant results in a complete Performance Test report.
  • Presented performance statistics to application teams, and provided recommendations on the issues that impact performance.
  • Interacted with developers during testing for identifying and fixing bugs for optimizing server settings at web, app and database levels.
  • Worked with developers, Business Analysts and Release managers to discussed ways to fix the defects.
  • Discussed the analysis with the Stakeholders and presented the risks before the release went to production
  • Maintained defect status and reported testing status weekly and monthly using defect tracking tools.

Environment: HP ALM, Performance Center, HP LoadRunner - 12.50/12.53, JMeter-3.3, Java, .Net, MS SQL Server, MS SQL,IIS, Quality Center, Axman, ClearQuest, App Dynamics,DynaTrace6.1, Jenkins, Wily Introscope, Java, Apache Tomcat, MQ, Sitescope, JIRA, JVisualvm, Jconsole.

Confidential, Minneapolis, MN

Sr. Performance Engineer

Responsibilities:

  • Involved in Requirements Gathering, Test Plan creation, Analysis and Reporting.
  • Responsible to Monitor the Production, find the Performance Issues and replicate in Performance Testing environment for Fix.
  • Performance Testing Web Applications, Siebel Web Applications, AJAX, Oracle EBS, Mobile Native& Mobile Web Applications using HP LoadRunner 11.50/12.02 and JMeter.
  • Webservicestesting using JMeter and LoadRunner
  • Worked with the SME/Tech team to get the test data necessary for generating and running Vuser scripts.
  • Enhanced Load Runner scripts to test the new builds of the application.
  • UsedCA Wily Introscope, NMon& HP SiteScope to monitor Production and Performance Testing Environments.
  • Responsible for Monitoring the Performance of the Applications, findings the Bottleneck, doing Tuning.
  • Monitoring Mobile Native App’s in Production and Performance Testing Environment using Crittercism.
  • Creating AWR reports for every test Run and had a review meeting with the DBA’s.
  • Responsible for determining the room for Performance improvement for any Application or a Service while testing, Implementation and retesting.
  • Used JIRA for reporting the Performance Bottleneck Found.
  • Responsible for Setting up the environments for Performance Testing and Co-coordinating with HP Customer Service.
  • Used Rabbit and Jenkins for deploying the code in to Performance Testing Environments.
  • Involved in creating Dynatrace dashboard and reports using built-in and/or custom measures to present testing and analysis results effectively
  • Analyzing performance critical transactions using tagged web requests and Purepathsto trace bottlenecks
  • Analyze Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks.
  • Tuned number of full GC and its CPU spikes at high memory conditions by increasing heap size and thereby eliminating JVM abnormalities
  • Capacity planning / sizing recommendations i.e. increase jvm heap memory, jvm database connections, additional number of Jvms, additional hosts etc. based on current production metrics/ capacity baselines
  • Suggesting the Project teams on Configuration and Tuning Parameters on JVM, Server.XMl, Timeouts, Logs and other areas of improvements.
  • Monitored Java GC Overhead to see if approaching 100% and verify if process is spending all its cycles on GC which means the heap was full.
  • JVM metrics, Monitored Memory (Heap/non-Heap memory usage, Garbage Collection Graph), Threads (Active and Idle Thread count for Pool, Number of active threads in JVM)), HTTP sessions (Session Graphs: Active, expired and Rejected HTTP session count), App Server Transactions (Number of active transactions, Top level transaction, Nested transactions, aborted and Committed transactions) using Jprofile and Java Diagnostics.
  • Monitored average number of executions that happens during the execution of every SQL Statement in Oracle db including Read, write, block size, Buffer Cache size, SQL area size.
  • Monitored the memory usage when running the long duration Soak tests to ensure no memory leaks were observed.
  • Reviewing the deployments folders and tuning to improve the Performance.
  • Monitored and analyzed the server logs to fix any bugs or issues while executing the required tests.
  • Created a Strategy to Figure a Performance Bottleneck. Find the Time--Finding the End to End time with Breakdowns (Client, AppServer, Gateway, F5, and Network Time) for every project to determine the area where Performance improvement can be implemented.
  • Responsible for doing Load, Stress, Volume, Spike, Scalability, Failover and other testing’ s to determine the performance of the application.
  • Architecture review with the Project Teams along with the PE manager.
  • Analyzed test results using the HP Analysis tool based on Average response times, Throughput, Hits per seconds and Resource usage to ensure they meet the defined SLA’s.
  • Responsible for giving accurate project status reports to managers and business partners to ensure the deadlines are met.
  • Updating the Stake holders about the performance results by generating reports using Load Runner analysis, JMETER Reports, Manual Reports about Tests, Findings and Tuning Implementations.

Environment: HP LoadRunner 11.50/12.02, JMeter 2.90,CA WilyIntroscope, NMon& HP SiteScope, Java-J2EE, SOA, Web Services, Rest, SOAP, Web Services, API, XML, HTML, Oracle, Apache, JBOSS, F5, VMWARE, Unix, Linux.

Confidential, Mooresville, NC

Performance Test Lead

Responsibilities:

  • Responsible for Test Design (Test case/ Test Data generation), Defect tracking, Reporting and Reviews of Test Execution.
  • Managed multiple stakeholders in Onsite-Offshore setup, Involved in all Performance engineering activities
  • Actively participated in the creation of Performance Test artifacts including test strategy and test plans, entry and exit criteria, test execution reports, defect/issues reports, action/item reports, and project plans.
  • Analyzed requirements and product specifications to determine the test objectives and the appropriate level and type of testing needed, executed automatic test scripts for Performance and Load Testing using HP LoadRunner 11.0/11.52 /12.0, HP ALM-Performance Center 11.0/11.52, and JMeter 2.8/2.9
  • Participated in requirements and design reviews to identify test scenarios/cases to be executed for Performance and Load Testing.
  • Designed the Performance Test Environment for accurate projection by capturing the details of Production Environment.
  • Used Load Runner Protocol Java VUsers, web Services and Web HTTP/HTML Protocols for Service Oriented Architecture, Web Services, API’s, and TIBCO Environments.
  • Depending on the production volumes captured from the Business, Designed the Load tests, Performance, Stress tests, Volume and Long-Duration/Soak Test Scenarios
  • Extensively used JMeter for Performance testing SOA, Web services and API’s.
  • Analyzed applications test Results and Environment behavior under high Loads and optimized server configurations.
  • Tested performance of J2EE, J2SE, SOA, and Apache Tomcat, Web sphere App Server, F5 and IBM Data Power Appliances.
  • Analyzed CPU Utilization, Memory usage, thread usage, Garbage collection, and DB connection to verify the performance of the Application.
  • Analyzed the network connections and logs to troubleshoot any network issues.
  • Used monitoring tool such as Transaction viewer, HP Performance Center 12, HP diagnostics,
  • Generated performance graphs using tools like Splunk and Foglight etc.
  • Used LoadRunner Analysis to Analyze the LoadRunner Performance results.
  • Ensure sufficient level of stakeholders’ participation in all phases of the Performance Testing life cycle.
  • Ensured appropriate stakeholders’ signoff is obtained where required on test artifacts and exceptions.

Environment: LoadRunner 11.0/11.5/12.0, JMeter 2.8/2.9, HP ALM-PC 11.0/11.52, JIRA, HP Diagnostics, Transaction Viewer Splunk, J2EE, Siebel,.Net, Web Services, TIBCO, Web Services, XML, HTML, ORACLE, WAS, MS IIS Server, Web Sphere, Data Power, F5.

Confidential, Chandler, AZ & SFO, CA

Performance Engineer

Responsibilities:

  • Lead multiple projects/efforts on Multiple Platforms working with On-Shore and Off-Shore leading a Team of 8 Engineers and produce efficient results.
  • Assisted in exploring of new software as escalated, Produce the hand-off / exit report for application support teams.
  • Mentor Systems, Architects and other team members.
  • Provided Subject Matter Expertise in core Performance and Analysis tools like HPLoadRunner, HP Performance Center, Dynatrace and Splunk
  • Maintained cross-business responsibilities by providing end-user support within the company when issues arise.
  • Frequent interaction with the business to integrate knowledge of the business and Application Performance priorities.
  • Designed the Performance test environment coordinating with the infrastructure teams and Installed Open source (JMeter), Commercial Tools (LoadRunner) for Performance testing.
  • Used HPPerformance Center 9.52 to create scenarios and run load tests, HPLoadRunner 9.5/11.0 and JMeter for writing Vuser Scripts.
  • Assisted the team on Scripting using Web HTTP/HTML, .NET Applications, Siebel Web, RTE, AJAX True Client and Citrix ICA protocols for testing different applications.
  • Assisted the team to use Firebug, HTTP watch and other Developer Tools for efficient scripting.
  • Developed, maintained, recommended, documented and suggested supports tools and backend utilities to perform performance and capacity planning management
  • Designed scenarios for Performance Testing, generating scripts and handling Correlation as well as parameterization using LoadRunner VUGen.
  • Developed scripts and scenarios for automated testing new and enhanced web based products using Load Runner
  • Created and coded a very flexible LoadRunner script that allowed for fast configuration changes during testing
  • Enhanced script by inserting Checkpoints to check if Virtual users are accessing the correct page which they are supposed to be accessing.
  • Utilized performance/monitoring tools, analyzing results, resolving performance related issues to include optimization and tuning recommendations.
  • Extensively Monitored Hardware(CPU, Memory, DISK IO, Network IO), Memory (Heap Utilization, Garbage Collection Time Spent, Garbage Collection Minor Collections, Garbage Collection Major Collections)
  • Extensively Monitored Memory Pools--CMS Old Gen, CMS Perm Gen, Code Cache, PS Eden Space, PS Old Gen, PS Perm Gen, PS Survivor Space, Par Eden Space, Par Survivor Space
  • Extensively monitored JVM Properties, JVM Startup Options, JVM System Options, Environmental Properties, JMX Metrics, Application EARS, JDBC Connection Pools, Queues, Web Container Runtime- HTTPS or AJP, JVM-Classes, JVM-GC, JVM-Memory, JVM- Threads
  • Supported performance design patterns, architecture reviews, capacity planning, code profiling, and root cause analysis.
  • Worked closely with development on the design and implementation of enhancements based on the tuning recommendations.
  • Validate the code provided by development is efficient and accurately addresses performance issues reported.
  • Wrote clear and concise performance reports for review with the stakeholders.

Environment: LoadRunner 9.5, JMeter 2.5/2.7, Performance Center 11.00/9.52,QC,Dynatrace, Splunk, Windows Performance Monitor,.Net, Java, Siebel, Web Services, API, XML, HTML, DB2, Oracle, MS SQL Server, DB2, MS IIS Server, WAS 6.0, WAS 7.0.

Confidential

Sr. Performance Tester

Responsibilities:

  • Served as a technical expert to IT groups in planning the resource requirements for systems under development.
  • Presented statistical availability and trend analysis and recommendations to IT management, IT leadership, and the business, as needed.
  • Served as a technical expert resolving critical and complex application performance issues. Identifying and driving optimization changes in the application design to improve customer experience for mission/business critical IT applications
  • Identified the testing objectives, planned Load Runner implementation and performed the simulation.
  • Created LoadRunner scripts to load test high traffic end user processes for performance and reliability.
  • Developed test scripts in VuGen. Enhanced the scripts by adding checkpoints, functions in C Language, transactions, rendezvous points, created parameters, and performed manual correlation to enhance recorded scripts.
  • Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis tool.
  • Designed performance test scenarios using HP Performance Center, ran stress tests and analyzed the results.
  • Conducted load and reliability testing on website’s workflows to identify and report performance bottlenecks.
  • Extensively developed various scenarios and performed performance and volume tests using Performance Center
  • Utilized HP Performance Center to synchronize LoadRunner Controller usage among the teammates in order to meet the software testing goals under tight deadlines
  • Infrastructure Monitoring using Dynatrace & Application Monitoring using End User Management & Business Availability Center BAC.
  • Monitored DB Server Mainly CPU Usage, Memory Usage, Usage, Read/sec, Write/sec, any locking, queries, Average response times, concurrent invocations, Connection Counts, errors per interval, Responses per interval, JDBC-Concurrent Invocations, Oracle Active Connection Count, oracle Waiting for Connection count
  • Developed various reports and metrics to measure and track testing effort.
  • Attending weekly defect report meetings and presented progress updates.
  • Attending conference calls with offshore team to discuss the Testing status and to assign the defects to the concerned developers.

Environment: LoadRunner 8.1/9.5, HP Quality Center, DynaTrace,IBM OS version 4690, POS version 6, Java, J2EE, VBScript, Oracle, SQL, Unix, Shell scripting, HTML, WebSphere.

Confidential

Performance Tester

Responsibilities:

  • Worked closely with the project team in planning coordination and implementing Performance Testing Methodology.
  • Gathered, consolidated requirements for generating performance goals and test plans.
  • Analyzed business requirements to better understand business logic and process flow.
  • Facilitating meetings with the development, project and business user’s teams to discuss issues and suggest resolution.
  • Prepared Test Plans for the performance testing activities that was to be conducted and prepared a detailed test design on various load tests to be conducted based on the requirements gathered.
  • Primarily responsible for the setup and execution of the test cases, and capture data related to a performance test
  • Performing Back end testing with extensive use of SQL Queries and UNIX commands.
  • Created Scripts and performed Test Execution in HP Performance center and Load runner.
  • Performed data parameterization in order to facilitate the Data-driven tests.
  • Wrote application specific functions using C language to add logic to load testing scripts
  • Collected performance test results and metrics and did detailed Analysis of logs.
  • Performed CPU and Memory monitoring and performance metrics extraction of web, application and database servers.
  • Detected performance bottlenecks related to Memory leaks in application server.
  • Closely worked with development team and guided them in finding and fixing the performance defects
  • Collaboratively worked with Capacity Planning to obtain the performance expectations of performance scenarios to be executed, provided results to validate current forecast models, and for future infrastructure architecture planning.
  • Walk through Load and Performance report with client for optimization and escalate issues when needed.
  • Prepared detailed test reports and summary report highlighting the different load tests conducted and the performance achievements made from the engagement.
  • Skilled at converting raw data into meaningful charts and graphs that show the pertinent results of the test in graphical context.

Environment: HP LoadRunner 9.10, HP QTP, Performance Center,CA Wily Introscope, SharePoint, IBM Mainframe, HP Sitescope, Java, C, VBScript, TSL, XML, HTML, MS Office, SQL, PL/SQL, Siebel, WSDL, SOA, API, Web Services, XML Marker, Putty, WebSphere, IBM HTTP Server(IHS), Unix and Windows.

We'd love your feedback!