Sr. Performance Expert Resume
South, CarolinA
SUMMARY
- Specialized in performance testing applications using load - testing tools such as Load Runner, Performance Center, VuGen, LoadRunner Analysis.
- Prepared Test scripts, Load Test, Test Data, Test Plan, Test Cases, Execute test, validate results, Manage defects and report results
- Worked in Loadrunner Web (http/html), SAP WEB/GUI, Siebel,Web Services Citrix and RTE.
- Identified Real World Scenarios and Day in Life Performance Tests
- Complex Usage Pattern Analysis
- Developed Complex ‘C’ Libraries and Utility Functions for Code Reusability and Modularity
- Independently develop LoadRunner test scripts according to test specifications/requirements.
- Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller.
- Performance Tested SAP web,Peoplesoft and Documentum applications
- Performed in-depth analysis to isolate points of failure in the application
- Well Proficient with complex ‘C’ Programming and VB Scripting
- Uploaded test cases in HPQC from excel sheet
- Feature testing in Agile methodology.
- Actively conducted & Involved in monthly Sprint Planning, Review meetings for various Streams.
- Participated in daily stand up meetings as part of Agile methodology to discuss work accomplished, planned work and raise issues.
- Extensively worked with batch jobs in UNIX
- Looked at the log files and manipulated files in UNIX
- Gave Sprint Feature Demos to business/product owners.
- Test Activities includes documenting Test scripts from Use Cases in the Quality Center test case repository.
- Experience in Agile with Scrum
- Developed User Stories, Scrum meeetings and developed Acceptance Criteria
- Discuss with BA and Development team regarding queries on requirements and functionality of application from Offshore and Onshore team
- Very good with SQL to verify Data
- Reviewed project status update provided by offshore team. Daily conference communication calls to discuss project update
- Performed Manual testing which involves Data validation, Static Testing, System testing, SIT testing, End to End testing, UAT testing.
- Report, track, and monitor defects in the defect tracking system
- Tested application with different user roles like user, admin, manager and approver
- Tested all workflow tasks with appropriate data and permissions.
- Involved in business requirement meetings to understand client business rules/strategy.
- Worked through builds, releases and patches as planned In Cloud environment.
- Performed Integration, Regression, End to End, functionality, and UI Testing. Migration testing,
- Database Backend testing, Data Mapping testing, Product Verification testing, Build verification testing and Fix verification testing.
- Worked closely with the Business Analyst and QA Management for Requirements Overview and
- Clarification.
- Worked with the Developers to determine what Aspects of the Design and Codes are Testable.
- Participated in Test Strategies Review meetings.
- Analyzed Functional Requirement Documents (FRDs) at the beginning of each enhancement cycle to understand all enhancement requirements.
- Wrote Test Cases and Test Scenarios using business requirements, Use Cases, Wireframes and UI Screenshots.
- Worked on production to find the current and projected user volume and transaction density.
- Duration Test was conducted to find the System Stability and memory leaks.
- Prepared Automation Test Plans and Test Data for Web Testing.
- Involved in root Cause Analysis for the problems in proposed architecture.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Monitored the Resources metrics to find the performance bottlenecks.
- Develop and implement load and stress tests with Mercury Load Runner, and present performance statistics to application teams, and provide recommendations of how and where performance can be improved
- Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
- Expertise in Automation testing using Quick Test Pro (QTP).
- Developed Driver Script, Startup Script and Utility Functions in QTP
- Expertise in developing Automation Frameworks such as data driven, key driven and Reusable business component approach
- Experience in running various kinds of tests including Unit Testing, System and Integration Testing, Regression Testing for Web and Client/Server Applications.
- Expertise in problem solving and bug tracking using bug tracking tools.
- Very strong knowledge and experience in relational databases (RDBMS) such as Oracle, SQL Server and MS Access.
- Tested applications using SOAPUI
- Wrote Performance Test Plan and Test Case design document with the inputs from developers and functional testers.
- Conducted Data Driven testing using parameterization in UFT to test with different sets of data.
- Created UFT advance automation scripts using descriptive programming, modular approach, custom functions, logics, and looping for Regression testing.
- Used UFT checkpoints to automatically capture and verify properties such as the number of links.
- Wrote complex queries for Database Validation and Conducted User Acceptance Testing (UAT) and Regressing testing on various builds of the application using UFT.
- Development of performance testing processes & procedures and responsible for /Performance testing for various Spansion third party systems.
- Hands on experience with designing, developing and executing Test Strategies (Test cases/scenarios/scripts) for Load Testing the application using Loadrunner.
- Experienced in developing different test scenarios like Smoke Test, Scalability testing, Reliability testing, Stress testing, fault tolerance testing, Performance regression testing etc.
- Responsible for application and components behavior with heavier loads and optimizing server configurations
- Worked extensively with LoadRunner. Created Scenarios that ran tests with IP Spoofing with several processes and in multithreaded environments, and analyzed and generated performance reports
- Hands on experience configuring and using SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
- Extensively involved in performance tuning of application servers such as Weblogic and Tuxedo
- Extensively involved in Performance tuning of Web servers such as web sphere.
- Involved in Performance tuning of database Servers.
- Generated load in the system using multiple controllers, Thin and Thick client at the same time.
- Well versed with the behavior of online monitors and the techniques to fix the monitoring issues and monitoring Vuser status.
- Excellent Knowledge of programming languages like C, C++, Java, SQL to debug and execute Load runner scripts.
- Experienced in all phases of SDLC, including identification of functional requirements, administration of test projects and defect tracking to ensure successful application delivery on time and within budget.
- Experienced in the use of Agile and RUP approaches, including Extreme Programming, Test-Driven Development.
- Strong technical expertise implementing various test phases, ranging from Smoke Testing, Unit testing, Integration testing, Functional testing, System testing, User Acceptance Testing, to Regression testing.
- Specialized in analyzing the functional specifications and creation of automated test scripts using Mercury Interactive Tools such as WinRunner and QTP.
- Knowledge for installation and administration of application servers such as WebSphere and WebLogic and version control tools such as Rational ClearCase, CVS, VSS and Star team.
- Knowledge for installation, configuration of WebSphere application server monitoring solutions using IBM ITCAM suite.
- Extensive experience using defect-tracking tools such as IBM Rational ClearQuest and Test Director.
TECHNICAL SKILLS
Operating Systems: AIX, HP-UX, Linux, Solaris, Windows 2000
Languages: C, C++, TSL, JAVA, SQL and PERL.
Databases: Oracle, MS Access, DB2 and MS SQL server
Testing Tools: WinRunner, LoadRunner, Test Director and Quick Test Pro.
Web / Application Servers: Tomcat, IIS, Jboss, WebSphere and WebLogic.
Web Technologies: Struts Framework, Java Servlets, Java Beans, J2EE, JMS, JDBC, RMI, EJB, Swing, andJSP.
XML and Web services: XML, DOM, SAX, XSLT, XPATH, XSD, WSDL and SOAP.
Defect tracking tool: PVCS Tracker, Test Director and Rational Clear Quest.
PROFESSIONAL EXPERIENCE
Confidential, South Carolina
Sr. Performance Expert
Responsibilities:
- Worked as an independent consultant for performance testing and coordinated with multiple vendors.
- Involved in preparation of estimation, capacity matrix, testing plan and details, capacity plan and performance strategy docs and conducted assessments and data modeling using excel.
- Recording, scripting, introducing dynamic navigation, parameterization and execution of the scripts were done.
- Designed and implemented performance test frameworks for improving test efficiency.
- Performed testing for No load, Medium Load and Full Load and analyzed the system response.
- Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.
- Cooperated with Basis Team to understand requirements and issues around test execution environment.
- Validate testing infrastructure (connectivity, scripting/protocol compatibility with SAPGUI, SAP Web and virtual user capacity of hardware)
- Experience in Agile with Scrum
- Developed User Stories, Scrum meeetings and developed Acceptance Criteria
- Used MQClient protocol to test websphere MQ.
- Feature testing in Agile methodology.
- Actively conducted & Involved in monthly Sprint Planning, Review meetings for various Streams.
- Participated in daily stand up meetings as part of Agile methodology to discuss work accomplished, planned work and raise issues.
- Gave Sprint Feature Demos to business/product owners.
- Test Activities includes documenting Test scripts from Use Cases in the Quality Center test case repository.
- Developed and executed Test Scripts for Smoke Test prior to Build deployment.
- Interact with business analyst, system staff and developers.
- Conducted Data Driven testing using parameterization in UFT to test with different sets of data.
- Created UFT advance automation scripts using descriptive programming, modular approach, custom functions, logics, and looping for Regression testing.
- Used UFT checkpoints to automatically capture and verify properties such as the number of links.
- Wrote complex queries for Database Validation and Conducted User Acceptance Testing (UAT) and Regressing testing on various builds of the application using UFT.
- Worked with excel macros by running it from UFT and importing the results to UFT.
- Used Custom Checkpoints, Data Driven, and Regular Expression in UFT.
- Involved in Keyword Driven automation framework demonstrations to all the stakeholders using UFT.
- Involved in documentation as to how to maintain and run scripts for future enhancements.
- Participate in weekly meeting with the management team and walkthroughs.
- Developing testing schedules and monitoring progress against the schedules.
- Maintained and documented operational proceedings for each Build while Regression testing.
- Involved in manual test case execution using Quality Center and analyzed process of application and prepared framework for Automation
- Worked with mainframe based applications using green screens
- Tested applications using SOAPUI
- Reviewed, developed and executed automatic Test Scripts using QTP to perform regression testing.
- Managed resources and process of performance testing (like Load, Stress, Volume, Endurance and Failover) using LoadRunner (Controller, Virtual User Generator, Analysis) and Protocols used Web, Web Services
- Tested websites and web applications inPythonProgramming language And using Django
- Worked with SQL for database testing
- Extensively worked on UNIX to change the database connections, tracing logs, monitor resources of the machines, create users and execute batch jobs.
- Coordinated with tools team to Install Mercury Diagnostics, Wily Introscope and Sitescope on the performance environments for triage calls to identify the bottlenecks.
- Managed near-shore and off-shore team to develop test harness, execute performance scenarios during nights and weekends and report generation.
- Presented results of the performance testing along with Project Management team to the clients mainly senior management.
- Also involved with project management team to schedule the testing activities for the TST space and resource allocation.
- Expertise in implementing automation tool Mercury QTP
- Developed and enhanced scripts using LoadRunner VuGen and designed scenarios using Performance Center to generate realistic load on application under test.
- Uploaded test cases in HPQC from excel sheet
- Discuss with BA and Development team regarding queries on requirements and functionality of application from Offshore and Onshore team
- Reviewed project status update provided by offshore team. Daily conference communication calls to discuss project update
- Performed Manual testing which involves Data validation, Static Testing, System testing, SIT testing, End to End testing, UAT testing.
- Report, track, and monitor defects in the defect tracking system
- Tested application with different user roles like user, admin, manager and approver
- Tested all workflow tasks with appropriate data and permissions.
- Involved in business requirement meetings to understand client business rules/strategy.
- Worked through builds, releases and patches as planned In Cloud environment.
- Performed Integration, Regression, End to End, functionality, and UI Testing. Migration testing,
- Database Backend testing, Data Mapping testing, Product Verification testing, Build verification testing and Fix verification testing.
- Worked closely with the Business Analyst and QA Management for Requirements Overview and
- Clarification.
- Worked with the Developers to determine what Aspects of the Design and Codes are Testable.
- Participated in Test Strategies Review meetings.
- Analyzed Functional Requirement Documents (FRDs) at the beginning of each enhancement cycle to understand all enhancement requirements.
- Wrote Test Cases and Test Scenarios using business requirements, Use Cases, Wireframes and UI Screenshots.
- Extensive advanced programming of LoadRunner VuGen scripts for dynamic navigation.
- Creating the Test Scenarios, executing and generating reports using LoadRunner
- Performance Testing - Load testing, stress testing and soak testing of the application.
- Installed, customized and administered Performance Center, LoadRunner and QTP. Troubleshoot issues encountered, evaluate and perform upgrades of the tools in the Mercury suite.
- Configured Application Performance Analyzer for monitoring system resources and activity.
- Used SiteScope to monitor server metrics.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Provide support to the development team in identifying real world use cases and appropriate workflows
- Performed in-depth analysis to isolate points of failure in the application
- Assist in production of testing and capacity reports.
- Created comprehensive analysis and test results report.
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Created Test Schedules.
- Worked closely with clients
- Interface with developers, project managers, and management in the development,
- Execution and reporting of test performance results.
Environment: LoadRunner, VuGen,AGILE, SPRINT, ALM,UAT,QTP,SAP Web,android,iPhone,Performance Center, IIS, AIX, .Net,SQL,mainframe,WebLogic8.1, MQ Series, MS Office, MS-Visio, Java, Windows, LINUX, HP Tivoli, LDAP, Shell, LAN, WAN
Confidential
Performance Analyst
Responsibilities:
- Manually correlated the opportunity Ids, to save the dynamically changing opportunity id’s into a parameter by going to the body of the server response in the LoadRunner
- Used Web Reg Find function to search for the text to see if the desired pages are returned during replay in Virtual User Generator
- Changed the runtime settings such as pacing, Think-time, Log settings, browser emulation and timeout settings in LoadRunner VUGEN and controller to simulate the real scenario.
- Created various scenarios in LoadRunner controller for performing baseline, benchmark, stress tests and endurance tests
- Responsible for developing and executing performance and volume tests
- Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
- Extensively worked with batch jobs in UNIX
- Looked at the log files and manipulated files in UNIX
- Expertise in implementing automation tool Mercury QTP
- Feature testing in Agile methodology.
- Actively conducted & Involved in monthly Sprint Planning, Review meetings for various Streams.
- Participated in daily stand up meetings as part of Agile methodology to discuss work accomplished, planned work and raise issues.
- Gave Sprint Feature Demos to business/product owners.
- Test Activities includes documenting Test scripts from Use Cases in the Quality Center test case repository.
- Developed and executed Test Scripts for Smoke Test prior to Build deployment.
- Very good with SQL to verify Data
- Strong knowledge in QTP Object repository creation and maintenance, regular expression, re-usable actions, data table, checkpoints and recovery scenarios
- Created TestPlan for testing effort and developed test cases from functional requirements and technical specification and use cases.
- Performed Manual testing to know the AUT well and also executed test cases manually to verify the expected results.
- Involved in writing complex SQL queries using TOAD to extract the data from Oracle database to conduct Backend Testing.
- Used extensive SQL queries to perform database testing, hence validated tests by cross checking data.
- Wrote smoke test plan for the Application under Test, which includes the basic test cases, which ensure that the application is stable enough to start functional testing.
- Involved in Smoke testing on all environments after every deployment.
- Used Rational ClearQuest for version control, tracking defects, enhancement requests, and assess the real status of project throughout the life cycle.
- Preparation of test data for various levels of testing.
- Performed GUI testing and functionality testing for front end screens.
- Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Tested web services applications using SOAPUI
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Diagnosed performance bottle-necks, performed tuning (OS and the applications), retesting, and system configuration changes for application performance optimization.
- Performed baseline test with 1 user and 5 iterations and benchmark test under a load of 100 users using LoadRunner controller
- Used Scenario By Schedule in the controller to change the Ramp Up, Duration and Ramp Down settings
- Executed stress tests with a load of 225 users to see the breakpoint of the application
- Involved in development of Test Plan based on Functional Requirement Specifications and Use Cases in Test Director.
- Used ALM to create and maintain Test Requirements, bug tracking and reporting.
- Developed and executed Test Scripts for Smoke Test prior to Build deployment.
- Interact with business analyst, system staff and developers.
- Involved in documentation as to how to maintain and run scripts for future enhancements.
- Participate in weekly meeting with the management team and walkthroughs.
- Developing testing schedules and monitoring progress against the schedules.
- Maintained and documented operational proceedings for each Build while Regression testing.
- Involved in manual test case execution using Quality Center and analyzed process of application and prepared framework for Automation.
- Analyzed the Transaction Summary Report and graphs generated in a LoadRunner Analysis session
- Created Templates in Analysis session and and analyzed web page diagnostics to see if the server was the bottle neck or the network was the bottleneck
Environment: QTP, Loadrunner, Performance Center, ALM,UNIX,SQL,Peoplesoft web, Rational Clear quest, SOAPUI,Quality Center, Test Director, HP BAC, VB Script, .Net, JavaScript, SQL, Java, JAVA,JUnit, J2EE, JSP, IIS, XML/XLST, Oracle
Confidential
Sr. Performance Expert
Responsibilities:
- Worked as an independent consultant for performance testing and coordinated with multiple vendors.
- Involved in preparation of estimation, capacity matrix, testing plan and details, capacity plan and performance strategy docs and conducted assessments and data modeling using excel.
- Recording, scripting, introducing dynamic navigation, parameterization and execution of the scripts were done.
- Designed and implemented performance test frameworks for improving test efficiency.
- Extensive advanced programming of LoadRunner VuGen scripts for dynamic navigation.
- Creating the Test Scenarios, executing and generating reports using LoadRunner
- Performance Testing - Load testing, stress testing and soak testing of the application.
- Installed, customized and administered Performance Center, LoadRunner and QTP. Troubleshoot issues encountered, evaluate and perform upgrades of the tools in the Mercury suite.
- Developed automated test scripts for functional testing using Mercury QTP and wrote external library functions to create Test Data.
- Used VB Scripts to enhance recorded scripts and some of fixed values were parameterized for performing data-driven testing.
- Developed Descriptive Programming and Maintaining Local and Global Object Repository in QTP
- Wrote Performance Test Plan and Test Case design document with the inputs from developers and functional testers.
- Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness Web Protocol.
- Extensively used Controller to generate load and define load generators.
- Used Test Results to provide summary reports, response times and monitor averages.
- Dealt with business team to get the performance requirements for the Load Testing, Stress Testing and Capacity Planning.
- Extensive familiarity with protocols like HTTP / HTML.
- Extensively used other features like parameterization, correlation and configured monitors for WebSphere, MQ Series and Database.
- Responsible for analyzing the requirements, designing, debugging, execution and report generation of existing legacy system and new panama application.
- Responsible for creating a base line and executing Performance, endurance testing.
- Measured the web based applications for Transaction Response Time for Business critical transactions.
- Created Test Cases and scenarios for Unit, Regression, Integration as well as Back - end and System testing.
- Developed Quick Test Pro scripts for performing Functional and Regression testing.
- In-depth testing of the application enhanced scripts using Checkpoints and Synchronization points.
- Debugged the application to identify and check that the scripts run smoothly.
- Wrote SQL statements to extract data from Tables.
- Able to effectively contribute in quality inspection activities such as requirements and design reviews
- Configured Application Performance Analyzer for monitoring system resources and activity.
- Used SiteScope to monitor server metrics.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Assist in production of testing and capacity reports.
- Created comprehensive analysis and test results report.
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Created Test Schedules.
- Worked closely with clients
- Interface with developers, project managers, and management in the development,
- Execution and reporting of test performance results.
Environment: LoadRunner, VuGen, Peoplesoft Web,QTP,Performance Center, IIS, AIX, .Net
Confidential
Lead Performance Test Engineer
Responsibilities:
- Identified the Test Cases for Performance as per the Client requirements and created the Test Cases in Quality Center
- Performance Tested SOA Based application using Web Services Protocol
- Involved in conducting benchmark test, stress tests and volume tests against the application using LoadRunner..
- Interacted with the Business Analyst to discuss the GAPS in the Performance requirements.
- Developed the Performance Test Plan and Load Test Strategies.
- Conducted Performance Testing to achieve Performance Testing Objectives like bottleneck identification, measuring reliability and capacity of the applications and finding out optimal hardware configuration
- Installed LoadRunner and Sitescope on Windows machines.
- Worked in Loadrunner Web (http/html), SAP WEB/GUI, Web Services and Citrix.
- Developed Requirement Traceability Matrix (RTM) to track requirement during QA Testing phase.
- Constantly verified Requirement Traceability Matrix to determine if each and every Functional requirement is taken care.
- Conducted GAP Analysis to identify modifications and business requirement.
- Expertise in implementing automation tool Mercury QTP
- Strong knowledge in QTP Object repository creation and maintenance, regular expression, re-usable actions, data table, checkpoints and recovery scenarios
- Created TestPlan for testing effort and developed test cases from functional requirements and technical specification and use cases.
- Performed Manual testing to know the AUT well and also executed test cases manually to verify the expected results.
- Involved in writing complex SQL queries using TOAD to extract the data from Oracle database to conduct Backend Testing.
- Used extensive SQL queries to perform database testing, hence validated tests by cross checking data.
- Wrote smoke test plan for the Application under Test, which includes the basic test cases, which ensure that the application is stable enough to start functional testing.
- Involved in Smoke testing on all environments after every deployment.
- Used Rational ClearQuest for version control, tracking defects, enhancement requests, and assess the real status of project throughout the life cycle.
- Preparation of test data for various levels of testing.
- Performed GUI testing and functionality testing for front end screens.
- Tested the Functionality and Performance of the application using automated test tools as well as by manual testing.
- Responsible for load testing Oracle Application.
- Identified system capacity, system scalability and stability under stable load as well as under pick load time.
- Developed LoadRunner scripts for Data Creation and Functionality of SAP CRM, ERP via the SAP GUI.
- Load Runner was used to simulate multiple Vuser scenarios. Defined Rendezvous point to create intense load on the server and thereby measure the server performance under load.
- Verified the connectivity from Controller to the Load Generator. Utilized the IP address of Load Generators to add them to the Controller.
- Traced deadlock and expensive SQL queries and test procedures (MS SQL Profile, Oracle Performance Manager).
- Responsible for developing baseline Scenarios and Load Testing Harnesses for load/performance testing of the application.
- Performed testing for No load, Medium Load and Full Load and analyzed the system response.
- Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.
- Developed and enhanced scripts using LoadRunner VuGen and designed scenarios using Performance Center to generate realistic load on application under test.
- Developed performance test plan as well as developed details performance analysis reports, Graphs ( include Load Runner build -in graphs and MS Excel - custom graphs).
- Coordinated creation of stress environments to conduct stress\load testing.
- Conducted Load Test for multiple users using Load Runner.
- Extensively Used Load Runner Monitors to identify bottlenecks.
- Enhanced Vuser scripts by introducing the Timer Blocks and by Parameterizing the Data Value’s to run the script for Multiple Users.
- Analyzed the load test results including transactions by drilling down, merged graphs (overlay graphs, correlate graphs), cross result graphs and auto correlating measurements and thus focusing on behavior patterns and identifying problematic elements using the LoadRunner Analysis tool.
- Manually Correlated the Session ID’s, Database Primary Keys to save the dynamically changing value into a Parameter by going to the body of the server response.
- Created Scenarios by using the Scenario by Schedule and Scenario by Group in the LoadRunner Controller.
- Interacted with the Management during the Load Test period to discuss CPU Usage, Memory, Network Speed and Load Balancing issues.
- Created Templates to generate Reports in Sitescope and sent the Reports to the Management.
- Analyzed the Transaction Summary Report and Graphs generated in a LoadRunner Analysis session.
- Developed performance and scalability characteristics of commonly used SQL queries, decreased Shared, Updated, Exclusive Locks by isolating objects being locked and increased efficiently handling of transactions.
- Prepared the Detailed Performance Test Report and sent it to the Management.
Environment: LoadRunner,SAP,QTP,Citrix, Web HTML/HTTP,Windows2000/NT, UNIX, SOA, Oracle, DB2 and XML/ SOAP
Confidential
Sr. Performance Test Engineer
Responsibilities:
- Responsible for implementing LoadRunner based infrastructure, including: Architecting the load testing infrastructure, hardware & software integration with LoadRunner.
- Prepared Test scripts, Load Test, Test Data, Test Plan, Test Cases, Execute test, validate results, Manage defects and report results
- Identified Real World Scenarios and Day in Life Performance Tests
- Complex Usage Pattern Analysis
- Developed Complex ‘C’ Libraries and Utility Functions for Code Reusability and Modularity
- Independently develop LoadRunner test scripts according to test specifications/requirements.
- Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller.
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Diagnosed performance bottle-necks, performed tuning (OS and the applications), retesting, and system configuration changes for application performance optimization.
- Manually correlated the opportunity Ids, to save the dynamically changing opportunity id’s into a parameter by going to the body of the server response in the LoadRunner
- Used Web Reg Find function to search for the text to see if the desired pages are returned during replay in Virtual User Generator
- Changed the runtime settings such as pacing, Think-time, Log settings, browser emulation and timeout settings in LoadRunner VUGEN and controller to simulate the real scenario.
- Created various scenarios in LoadRunner controller for performing baseline, benchmark, stress tests and endurance tests
- Performed baseline test with 1 user and 5 iterations and benchmark test under a load of 100 users using LoadRunner controller
- Used Scenario By Schedule in the controller to change the Ramp Up, Duration and Ramp Down settings
- Performed in-depth analysis to isolate points of failure in the application
- Worked on production to find the current and projected user volume and transaction density.
- Duration Test was conducted to find the System Stability and memory leaks.
- Prepared Automation Test Plans and Test Data for Web Testing.
- Involved in root Cause Analysis for the problems in proposed architecture.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Monitored the Resources metrics to find the performance bottlenecks.
- Develop and implement load and stress tests with Mercury Load Runner, and present performance statistics to applicationteams, and provide recommendations of how and where performance can be improved
- Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
- Development of performance testing processes & procedures and responsible for /Performance testing for various Spansion third party systems.
- Designed, created & executed business realistic scenarios across multiple LOBs.
- Act as Load Runner expert, meet with SDC engineers to determine performance requirements and goals, determine test strategies based on requirements and architecture.
- Performed Performance testing using Load Runner and developed test scripts and scenarios.
- Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
- Develop performance test plans for new application releases and coordinate the Performance engineering team through completion of performance testing projects
- Execute performance test scenarios and analyze results
- Wrote custom LoadRunner functions and programs to support the load testing efforts, monitor resources to identify performance bottlenecks analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed.
- Log and prioritize performance bugs and work with Engineering and program management team on timely resolutions
- Interacted with the SDC engineers to sort out the issues and the defects
Environment: LoadRunner, BAC EUM,Windows2000/NT, UNIX, SOA, Oracle10g, DB2 and XML/ SOAP.
Confidential
Sr. Performance Engineer
Responsibilities:
- Responsible for capacity planning for PLIQ application to meet changing demands.
- Used Wily Introscope for Performance data, problem solving, trend analysis, and capacity planning
- Responsible for writing performance test plan and performance strategy document
- Prepared Test scripts, Load Test, Test Data, Test Plan, Test Cases, Execute test, validate results, Manage defects and report results
- Worked on production to find the current and projected user volume and transaction density.
- Duration Test was conducted to find the System Stability and memory leaks.
- Prepared Automation Test Plans and Test Data for Web Testing.
- Involved in root Cause Analysis for the problems in proposed architecture.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Monitored the Resources metrics to find the performance bottlenecks.
- Develop and implement load and stress tests with Mercury Load Runner, and present performance statistics to applicationteams, and provide recommendations of how and where performance can be improved
- Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
- Development of performance testing processes & procedures and responsible for /Performance testing for various Spansion third party systems.
- Developed Vugen test scripts in Loadrunner for Oracle forms and JSP pages using NCA and HTTP protocols.
- Tested web services applications using SOAP Client as well as by using WSDL Files.
- Developed Web Service Vuser scripts for a Web Service Call using Soap UI
- Interacted directly with developers, project managers for the development, execution and reporting of all testing efforts.
- Executed Tests through the Controller including ramp-up and ramp-down of several hundred virtual users including run-time configuration, etc.
- Established tests for applications with fully developed User Interfaces, and those without fully developed User Interfaces.
- Reported test results and interpretation to Management.
- Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
- Analyzed Load Runner on-line graphs and reports to identify network/client delays, CPU /memory usage, I/O delays, database locking and other issues at server level.
- Responsible for analyzing results, reports and charts to see response times of individual transactions with respect to whole applications.
- Coordinated daily status call for technical and non-technical audiences on test progress
- Responsible for creating load runner scripts, made scripts more dynamic by parameterization and correlation.
- Act as Load Runner expert, trained and mentored junior team members,
- Determined test strategies based on non-functional requirements and architecture.
- Design and develop performance test scenarios and test data for company's applications, APIs and data processing engine.
- Baseline the performance tests.
- Executed performance test scenarios and analyzed results and reported findings to the project manager.
- Profiles slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using wily Introscope tool.
- Studied application performance and maximum scalability via critical parameters such as: number of users, response times, hits per seconds (HPS) and Throughput using Load runner.
- Analyzed various graphs generated by Load Runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
- Identified performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
- Log and prioritize performance bugs and work with Engineering and program management team on timely resolutions
- Tune systems for optimal performance and characterize systems on multiple platform and configuration combinations
- Work closely with the application architecture team, development team and management team to sort out the issues and the defects.
Environment: Performance Center, Load Runner, Test Director,Windows2000/NT, UNIX, Java, J2EE, Junit, XML, RUPOracle10g, DB2, Websphere, SOA, Struts, EJB, IIS and XML/ SOAP.