Sr.performance Engineer Resume
Boston, MA
PROFESSIONAL SUMMARY:
- 9 years of professional experience in Software testing and quality assurance. Exposed to all phases of Software development life cycle and Software testing Lifecycle. Possess strong analytical, verbal and interpersonal skills with capability to work independently.
- Experience in testing of Web based, Client/Server and SOA applications
- Diverse experience in information technology with emphasis in Performance Testing and Performance Engineering using various performance testing tools and monitoring tools.
- Experience Performance testing on Multi - tier Environments for different Software platforms
- Expertise in developing Test Plans, Test Functional /Non-Functional automation scripts, creating test scenarios, analyzing test results, reporting Bugs/Defects, and documenting test results.
- Expertise in different testing methodologies like Agile, Scrum, Waterfall and Mobile etc.
- Experience in Collaborating with Key Stakeholders-Business Representatives, Product/Project Managers, Developers, DBA’s, Infrastructure leads, Architects, Middleware etc.,
- Strong experience in designing, documenting, and executing test cases using Mercury Quality Center.
- Created Performance scenarios and scripts for various types of tests (load, stress, baseline/ benchmark/ Capacity).
- Good experience in Unit, Smoke, Sanity, Functional, Regression, Integration, System, Load, Performance, Stress and UAT testing methodologies.
- Sound knowledge of Load, Stress and performance test using various performance testing tools such as Load Runner, Performance center, ALM Performance Center, QA Load, JMeter.
- Expertise in recording/Coding Vugen scripts using different protocols in all types of environments.
- Expertise in parameterization, manual correlation, run time settings and C.
- Excellent knowledge and skills in test monitoring for transaction response times, web server metrics, Windows / Linux / AIX system resource, Web App, Server metrics, Database metrics, and J2EE Performance.
- Experience in analyzing Performance bottlenecks, Root cause Analysis and server configuration problems using Load Runner Monitors, Analysis, Site Scope and J2EE Diagnostics.
- Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
- Monitored and analyzed the performance of web applications and database servers.
- Excellent Knowledge of programming languages like C, C++, Java, .Net, SQL, Java to debug and execute Load runner scripts.
- Worked on various Load Runner Protocols like Web (HTTP/HTML), Web Services, Siebel Web, FLEX, Citrix, SAP-GUI/Web, .NET, JAVA Vuser, Java Over HTTP, Sybase CTlib/DBlib, Oracle 2-Tier, Ajax and Multi Protocols.
- Sound Knowledge in WAN Emulation testing using Shunra and Wire shark.
- Proficient in Monitoring all Tiers (Web, App, Network, DB, Mainframes) in QA, Pre Prod (Performance) and Prod Environments using various monitoring tools such as Site Scope, HP Diagnostics, Wily Interoscope, Dynatrace, App Dynamics, Team Quest, Perfmon. NMon etc.,
- Good knowledge of SOA, DB, TIBCO, Siebel, PeopleSoft, SAP, Java, J2EE and Dot Net based applications.
- Well experienced in Installing, Configuring and Maintaining the performance test tool environment (Hands on experience on all Versions of Load runner and JMeter)
- Prioritizes tasks, accomplishing deliverables on time while working independently with minimum supervision toward realizing targets.
- Well Versed with different defect tracking tools (Test Director, Quality Center, ALM, JIRA, Clear quest, Rational Clear Quest etc.) for managing defects.
- Using Toad to run SQL quires using Joins and Sub-Queries for data creation, validation, Used SQL Server Profiler, database connectivity, Oracle10g, configuring TNS file and AWR Reports for optimal performance and scalability.
- Experience in Tuning and resolving performance bottlenecks
- Extensive work experience in ETL processes consisting of data sourcing, data transformation, mapping and loading of data from multiple source systems into Data Warehouse stakeholders for requirements gathering, architecture review and results analysis.
- Good analytical, interpersonal and communication skills. Driven, committed and hard working with a quest to learn new technologies and undertake challenging assignments.
TECHNICAL SKILLS:
Testing Tools: HP Load Runner 8.1/9.5/11.0/11.50/12.02/12.50, HP Performance Center 9.5/ 11.0/11.5/12, ALM 12.50, HP Quality Center, JMeter 2.5/2.7/2.8/2.9/2.10, SOAP UI and UFT, Fiddler.
Languages: Microsoft C#, C, C++, .Net, Java
Markup/Scripting Languages: DHTML, CSS, J Query, JavaScript, XML, HTML
RDBMS: MS SQL, Microsoft Access, SQL Server, Oracle Database
Operating Systems: AIX, HP-UX, Solaris, UNIX, Windows XP,2003,2000,Vista, Windows NT and Linux
Monitoring Tools: Performance Center, Wily Intro scope, Site Scope, Dynatrace, App Dynamics, HP Diagnostics, Transaction Viewer, Splunk, Windows Performance monitor, Nmon, Fiddler.
PROFESSIONAL EXPERIENCE:
Confidential, Boston, MA
Sr.Performance Engineer
Responsibilities:
- Involved in requirements gathering and developing performance test plans.
- Working on People Soft application HR/CMS v9.2.21.
- Migration of People Soft application into cloud into AWS.
- Created, executed and analyzed performance test scripts using Load Runner 12.53/12.55collected performance metrics for fine-tuning.
- Involved in the creation of PT calendar, which gives the details of the load test schedules.
- Developed and Executed the test scripts for Baseline, load, Stress and Endurance (Soak) Testing.
- Parameterized test data to accurately depict production trends.
- Performed performance Baseline, load, Stress and Endurance (Soak) Testing by simulating multiple Vusers using Load Runner.
- Inserted Transaction Points to measure the performance of the application in Load Runner.
- Validated the scripts by running multiple iterations to make sure they have been executed correctly and meets the scenario description.
- Configured Sites scope to capture runtime Metrics from Web servers, Application Servers.
- Executed Test scenarios and monitored Web, App and DB server's usages of CPU, Heap, Throughput, Hit Ratio, and Transaction Response Time and Garbage collections.
- Prepared Test Analysis, Test Result Summary in html Reports and presented with exclusive graphs in excel sheet for the all runtime Metrics.
Environment: HP Load runner 12.0/12.55, Performance center 11.5/ALM, Dyna Trace, app Dynamics, Windows XP, VUgen, Integration Servers, Windows 2008, Windows Vista, Web applications, XML files, J console, J Visual VM, SOAP UI.
Confidential, Boston, MA
Sr. Performance Test Analyst
Responsibilities:
- Involved in requirements gathering and developing performance test plans.
- Created, executed and analyzed performance test scripts using Load runner 12.53 and UFT 12.5 collected performance metrics for fine-tuning.
- Involved in the creation of PT calendar, which gives the details of the load test schedules.
- Developed and Executed the test scripts for Baseline, load, Stress and Endurance (Soak) Testing.
- Performed performance load, stress and smoke test for the scripts created in the UFT by testing simulating multiple users in Load Runner.
- Validated scripts by running multiple iterations in the UFT by importing the data from the excel sheet to Global sheet.
- Parameterized test data to accurately depict production trends.
- Performed performance Baseline, load, Stress and Endurance (Soak) Testing by simulating multiple Vusers using Load Runner.
- Inserted Transaction Points to measure the performance of the application in Load Runner and UFT.
- Validated the scripts by running multiple iterations to make sure they have been executed correctly and meets the scenario description.
- Configured Sites scope to capture runtime Metrics from Web servers, Application Servers.
- Executed Test scenarios and monitored Web, App and DB server's usages of CPU, Heap, Throughput, Hit Ratio, and Transaction Response Time and Garbage collections.
- Prepared Test Analysis, Test Result Summary in html Reports and presented with exclusive graphs in excel sheet for the all runtime Metrics.
- App behavior monitoring with App Dynamics/ Dynatrace by creating custom dashboards and reports.
- Written Load Runner 12.5 Scripts, enhanced scripts with C functions, parameterized cookies, and stored dynamic content in Load Runner functions.
- Text checks were written, Created scenarios for Concurrent users. Run time settings were configured for HTTP iterations. Used maximum bandwidth speed to bring the testing scenario to real world.
- CPU, Memory, Transaction Response time, Dead locks, Thread Count, Hogging Thread Count, Queue Length and through put were mainly monitored while running Performance Baseline, Load, Stress and Soak testing.
- Experience using the Fiddler to monitor the Bottlenecks and identify slow request.
- Used Fiddler to get to know Confidential a glance how the app is using the network.
- Written high-level Load Runner scripts for Single User by storing dynamically varying object IDs in variables and validating correct downloads of HTML pages by validating content in sources.
- Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submit under Http protocols.
- Used ALM, DIMS as defect tracking tool to track the defect, to report and to coordinate with various groups from initial finding of defects to final resolution.
Environment: HP Load runner 12.0/12.5, HP Unified Functional Testing (UFT), Performance center 11.5/ALM, Dyna Trace, app Dynamics, Windows XP, VUgen, Integration Servers, Windows 2008, Windows Vista, Web applications, XML files, J console, J Visual VM, SOAP UI.
Confidential, Dallas, TX
Sr. Performance Test Analyst
Responsibilities:
- Analyzed Business requirements and use case documents and developed Performance test cases; test scenarios and test plan accordingly.
- Created Test Strategy, Test plan, Test closure reports, and metric reports for all Performance Testing efforts.
- Handled project independently and Lead projects with offshore team.
- Planned, designed, Implemented and Executed Stress/Load/Performance Testing.
- Coordinated, scheduled and implemented the testing phase for new applications as well as upgrades to existing applications.
- Responsible for scripting, Analyzing results using JMeter 2.8/2.10 &HP Load Runner 12.0/12.50 tool.
- Extensively used Load Runner Protocols Web HTTP/HTML, Siebel Web and Mobile Web Http/Html Protocols.
- Responsible for creating scripts and running performance test for Android Applications as part of Mobile performance testing.
- Performed Endurance tests by executing the test for longer hours in order to Record the memory Foot print and find out any Memory Leaks, slow resource consumption problems etc.,
- Analyzed the results of the tests that were used to assist in the identification of system defects, bottlenecks and breaking points.
- Involved in working with development teams and concerned teams to fix performance issues.
- Reported Defects in Defect Tracking Tool (JIRA).
- Simulated hundreds of concurrent users using Load Generator while monitoring both end-user response times and detailed infrastructure component performance (Servers, Databases, and Networks etc.)
- App behavior monitoring with App Dynamics/ Dynatrace by creating custom dashboards and reports
- Used Splunk to monitor the Logs in depth for all the Servers of the Application
- Used Dynatrace, J probe tools for profiling the application to find out where the performance issue
- Did deep diagnostics using DynaTrace Tool, Monitored DB and Application Servers to trouble shoot root cause of problem
- Carried out deep dialogists using DynaTrace to capture memory leaks in the application by carrying out Longevity tests.
- Used DynaTrace to measure web site performance in test environment to capture performance metrics of key product features
- Monitoring application health using Dynatrace and reviewing performance of different components of web pages, also comparing daily, weekly and monthly trends for deep down analysis.
- Setting up user profiles, configuring and adding application servers on Dynatrace tool.
- Reviewing web components using Dynatrace client, analyzing and giving feedback to improve performance.
- Analyze Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks.
- Tuning support with metrics from AWR Reports and Explain Plans.
- Worked closely with Development and Business team to get an understanding of the system architecture, system component interactions, application load pattern and the Performance SLA.
- Developed and maintain performance library with focus on potential reuse.
Environment: HP Load runner 12.0/12.5, Performance center 11.5/ALM, Dyna Trace, app Dynamics, Splunk Android/IOS, Mobile performance Testing, Windows XP, VUgen, Integration Servers, Windows 2008, Windows Vista, Web applications, XML files, J console, J Visual VM, SOAP UI.
Confidential, Atlanta, GA
Performance Engineer
Responsibilities:
- Responsible for the full life cycle of performance testing and Lead a team of 6 Performance testers.
- Involving in business discussions with clients/business heads, technical discussions with designers and developers to gather the performance requirements, KPI and SLAs.
- Analyzed requirements and product specifications to determine the test objectives and the appropriate level and type of testing needed, Executed automatic test scripts for Performance and Load Testing using HP Load Runner 11.00/11.52, HP ALM-Performance Center 11.00, and JMeter
- Creating Workload model based on scaled down volumes of production in terms of transactions/page views and infrastructure.
- Ensuring the new business scenarios are mapped to test cases.
- Set standards and technique in scripting, data preparation and test reports and also preparing checklists.
- Creating complex Business process performance test scripts with Load runner tool using Web (HTTP/HTML), Web Services and Ajax true client (IE&FF) protocols.
- Enhancing the scripts by employing Manual/Automatic correlation, Parameterization Techniques, transactions and runtime settings for call flows and LR specific functions using Load Runner VuGen.
- Capturing the updated volume metric information from the business people for the performance business scenarios and mapping it to the scenarios in HP-Controller, which induces the virtual users on application.
- Tested performance of J2EE, J2SE, SOA, Apache Tomcat, and Web sphere App Server, F5 and IBM Data Power Appliances.
- Responsible to monitor the web, application, data base server's performance during and after the performance test execution.
- Analyzing the app server performance through Dynatrace tool, Wily Intro scope and data base server performance through API analysis and Oracle-AWR reports.
- Involved in creating DynaTrace dashboard and reports using built-in and/or custom measures to present testing and analysis results effectively
- Analyzing performance critical transactions using tagged web requests and Pure paths to trace bottlenecks
- Analyze Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory eaks.
- Tuned abnormal long GC pauses by breaking it down into smaller incremental pauses
- Tuned number of full GC and its CPU spikes Confidential high memory conditions by increasing heap size and thereby eliminating JVM abnormalities
- Capacity planning / sizing recommendations i.e. increase JVM heap memory, JVM database connections, additional number of JVM’s, additional hosts etc. based on current production metrics/ capacity baselines
- DB Performance tuning support using DynaTrace Response time/ Execution hotspots and Pool Usage with leak snapshots, PA Fog light and Wily Intro scope reports.
- Worked with DBA's to help tune the queries.
- Used Splunk to monitor the Logs in depth for all the Servers of the Application
- Experience in working with Splunk authentication and permissions and having significant
- Preparation of Test Summary Reports and Final Test Completion Report
- Reporting the performance issues (if any) to the concerned stakeholders and tracking the defects through HP Quality Center&JIRA.
- Preparing the performance test reports and giving a Go/No Go for the performance of the application in stake holder meetings before go-live.
- Archived all the Documents every release for Every Project.
Environment: HP Load Runner 11.5, HP ALM 11.00, Java, JSP, Web Logic, Unix/Linux & Sun Solaris, Oracle, ATG Services, Akamai, CSI, Dynatrace, Wily Intro scope, JVM, Fog light, J Console, Web logic, J Boss, Mainframes
Confidential, Hartford, CT
Lead Performance Test Engineer
Responsibilities:
- Scheduled Several Meetings with the Stakeholders as part of Requirement Gathering phase every release to Initiate Release testing for multiple applications.
- Defining the performance goals and objectives based on the client requirements and input.
- Involved in building a Performance Test Environment.
- Scheduled many Test Plan Signoff, Go No Go Meetings, Test Results Review &Bug Fix Meetings every release on a needed basis.
- Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases.
- Worked closely with Development and Business team to get an understanding of the system architecture, system component interactions, application load pattern and the Performance SLA.
- Used HP Performance Center 9.52 to create scenarios and run load tests, HP Load Runner 9.52 and Jmeter for writing Vuser Scripts.
- Responsible for generating the key Virtual user scripts using the Load Runner VU Gen utility for web (HTTP/HTML), Siebel Web, Sybase Ctlib, DBlib and Oracle 2-Tier Protocols.
- Responsible for Running the Load Runner scenarios for the Vuser using Load Runner Controller and monitoring the server response times, throughput, Hits/sec, Trans/sec Transaction Response under load, Web Server Monitors, App server monitors, system monitors such as java processes and a host of other Performance metrics.
- Made many enhancements to the recorded scripts by correlating, parameterizing, inserting debugging messages and string manipulation and any other script enhancements as and when needed.
- Implemented IP Spoofing techniques to simulate unique users' requests while running the tests.
- Created quantifiable load with test-scenarios for various applications (both standalone and integration) using Load Runner's Controller.
- Responsible for Monitoring the Application's performance under Load using the key Web Server Monitors, Web Application server monitors for Web Sphere, IIS 5.0, Apache monitors and NT Performance Monitors
- Configured various Web Sphere monitors for WAS applications to figure out which of the several servlets/JSPs caused the problem.
- Primarily used HP Site scope and Splunk for Monitoring the Applications
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Worked closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Implemented and maintained an effective performance test environment.
- Identify and eliminate performance bottlenecks during the development lifecycle.
- Conducted Duration test, Stress test, Baseline test and several other performance tests.
- Involved in communicating with the vendor teams to resolve issues related to memory leaks
- Used Soap UI Pro to perform Functional Web Service test.
- Used various servers and ran SQL queries in SQLServer7.0 on the back end to ensure the proper transaction of data during various tests.
- Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
Environment: Load Runner 9.5, Performance Center, JMeter, Site Scope, Splunk, Oracle, Citrix, MS SQL Server, Web logic, Load Balancer, JAVA, Quality Center 10, J2EE Diagnostic Tool, web, Windows 2000/XP.