We provide IT Staff Augmentation Services!

Sr. performance engineer Resume

3.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • Analytical, talented and Multi skilled with overall 8 years of IT experience as Performance Test Engineer.
  • Experience in Software testing life cycle including as Unit Testing, Functional Testing, Integration Testing, Regression Testing, Performance/ Load Testing, System Testing, Smoke, Sanity and User Acceptance Testing along with being experienced in methodologies like Waterfall, Iterative and Agile.
  • Experience in applying Testing Methodologies, creating Test Plans, Executing & Automation of Test Cases, Bug Tracking and Report Generation.
  • Expert knowledge of Bottlenecks in Performance, Web Performance Throughput, Server Response Time, Network Latency.
  • Expertise in implementing testing strategies for the entire Oracle EBS implementation
  • Simulated different levels of user load VUsers - simultaneous user load/concurrent user.
  • Practiced tester of web application running Load Balancers with different types of logic for distributing incoming HTTP requests ( round robin and sticky- both IP based and cookie based).
  • Possess problem solving skills, communication and co-ordination skills coupled with a keen analytical aptitude along with a great sense of responsibility in execution.
  • Expertise in Performance tools like Load Runner, Silk Performer, NeoLoad, IBM Green Hat Tester, and JMeter
  • Experienced in using Open Source tool Windows Perfmon Utility and Open source tool UNIX Nmon utility and also other Commercial Monitoring tools like Dynatrace, AppDynamic,HP Diagnostics, SiteScope, Wily Introscope, Splunk, New Relic to monitor the whole infrastructure.
  • Expertise in installing and fine tunning of Dynatrace server in production, test and development environment.
  • Good experience in collecting the Windows and Linux counters of Web Servers and Application Servers like Memory, CPU, Network and Heap Usage.
  • Performed tests and analysis such as load test, stress test, endurance test, performance bottleneck test, benchmark test, and baseline test etc. using HP Load Runner on web server, application server and database server at different levels and loads.
  • Experienced in configuration management using Visual Studio Team System(VSTS) team foundation server, VSS and sub-version.
  • Experienced in White Box testing by authoring python Codes.
  • Experienced in Virtual User Generator (VuGen) scripting for performance/load testing, Multiple protocols, Controller and Analyzer Tools, Report generation.
  • Experience of System Performance Testing Methodologies (peak/Stress/Endurance Tests).
  • Extensive experience of Web languages (HTTP/HTML), Web Services, Oracle, Citrix Protocols, etc.
  • Knowledge of testing front end web technologies based on JavaScript, DHTML, CSS, and HTML.
  • Proficient in Structured Query Language (SQL), Joins, PL/SQL stored procedures & triggers.
  • Experience in preparing Test data by retrieving data from Relational Databases Oracle, MS SQL Server.
  • Expert in using open source/enterprise tools to perform various type of test in Linux based server such as Jmeter, SOAPUI etc.
  • Good experience with run time settings/recording options and general options in IBM Rational Performance Tester.
  • Proficient in plotting and implementing scenarios and loading IBM Rational Performance Tester scripts into a controller.
  • Experience on working with JMeter for load performance testing.
  • Proven ability to check Network Bottlenecks using Vuser Graphs.
  • Experience in Installation and Configuration of Software and Hardware in testing environment.
  • Experience in working as a team member and also work independently to resolve technical issues
  • Excellent Written/Verbal communication, highly motivated, self-starter able to work independently and collaboratively within a diverse technical team.
  • Expert in finding the reasons for High Response Time.
  • Open source environment in Linux and UNIX Platform Web application testing.
  • Experience in writing and executing SQL queries to perform Data Validation and Back end testing of databases to check the integrity of data.
  • Acquired professionalism in testing the widest array of applications, from client/server to multi-tier protocols, which are developed in .Net, J2EE Technologies.
  • Strong understanding of Data Layer Testing for different databases such as MYSQL & Mongo DB with capabilities of writing different queries.
  • Experience using an HTTP debugging proxy server application with Fiddler.
  • Excellent inter-personal abilities and a self-starter with good communication & presentation skills, problem solving skills, analytical skills and leadership qualities.
  • Expert in finding performance bottlenecks both in client side and server side and making recommendations for Performance Profiling or Tuning.

TECHNICAL SKILLS:

Operating Systems: MS-DOS, UNIX, Windows, Mac OS X and LINUX.

Defect Tracking Tool: Quality Center, JIRA

Performance Testing Tool: HP Load Runner, JMeter, Silk Performer, HP PC12.5, HPDiagnostics, SiteScope, IBM Green Hat, Wily Intrascope,Microsoft Visual Studio, NeoLoad

Scripting Language: Javascript, VB Script

Programming Languages: C, C++, VB, PL/SQL, Python, .NET, Java and J2EEs

Database Tools: SQL Server, Oracle, MS-Access, My SQL, DB2

Web Technologies: HTML, XML, Java Script, VB Script

Web & App Server: Web Logic, Web Sphere, Apache, Tomcat and IIS

PROFESSIONAL EXPERIENCE:

Confidential, Irving, TX

Sr. Performance Engineer

Responsibilities:

  • Designed test performance plans as required by customer for providing necessary support and ensured that development process was carried in accordance to strategy.
  • Preparation of Paper Scripts (Manual Workflows) to make sure scripting activity is well planned to avoid any discrepancy in the flows.
  • Create code to insert test data into test Database (MySQL) using MySQL DB python module.
  • Mobile application scripting with Perfecto Mobile for iPhone and Android devices.
  • Capturing HTTP(S) traffic using Fiddler for performance scripts development.
  • Execution of different kinds of tests like dry run, baseline, soak tests, peak loads, break points, failure tests using performance center.
  • Gathering non-functional requirements for testability and participating in the review of the technical acceptance criteria.
  • Discussion of the scope with Solution Architect to identify the Performance test Scenarios and Volumes for all the DCTS applications .
  • Creation of the Work load Model (WLM) for Performance Testing .
  • Test Case creation and execution in the Pre-PAT environment.
  • To identify early environment/infrastructure issues by testing the application with lower volumes in a scaled down environment and get those fixed in spite of consuming a lot of time during PAT phase .
  • Reviewing PCoE logs and reports for Performance test Executions in the PAT environment .
  • Validating the environment for Performance testing.
  • Creation of T est Data to be used for scripting and execution.
  • LoadRunner script creation using Web (Http/HTML) and Web Services protocols .
  • Collected and maintained PBDs metrics using Wily Introscope.
  • Providing recommendations to improve the environment after the PCOE test execution is completed in the PAT environment .
  • Extensively used Wily Introscope and HP Diagnostic to analyze the system resources bottlenecks like Memory Leaks, CPU and Network Bottlenecks as well as problematic application and DB components.
  • Analysis of results by monitoring graphs from different teams for WAS, WPS, DB2 and Report Preparation Via LR Analysis.
  • Perform detailed level of work load model (WLM) analysis and log analysis on the production data to identify the peak work load business requirements and analyze/ estimate the increase in future users to assess the application capacity
  • Monitoring to figure out the memory leaks / top CPU consuming code / SQLs share the performance test results with the business team, technical team and senior leadership.
  • Execute various kinds of performance tests (baseline, load, stress) with various transaction mix settings.
  • Analyze the client side and server side metrics and provide a detailed report with observations and recommendations.
  • Simultaneously worked on different project using JMeter for designing and developing test scenarios, test scripts as well as integration solutions.
  • Developed load test scripts using Apache jMeter and performed load testing in Blaze meter (cloud).
  • Used JMeter for heavy load on a server, group of servers, network or object to test its strength or to analyze overall performance under different load types.
  • Worked with JMeter in stimulating load on the servers to check performance of different load types.
  • Developed Shell/Batch/Python Scripts for automation purpose.
  • Developed Python and Shell Scripts for automation of the build and release process
  • Used profiling/debugging tools like AppDynamics, Fiddler (for SOAP & REST Web service traffic) and analytical tools like
  • Monitoring and bottleneck analysis using Dynatrace and Wily Introscope.
  • Creating Business transactions, dashboards and reports in Dynatrace, creating and maintaining profiles in Dynatrace. Collecting Dynatrace agents, collectors and Altering.
  • Monitor performance and reliability of servers and workstations using several system management and monitoring tool such as AppDynamics.
  • Captured/recorded transactions with AppDynamics for service calls and Adobe Scout for UI, analyzing them top-down and bottom-up
  • Used AppDynamics Performance tools for monitoring and tuning the WebSphere environment (JVM heap size, data base connection pool size, etc).
  • Actively participated in Defect Review meetings on a daily basis.
  • Preparation of Daily and Weekly status reports. Attending weekly defect report meetings and presented progress updates.
  • Attending conference calls with offshore team to discuss the Testing status and to assign the defects to the concerned developers.
  • Participated in QA reviews and provided required support and clarification as needed for the reviewers.

Environment:: LoadRunner 12.5 (VUGen, Controller, LR Analysis), AppDynamic, DynaTrace, JMeter, HP ALM/Performance center 12.5, HP QTP, Java, Python, SQL Server, Fiddler, Web services.

Confidential, Houston, TX

Sr. Performance Tester

Responsibilities:

  • Assisted the team lead in the preparation of the Test Plan and Test Strategy documents.
  • Responsible for Load Testing Co-ordination with various other projects involved in load testing activity.
  • Created and maintained Test Scripts and Test Cases based on High Level Functional Requirements Document (FRD) utilizing Visual Studio Team System (VSTS), Team Foundation Server (TFS), for manual, automated(Functional & Regression), and performance/stress testing.
  • Performance testing of customer end applications using Neoload.
  • Interacted with the Business Analyst and application teams to discuss the performance requirements and test strategy also developed the performance Test Plans and Load Test Strategies interacted with end client.
  • Managing the team throughout the software Testing process involving creation all parameters, timelines and performance benchmarks are met.
  • Multi-tasking between testing concurrently on multiple projects.
  • Analyzing required effort in terms of resources needed, script complexity, scenario design challenges, overall man-hours required and documenting them before project execution.
  • Planned and generated VuGen scripts using Load runner 11.5 and TruClient protocol. Modified the scripts for parameterization, pacing and think times.
  • Worked closely with the offshore by assigning them work on daily bases and also validation of work.
  • Executed load, endurance, regression and performance tests based e-commerce website with 1000 to 10000 user loads.
  • Multiple protocol environments: Web (HTTP/HTML), AJAX TruClient, Web (Click & Script), Web services on LoadRunner VuGen 11.5.
  • Creation of the Work load Model (WLM) for Performance Testing and capturing of Dynamic Values using Manual Correlation.
  • Prepared analysis report with important graphs in Excel Sheet with HTML Reports. Adding new graphs to the Analyzer reports, comparing results with SLAs, merging two or more graphs to compare results, exporting HTML reports. Calculations of tps and response times across different users in one test.
  • Interacted with capacity team during the test runs for monitoring of Apache and Jboss servers involved in the project for CPU utilization, memory and disk usage.
  • Exclusively involved in performance tuning of servers such as Apache and Jboss.
  • Identified bottle necks, performance issues using multiple user test results, online monitors, real-time output messages and LoadRunner Analyzer.
  • Anticipated in defect review meetings conducted by the PM to discuss the status of defects to the application.
  • Always met the deadline when required to do Load testing.
  • Recorded videos of functionality flow and the replay of scripts using HP screen recorder.
  • Interpreted different performance counters, participated in analysis of load test results and mentor others in troubleshooting issues with Visual Studio Team Test.
  • Set up environment for developer to dock Quality Center within Visual Studio Team System(VSTS).
  • Analyzed and exchanged administrator, designed, developed and maintained the web-based and automated accounting systems and various other applications using Visual Basic and SQL Server.
  • Used SOAP UI for web services is used for giving the input requests and the responses are captured and later scripted in load runner by giving them proper SSL Certs for secured endpoints.
  • Monitoring and bottleneck analysis using Dynatrace.
  • Used Splunk to analyze bottlenecks like memory leaks, CPU and network bottlenecks as well as problematic application and DB components.
  • Prepared daily task sheets for offshore and also updated performance status sheets for manager onshore. Scheduled several meetings with business teams, applications teams and development team to discuss the daily status of the project.

Environment: Load Runner 11.5, ALM/Performance center 11.5, ALM/Quality center 11.5, Microsoft Visual Studio, Soap UI, IBM Green Hat, DynaTrace, NeoLoad, Splunk, HP screen recorder.

Confidential, Denver, Colorado

Performance Tester

Responsibilities:

  • Designed test performance plans as required by customer for providing necessary support and ensured that development process was carried in accordance to strategy.
  • Responsible for performance testing (Load, Stress and Volume) using HP LoadRunner (Controller, Virtual User Generator, Analysis)
  • Involved in installation and Setup of Performance Center HP LoadRunner.
  • Developed automated tests and measured and validated performance of system against requirement and maintained automated tests environment and scripts for performance tests.
  • Monitored CPU usage, Idle Thread Counts, GC Heap, Open Session Current Counts by using Introscope and WebLogic Console.
  • Developed Work Load Model (WLM) for every release.
  • Collected performance monitoring statistics coordinated with tech architects, business analysts to analyze the performance bottlenecks and provided recommendations to improve the performance of the application.
  • Created scripts for Regression, Security, GUI, Integration and Database testing.
  • Preparation and execution of test scripts using JMeter and SOAP UI tool to perform Web Services testing and load testing in Blaze meter.
  • With JMeter did test performance for both, static resources as well as dynamic resources, as well as handle a maximum number of concurrent users then your website can handle and providing the graphical analysis of performance reports.
  • Configured different test plans and analyzed the results using JMeter.
  • Conducting WSDL review meetings to understand the requirement of each Web Service.
  • Execute each Web Service manually by testing each operation in the WSDL.
  • Performance tested SOA based application using Web Services Protocol.
  • Designed and developed automated scripts using LoadRunner based on business use cases for the application.
  • Designed and conducted Smoke, Load, Soak, Stress, and Scalability tests using Performance Center.
  • Used VisualVM to Monitor the JVM for CPU, Heap, Non Heap, GC Logs, Thread behavior and I/O Stat using UNIX commands like top, Vmstat, Nmon and Netstat while system under test.
  • Worked in Test management for Oracle EBS Financial and Supply chain modules.
  • Assist with the integrated design, build, test and deploy phases of Oracle EBS modules to ensure that all application related transactions are appropriately captured, tracked and accounted.
  • Created Base Line test, Stability test, Stress test and Load test scenarios.
  • Used Random Pacing between iterations to get the desired transactions per hour.
  • Extensively used SQL queries to check the business transaction flows, editing existing batch jobs.
  • Created VuGen scripts using different protocols like Web-HTTP, Web services.

Environment:: Load runner 11.0, Jmeter, Agile methodology, Quality center 11.0, SOA, Remote Desktop, Java, Load Testing tool, Wily Introscope.

Confidential

Performance Tester

Responsibilities:

  • Responsible for reviewing and analyzing the requirements of the new system and identifying discrepancies that can hinder System, Regression, and User Acceptance Testing (UAT).
  • Responsible for performance testing (Load, Stress and Volume) using HP LoadRunner (Controller, Virtual User Generator, Analysis)
  • Worked with manual and automation testers and conducted testing using HTTP Watch, Selenium tools.
  • Involved in installation and Setup of Performance Center HP LoadRunner.
  • Created customized Load Runner VuGen scripts at the API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server.
  • Performed result analysis using online monitors and graphs using Load Runner
  • Assisted in tracking, analyzing and documenting bugs/defects using Test Director.
  • API test sequencing manually by editing the corresponding XML file.
  • SQLs are written using QMF to query the DB2 database on mainframe.
  • Hands-on development experience in Java technologies on Linux and Windows platforms.
  • Responsible for executing and maintaining scripts in mainframe applications
  • Conducted performance testing of Oracle EBS by using HP Load Runner.
  • Worked on different PeopleSoft modules HR, FIN and SCM.
  • Analysis, design and development to connect legacy systems with Oracle EBS.
  • Interaction with EBS business users and cross functional teams during Requirement gathering phase.
  • Performed data retrieval from DB2 and Back End testing using SQL and PL/SQL Verified Data Loads using ETL Source to Target Mappings.
  • Arranged schedules and notifications for the QA Team, Development team, Middleware team and Business Banking team to meet and discuss status of the project.
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Coordinated with Functional Teams to Identify the Business Process to be Performance Tested.
  • Designed Test Case documents for Performance testing in Quality Center and report defects.
  • Worked closely with functional team to test the application functionality in development and QA environments using QTP testing tool and extensively used descriptive language and check points feature
  • Performed extensive analysis using the HP .Net diagnostics tool, Wily Introscope. Used HP Quality Center 10 for defect tracking.
  • Analyzed the server resources such as Available Bytes and Process Bytes for Memory Leaks.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Worked closely with engineering team to discuss the design and testing aspects of the applications to design the test plans and to isolate the root cause of defects.

Environment: LoadRunner8.0, Java, Web Sphere, Linux, AIX, HTML, DHTML, JSP, Toad, Oracle, UML, Mainframe, Informatica, MQ Series, XML, QTP8.0, Test Director 7.6, Agile.

Confidential

Performance Tester

Responsibilities:

  • Created the load test scenarios using LoadRunner Controller from scratch, which includes Creating the VUGen Scripts and Assigning VUsers for each script.
  • Correlated all dynamic values within the script generated by Load Runner and enhance (add transaction, text/content check) them according to the scenario.
  • Developed scripts using Web (HTTP/HTML), Web services and JAVA.
  • Enhanced the Load Runner scripts by parameterization, check points, correlation and by keeping Rendezvous points.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Performed smoke testing by checking the build release from the developers.
  • Performed Regression testing after logging defects.
  • Involved in database testing by writing SQL queries and also using data base functions for automation.
  • Worked on Throughput, Hits per seconds, Network delays, latency, and capacity measurements and reliability tests of on multi - layer architecture.
  • Worked on performance testing report and made recommendation for system/application performance improvement.
  • Worked with developers, business and release managers in bug fix issues and in meeting project deadlines.
  • Reported the bugs, Email notifications to the developers using QC.

Environment: Load Runner v.10, Quality Center, VB script, UNIX, XML, Shell Scripting, Web Sphere, Web logic, Oracle, SiteScope, Tivoli.

We'd love your feedback!