Performance Engineer Resume
Columbus, OH
SUMMARY:
- Performance test engineer with 7 years of experience in all phases of Software Development Life Cycle (SDLC), developing Test Cases, Test Plans and thorough executions and analytical reporting of results to the stakeholders
- Experienced in both manual and automated functional testing of Web and Client/Server applications using automated tools such as HP/Mercury tools: (WinRunner 7.5/8.0/8.2, Quick Test Pro: 8.0/8.2/9.2, Test Director/Quality Center 7.5/8.0/8.2/9.2/10.0 ) and SOASTA.
- Excellent understanding of Software Development life Cycle (Agile and Waterfall) and importance of performance tester in Development/ Enhancement and Maintenance of software applications.
- Experience in working with various protocols like Web HTTP/HTML, SAP GUI and SAP WEB Protocols.
- Designed and executed Automation test scripts using Performance testing tool HP LoadRunner for applications in Client/Server, Windows, UNIX/Linux, Web Services and Web based applications.
- Developed automation test frameworks in HP Quick Test professional (QTP) using extensive descriptive programming in VB script.
- Expertise knowledge of Keyword Driven Framework, Data Driven Framework, Hybrid Framework in HP QTP.
- Designed, executed and analyzed automated functional and performance tests in JMETER.
- Analyzed and ran Performance Testing and Load Condition Testing using HP LoadRunner and HP Performance Center tools against the web and software applications in virtual environment. Ran Load Test Scenarios and monitored performance. Using VuGen to run automated test scripts recording procedures to mimic an end - user's interaction with the application. Defined end-user behavior using Runtime Settings. Implemented Parameterization and data-driven methods for input data.
- Experienced in performance testing and load testing using tools like JMETER, Neoload and LoadUI.
- Experienced in monitoring Web Servers such as Apache, Tomcat and Application Servers such as Microsoft IIS, Web sphere, Web logic, Glassfish and Database Servers Such as SQL Server, PostgreSQL, and Oracle during different types of Performance Testing.
- Proficient in SQL queries, PL/SQL stored procedures & Triggers. Experience in validating test data by retrieving data from Relational Databases Oracle, SQL Server, Teradata, My SQL, Postgresql using database clients like Toad, squirrel, oracle sql developer, etc.
- Strong knowledge of using Single and Multiple protocols in LoadRunner Vugen like Web Http, Webservices, Ajax Tru Client, SAP GUI, SAP Click and Script, Web Click and Script, Citrix ICA, ODBC, Flex and Oracle NCA.
- Expertise in developing Complex Automated Script framework, utility functions manually for all Protocols as per CMMI level 5 standards using Network Sniffers like HTTP Watch, and HTTP Fox, Fiddler, Firebug, Very good experienced in batch testing scripts, Load Runner Vuser/Vgun scripting and writing reusable function using C for HTTP, HTTPS, Citrix, Web services (SOAP UI and REST FULL), FTPS, SFTP, Swift Net, RTE, True client JMS, Java, Stomp Protocols and designing the scenarios based on the production volumes.
- Involved in Planning and Translation of Software Business Requirements into test conditions; execution of all types of tests; and identification as well as logging of Software bugs for business process improvement.
- Expert in working with protocols like Web HTTP/HTML, Web Services, Citrix, Siebel-Web, Mainframe(RTE), FLEX, Script, RDP, Mobile Web, Database, SAP and Multi Protocols for performance testing.
- Performing the Performance testing life cycle i.e. Scripting, Design of Scenarios, execution, Performance Monitoring and Performance Analysis.
- Expertise in developing Test Automation Script, Creating Test Scenarios, Analyzing Test Results, Reporting Bugs/Defects, and Documenting Test Results.
- Expertise in executing VuGen scripts in Load Runner for Performance, Load and Stress Testing using Controller in Load Runner and generated reports using the Analysis tool in Load Runner. Extensively worked on Web (HTTP/HTML) and Web Services (Soap service and REST)
- Good Analytical, Interpersonal and written/verbal communication skills. Result oriented, committed and hard working with a quest to learn new technologies and undertake challenging tasks. Experience in working as a team member and also work independently to resolve technical issues of a project.
TECHNICAL SKILLS:
Testing Tools: HP Load Runner 8.0, 9.5, 11.0, 11.50, 12.0, HP Performance Center 11.0,11.5, 12, ALM, HP Quality Center, JMeter 2.5, 2.7, 2.8, 2.9, 2.10, SOAP UI, Siebel 7.7.3 and QTP, TestDirector, Winrunner, QTP, Test Director
Languages: C, C++, SQL, PL/SQL, HTML, XML, Java, JavaScript, VB Script, ASP, JSP, TSL, PERL, VB.Net, Shell Scripting
Markup/Scripting: DHTML, CSS, JQuery, JavaScript, XML, HTML
Web/Application: HTML, CSS, JQuery, word press, Web ServersIBMWebSphere, BEA Weblogic, Apache Tomcat, IIS and JBoss EAP 5/6.
RDBMS: Oracle, MS SQL Server, IBM DB2, MS Access, and MySQL
DBMS: PL/SQL +, FORMS, PostgreSQL,
Packages: MS Office, MS Excel, MS PowerPoint, MS Project, MS Visio, Outlook, Lotus Notes, TOAD, SQL Developer, WinScp, FileZilla, WIN SQL, Cute FTP, Adobe Photoshop CS5, Dreamweaver, flash, Illustrator, and In Design
Monitoring Tools: Performance Center, Wily Intro scope, Site Scope, Dyna Trace, App Dynamics, HP Diagnostics, Transaction Viewer, Splunk
Others: Data Flow Diagrams, ER Diagrams, Relational Data Modeling and UML, ETL, AWS, MS Dynamics 365 CRM
Operating Systems: MS DOS, Windows 95/98/2000/NT4.0/XP/Vista, UNIX, Linux, Sun-Solaris, MAC 10, IBM Mainframes, HP-UX
PROFESSIONAL EXPERIENCE:
Confidential, Columbus, OH
Performance Engineer
Responsibilities:
- Managed resources and process of performance testing (like Load, Stress, Volume, Endurance and Failover) using LoadRunner (Virtual User Generator, Analysis) and Protocols used Web, Web Services.
- Performance Engineer for Wholesale QA team and Individual contributor.
- Extensively worked on UNIX to change the database connections, tracing logs, monitor resources of the machines, create users and execute batch jobs.
- Experienced with test scripting using Web http/html, Ajax TrueClient and WebServices.
- Performance/ Load testing using HP ALM 12.0 & 12.5.
- Used Splunk to identify errors in server logs.
- Developed and implemented performance test plans in accordance with agreed strategies and protocols.
- Coordinated with tools team to install CA Introscope and Dynatrace on the performance environment for triage calls to users and execute batch jibs.
- Presented results of the performance testing along with Project Management team to the client’s including senior management.
- Interacted with developers for identifying memory leaks, fixing bugs and for optimizing server settings.
- Analyzed JVM Garbage collection, Thread dump, Thread, DB Response Times.
- Managed multiple applications simultaneously end to end process.
- Monitoring and bottleneck analysis using Dynatrace and Introscope for different applications.
- Capturing HTTP(S) traffic using Fiddler for performance test scripts development.
- Scripts development based on Ajax TruClient, WebServices, Webhttphtml, Java and Citrix protocols.
- Performance testing of applications like payments processing using Unix shell scripts.
- Developed and implemented performance test plans in accordance with agreed strategies and protocols.
- Developed web services automated scripts from to verify RESTful web service calls using XML and JSON format.
- Interacted with business analyst to write the acceptance test cases based on business transactions.
- Closely worked with development team and guided them in finding and fixing the performance defects
- Conveyed results analysis reports and performance tuning areas directly to client.
- Analyzed the results of the Load test using LoadRunner Analysis tools, looking at the online monitors and the graphs and identified the bottlenecks in the system.
- Identified and analyzed memory leaks at each component level.
- Ensure that defects are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
Environment: LoadRunner, Performance Center, DynaTrace, CA IntroScope, SiteScope, Citrix, WebLogic, Java, Quality Center, Web services and Unix
Confidential, Birmingham, AL
Sr. Performance Engineer
Responsibilities:
- Involved and Coordinated in creating the Performance Test environments to conduct Stress/ Load testing.
- Coordinated with Project Teams to gather Performance Testing Requirements.
- Gathered performance test requirements from the client and designed performance tests for benchmark, baseline, stress, endurance, network and component.
- Followed up with standard software testing life cycle (STLC) to perform different types of testing for the projects Analyzed requirements, Use Cases, and Test cases.
- Created Scripts using Load Runner protocols Web-Http/Html, Seibel Web, Mobile-Web, Web Servicesfor testing Multiple applications
- Designed and developed automated scripts using Load Runner based on business use cases for the application.
- Used Manual and AutomatedCorrelation concepts for capturing dynamically generated values.
- Modified existing Load Runner scripts to replicate new builds of the application.
- Created and executed different performance test scenarios like Load Test, Stress, Volume and Endurancetests according to Test Plan document.
- Extensively used JMeter for Webservices Testing and also WebApplicationTesting.
- Used custom Jmeter plug-in for reports generation and analysis
- Responsible for monitoring the Infrastructure behavior using AppDynamics during Load Test execution to identify performance Bottle Necks if any.
- Used AppDynamics to MonitorEndUserExperience, Overall Application Performance, Business Transaction Performance & Application Infrastructure Performance.
- Used App Dynamics to Monitor the Memory Pools, Transactions, StackTrace and other performance Counters of all the tiers involved in the Architecture.
- Used JVisualVM to Monitor the JVM for CPU, Heap, GC, Thread behavior and I/O Stat using UNIX commands like top, Vmstat, Nmon and Netstat while system under test.
- Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs, analyzed the response times of various business transactions, login times under load, developedreports and graphs to present the test results.
- Responsible to Monitor production and identify any Performance related Issues, Replicate it in Performance testing Environment, Tune and Retest.
- Scheduled Test results review meetings with project teams to walk through Test reports and discuss about Performance Bottlenecks Identified.
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution using JIRA defect tracking tool
- Prepared Final Performance Report by consolidating all the data gathered from the Performance Tests.
- Create progress reports on projects and reported status to ProjectManager on a Timely manner as part of the Team’s Process.
- Published the Final Reports to all the Stake holders of the Project.
Environment: Load Runner 11.52/12.0, J meter 2.9, 2.10, HP Performance Center, Quality Center, HP ALM 12, JVM, App Dynamics, UNIX, Vmstat, Nmon, Netstat, Application Servers, Tomcat Servers, Web Logic, Web Servers, Oracle 11g, Toad, SQL Developer Message Queue Servers.
Confidential, CT
Sr. Performance Engineer
Responsibilities:
- Performed the GAP analysis, to understand the gap between the 4010 and 5010 conversions.
- Analyzed the Business Requirements Document (BRD), created Test Plans and prepared detailed Test Cases.
- Responsible for Creating the test strategy & test plan for performance testing, risk assessment, automated test scripting using Loadrunner, SOASTA and designing workload model based on business demand and forecasts.
- Worked closely with Development and Business Team to get an understanding of the system architecture, system component interactions, application load pattern and the Performance SLA.
- Prepared performance scripts for applications such as Oracle Forms, Oracle Hyperion, Appian, SOAP/REST web services etc. I also worked in setting up virtual scenarios for our executions under the supervision of my lead.
- Good experience in using APM tool DynaTrace, AppDynamics in monitoring business transaction across all tiers (web/app/DB) of the applications. Adding DynaTrace headers to the VuGen scripts to monitor response times closely.
- Defined performance goals and objectives based on the Client requirements and inputs.
- Developed Scripts in HTML/HTTP, SAP GUI and SAP WEB Protocol in LoadRunner
- Supporting Performance Center, Quality Center, ITKO Lisa and UFT
- Created and coded a very flexible LoadRunner script that allowed fast configuration changes during testing, executed the same from multiple Load Generators in Controller.
- Mostly involved in Performance testing applications such as .NET & Java Applications.
- Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
- Monitored the performance of the Web and Database (SQL) servers during Stress test execution.
- Used Sitescope & HP Diagnostics to monitor the load test and to identify the bottle necks.
- Configured Offline & Online Diagnostics like J2EE/.NET Diagnostics through performance center.
- Performed CPU and Memory monitoring and performance metrics extraction of web, application and database servers.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Produced a test report correlating poor performance to bottlenecks, and make recommendations for improvements.
- Collaboratively worked with Capacity Planning to obtain the performance expectations of performance scenarios to be executed, provided results to validate current forecast models, and for future infrastructure architecture planning.
- Validated test results through the UI and through the analysis of various system/ application error logs as well as database queries.
- Updated test matrices, test plans, and documentation at each major release and performed Regression testing after making the needed changes and adding the new Functionalities which are to be tested.
- Traced the bugs and reported to the developers using HP Quality Center.
- Developed automation script using Selenium and web driver
- Worked closely with developers to recreate defects found and to verify fixes.
- Worked closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
Environment: LoadRunner 9/9.5/11, Rational Clear Quest, HP Quality Center/ALM, IBM OS version 4690, POS version 6, .NET, Java, Dyna-Trace, AppDynamics, J2EE, VBScript, Oracle, SQL, UNIX, Shell scripting, HTML, Web Sphere
Confidential, Columbus, OH
Sr. Performance Test Engineer
Responsibilities:
- Conducted Smoke, GUI, Functional, Integration and Regression testing during the various phases of the Development.
- Performed Functional and Integration testing of the application and prioritized test cases to meet project specific deadlines.
- Developed and maintained Selenium Script on Jenkins to support regression testing whenever a Change Request is completed.
- Develop Test cases from business use cases, Test data and Selenium scripts for UItesting.
- Executed automation scripts on different browsers/environments & reported defects/results to the team
- Maintained the Selenium & JAVA automation code and resources in source controls like CVS, SVN over the time for improvements and new features
- Regression testing by automation using Selenium WebDriverr, JAVA, Cucumber.
- Writing test cases for Cucumber automation (Groovy, Java
- Demonstrated the advantages of using Selenium with Cucumber for automated testing.
- Configured Maven for JAVA automation projects and developed Maven project object model (POM
- Extensively automated regression and functional test suites by developing over 150 test cases, 6 test suites using Selenium WebDriver, JAVA, Junit.
- Achieved customer satisfaction and high quality by effective Automation testing, AGILE Processes Responsibilities
- Performing Back end testing with extensive use of SQL Queries and UNIX commands.
- Utilized UNIX and SQL to create test data and perform backend validation of UI based applications. Reporting and verifying the bugs in Jira.
- Involve in applications analysis and testing which included claims processing as per EDI/ANSI-X-12 transactions standard.
- Worked extensively on EDI 834, 837, 835,270/271 & 276/277 for different business users.
- Wrote Test Cases and assisted in writing Test Cases for 276/277, 270/271 and 278 transaction.
- Participated defect review meetings with the team members and coordinated with project development team.
- Used Quality Center/ALM for tracking document and managing version.
- Gathered and analyzed tests logs, defects and implement final test summary report.
Environment: Selenium WebDriverr(Java), Java, LoadRunner C++, AppDynamics, Informatica, SQL server, Jira, Jenkins Continuous Integration, Agile Scrum, Windows 7 Professional, V.net, Star team, Informatica.
Confidential, Thousand Oaks, California
Performance Tester
Responsibilities:
- As a performance testing lead my job was reviewed the business requirements documents and technical specifications to understand the SAPGUI and SAP web Application for different workflow
- Attended meetings with business analysts, financial advisors, developers, managers, and other project related personnel to understand more facts about the product and user interface issues
- Involved in writing test plans, test cases, and test scenarios necessary to plan the testing processes
- Involved the daily status meeting with project manager and offshore team members about progress of the Project
- Involved for setting up and executing of the test cases, and capture data related to performance testing
- Worked closely with the project team in planning coordination and implementing testing methodology and writing business function for specific test
- Gathered, consolidated requirements for generating performance goals and test plans
- Responsible for designing of test plans, test procedures and test cases and execution of test cases
- Generated VuGen scripts, testing scenarios, ran the scripts and analyzed the results using LoadRunner
- Used rendezvous point, start and end transaction, parameterization, correlation features in virtual generator of LoadRunner
- Designed scenarios for performance testing, generated scripts and handled correlation as well as parameterization using LoadRunner generator, executed scenarios using controller and analyzed the results using analyzer and performance center
- Performed performance stress/load test on web and windows based applications using LoadRunner
- Enhanced script by inserting check points to check if Vusers were accessing the correct page which they were supposed to be accessing
- Created a scenario with certain amount of Vusers giving ramp up, ramp down and run time in the controller of Performance center
- Analyzed the LoadRunner reports to calculate response time, transactions/minute, hits/sec and throughput
- Met with managers, business analyst, work steam lead and developers to discuss the performance test analyzed report, real world use cases and appropriate workflows
Environment: SAP GUI, SAP NetWeaver, ECC, LoadRunner, Performance Center, QTP, QC, HP SiteScope, JIRA, Share point, C, MS Office, HTTP/HTML, Web Services, XML, Windows.
Confidential
Performance Engineer
Responsibilities:
- Created the load test scenarios using Load Runner Controller from scratch, which includes Creating the VUGen Scripts and Assigning VUsers for each script.
- Correlated all dynamic values within the script generated by Load Runner and enhance (add transaction, text/content check) them according to the scenario.
- Developed scripts using Web (HTTP/HTML), Web services.
- Enhanced the Load Runner scripts by parameterization, check points, correlation and by keeping Rendezvous points.
- Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
- Performed smoke testing by checking the build release from the developers.
- Performed Regression testing after logging defects.
- Involved in database testing by writing SQL queries and also using data base functions for automation.
- Worked on Throughput, Hits per seconds, Network delays, latency, and capacity measurements and reliability tests of on multi - layer architecture.
- Worked on Performance testing report and made recommendation for system application Performance improvement.
- Worked with developers, business and release managers in bug fix issues and in meeting project deadlines.
- Reported the bugs, Email notifications to the developers using QC.
Environment: Load Runner 9.10, Quality Center, VB script, UNIX, XML, Shell Scripting, Web Sphere, Web logic, Oracle, Site Scope, Tivoli.