Performance Engineer Resume
Chicago, IL
Summary
- Extensive experience in managing and leading QA effort at various stages involving functional automation and performance evaluation of software applications.
- Proficient with various models and methodologies associated with Software development lifecycle (SDLC) and Software testing lifecycle (STLC) with knowledge of ISO and CMM level standards.
- Brain bench Certified Advance Load Runner Engineer with expertise in utilizing and managing Load Runner \\Performance Center environments along in-depth understanding of vUgen based scripting using numerous protocols such as web(http/html),web services Click and Script(Ajax/Web),Siebel.
- Expertise in creating Automation scripts using HP Quick test professional and integration to Quality Center
- Quality Assurance, Automation, Testing, Analysis and Documentation of Web Based and Client / Server applications
- Extensive experience in reviewing and analyzing of business requirements and writing detailed Test Plans, Test Cases, and Test Scripts.
- Exposure to all stages of Software development life cycle, excellent in verbal and written communication, pragmatic team player with a combination of business acumen and technical skills.
PERFORMANCE TESTING SUMMARY:
- Experienced with leading performance engineering efforts utilizing HP Loadrunner\\Performance Center.
- Brain bench Certified Loadrunner specialist.
- Proficient with vUgen(scripting) ,Controller(test design and execution) & Analysis(result analysis) modules of Loadrunner
- Extensive Experience in Pre-Test Architectural overview, Load contribution analysis, Planning and Test Execution and Post test result analysis.
- Defined a Scope and Strategy and Scaled the Load Distribution for Prod vs. Test Environment.
- Experienced with working on various application architectures utilizing suitable Loadrunner protocols for testing numerous application types including Web, Client server, and Web service based, Ajax based AUTs.
- Proficient in Pre test analysis, Test Planning, Test design/setup, test execution and monitoring, post test analysis phases critical to Performance testing software applications.
- Experience in leading performance testing efforts in challenging environments involving complex architecture and narrow test schedules prioritizing Business and resource critical scenarios yet not compromising the load distribution ratios needed to drive realistic tests that generate conclusive artifacts.
- Worked on various applications types, and used multi test protocols such as http\\html, Winsock, Web click and script, GUI, web services provided by Loadrunner to derive tailor made scripting solutions demanded by web, Client server and real time applications.
- Proficient in C and VB programming which complements scripting and enhancement requirements for vUgen scripting
- Proficient with Customization, Randomization, Verification techniques demanded by load test scenarios, to simulate unique effect for each user, utilized rendezvous points.
- Utilized IP Spoofing and geographically dispersed load generators to mimic real time load characteristics for testing Routing/load balancing involved application.
- Worked with sitescope, Tivoli performance viewer, dyna trace and also used custom monitoring of system metrics for Windows (Perfmon) and Unix (VM\\Net\\R-stat scripts) environments.
- Proficient with Administering and Utilizing HP performance center as a web based version of load runner running over IIS.
- Proficient with configuring and utilizing standalone windows hosted load runner controller
- Expertise in conducting test reflecting globally distributed Load generators configuring over firewall LGs and MOFW agents to investigate the response time behaviors when the application is accessed over diverse geographical locations
- Evaluated Shunra WAN emulator and Dyna trace root cause analysis tool to study the compatibility and efficiency of these tools in enhancing the performance testing efforts across the organization.
- Efficient in designing modular vUgen script design that aid code reusability utilizing randomization, parameterization ,correlation rules as applicable to AUT
- Proficient with creating scripts that utilize system functions to execute Batch ,standalone executable processes that can be used in sync with the GUI based load to conduct end to end testing
- Experienced with Automation utilizing QTP and integrating QTP with Loadrunner GUI user protocol
- Utilized Increasing Load, Dynamic load and Stability testing workloads to predict the performance durability of the Applications.
- Utilized release notes to reflect changes /additions into the scripts as needed. Conducted regression and performance tests to identify the impact caused by the release modifications.
- Proficient with Delta identification and overall performance analysis for Existing application over the newer version releases.
FUNCTIONAL/REGRESSION TESTING SUMMARY:
- Developed Functional test plan and automated test cases using HP Quick test professional.
- Proficient in designing and implementing automated regression script suite utilizing HP/Mercury’sQTP.
- Worked with creating centralized object oriented test automation framework which could be referenced to new QTP script development across the organization.
- Proficient with creating data driven test scripts which test application functionality over data point combinations.
- Utilized hands-on technical automation skills and the ability to communicate and interact effectively with involved technical teams to gather and analyze functional requirements and develop automation scripts based on functional, business and resource critical scenarios set as a highest priority.
- Proficient with Visual Basic and Logical enhancement of QTP scripts
- Introduced Object property and data checkpoints as demanded by the test case.
- Utilized global function files when needed to replicate activities as demanded by test scenarios.
- Customized the scripts to generate bug snapshot reports for each test run.
- Introduced Error Handling and default base state, methods to ensure scheduled batch test runs.
- Employed data driven, Priority based Batch testing.
- Expertise with GUI Map management and Custom Object declaration.
- Utilized Functional testing tools in conjunction with Load testing tools to identify functional bugs which arise during peak loads only.
- Proficient in manual and automated testing methodologies, including data driven test design
- Knowledge in Quality Center for Test Management as well as Requirement Management
- Exposure to Software Configuration Management (SCM)
- Performed Sanity, Functional, Integration, Reliability, Compatibility, System, Regression and User Acceptance Testing (UAT) testing for authenticating environment stability and readiness for Load testing and overall certification.
- Extensive experience in data validation and manipulation in the Oracle, DB2, SQL server
- Proficiency in writing UNIX shell scripts, for monitoring and scheduling batch jobs
- Strong experience in writing queries to extract data from databases using SQL and TOAD
- Experience in working across the various platforms and have good exposure to client server architecture, object oriented approach and Internet technologies
Experience
Confidential,Chicago, IL Jun 09 - Present
Sr. Performance Engineer
Confidential, is a global leader in retail and commercial banking sector, providingfinancial services to private, corporate and institutional clients. With a global expertise in investment banking and asset management, UBS provides international wealth management and banking services.
Performance center of Excellence (PCOE) team provides expert solution to all Performance testing operations as a part of implementing single stop solution to furnish expertise with Performance testing methodologies and tools needed to conduct the testing demanded by different testing teams. This Team also works on installation of Performance Center \\ Loadrunner licenses ,deployment of over firewall agents and providing script solutions and strategies for all applications based on the involved technologies and architectures. Also this team maintains a share point site to address all ongoing issues reported by other Performance testing teams which include Scripting and tool guidance POC sessions, Protocol Identification and best practices for Pre and Post test analysis, Troubleshooting all Perf center issues along testing of internal projects such as time management and defect management applications used by different QA teams.
Responsibilities
- In the role of Senior Performance Engineer two of my key responsibilities were to provide Performance test solutions to the projects involved with along working as a representative to Perf COE team by providing mentoring and troubleshooting guidance to the issues presented.
- As a part of Perf COE team at UBS was responsible to conduct architectural analysis and design creative testing strategies and solutions to capture conclusive artifacts for all Perf testing projects involved with.
- Worked with Performance Center \\ Loadrunner 9.51 and utilized vUgen over various protocols such as web-http, web services, Winsock and .Net protocols.
- Lead initial discussion and requirement gathering meetings with the involved teams to obtain navigational and architectural breakdown of AUT,conducted in-depth Pre Performance test analysis to outline the scope and strategy that needs to be fulfilled to certify application performance over the Performance Acceptance Criteria (PAC)
- Designed a test plan to provide clarity on the different phases involved between inception to conclusion phase of performance testing and obtained the signoffs over the test schedule and commitments described in the test plan.
- Utilized Loadrunner for scripting and test execution.
- Utilized tools such as Sitescope 9.5, Perfmon and Unix performance logging as applicable to the AUT architecture to capture the resource and process snapshot over test execution.
- Created periodic \\ historical performance test result reports utilizing LR Analysis and Monitoring data obtained to portray application performance trends over varying user loads, evolving software versions and hardware changes proposed, to aid tune up and performance improvement efforts along capacity planning practices commissioned.
- Configured over the Firewall (OTF) Load Generators and Monitor over Firewall (MOFW) Agents to conduct testing that needs testing over Firewalls and Geographical location utilizing LR controller licenses from test lab.
- Conducted in-depth Post Test analysis and published test results.
Environment\\Tools: HTTP, Web services with SAML authentication, HPUX and Solaris, Oracle, SQL (Toad), Performance Center (vUgen, Analysis and Online Trending), MOFW and OTF agents in sync with Loadrunner, HTTP Analyzer, Wire shark
Confidential,Tampa, FL Nov 08 – Jun 09
Performance Engineer
Confidential,is one of the largest banking and financial services organizations in the world. Through an international network linked by advanced technology, including a rapidly growing e-commerce capability, HSBC provides a comprehensive range of financial services: personal financial services; commercial banking; corporate, investment banking and markets; private banking; and other activities.
Played a vital role in the group titled Testing Centre of excellence (TCOE) which constituted of 8 test engineers. The TCOE team is responsible for performance certifying the different software applications under HSBC retail and Banking sectors. The team is responsible for pre and post test analysis and recommendations to enhance the overall performance of these applications
The QA objective of the TCOE is to reduce the risks of poor performance after production deployment by testing systems under “production like” conditions while simulating peak and above peak workloads .Performance objectives and requirements are validated through the development of test strategies based on the system architecture and through the use of load generators and monitors. This team utilizes PC 9.0 for testing.
Automated regression testing focuses on the utilization of tools that capture, validate, and replay user interactions automatically to identify defects and ensure that business processes work flawlessly upon production deployment. The principal Automation tool used for this purpose is Mercury Quick Test Professional.
Responsibilities:
- Responsible for deploying off shore projects and delivering test deliverables and updating the end user.
- Lead a 8 member offshore project team and maintained script results as per requirements of each release.
- Organized meetings to update the upper management on the status and outcomes of the analyses for the application under test (AUT).
- Reviewed and analyzed new user requirements, program design, coding, and Unit testing.
- Interacted with developers to resolve for the defects, which would affect income generating functionality in Production.
- Developed High Level Test Plans by incorporating User Profiles, Configurations, Environments, Risks, Test Scenarios, Schedules and Analysis and Presentation of Results
- Worked with development, users and support groups to understand the application architecture, to simulate realistic production scenarios for load and stress testing.
- Extensively worked on Mercury Performance Centre and Load runner, created Scripts based on prioritized/critical scenarios and scattered the peak load over the production like distribution ratio.
- Configured VUgen, Analysis and Controller Modules when working with Loadrunner, and communicated with Test database to store results.
- Worked with Mercury Performance centre and designed the Tests to work with both Load runner controller and Performance centre web portals.
- Analyzed the tests using Monitoring data obtained from Introscope.
- Utilized MQ tester to generate a MQ load obtained from interfacing applications.
- Utilized the analysis component to drill down test results
- Utilized Rendezvous and Check points to further enhance performance scripts to test application without any scripting errors.
- Utilized smoke, Load and stress testing approach to predict the over all application performance.
- Interacted with Stakeholders during testing, isolated bottlenecks at different levels and suggested Tune-up methodologies.
- Maintained Master GUI Map in a central repository and updated it for version changes
- Attended meetings to outline and present performance testing strategies and objectives.
- Performed Functional/Regression testing and Performance/Stress/Load testing.
- Developed scenarios for Regression/Functional and Performance testing which covers more than 90% of the Critical scenarios for the application.
- Analyzed Average CPU usage, Response times, No of Transactions, Throughput, HTTP Hits and Average Page times for probable scenarios and created Performance explorer graphs to analyze the CPU and Memory utilization for different load tests.
- Performed Data driven testing and script enhancement.
- Generated Test data for various types of testing to valid data, Invalid data, Partial data, bad data Testing
Environment: Mercury Performance Center/ Load Runner, Mercury QTP.
Win NT/2000, Java, UNIX, J2EE, EJB, Web Sphere, Servlets, JSP, JavaScript, XML, HTML, DHTML, VBScript, and Oracle 9x, SQL, C, C++ and JDBC.
Confidential,St Louis, MO Oct 06 - Nov 08
Performance Engineer
Confidential,is the largest provider of local, long distance telephone services in the United States, and also sellsdigital subscriber lineInternet access and digital television. AT&T is the second largest provider of wireless service in the United States
Played a vital role in AT&T\'s performance and regression testing team. The Performance Testing team is responsible for providing automated solutions for performance and regression testing. The objective of the performance testing effort is to reduce the risks of poor performance after production deployment by testing systems under “production like” conditions while simulating peak and above peak workloads. Performance objectives and requirements are validated through the development of test strategies based on the system architecture and through the use of load generators and monitors.
Automated regression testing focuses on the utilization of tools that capture, validate, and replay user interactions automatically to identify defects and ensure that business processes work flawlessly upon production deployment.
Responsibilities:
As a part of this team my job responsibilities are; but not limited to:
- Understanding the application under test, writing test plan, deifying test strategy, coming up with test matrix and guiding/working with the test engineers in scripting.
- Work with development, users and support groups to understand the application architecture, use and current production issues to simulate the best possible real time scenarios for load and stress testing.
- Defining the test scenarios and making sure that scripts are working according to planned scenario.
- Attending meeting with other groups and explains performance objective, strategy and progress
- Analyzing and recommending best suited tools for scripting and reporting, this recommendation normally based on application under test.
- Creating reports for higher management on performance test results
- Configuring the Loadrunner Agents and enable running virtual users over firewalls.
- Configuring Performance center monitoring profiles and site cope based monitoring that is needed to monitor AUT under stress to analyze the application behavior over varying loads
- Configuring Over Firewall Monitoring (MOFW).
- Worked with HP Performance center\\Loadrunner for Performance testing and QTP for Functional test activities
- Tested web, Client server and web service based applications, using various workloads as demanded by the application.
- Executed and analyzed the test results, compared the Response times over Load and correlated the resource statistics, for Identifying involved bottlenecks if any.
ENVIRONMENT : J2EE ,Java Server Pages (JSP), Servlets, XML, Loadrunner, Quick Test Professional, Quality Center , Web Sphere , Oracle, SQL Server, UNIX, Windows NT/2000/XP, AIX and PVCS,Attachmate,Tn host Emulator,Tivoli,Indepth I-3
Confidential,Greenville, NC Oct 05 - Sep 06
QA Tester
Confidential,is a leading independent leaf tobacco merchant serving the world\'s largest cigarette manufacturers. Alliance One selects, purchases, processes, packs, stores, and ships leaf tobacco. The Software development team at Alliance one is responsible for creating software solutions to aid order management, Inventory assignment and management along human resource management for SAP implementation and usage.
Responsibilities:
The responsibilities at Alliance One as a QA tester included Leading manual and automation testing efforts. It also extended to mentoring a team of 5 with QA Methodologies. Participated in Proof of concept sessions to evaluate Automation and Performance test tools to implement SAP and Siebel Testing. Participated and presented in weekly brainstorming sessions with offshore team to aid standardization and code reusability standards and enhance uniformity and clarity of QA testing efforts
- Lead a team of 5 QA testers and was involved with test case development and execution.
- Played a important role in SAP testing where the QA tools needed were chosen based on POC sessions with tool vendors. This helped in understanding and Comparing Silk Test over QTP and Silk Performer over Load runner to determine the right tool needed for SAP testing.
- Developed test plans and obtained signoffs .Created test cases and automated the regression test cases for verification over minor\\major enhancements to the Application.
- Utilized Test Director as repository for various QA efforts. Maintain and administrate Mercury Test Director (TD).
- Responsible for Identifying and Assigning bugs to the internal bug tracking portal. Periodically review and update to fixed issues by regression testing.
- Conduct ongoing analysis on emerging test tools and QA best practices and conduct guidance sessions to the other QA counterpart teams.
- Evaluate Mercury and Segue Test tools and conduct POC for implementation of SAP testing.
Testing Tools Used : Manual Testing, QTP, Loadrunner, Segue Silk Test and Silk Performer
Technologies : VB, Java, Web logic, HP-UX, Oracle
Confidential,Hyderabad, India Apr 04 - Sep 05
Software Quality Assurance Engineer
Confidential, is a rapidly growing medical software and services company. It also offers a healthcare software services and consulting. iMedx provides software solutions to different areas such as medical transcription over internet, internet based compliant heath record solutions, online prescription services to mention a few.
Responsibilities:
The responsibilities at iMedx as a Software QA engineer were included but not limited to
- Planning, Designing, Developing and Executing Standard testing and quality assurance
methodology.
- Developing Manual test cases, validating and reporting defects.
- Confirming, Enhancements and bug fixes,
- Developing regression scripts utilizing Visual Basic utilizing Mercury QTP.
- Was responsible for maintaining and administrating Mercury Test Director (TD).
- Was involved with utilizing TD for maintaining Test cases, defect tracking and reporting.
- Participated in Mercury Loadrunner Training program and evaluated Loadrunner 8.0.Worked on POC for performance testing utilizing Loadrunner.
Projects handled:
- iMedx Turbo scribe: let\'s doctors rapidly access and manage their transcriptions over the Internet. The transcribed documents are secure, portable, and easily searchable.
Test tools Used : Manual testing, Test Director, QT-Professional, SQL. Unix Shell Scripting. Technologies : Java, UNIX, Oracle, Web sphere
- iMedx Turbo Record: is an internet-based compliant electronic health record (EHR) solution.
Test tools Used : Manual testing, Test Director, TOAD (SQL), Shell Scripting
Technologies : Java, UNIX, Oracle, Web sphere
- iMedx TurboRx : is an online, e-prescribing service.
Test tools Used : QTP, HP Test Director, IBM Functional tester, Load runner
Technologies : Java, UNIX, Oracle, Web sphere
Confidential,Hyderabad, India Oct 03- Mar 04
Software Tester
Mahaveer Infoway provides software solution to clients in different areas such as Banking and Finance, Insurance, Telecom, Manufacturing and Real Estate.
Responsibilities:
- Developing Manual test cases and Use cases utilizing Software Specification Documents and Business requirement documents.
- Creating Test Plan and Highlighting and prioritizing software modules to be tested based on availability, business critical and resource critical nature.
- Creating test cases and validating test cases.
- Identifying and reporting bugs using Segue Silk Radar.
- Confirming, Enhancements and bug fixes,
- Developing regression scripts utilizing Segue Silk test and QTP
Testing Tools Used : Silk test, QTP (Testing GUl), SQL queries and Shell scripts
Technologies : VB, Java, Web logic, HP-UX, Oracle
Education
BS Computer science,
Masters in Computer Application