Sr. Performance Test Engineer Resume
Rochester, NY
SUMMARY:
- 8 years of professional experience in Software Performance Quality Assurance Testing.
- Expertise in Quality Assurance (QA), Manual Testing, Performance Testing, Automation Testing, Functional and Non - functional Testing, Software Development lifecycle (SDLC), Test Case/Set Development, Test Scripting, Setup, Execution, Result and Reporting, Bug and Defect Analysis/Logging and Tracking.
- Adept in formulating Test Plans, Test cases, Test Scenarios, Test Approach and setting up testing environment. Experienced in Writing Training documents, Traceability Matrix.
- Performed regular testing and analysis of progress, defects, risk assessment, impact reports
- Expert creator of comprehensive performance test results, analytical performance status reports,
- Expert in Automated Testing in different testing and Software Development Life Cycle (SDLC) methodologies like Waterfall, Agile, Scrum.
- Good Analytical, Interpersonal and communication skills. Result oriented, committed and hard working with a quest to learn new technologies and undertake challenging tasks.
- Team player and Quick Learner, adapter of new tools/technologies and their test applicability.
- Excellent Written/Verbal communication, highly motivated, self-starter able to work independently and collaboratively within a diverse technical team.
TECHNICAL SKILLS:
Automated Testing Tools: HP Quality Center, HP UFT, HP Performance Center/ALM, SoapUI, Jmeter, Selenium, Jenkins, Appium
MS: Windows 98/2000/NT, XP, Server 2003/2008, Unix, Linux and Solaris, iOS, Android, Mac
MS Office Suite: Word, Excel, Visio, PowerPoint, Access, SharePoint, Jira, Project
Bug Tracking Tools: Rally, Jira, Bugzilla
Languages: sShell, HTML, CSS, XML, C, Python, SQL/T-SQL, Java Script, AJAX, VBScript
Servers: IIS, WebSphere, WebLogic, Tomcat, Apache
Databases: RDBMS, DB2, MS SQL Server, Oracle, MongoDB, Postgre
Browsers: MS Internet Explorer, Mozilla, Chrome
SDLC: Agile Iterative/Incremental, Waterfall, Scrum, RUP
Others: APM, Experitest Seetest, MobileLabs, Xcode/Instruments, Azure Management Studio, Git, SiteScope, DynaTrace, App Dynamics, New RelicPostman, Web services, WSDL, SOAP/REST
PROFESSIONAL EXPERIENCE:
Confidential, Rochester, NY
Sr. Performance Test Engineer
Responsibilities:
- Worked in Payroll generating applications such as 401(K) applications Line of Business and served as a Senior Performance Test engineer
- Worked with project called Mobility, whereby the Confidential had released its very first smartphone mobile application for Confidential customers. Confidential clients and employees of those clients could access their Confidential accounts and pay details using the smartphone with a valid user account. The product offered by first version of mobility was retirements, Health and benefits, Payroll information and Time off accrual.
- Reviewed requirements for project and designs for risks.
- Responsible in giving feedback to the development team with possible improvements and discovered performance issues/benchmarks.
- Involved in pre-testing that included reviewing and studying the functional and non- functional requirement document for developing test plans and test cases.
- Participate in production release validation to ensure successful deployments of products without any performance issues
- Responsible in Planning the entire Load test process based on the performance requirements document load testing with SOASTA cloud testing tool, Load Runner, HP Performance Monitoring Tools, HP site scope, QTP ( for running scripts for creating 800 user accounts), SOAP UI (for web services calls).
- Responsible for developing Analytical Reports with multiple parameter fields, multiple expressions and multiple grouping for hierarchy levels and published them using the Business Objects Enterprise
- Executed SQL queries to ensure data is populated into the correct tables and data integrity is maintained
- Conducted walkthroughs with the team reviewing the test plans and test cases for team input and base lining the test plan using the Agile test Methodology
- Formulated Test Plan and Test Cases in Quality Center as per the project milestones.
- Performed manual testing executing all the test cases in Quality Center before switching to automation testing.
- Involved in Both form of testing i.e. Black Box Testing & White Box Testing.
- Participated in Unit Testing by Using Junit Tool.
- Developed & updated automation scripts using Quick Test Pro on different functionalities of the application for GMS and Rhythmyx (Web Content Management)
- Learnt the GUI objects of the application, mapped the custom objects to the standard objects, where necessary, and inserted GUI, Bitmap and Text checkpoints, where needed, to compare the current behavior of the application being tested to its behavior in the earlier version using QTP.
- Parameterized the fixed values in checkpoint statements, created data tables for the parameters and wrote functions for the parameters to read new data from the table upon each iteration - Performed Data-driven testing.
- Prepared a detail test schedule and Test Metrics on a weekly basis for the project members to know the status of the QA Process.
- Used Quality Center to report application bugs and enhancements request and discussed with developers to resolve technical issues.
- Responsible for release management job (updating new codes in my testing (Performance) environment ), as in Confidential performance team had to deploy new codes in their corresponding testing environments The tool used for staging the codes was Jenkins and deployments were done through tidal jobs client.
Environment: WebSphere, RUP, Citrix, Windows 2000 Advanced Server, Doc Harbor Mid-tier, IBM AIX, Oracle Database, SQL Server 7.0, PL/SQL, MQ series (IBM and MS), Clearquest, AB Initio, Cognos, Optimal Application expert, LoadRunner, Test Director.
Confidential, Charlotte, NC
QA Performance Analyst
Responsibilities:
- Identified bottlenecks and points of degradations: Monitoring critical bottlenecks, resource consumption (i.e., front-end & back-end servers as well as middle-tier servers - CPU, memory, disk IO, network, etc.) and their associated load levels. Tested up to 10,000 virtual users within the firewall (intranet) and outside (extranet)
- Scripted and performed ASP and JavaScript pages response time measurements and establish benchmarks (tests performed on an array of Apache, WebLogic and Oracle servers)
- Designed several Loadrunner scripts (VuGen) with different protocols like Web(Http/Html), .net, Siebel, Web Services (SOAP/WSDL/XML Web Custom Request), SAP-Web, SAP -GUI, Citrix, PeopleSoft for load testing different applications.
- Established performance/load table to keep track of front-end and back-end system history and benchmarks
- Implemented and established large-scale load volume testing for front-end, middleware and back-end systems
- Analyze online and batch transactions to test and measure the response times and other performance units.
- Identified and established Performance Tuning, working performance issues with DBAs and developers.
- Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application.
- Monitor resources to identify performance bottlenecks and tuning JVM.
- Planed and implemented server component-level performance testing and monitoring
- Certified pre-release versus post-release build improvements and degradations
- Certified new build versions to verify that they can handle the expected loads
- Established software/system performance & load benchmarks measurements for capacity, scalability and breakpoints
- Responsible for gathering and analyzing business and technical requirements.
Environment: Load Runner, Performance Center, Quality Center, Quick Test Pro, Clear quest, Oracle, MS SQL Server, Web logic, Web Sphere, Load Balancer, JAVA, UNIX.
Confidential, Pleasanton, CA
Senior Performance Engineer
Responsibilities:
- Analysis of new business requirements and performance of Kaiser Applications.
- Worked extensively on Citrix and web based applications
- Used Citrix ICA protocol in LoadRunner
- Configured and Installed LoadRunner for Citrix based scripting
- Debugged, Edited LoadRunner scripts with advanced synchronization techniques.
- Creating scripts to generate load by using Loadrunner Vugen tool.
- Setting up counters for Sitescope and monitoring the Sitescope counters in Loadrunner Controller and Performance Center.
- Measuring scalability of application through stress test and analyzing the impact of peak loads on the availability of Application with Loadrunner Analyzer.
- Post production monitoring of applications and server responses for any performance related defects due to new releases to Applications
- Worked on replicating production problems in load environment for Applications.
- Created UNIX scripts for collecting statistics from NMON and VMSTAT counters during the execution of performance tests.
- Assisted the development team with performance results and recommendations that helped in performance tuning
Environment: Citrix, WebSphere, LoadRunner, WinRunner, Windows Advanced Server, IIS, IBM AIX, Oracle, SQL Server 7.0, PL/SQL, Selenium, Quality Center, Performance Center, Shell Scripting, Web services.
Confidential, San Francisco, CA
Sr. Performance Engineer
Responsibilities:
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
- Interacted with the Business community and the end users to gather requirements and developing User Requirement Specification URS documents.
- Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Followed Rational Unified Process RUP in executing different processes.
- Used WinRunner for functional and Regression testing.
- Created TSL Scripts and performed Data driven testing.
- Extensively used LoadRunner for performance and stress testing.
- Written LoadRunner Scripts, enhanced scripts with functions, parameterized cookies, stored dynamic content in LoadRunner functions, and used client side secure certificates.
- Written high level LoadRunner scripts by using Virtual User generator for Single User, Base Line, Soak (Endurance test) scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by validating content in sources.
- Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application.
- Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submit under Http protocols.
- Cookies were properly handled and used proxy servers and need based headers info was recorded into the scripts.
- Oracle Database performance was monitored for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of application.
- Optimized SQL queries.
- Improved performance by adding Indexes there by indirectly minimizing dead locks.
- Monitored database for sessions, connection pool and Memory issues.
- Written shell scripts and changed configuration files through VI editor for connection management and session management.
Environment: WebSphere, RUP, Citrix, Windows 2000 Advanced Server, IIS 5, Doc Harbor Mid-tier, IBM AIX, Oracle Database, SQL Server 7.0, PL/SQL, MQ series (IBM and MS), Clearquest, CISCO Switch, AB Initio, Cognos, Optimal Application expert, LoadRunner, Test Director, WinRunner
Confidential, Charlotte, NC
Sr. Performance Engineer
Responsibilities:
- Analyzed the requirement and design documents.
- Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
- Written LoadRunner Scripts, enhanced scripts with C functions.
- Parameterized Users, stored dynamic content in LoadRunner functions, used client side secure certificates.
- Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users.
- Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world.
- CPU, Memory, ASP Requests, Network, Web connections and throughput were monitored while running the various scenarios in the LoadRunner TestCenter.
- Used VTS (LoadRunner Component) for the Communication between scripts
- Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour
- Added performance measurements for Oracle, Web Logic, IIS in LoadRunner TestCenter.
- Analyzed results using LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
- Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Used LoadRunner tool for testing and monitoring.
Environment: LoadRunner 7.8, LoadRunner TestCenter 7.8, VTS (Virtual Table Server), Windows 2000 Advanced Server, Apache, IIS 5, Livelink 9.2, BEA Weblogic 8.1 SP1, Servlets, EJB, Solaris 5.8, Oracle Database
Confidential, Wichita, KS
Performance Engineer
Responsibilities:
- Analyzed the requirement and design documents.
- Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
- Written LoadRunner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in LoadRunner functions, used client side secure certificates. Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users. Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world. CPU, Memory, ASP Requests, Network, Web connections and throughput were monitored while running the various scenarios in the LoadRunner TestCenter.
- Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour
- Added performance measurements for Oracle, Web Logic, IIS in LoadRunner.
- Analyzed results using LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
- Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Used LoadRunner tool for testing and monitoring.
Environment: LoadRunner, LoadRunner TestCenter, VTS(Virtual Table Server), Windows 2000 Advanced Server, Apache, IIS 5, Livelink, BEA Weblogic, Servlets, EJB, Solaris, Oracle Database