Quality Assurance Engineer Resume
Austin, -tX
SUMMARY
- As Quality Assurance Engineer with more than 4 years of experience with great attention to detail and a commitment to developing and implementing continuous improvement initiatives through the collection and analysis of data and converting it into actionable information. A self - starter, motivated team player and excellent communication and interpersonal skills. Open to acquire knowledge, learn new technologies.
- Experience in the area of Software Testing (Manual, Performance and Automation) with a solid understanding of Test planning, designing/implementing performance related tests and worked with automation testing frameworks.
- Excellent working knowledge of SQL and RDBMS such as MySQL, Oracle, etc.
- Proven expertise and success with QA automation tools, both commercial and open source to include QuickTest Pro, LoadRunner, Selenium, Jmeter.
- Excellent working knowledge of designing and implementation of all test strategy plans and automated test solutions for client/server and Web applications with LoadRunner.
- Well acquainted with all phases of SDLC and STLC.
- Hands on experience with designing, developing, and executing different type of Load tests for Web applications and Mobile iOS/Android Applications.
- Strong Experience in Automating Web and .NET Application Testing using Selenium WebDriver with TestNG and JUnit framework.
- Worked in Agile and Waterfall Testing methodologies.
- Experience with web services, especially SOAP, RESTful API
- Worked extensively with LoadRunner in Enhancing scripts, assembling test scenarios, Executions & test reports.
- Handled performance test environment readiness pertaining to code deployment followed by test servers/services restarts before start of validation and test run.
- Experienced in developing different test scenarios like SingleUser/SanityTest, Load Test, Scalability/Stress testing, Reliability/Endurance testing, Performance regression testing etc.
- Hands on experience configuring and using MOFW, Dynatrace, HP Diagnostics, App-Dynamic tools to setup Performance Monitors for monitoring and analyzing the server stats during test executions.
TECHNICAL SKILLS
Testing tools: LoadRunner, JMeter, Seleniumwebdriver/IDE, ALM Performance Center
Webservices: Soap UI, REST
Script Languages: C, C++, Java Script, SQL scripting, Shell basics, Unix basics, PHP
User Analytics Reporting Tools: NetBeans6.0, Eclipse IBM Rational Application Developer (RAD)
P latforms: Windows NT 10.0/2019/08/10 , Unix, JBOSS, Amazon EC2 instance, Tomcat, Web-logic, & Amazon AWS Cloud, Linux, Ecommerce, Active & Passive Directories, IIS
Databases: MSQLServer2016/2017, DB2, Oracle, MySQL, Retail, TDM, Datawarehouse
Web Technologies and Monitoring Tools: HTML, CSS, XML, VBScript, Dyna-Trace 5.5 & 6.1, Visual VM, App-Dynamic, HP Diagnostics, Splunk, Datadog, Perfmon, Wireshark, Database Trace, Rabbit MQ, ESB Simulator, Graphite, Graphana, Fiddler, MOFW
Tools: Office client, Excel, PowerPoint, Project, Visio, Outlook, WIN SQL, Spooky, R10 Web Client, Postman, SCCM
Defect tracking Tools: Quality Center (9, 10, 11.0), Jira, Team Foundation Server (TFS 2010, 2012), qTest (1.1,1.2), QuickBase
PROFESSIONAL EXPERIENCE
Confidential, Austin -TX
Quality Assurance Engineer
Responsibilities:
- Gathered requirements from business teams and stakeholders.
- Organized meetings with Product Owner, Business, Architecture, and development teams in establishing scope of the Automation, Functional and Non-Functional Requirements for the application under test.
- Experience in a testing environment developing non-functional / performance tests against documented APIs and aligned requirements between Jira, Quality Center (QC) and qTest.
- Designed workload model and identified feasibility for the test cases.
- Prepared test suites in Quality center and re-imposed data to another test management tool as qTest.
- Developed scripts using Web (HTTP/HTML) & Web Services protocols through LoadRunner tool and enhanced scripts to support error/data handling.
- Test execution includes: Smoke test for Environment, Sanity Test, Capacity Test, Load Test, Stress Test, Endurance Test, Regression test and HTTP Watch Test - Client-Side Page analysis.
- Performed cross-browsing testing to verify if the application provides accurate information in different browsers.
- Developed Test Frameworks in Selenium for UI Regression Test Automation and when necessary, and potentially execute Unit Test Automation.
- Automated scripts and performed functionality testing during the various phases of the application development using TestNG framework.
- Wrote and executed SQL Query Statements to retrieve data from back-end.
- Responsible to handle performance related issues and reproduced production issues in internal lab Environment. If needed, raised defects for tracking.
- Collaborated with development and product teams to validate defects.
- Responsible for Point of Sale Profiling and Self-Checkout Profiling for new release of R10 version.
- Executed foremost test scenario which inflates DB with Transaction Logs, new prices for products, taxes, and promotions for products.
- Extensively worked on Amazon Virtual Private cloud on premises in order to ensure functionality of application.
- Involved in Drive Test-Driven Development (TDD & BDD) by working hand-in-hand with the development and product management teams.
- Raised tickets to third party companies like HP Microfocus to facilitate critical test executions.
- Supported for R/D projects and helped client for further steps in development of project.
- Vocal point for operational and certificate security tests designated for execution in Technology LABs.
- Identified bottlenecks, bugs, project risks, dependencies and provided alternatives to avoid risks.
- Configured logging level to Debug to seize every trend and took DB traces during execution.
- Analyzed metrics which involves response times, hits, and throughput obtained from analysis file using Splunk, Perfmon, MOFW and various logs from servers for each test after execution.
- Analyzed the Unix Server CPU & Memory Usage at Box Level.
- Prepared the consolidated test report for each executed load test in MS Word, power point & Email format after completion of analysis and shared with all stake holders and team.
- Worked with offshore team on daily basis to avoid any issues that might impact production and allocating tasks to team when required.
Environment: Web Servers, App Servers (Windows, Linux), Java, .NET, JIRA, Spooky client, UNIX, SQL Sequel Server, TDM database, Retail Database, LoadRunner, Load Generator, Selenium web driver, Quality Center, Performance Center in Cloud, MOFW, Dynatrace Ajax Client, SSMS (SQL Server Management Studio), RDM, Perfmon, Rabbit MQ, Wire Shark, DB Trace, Splunk, Site Scope
Confidential
Performance Engineer
Responsibilities:
- Gathered requirements from business teams and stakeholders, prepared test plans.
- Used Splunk tool queries to pull the peak hours’ load volumes for the business-critical transactions to be performance tested.
- Prepare Test Plan/Strategy and share with key stake holders, product manager, business lead, & development team for Sign-Off.
- Involved in testing at API level such as via Web Services, JMS Queue, or other back end service and executed tests at various levels which includes DB, logging, and UI validation.
- Prepare and Document performance test cases and test data setup process before start of testing.
- Developed scripts using Java, .net & Web Services protocols through Load runner tool and enhance the scripts to support error/data handling.
- Evaluated Think Time & Pacing calculations in preparing the scenario designs for testing with various loads ranging from 5 TPS to 600 TPS.
- Test execution includes: Smoke Test, Capacity Test, Stress Test, Endurance Test, and HTTP Watch Test - Client-Side Page analysis.
- Analyzed metrics shared them to customers for better understanding of issues before releases.
- Analyze the tomcat server metrics - JVM Heap Usage metrics: Used Heap after Collection, %time spent in GC, Young Generation Usage, Old Generation Usage, Perm Gen Usage; Process CPU.
- Prepare the consolidated test report for each executed load test in MS Word, MS Excel & Email format after completion of analysis.
- Track and Tabulate the defects pertaining to Key Performance Indicators not meeting the historical baselines test results/SLAs and functionalities not working as expected during manual validation after code deployment
- Support Testing in Production region to replicate the Prod issues (P1 tickets) in Test region then validate the fix and provide Sign-Off. The P1 tickets were mostly about the ones listed below:
- Unhealthy Heap usage - Heap memory not getting fully reclaimed after GC - fix here involved enabling of Multi-threading on JRuby Single Container that utilized the thread local storage space effectively after Invocations and GC.
- Connection termination with CPU reaching threshold of 85% - fix here involved Optimization of Search Java Code that avoids duplicate calls hitting the DB.
Environment: MAUI - Mainframes, Stubs Server, Load Balancer, IBM Support Assistant 4.1, Splunk, Load Runner - 12.50 & 12.53, Quality Center 11, Dynatrace Ajax Client, Visual VM, Graphite, Web Serve, App Server (Tomcat), Unix, DB2 database, Virtualization, Linux, Unix