We provide IT Staff Augmentation Services!

Qa Performance Test Analyst Resume

3.00 Rating

Dallas, TexaS

SUMMARY

  • Over 15 plus years of experience in Software Analysis, Performance, Automation initiatives for Client Server, Web Applications, and Web Services wif Government, Veteran Affairs, Point of Sales (POS), Financial, Airline Industry, Cyber Security, Mobile Industry, Hardware, Software, Manufacturing and private sector industries
  • Expertise in using Rational Team Concert (RTC), Rational Requirements Composer (RRC), Rational Quality Manager (RQM), HP Load Runner, IBM Rational Performance Tester tool and Jmeter/Blazemeter for performance testing, QTP, UFT, Rational Functional Testing for automation testing.
  • Worked on End - to-End performance testing - design, develop and execute performance test plan to validate load, stability, scalability and reliability standards of the application are achieved using all 3 components of load runner tool - Vugen for script development, Controller to control Load Runner Scenarios, Analysis component for Graphical representation of the application performance and provide performance analysis of API’s.
  • Worked as a part of the team to analyze performance test results, find solutions for issues to find the root causes as an effort to enhance the performance of the application.
  • Tested API’s using Soap UI for web services testing adding soap assertions, Soap response, Soap fault, Not Soap fault, Schema Compliance, WS Security Status, WS-Addressing Response, Working wif Messages to name few.
  • Used Compuware Dyna Trace for performance monitoring tool - User Experience, All Tiers, Code-Level detail for analyzing the test results.
  • Jmeter 5.1.1, Blazemeter v.4.6.0, Katalon Studio for windows 64-6.2.0, Appium Studio, Android Studio SDK, Fiddler 4, NetScaler Load Balancer

TECHNICAL SKILLS

Operating Systems: Windows 95/98/2000/2003/ NT/XP, RedHat Advanced Server Linux 2.4.x, Sun Solaris 2.6/2.8, MS-DOS 6.22

Languages: Visual Basic, C/C++, SQL, PL/SQL, HTML

RDBMS: SQL Server, MS-Access

Testing Tools: Win Runner 7.0/7.5/8.2, Quick Test Pro 6.5/8.0/9.0/9.2 , Load Runner 6.0/7.0/7.5/8.0 /9.0/9.5/11.04 , Test Director 7.0/7.6/8.0/9.0 , Quality Center 10.0/9.0, Rational Test Manager, BPT 10.0, Software Planner 9.4.2, Rally Software Development, Bugzilla, Jmeter 5.1.1, Blazemeter v.4.6.0, Katalon Studio for windows 64-6.2.0, Appium Studio, Android Studio SDK

Protocols: HTTP/HTML, Ajax TruClient, Ajax Click & Script, Win Socket in Load Runner

Rational Tools: Rational Test Suite 2002.00.05 (Rational Robot, Rational Test Manager, Rational Clear Quest, Rational Clear Case, Rational Requisite Pro), Rational Functional Tester (v7), Rational Functional tester 8.5

Methodology: Agile Development Methodology wif Scrum, RUP Methodology

Web Services: Service Test 11.10, 9.5, Soap UI testing on SAO/OSB

Professional Training: Oracle, Oracle Financials 11i - FA, GL, AP, AR, OM, OTC

PROFESSIONAL EXPERIENCE

Confidential, Dallas, Texas

QA Performance Test Analyst

Responsibilities:

  • Worked as a performance test architect for E2E testing for Financial Flows scoped in the current phase
  • Cost Rollup, Shop Costing Post Close, Post Inventory to GL & Supply Chain Management - Flow Trac & Customer Order Creation
  • Developed test plan document, developed scripts, executed them for baseline, load, performance, Stress test on determined load.
  • Used Jmeter/Blazemeter, Selenium for testing purposes.
  • Used Katalon Studio, Appium Studio, Android Studio SDK for Emulator based testing for Flow Trac which is hand held devices used by the business team on the plant.

Environment: Jmeter 5.1.1, Blazemeter v.4.6.0, Katalon Studio for windows 64-6.2.0, Appium Studio, Android Studio SDK, Fiddler 4, NetScaler Load Balancer, Database DB2 9.0, Windows Servers 2016, Web Sphere IBM HTTP server on AS400 server, IBM RPG programming language (HLL) for business applications.

Confidential, Dallas, Texas

Sr.QA Test Engineer/ Automation Tester

Responsibilities:

  • Managed Risks, CLIN deliverables, Application Change Requests (ACRs), Web Service Change Requests (WSCRs), Requirement Worksheets (RWs), Functional Line Item (FLI) requirements, operational characteristics, product designs, and test artifacts.
  • Developed and/or support UC diagrams using UML, Requirement Specification Document, System Design Document, Master Test Plans, Requirements Traceability Matrix, etc.
  • Worked on End-to-End performance testing - design, develop and execute performance test plan to validate load, stability, scalability and reliability standards of the application are achieved using all 3 components of load runner tool - Vugen for script development, Controller to control Load Runner Scenarios, Analysis component for Graphical representation of the application performance and provide performance analysis of API’s.
  • Worked as a part of the team to analyze performance test results, find solutions for issues to find the root causes as an effort to enhance the performance of the application.
  • Tested API’s using Soap UI for web services testing adding soap assertions, Soap response, Soap fault, Not Soap fault, Schema Compliance, WS Security Status, WS-Addressing Response, Working wif Messages to name few.
  • Used Compuware Dyna Trace for performance monitoring tool - User Experience, All Tiers, Code-Level detail for analyzing the test results.

Environment: s: Automation testing tools - RTC, RRC, RQM, Oracle Database XE Server, Eclipse, Maven, BDN - Share Point, Confluence, Slack, JIRA, Jmeter, Groovy scripting.

Confidential, Dallas, Texas

Sr. QA Lead Analyst/ Performance Tester / Automation Tester

Responsibilities:

  • Managed Risks, CLIN deliverables, Functional/Technical Requirements, etc.
  • Worked as a part of the team to analyze performance test results, find solutions for issues to find the root causes as an effort to enhance the performance of the application.
  • Tested Threat Protection (TP) and Advanced Threat Protection (ATP) in various releases Mater1, Mater2, Nemo & Dusty. Testing functionality, Interoperability, Integration, smoke, sanity testing; Software Data Encryption (SDE), Self-Encrypting Drive (SED), Enrollments - for Sign-in and Access authentication, Bit locker Encryption - Finger prints, smart card, Cloud Based testing by sending the policies from various servers - EE, VE, remote servers and verifying the policies received on the client systems.
  • Verified events flow from user interactive systems to the servers in the form of alerts, notifications using simulated unsafe and abnormal threats, real threats on virtual environment are sent and quarantined as per the set policies using appropriate data.
  • Tested antiviruses - McAfee Endpoint Security tools, Cylance Protect software and Threat Defense products using windows commands using child extracted components, Gui, PsExec, Silent command Installation methods as an Admin, regular user.

Environment: .Net Framework 3, 3.5, 4.5, SharePoint Office 365, McAfee Endpoint Security v10.0, Atlassian JIRA Project Management Software v6.4.12#64027, Qmetry v 6.7.04.15, Slack tool, Dell Test Machines Win 10, 8.1, 7 for both 64 & 32 bit wif different hard ware, drivers and configurations, Agile Methodology.

Confidential, Dallas, Texas

Team Lead and QA Tester (Automation/Performance/Web Services)

Responsibilities:

  • Designed all project documents relating to requirements and testing framework - functional, regression, automation, web services and performance testing initiatives.
  • Worked as a part of the team to analyze performance test results, find solutions for issues to find the root causes as an effort to enhance the performance of the application.
  • Developed POC for identifying automation testing tools, well suited for the project needs.
  • Developed requirements, defined environments, and authored testing framework documents.
  • Tested and validated Web Application - Consumer Web, Agent Web, VRU testing, Portal testing and Treasury, Gateway (Pay Leap) and Processor (TSYS) functionality.
  • Tested Consumer Web functionalities relating to Verify Account, Pay My Bill, Verify Payment, and TEMPThank You from both UI, middle layer and back end perspective for both UI and Mobile responsiveness for couple of Consumers (CE & TF) wif different merchant configuration.
  • Tested in Portal functionalities relating to Take a Payment, find a Payment, User Management for managing roles and responsibilities, Reporting and serving as Agent Web.
  • Tested VRU (Voice Response Unit) to validate features of Dual Tone Multi Frequency (DTMF), Audio Speech Recognition (ASR) and Blended technology features. Talk to Audio (TTA) recorded file is passed to CXP Voxeo and in Text to Speech (TTS) actual prompts are customized and used in the application,
  • Developed Requirements Management Plan, Test Plan and other strategic documents to implement requirements and testing phases - Functional, System, Integration, Regression and Performance validations.

Environment: s: .Net Framework, C# programming language, Microsoft Visual Studios 2013, Microsoft SQL Server 2014 Management Studio(SSMS), Automation testing tools - Parasoft SoaTest, Smart Bear Test Complete, Telerik Visual Studios, Soap UI for Rest services, Subversion for source control, Red mine project management/issue tracking system, SharePoint Office 365, CenturyLink cloud services.

Confidential

Sr. Systems Analyst and Test Engineer

Responsibilities:

  • Supported Release Planning and managed Requirements via RSD, BRD, RTM, etc.
  • Performed Automation and Performance testing on EVSS (eBenefits, VDC) Application for the VA; developed test scripts using ObjectMap/objmapEBenefits.rftmap, HelpUtil Function Library, Data pool Selection order as sequential for eBenefits application and VDC applications; created objects in object repository for both objMap and VDCobjMap - buttons, checkboxes, links, list, radio button, table, text etc.
  • Created test scripts for Basic flow, Alternative flow and Exception flow; updated test data, as the application undergoes changes in various releases and executed the script wif zero errors; and updated test config files to toggle and choose between various environments pint, pre-prod, QA environments. Debug the script for java errors, performed projEbenefits SVN commit and SVN update functions to submit and access the most updated versions of the script.

Environment: Rational Functional Testing tool, Java, Eclipse, IBM Rational Functional Testing 8.5

Sr. Analyst and Automation Tester (IBM Services)

Confidential, Irving Texas

Responsibilities:

  • Led highly dynamic sprint team using Agile methodology.
  • Created, executed and maintained over 200 plus Test Cases for Functional, UI, Regression, Database, and Automation testing for MS SharePoint Project on reporting tools.
  • In the process of building a detailed, reusable, Script-Free Keyword + Data Driven Automation Framework for over 25% Test Cases using QTP + Excel. No Record and Run
  • Reusable scripts will handle dynamic pages using QTP Descriptive Programming in VB Script.
  • Mobile devices, Network testing for iOS, Android, Windows phone and other smart phones.
  • Facilitated and coordinating the Learning Services UAT for new version of tools - Captivate Simulations 6, Adobe Presenter 8, Adobe Connect and other stand-alone applications.

Environment: Microsoft Project Server 2010, Share Point 2010, QTP 11.0, Windows Server 2008, Content & Custom Database, Microsoft Office 2010, Q Enterprise Manager v.4.8.5, Adobe Captivate tools and Presenter tools.

Sr. Performance Analyst and Engineer

Confidential, Seattle WA

Responsibilities:

  • Tested performance characteristics of the new release of the MOW -Booking through the User Interface which enables the customer to begin make his booking until he receives the booking confirmation (PNR) on the mobile device.
  • Performed analysis to determine the typical user activities of concurrent users using Confidential system.
  • Simulated user experience and activities, using a Load Runner Vugen scripts.
  • Set up monitors add measurement - for Window Resources -Web server, API server, MI IIS, Network Delay monitors, MS Active Server pages etc.
  • Assigned load generators to the scripts in a group, after evaluating the available memory @ 1 vuser uses 50 mb of memory.
  • Calculated the pacing, number of Vusers using the baseline test results and historical data.
  • Conducted Baseline, load and Endurance tests using the Controller for various load levels.
  • Used Analysis tool extensively to analyze the test result - adding new graphs, merge graphs, adjust the scales, granularity, gather important information from the Window Resources default measurements - %Total Processor Time, % Processor Time, File Data Operations/sec, Processor Queue Length, Page Faults/sec, disk time, Pool Nonpaged Bytes, Pages/sec, Total Interrupts/sec, Threads and Private Bytes.
  • Developed new graphs and reporting - Errors per second, Error per second (by Description), Transactions per second, Window Resources graph, MS IIS, Network Delay Time graphs to the default list Running Vusers, Hit per second, Throughput, Transaction Summary, Average Transaction response time graphs.
  • Prepared detailed Performance Test Reports outlining the Executive Summary for the test run for management review.
  • Prepared additional documents - Performance Test Analysis step by step guidelines, Power point presentation on Performance Test analysis approach, Load test & Controller setting step by step guidelines.

Environment: Load Runner 11.00, Ajax TruClient (Firefox, IE9.0), HTTP/HTML, First Class for Mailbox, Web Server, Mobile QA DNA server, Windows Server 2003, Mantis for logging defects.

Team Lead, Sr. Performance & OSB/SOA Web Services Tester

Confidential, Dallas Texas

Responsibilities:

  • Involved in Performance Test Approach - Planning, Recording, Execution & Monitoring, Analysis Document defects, Execute defect retests.
  • Performance Testing comprises of Baseline/Benchmark testing, Peak Load testing, Stress/Breakpoint testing, Endurance testing, Capacity Planning and Handset UX Checkpoints
  • Involved in analyzing the performance test results performed on devices, web services, perfecto based, Android Monkey, Android Debug Bridge (adb) testing to validate system can adequately support the projected user base and maintain an acceptable level of system performance wifout degrading end user experience.
  • Used Perfecto Tool based on cloud technology to perform latency tests on mobile devices - Android, UICC, Iphone, Blackberry.
  • Used Oracle Enterprise Manager to manage, monitor the SOA Infrastructure which has been deployed to various environments. Also used Vordel Monitoring tool for analyzing test logs during device activation phase.
  • Tested over 45 standalone API’s/Services using Soap UI for Web Services testing and in combination wif various Isis Flows for Portals, BPEL and devices Android Micro SD, UICC, Iphone based tests.
  • Used NXP JCShell program for cleaning the Secured Element after activation is complete. Follow other steps, terminate the activated device from database, clean the SE, delete associated users from TSM database, uninstall the application or clear the data from the device before attempting new device activation.
  • Worked on Ztracker for analyzing the mobile based application for tracking purpose.
  • Performed Load & Performance tests, Benchmark, Stress, Endurance test up to 10K virtual users up to 24 hours. Analyzed the graphs, performed comparison wif various waves testing results and submitted the result to the management.
  • SharePoint was used as a document repository for managing the documents on the share drive
  • Extensively used SQL developer for various database updates and validation.

Environment: Web Logic, Oracle 10g, OSB, Bugzilla, IBM Rational Performance Tester 8.2.1 (RPT tool), Web service wif SOA Extension, Soap UI (4.0.1), Vordel, Savvis, SharePoint, Microsoft Outlook 10.0, Mobile Devices, Stop Watch, Oracle Enterprise Manager 11g (Fusion Middleware Control), JCShell for cleaning the SE, SQL developer, Wallet Simulator, TSM Simulator for load testing. Used Oracle Application Server 10g.

Team Lead, Sr. QA and Performance Test Analyst

Confidential, Dallas Texas

Responsibilities:

  • Validated the web services using Service Test 9.5 - Front Plane (Front End) development done using Oracle ADF, AJAX and Java Script and deployed on Web logic, Cross Plane (Mid layer) developed using Web Services, core java, J2EE, Back Plane and Point Solutions (Back end) developed using JMS, Web-services and J2EE
  • Composed the test cases to validate and understand various Business Process Management (BPM) levels, Business Flows, Transactions for Itinerary Management and Distribution Management.
  • Participated in the Agile Sprint Planning session in analyzing the requirements, working closely wif Development teams before the development process, planning poker, estimation of tasks for the ongoing 2 weeks sprints.
  • Worked wif Jmeter in compiling scripts for Web Services testing. Assisted the team in analyzing the test results for Jmeter.
  • Worked wif Web Services Manager in Architecture team to perform Proof of Concept (POC) to procure an efficient XML gateway product to support the existing production environment and high volumes expected for the Agilaire environment.
  • Involved in planning and implementing the Performance Test strategy to evaluate the scoped 3 products - development of scripts using Vugen, Winsocket Protocol, script execution using 100 plus users, for different loads, CPU 1k, 100k, for 10min, 30 min duration, analyzing the test results using Analysis tool.
  • Used Putty which acts as a client for the SSH, Telnet and raw TCP computing protocols for Raw data for each connection which displays both sent/receive response, to socket, to the xml gateway product under test. This process is repeated for all the 3 gateway products in scope for POC analysis.
  • Used Filezilla client to collect the vmstats generated through Putty for further analysis along wif the graphs produced from the Analysis tool of Load Runner
  • Analyzed the results using vmstats, Analysis Test result graphs - Summary, Running Vusers, Throughput, Transaction Summary, Average Response Time, Total Transaction Per Second for each individual gateway xml product for Edge Security evaluation
  • Involved in assisting the team in compilation of the product report for the management review for selection of the gateway product.
  • SharePoint was used as a document repository for managing the documents on the share drive
  • Involved in the testing, validation of the migration effort in co-ordination of the Application Testing team Prior to handing it over to the Production team.

Environment: Java, J2EE JSP, JSF, Jax-ws, Oracle Application Development Framework (ADF), GIT, Virtual Machine, Red Hat Enterprise Linux Release 6.0, Eclipse, JDeveloper, Web Services, Web logic 10g, JBoss 5.0.2, Tomcat, XML, XSL, AJAX, JMS, JMockit, Maven, Enterprise Linux 5.0, Service Test 9.5, QTP 11.0, Load Runner 11.0, Jmeter, Soap UI, 9Agile Methodology Scrum, Rally Software Development Lifecycle Management System.

Confidential, Olympia WA

Sr. QA Analyst

Responsibilities:

  • Developed Test Plan, Test Approach, Test Scope and Identify Risks, Test Case writing framework as per the standards approved by the Pre-Panel and Board authorities.
  • Created Test Cases for Functional, Integration, System, Regression and User Acceptance Testing
  • Created naming standards and documents for both Test Case Development and Defect tracking process. Created Functional Coverage Metrics for test case development
  • Performed Smoke, Regression, functional testing, UI, Regression testing for every build which was deployed - new code, patches, defect fixes.
  • Extensively used SQL Server 2000 for executing queries for database validation.
  • Core applications were thoroughly validated by witting Test Cases for UI validation, database validation, application level validations. Good understanding of SOAP/XML code, to read the code during analyzing the defect.
  • SharePoint was used as a document repository for managing the documents on the share drive. Requirements were stored in the SharePoint and defects were logged and updated.
  • Functionalities tested are - Work Order, Protocol, Study, Envelope, Data Entry, Pre-Panel, Panel, Post Panel, Signature, Transmitting, etc.
  • Provided mentorship to the entire QA team in developing Test Cases, writing Test Plan, executing the Test Cases and execution of SQL scripts for database validation
  • Developed an extensive QTP framework to automate the Regression testing to validate the IRIS application for every build and upgrade. QTP framework was developed for Core and Standalone functionality using QTP functionalities – Parameterization, variables, wait functions, check points etc.

Environment: Visual Studio-2003, .Net Framework 2.0, Sql Server-2000, Visual Source safe 6.0, Institutional Review Information System (IRIS application Version 1.25), Share Point 2007, Window XP, QTP 10.0, Agile Methodology Scrum.

We'd love your feedback!