We provide IT Staff Augmentation Services!

L&p Test Specialist Resume

MI

SUMMARY:

  • 7 years of experience in Performance Testing, Quality Assurance and Automation testing in various domain like Health Care, Banking, E - commerce.
  • Worked with SOAP UI testing involving with WSDL, request/response XML’s, HTTP, SOAP and REST web services.
  • Hands on experience in using automated tools like Load Runner, Performance Center, QTP, Test Director, Quality Centre.
  • Used the various monitoring tools like HP Site scope, HP Diagnostics and Dynatrace to keep track of the test performance and identify various bottlenecks.
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle(SDLC) right from Inception, Transformation to Execution which include Design, Development, and Implementation
  • Strong theoretical and practical experience with various Agile approaches and implementation in a large organization
  • Worked in Agile project ensuring close, daily cooperation between business people and developers
  • Experience in performance testing of Java, .NET applications on IIS Server
  • Experience in infrastructure testing for enterprise wide applications.
  • Extensive experience with baseline, benchmark, load, stress, endurance, and capacity testing for performance.
  • Proficiency in testing the applications compatibility on UNIX and Windows platforms
  • Performing System Testing skills include Black Box, Smoke, Regression, Integration testing, User acceptance Testing
  • Expert in writing, executing test cases, usage of various databases, tools for bug tracking and generating reports
  • Worked closely with the developers and business customers to understand the business requirements and overall strategies
  • Strong process and documentation skills for performance testing/engineering.
  • Experience in Performance testing of Web applications and Client/Server by using Load Runner
  • Expertise in Manual and Automated Correlations to Parameterize Dynamically changing Parameters values
  • Tested and coordinated mobile applications in both Native and Web environments and validated the mobile application functionality on physical smart phone hardware devices and virtual device emulators like: Device Anywhere, perfecto mobile and Simulators like: Android SDK, IPhone SDK etc.
  • Monitoring system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat
  • Collecting the frequency of JVM Heap and Garbage Collection in Web Sphere and Web logic during test
  • Strong knowledge in database/SQL (DB2, Oracle, SQL Server) queries.
  • Expertise in Test documentation, Manual, Automation testing and Execution on Client/Server, Integrated Intranet, UNIX, Linux, Mainframes and Internet applications
  • Strong Experience in Automation Web Application Testing using Selenium WebDriver, Java and C# language bindings with TestNG framework.
  • Writing Test cases using Element locators, WebDriver methods, Java programming features, TestNG Annotations.
  • Excellent Knowledge in Executing Selenium Test Cases and Reporting defects.
  • Experience in Insurance, Financial, Trading, Retail and Pharmaceutical industries
  • Have an ability to handle multiple projects with competing priorities
  • Individual with good analytical, inter personal and problem solving skills

TECHNICAL SKILLS:

AIX, HP: UX, Solaris, UNIX, Windows XP, 2003, 2010, Vista, Windows NT and Linux.

Languages: C, C++, JAVA/J2EE, VB Scripts, PERL, Python, XML, Shell Scripting, Ruby.

Oracle, DB2, SQL Server, MS: ACCESS, MySQL

Web Related: DHTML, XML, HTML

Testing Tools: Load Runner, Win Runner, Quick Test Pro, TOAD, Selenium

Web / Application Servers: Apache, Tomcat, Web logic, Web Sphere, IIS

Methodologies: RUP, Agile, Performance Testing

Project Management /Analysis: MS Project, MS Visio, Clear Case, Clear Quest, Rational Requisite Pro and UML

Other: Performance Center, Willy Introscope, Sitescope, TeamQuest, Quality Center, Diagnostics

PROFESSIONAL EXPERIENCE:

Confidential, MI

L&P Test Specialist

Environment: Loadrunner, ALM, Rally, SoapUI, Jmeter, C#, MS Office, Windows and Linux, LAN, WAN, Java, Rest API, SOAP web services, SQL, DB, Teradata, Jmeter, Hadoop, Datastage(ETL), Agile, Jenkins, BDD.

Responsibilities:

  • Experience in creating Vuser scripts that contain tasks performed by each Vuser, tasks performed by Vusers as a whole, and tasks measured as transactions.
  • Developed Vuser Scripts in web http/html, rest, soap protocols for various webservices.
  • Designed tests for Benchmark, load tests and Stress testing.
  • Parameterized huge complex test data to accurate depict production trends.
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description
  • Create test scenarios in controller based on the business requirements. Random pacing between iterations was introduced to get the desired transactions per hour
  • Analyze load test results using LoadRunner Analysis tool and analyzed SQL, DB servers and database connections, sessions, Web Logic log files.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
  • Create load test approach document, summary report and consolidated test report for each release and review with business teams.
  • Introduced Jmeter as a performance testing tool for the developers in order to create their own test scripts and test their application and services.
  • Worked on integrating performance testing tool with Jenkins to create a Continuous Integration and Continuous Deployment pipeline.
  • Interact closely with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat using UNIX commands like top, vmstat, svmon and netstat.
  • Resolve performance tuning related issues and queries.
  • Analyzed JVM Heap and GC logs in Web Sphere during test execution.
  • Extensive experience monitoring the purepaths for 3-Tier architecture using Dynatrace.
  • Conducted result analysis and communicated technical issues with developers and architects
  • Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution. Worked in an agile environment.
  • Created comprehensive test results report.
  • Manage Performance Center servers, monitor the server status, edit server information, and check server performance.
  • Manage timeslots, view user reservations, and monitor availability of time and resources Manage timeslots, view user reservations, and monitor availability of time and resources.

Confidential, Saint Louis, MO

Senior QA Analyst

Environment: Selenium, Java, Soap UI, Load Runner, Neoload, C#, Performance Center, Dynatrace IIS, Web Logic, Oracle, QC, SOA Test, MS Office, MS-Visio, Java, Windows and LINUX, LAN, WAN, .Net, Java.

Responsibilities:

  • Involved in gathering business requirement, studying the application and collecting the information from developers, and business.
  • Designing the test plans which include scope, test strategies’, test scenarios and types of tests to be executed.
  • Review and interpret business requirement documents, and apply the functionality to testing models
  • Involved right from the Architecture Design of the Automation Framework using Selenium WebDriver.
  • Used Selenium Grid, JUnit test scripts to run automated test cases in parallel on 6 environments.
  • Creation and maintenance of Automation Test scripts.
  • Worked on SOAP UI, RESTful API html, http and external data source testing. Analyze, record and modify client-server traffic using HTTP monitor and then create and run functional and load tests on web services prior to implementation.
  • Configured Maven for JAVA automation projects and developed Maven project object model (POM).
  • Developed Vuser Scripts using Loadrunner in web, Web Services, Ajax True Client, ODBC, and Click & Script Protocols.
  • Designed tests for Benchmark, Stress and Endurance Testing.
  • Parameterized large and complex test data to accurate depict production trends.
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description.
  • Involved in SQL Query tuning and provided tuning recommendations to ERP jobs, time/CPU consuming queries.
  • Added performance measurements for Oracle, Web Logic, IIS in Load Runner Test Center.
  • Analyzed results using Load Runner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
  • Used Wily Introscope in monitoring java system as a part of regular activity in some areas like CPU utilization, File system usage, OS Memory utilization, Java thread monitoring, Java memory utilization, Http sessions monitoring.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Maintained test matrix and bug database and generated monthly reports.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used Load Runner tool for testing and monitoring actively participated in enhancement meetings focused on making the website more intuitive and interesting.
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
  • Provide support to the development team in identifying real world use cases and appropriate workflows
  • Worked with business & Infrastructure teams and developed business & Infrastructure workload models for several applications.
  • Measured performance units like response times, throughput etc. for web systems optimization. Built servers based on the system performance cycle and metrics.
  • Performs in-depth analysis to isolate points of failure in the application
  • Assist in production of testing and capacity certification reports.
  • Configured Offline & Online Diagnostics like J2EE/.NET Diagnostics through performance center.
  • Used Site scope to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.

Confidential, Minneapolis

Performance Engineer

Environment: LoadRunner, Performance Center, Sitescope, SAP, Oracle, PeopleSoft, MS SQL Server, Web logic, Load Balancer, Dynatrace, Jmeter, JAVA, Quality Center, J2EE Diagnostic Tool, web, Windows 2000/XP, HP-UX, AIX

Responsibilities:

  • Defining the performance goals and objectives based on the client requirements and inputs
  • Responsible for Performance, Stress & Regression Testing enterprise wide PeopleSoft ERP systems, Web applications.
  • Extensively Worked in Web, people soft, Oracle 2-Tier and Web services protocols in Load Runner
  • Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
  • Responsible for developing and executing performance and volume tests
  • Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
  • Partner with the vendor organization to analyze system components and performance to identify needed changes in the application design
  • Configure and set up monitors in site Scope.
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
  • Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
  • Experienced in Mobile application testing on IPhone, Android, Blackberry and feature phones using Simulator Device anywhere, Perfecto as well as real physical device.
  • Experience in Functional testing of Cloud based web applications to improve service efficiency.
  • Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
  • Implemented and maintained an effective performance test environment.
  • Identify and eliminate performance bottlenecks during the development lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Conducted Duration test, Stress test, Baseline test
  • Verify that new or upgraded applications meet specified performance requirements.
  • Used Performance Center to execute tests, and maintain scripts.
  • Used to identify the queries which were taking too long and optimize those queries to improve performance
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Created Test Schedules.
  • Worked closely with clients in an Agile environment.
  • Interface with developers, project managers, and management in the development,
  • Execution and reporting of test performance results.

Confidential, Kansas City, MO

Performance Tester

Environment: Load runner, QTP, Quality center, Informatica, Jmeter, Java, XML, Oracle, Business Objects, SQL server 2000/2005, Windows XP, Telnet, Web Sphere, Lotus Notes and UNIX

Responsibilities:

  • Experience in requirement gathering and analysis, data analysis, data mapping, functional design, quality assurance/Testing and documentation for Performance goals
  • Developed Test Plans, Test Scenarios, Test Cases, Traceability Matrix, Test Summary Reports and Test Execution Metrics.
  • Developed Vuser Scripts in web, web services, and Citrix Protocols.
  • Tested web services applications using SOAP Client as well as by using WSDL Files.
  • Developed and Executed the Test cases & scripts for Functional, System, Regression, Integration, Performance and UAT.
  • Involved in writing Complex queries using SQL for Data Integrity checks.
  • Participation in requirement / Use Case analysis, risk analysis and configuration management.
  • Validating the data thru various stages of data movement from staging to Data Store to Data Warehouse tables.
  • Created Test Cases using the SDLC procedures and reviewed them with the Test lead.
  • Executed all the Test Cases in the Test Environment and maintained them and documenting the test queries and result for future references.
  • Testing included unit, system testing, regression testing, Business Objects reports testing.
  • Provide flexible & high quality support to wider Testing strategies on key regular deliverables & ad hoc reporting issues.
  • Created scripts for web applications using web/http protocol and client server applications using web services protocol.
  • Validated client application interaction with middleware application, which is via HTTP/SOAP
  • HP Business Process Testing Using SOAP UI, Quality Center for .COM Web Application
  • Developed Web Service Vuser scripts for a Web Service Call using Soap UI.
  • Involved in performance testing and governance validation.
  • Retested the modifications, once bugs are fixed after reinstalling the application.
  • Developed automated scripts using Win runner for functional and regression testing.
  • Reported the bugs through email notifications to developers using Quality Center.
  • Generated Problem Reports for the defects found during execution of the Test Cases and reviewed them with the Developers. Worked with Developers to identify and resolve problems.
  • Lead and Schedule QA project status meetings and publish meeting minutes.
  • Developed presentation and testing implementation learning to other testing resources for cross functional training.
  • Involved in Performance testing of Web Applications with Load Runner.
  • Developed Load Runner Scripts using Vuser Generator
  • Created Scenarios for Performance Testing using Load Runner Controller and Analyze performance at server monitoring.
  • Generated Reports and Graphs using Quality Center.
  • Validate the XML files and accurate the Data.
  • Well versed with Unix Shell Scripting.
  • Assisted Team members in knowledge transfer of the Application and Quality Center
  • Involved in Defect Review meetings with Business Core Team and Developed use-case Analysis

Hire Now