SUMMARY:
- 5+ Years of experience inSoftware development (SDLC)ofQA and Automated Performance testing covering all test types and levels of test methodology.
- Strong Experience in QA &Performance Testing of web - basedapplications,SOA applications, Trizetto Facets, TIBCO applications and ETL Batch jobs.
- Strong experience in gathering and analyzing functional &non-functional requirements.
- Strong experience in Performance Testing tools HP Load Runner/ HP Performance Center.
- Strong experience in developing Performance Test Plans/Test strategies.
- Strong experience in developing Performance Test Scripts and executing Performance Tests.
- Strong experience in doing System Testing, Performance Testing, Benchmark Testing, Stress Testing and Endurance testing.
- Strong in Load runner Scripting concepts - Correlation,Parameterization, checkpoints and scripts customization.
- Strong experience in Web/HTTP, Web Service, Oracle2tier, Citrix and Ajax Truclient protocols.
- Experience in using monitoring tools Compuware DynaTrace, Wily IntroScope, andTeam Quest.
- Monitoring system resources using VM Stat, I/O Stat commands and collecting the metrics like CPU utilization, memory use,and Disks I/O rates, Paging space and paging rates etc.
- Experience with Automation Tools HP QTP/UFT, Selenium and Rational Functional.
- Experience with web services testing tools SOAP UI, Parasoft SOATest& JMeter.
- Experience in using UNIX command and writing UNIX shell scripts.
- Experience in working with XML, SOA and Messaging Queues.
- Experience in writing SQL queries for test data preparation and to validate test results.
- Involved in Performance Tuning of Oracle, DB2 and SQL server databases.
- Experience in defect management tools HP Quality Center/ALM.
- Experience in working with Waterfall model, V Model, and Agile SDLC methodologies.
- Excellent communication, coordination and interpersonal skills and worked as self-starter and performed well under little or no supervision.
- Ability to learn new technologies and challenging concepts quickly and implement them.
TECHNICAL SKILLS:
QA/Testing Tools: Load Runner, Performance center, Para soft SOATest, Selenium IDE, SOAP UI, JMeter, Postman
Monitoring Tools: Compuware DynaTrace, Wily IntroScope, Teamquest, Splunk, BWPM
Defect Management Tools: HP Quality Center/ HP ALM
Databases: Oracle10g/9i/8i, DB2, MS SQL Server 2008
Web Servers/Application Servers: Apache Tomcat, IBM WebSphere
EAI: TIBCO BC, TIBCO BW
Operating Systems: Windows 98/2000/XP, Windows Server, IBM AIX, Red hat Linux
Languages: C,SQL,Concepts of Java
PROFESSIONAL EXPERIENCE:
Confidential, Owings Mills, MD
Performance Test Engineer
Responsibilities:
- Gather the Performance requirements and identifying the Performance Test scenarios.
- Analyzed Business Requirements, Use Cases and developed Test Plan, and Test cases for complete end-to-end testing.
- Developing Performance Test Plan, Performance Test Cases, and Performance Test strategies.
- Interacting with project team to make sure the Performance Test environment is ready for Performance Tests.
- Developed workload model for various applications based on the application usage.
- Identified various Performance monitoring points based on the application architecture.
- Developing Performance Test scripts using Load Runner’s VuGen with protocols Web/HTTP, Web services and Ajax Truclient.
- Execute Web Services Performance Tests using Load Runner, JMeter, Parasoft SOATest tools
- Used Ajax Truclient to capture the response times of Ajax applications.
- Gathering the test data required to execute Performance Tests by interacting with Developers and Business Analysts.
- Enhancing the Performance Test scripts using correlation and parameterization.
- Implementing the content checks and error handling in Performance Test scripts.
- Used if conditions, for loops and c functions to customize the scripts wherever required.
- Inserting rendezvous points for simulating Vusers to simultaneously perform a task.
- Confirming that the Performance Test scripts are working, by checking in the database in case of updates or inserts.
- Designed the scenarios for Benchmark Testing, Stress Testing and Endurance testing using Load Runner Controller.
- Executed Performance Tests for different web technologies such as J2EE, Ajax, and PEGA.
- Monitoring the Performance metrics such as Response Times, Hits Per Second, Throughput during Performance Tests.
- Using Compuware DynaTrace Identified which layer is contributing most to the response times.
- Monitoring Heap usage, Web Container threads, JSP/Portlet response times through Wily IntroScope.
- Developing the Performance Test reports and reviewing the reports with project team.
- Identified various bottlenecks such as slow performing SQL queries, Load Balancing issues.
- Analyzed the server logs and heap dumps to find out hung threads and memory leaks.
- Developed UNIX shell scripts in extracting the logs after Performance Tests are executed.
- Interacted with developers, DBAs, System Admins to resolve performance issues.
- Used Ajax Truclient protocol for browser Performance Testing of web 2.0 applications.
- Monitored Queue depth while doing Performance Testing of TIBCO applications.
- Executed Performance Tests of batch jobs, monitored the duration of batch jobs and compared against the SLAs.
- Done Performance Tests of ETL(Informatica) batch jobs and collected Performance metrics.
- Executed Performance Tests of Trizetto Facets health care claims processing application.
- Used Firebug and HTTP Watch to monitor the requests of web applications.
- Used Selenium IDE to automate the scripts for data creation.
Environment: Load Runner, JMeter, SOATest, J2EE, TIBCO, BWPM,IBM Portals, WebSphere, Wily IntroScope,Site Scope,Splunk, Selenium, DynaTrace, PEGA, Oracle,SQL, HP ALM &QTP.
Confidential
Systems Engineer
Responsibilities:
- Reviewing the technical design documents of the application to understand the architecture.
- Gathering the Performance Requirements and identifying the Performance Test Scenarios.
- Develop Performance Test Scripts using HP Load Runner (Virtual User Generator).
- Used Web/HTTP protocol for developing the Performance Test Scripts.
- Correlating the Performance Test scripts by capturing the dynamically changing data.
- Parameterizing the Performance Test scripts to use various sets of data.
- Inserting Text check points to make sure the required pages are getting downloaded.
- Enhancing the Performance Test scripts using C functions wherever required.
- Designing the scenarios using HP Load Runner (Controller) for Load Tests and Stress Tests.
- Executing the Load Tests and Stress Tests using Controller.
- Monitoring the average response times, throughput, hits per second graphs using Controller during Performance Tests.
- Analyzing the Performance Test results using Load Runner Analysis by merging various graphs.
- Interacting with technical team members to find out and resolve the bottlenecks.
- Monitoring Server CPU and Memory usage using UNIX commands.
- Preparing Performance Test Report once the performance tests are completed.
- Reviewing Performance Test Reports with the project team members.
- Reviewed the business requirements and identified various test cases.
- Developed System Test Cases as per the requirements.
- Performed Smoke testing, System Testing, Regression Testing.
- Identified defects and tracked them using HP Quality Center.
- Identified the various test cases which need to be automated.
- Executed automation test cases using HP Quick Test Professional.
Environment: Load Runner,Oracle10g, J2EE, MS SQL Server 2000, Web Logic, IBM AIX 4.3, MQ Series 4.0, Quality Center 9.0, Quick Test Professional 9.1.