Software Test Engineer Resume
Washington, DC
SUMMARY:
- Over 10 years of experience in Information Technology with Project Management, Quality Assurance, Quality control, Load testing, Performance Testing, stress testing, scalability testing, test environment/app tuning and Capacity management.
- Experienced in implementing QA Processes, implementing new tools, leading multiple Projects with Multi - Site & Multi-Cultural teams, developing Test strategies, Metrics and Measurements.
- Experienced working with Customers, users, Project teams, Senior Management and Managing Onsite-Offshore model
- Expert Knowledge in different types of testing - Integration, Functional, system/end-to-end, sanity, regression testing, performance/volume testing and fault tolerance testing.
- Successfully completed Unified Billing Platform: UBP-R1 - R8 Sprint Nextel merger Engagements
- Actively participated with Software Development Life Cycle (SDLC), IA walk through, CR meeting and QA life cycle.
- Worked directly with ITPM, Business owners, development team, performance test engineers and management for every minor/major release.
- Coordinated and supported testing team and development with test execution, application troubleshooting with proper engagement.
- Developed documentation, Analysis of the business requirements, software requirements specifications, test plans, test strategy and capacity plan
- HP Performance Center: Extensive experience with Performance testing process, creating Loadrunner test scripts, Capacity planning, Performance testing load model and Test execution
- Extensive experience in bug tracking tools like TestDirector/Quality Center, ChangeSynergy, ClearQuest, Rational RTC and Rational RQM
- A self-starter with clear understanding, goal oriented approach to problem solving, well-organized and excellent communication skills
- Experience in - HP performance center, QTP/UFT, Ruby Watir and Selenium for automation frame work and UAT testing
TECHNICAL SKILLS:
Languages: C, C#, C++, SQL, and Java.
Test Tools: QTP, WinRunner, Loadrunner. Performance Center,Selenium, SOAPUI, JMeter
Bug Reporting: TestDirector, Quality Center. JIRA
Operating System: Windows 95/98/NT/2000/XP/ME/Vista/Windows Server2003 and Linux.
Web Language: HTML,DHTML, COLDFUSION, ASP, and XML
Front Ends: Visual Basic.
RDBMS: Oracle, SQL Server, MS Access, Sybase.
Scripting: TSL, VBScript, Java script, SQL, PL/SQL, C#, C++
Web Servers: Websphere MQ, Weblogic
App. Servers: IIS, Web Logic
Network: FTP, HTTP, TCP/IP, Telnet
Software: PVCS, MS Word, MS Excel, MS Power point, MS FrontPage, MS Outlook, Adobe Photoshop, Dreamweaver, Norton, Symantec, VSS, and AutoCAD.
PROFESSIONAL EXPERIENCE:
Confidential, Washington DC
Software Test Engineer
Responsibilities:
- Has very strong exposure on Performance using Loadrunner and JMeter
- Execution of automated test scripts using JMeter based on business/functional specifications
- Converted LR scripts to Jmeter through proxy setup in Loadrunner
- Used regular expression for dynamic values in Jmeter
- Perform Microsoft security patch (client, server) testing every month in Alpha environment.
- Check the data flow through the front and backend and extensively used SQL queries to extract the data from the database
- Excellent knowledge of Agile & Scrum development environment, multiple web browsers, and automated testing suites
- Prepare the sample data-sheets with different Test Cycles of Test Plans .
- Gathering and analyzing the performance requirements
- Preparation for the Business flow documents and test scripts for the identified business transactions
- Executing the test scripts with various load patterns and monitoring during the test run
- Identified key performance bottlenecks in the Web Servers, App Servers and Database Servers by using HP Diagnostics and also performed end-to-end root cause analysis on these bottlenecks to pin point the reasons.
- Involved in preparing the test reports along with the observations which needs to be shared with the client
- Performance business Requirement Analysis based on the client requirements and inputs.
- Developed & designed performance test plan, workload model according to the client’s requirement.
- Coordinated with Application team to understand the business process & identified the test data for load testing process.
- Coordinated with Application, server & DB teams during and after the load test.
- Developed Analysis results document & efficiently communicated the performance bottlenecks with all the concerned teams, management & project manager.
Environment: HP Performance Center 12, Dynatrace, Jmeter, IBM RQM, ClearCase, ClearQuest, Mainframe, Web Services J2EE,JAVA EJB, Webserver, BEA WebLogic 8.1 App Server
Confidential, Baltimore, MDSoftware Test Engineer
Responsabilités:
- Interacted directly with developers, project managers and stake holders. Elaborate performance executions and reporting of test performance results in low/High level.
- Determine the capacity management program goals, requirements, work flow and ensure its successful execution
- Server health checkup & patch validation on Dynatrace servers.
- Coordinate work with the distribution analyst to ensure appropriate allocation of resources, to meet the capacity demands, as per the set goals and plans
- Researched past application response time metrics, business transactions from production support team for developing realistic test scenario and load models.
- Reviewed test plan, test cases and communicated aggressively to get approval/sign off on time from application owners and stake holders.
- Designed Load model on the basis of the current volume and projected percentage increase in volume based on production metrics.
- Developed test plan strategy and involved in test client and test environment build.
- Review enhanced LoadRunner scripts using Vugen when necessary to boost delayed projects from multiple location teams.
- Develop and debug custom LoadRunner VuGen scripts for distributed environment.
- Develop Performance metrics worksheet to test and track optimal parameter and configuration setting for Windows IIS servers.
- Design and execute test scenarios using Performance Center.
- Set up SiteScope monitors for collecting application performance metrics during test execution using Performance Center.
- Provide guidelines to Test engineers to Configure Performance Center test scenarios and Vuser according to the load model so as to take into effect the load distributed across various geographies
- Provide guidelines Configured and used Site Scope Performance Monitor to monitor and analyze the performance of servers by generating various reports from CPU utilization, Memory Usage to load average, pages/sec etc.
- Analyze Web Server, App Server and Data Base Servers while performing load test
- Performed in-depth analysis to isolate points of failure in the application
- Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
- Used HP Diagnostics for Web Page Diagnostics and J2EE/.net Diagnostics to identify and pinpoint performance problem with Web J2EE .Net applicators
- Provided input based on test analysis for the tuning team for environment/application optimization and maintained all configuration and parameter changes and documented as recommendations for production team.
- Coordinated daily status call for technical and non-technical audiences.
- Creating Automation framework from test cases using Selenium, mapped the automated scripts to the corresponding test cases in the Test Manager to do the test coverage.
- Mentor new team member and assign task to each team member with accountability and responsibility.
- Working in Agile - Scrum development practices and testing methodologies. Facilitated daily stand-up scrum meeting
Environment: Quality Center, QTP, Performance Center, Dynatrace, AWS, Jmeter, DB2 and JCL, SOAPUI, Web Services J2EE,JAVA EJB, Webserver, Unix, BEA WebLogic 8.1 App Server
Confidential, Annapolis, MDSoftware Tester
Responsibilities:
- Used Fiddler’s Wcat extension for load testing on the web application.
- Used Fiddler to inspect the web traffic in Windows system
- Studied the existing architecture and the proposed architecture finding the difference in terms of software versions and configuration changes and the scaling factor and documenting them
- Identified key business scenarios from application specialists or business analysts
- Interacted directly with developers, Project managers and stake holders. Elaborate performance executions and reporting of test performance results in low/High level.
- Researched past application response time metrics, business transactions from production support team for developing realistic test scenarios and load models.
- Developed test plan, test cases and communicated aggressively to get approval/sign off on time from application owners and stake holders.
- Designed Load model on the basis of the current volume and projected percentage increase in volume based on production metrics.
- Developed test plan strategy and involved in test client and test environment build.
- Review enhanced LoadRunner scripts using Vugen when necessary to boost delayed projects from multiple location teams.
- Provide guidelines to Test engineers to Configure Performance Center test scenarios and Vuser according to the load model so as to take into effect the load distributed across various geographies
- Provide guidelines Configured and used Site Scope Performance Monitor to monitor and analyze the performance of servers by generating various reports from CPU utilization, Memory Usage to load average, pages/sec etc.
- Analyze Web Server, App Server and Data Base Servers while performing load test
- Performed in-depth analysis to isolate points of failure in the application
- Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
- Used HP Diagnostics for Web Page Diagnostics and J2EE/.net Diagnostics to identify and pinpoint performance problem with Web J2EE .Net applicators
- Provided input based on test analysis for the tuning team for environment/application optimization and maintained all configuration and parameter changes and documented as recommendations for production team.
- Coordinated daily status call for technical and non-technical audiences.
- Mentor new team member and assign task to each team member with accountability and responsibility.
Environment: HP Loadrunner, HP Quality Center, Quicktest Pro, AWS, Jmeter, Windows Server, Sql Server, .NET, Web Services
Confidential, Washington DCQA Tester
Responsibilities:
- Involved in writing the test conditions, test data, test cases using Functional Specifications.
- Detecting bugs & classifying them based on the severity.
- Based on use cases translated the information to test cases and test procedures.
- Performed High Level Design document reviews. Participated in Feature Design review meetings and presented test case review, strategy and feature functionality.
- Identified, established test requirements and documented using Test Director for requirements management.
- Documented test cases and conducted manual testing using Quality Center.
- Created detailed periodic status reports for senior management to keep them posted on the progress of implementation.
- Attended periodic meetings, teleconferences and led discussions on problem resolution.
- Determine system project line readiness and perform regression analysis.
- Extensively performed system testing to validate user interface, workflow and overall functionality using simulators and on real hardware.
- Performed integration testing between various modules and with hardware interfaces.
- Performed GUI, Functional, and Performance Testing and also used QTP and Loadrunner to automate the testing process.
- Worked within a team to resolve Relational Database problems.
- Implemented a full transaction engine using XML messaging/dialog from scratch.
- Updated XML documents for new requirements for the user
- Created automation test scripts using QTP tool and performed interface, functionality, and regression testing on new builds of the software.
- Monitored Daily and Scheduled reports generated by UAT analysts and System as a Vuser.
- Analyzed Load and Generation reports of the Scheduled against the online reports.
- Created Vuser Scripts using VuGen and Used Controller to generate and executed Load Runner Scenarios.
- Inserted Transactions and Rendezvous points into Web Vuser.
- Simulated multiple Vusers scenarios. Defined Rendezvous points to create intense load on the server and thereby measured the server performance under load.
- Involved in Performance Testing to simulate a process, which allows more than 1000 user login at the same time and observed the behavior of the system by using Loadrunner.
- Create scripts to enable the Controller to measure the performance of Web server under various load conditions.
- Extensive knowledge of load balancing theory both networking and software (application) sides, fault tolerance and fail over.
- Performed security audits, load balancing and fault tolerance solutions.
Environment: Manual Testing, Loadrunner, QTP, Quality Center, Web Services, Jmeter Windows Server, .NET, C#, HTTP, XML, SQL, Oracle, Java.
Confidential, New York, NYTest Engineer
Responsibilities:
- Responsible for creating Test Plan and Test Cases by analyzing Business requirements from Business Analyst and stored in ALM.
- Responsible for automating these test cases into test scripts using WinRunner 8.2, UFT 8.0 and LoadRunner8.0.
- Responsible for creating LR scripts thru VuGen (Virtual user Generator) by Web (HTTP/HTML) protocol.
- Responsible for creating the Workload Profile for the Load Test.
- Responsible for creating Load Runner Scenarios in controller and Analyzing Results through LR Analysis in the GM Tech Center Lab.
- Monitoring the Web, Application server and Database Server and getting metrics for CPU Utilization.
- Responsible for running both Base line, Block Point and Black box Tests for both functional and performance getting the timings for no load and under load and comparing the Results for the both Tests, Using the Maximum 2100 Vuser level for the Load Test.
- Analyzed failed Test Results and submitted defects based on Test Results thru ALM(Application Lifecycle Management) and maintained complete Defect status Life cycle.
- Responsible for the creating final Reports and recommending the Application Team in Application Tuning.
- Responsible for Creating and updating the Activity Log.
Environment: Mercury TestDirector, QTP, Linux, WebLogic, SOAP, Windows, .NET, C++, VB Script, Java Script, Oracle, SQL and PL/SQL.
