Test Engineer Resume
Chicago, IL
OBJECTIVE
Seeking a competitive environment to utilize my skills and abilities in a futuristic organization that offers Professional growth while being resourceful, innovative and flexible.
CAREER SUMMARY:
- Over 7 years of diverse experience in the field of Quality Assurance and Software testing.
- Good knowledge of structured testing methodologies, testing tools and full testing life cycle including design and development of test plans, test cases and test objectives.
- System specification analysis, testing methodology and test plan formulation.
- Involvement in various stages of Software Development Life Cycle (SDLC).
- Following the agreed methodology to prepare the individual Test Plans, Approach. In planning it covers Test Scripts, Cases, Results, Schedules, Complete Test Strategy, Acceptance Criteria, Risk Management & Final reports.
- Ensure that all deliverable are well within the quality standards and this will be taken care by frequent audits on all projects.
- Experience with automated testing tools like LoadRunner, WinRunner , QuickTest Professional (QTP), Mercury Diagnostics , Test Director and Quality Center.
- Proficient in debugging and executing Load Runner scripts.
- Excellent Knowledge of programming languages like C, Java, and SQL to debug and execute Load runner scripts.
- Excellent with parameterization and correlation.
- Well versed with Load Runner analysis in analyzing the results.
- Proficient in plotting and implementing scenarios and loading Load Runner scripts into a controller.
- Expeneince in using monitoring tools like Wily (Introscope), Spotlight and Jprobe.
- Experience in installing, configuring J2EE Diagnostic to get Transaction breakdown to method level.
- Experience in analyzing the complexity of the application with developers and identifying the business functionalities under the scope for Performance Testing.
- Ability to work in a team environment. Strong communication and interpersonal skills. Ability to interact with other teams and team members with ease and professionalism.
- Ability to quickly master new concepts and applications
- Ability to work in tighter schedules and on different applications concurrently.
- Preparing a detailed and comprehensive report at the end of Performance Testing explaining the various concerns identified and improvement fixes done to improve the performance of the application.
- People Management and Risk Management.
TECHNICAL SKILLS SUMMARY:
- Testing Tools: LoadRunner, WinRunner. QuickTest Professional, Quality Center
- Languages: C, C++, and SQL.
- Platforms: Windows 9x/NT/2000/XP, and UNIX.
- Database: MS Access, Oracle
- Scripting Language:: TSL
- Web Technologies: HTML, DHTML, and Java Script.
WORK EXPERIENCE:
Confidential, CHICAGO, IL
Performance Test Engineer Oct 2007 to presentResponsibilities:
- Involved in gathering critical business requirements from Business Analysts and development team.
- Involved in calculating scaling factor of Performance Environment. Scaling Factor provides comparison of Production and Performance Environment. For providing precise performance test results, it is required to find scaling factor more accurately.
- Designed load model of application based on peak load numbers provided by Business Analyst and Capacity Planning team.
- Actively participated in making test strategies and test scope.
- Analyzed application design document and prepared test cases and test plan.
- Analyzed test plans, write test standards and procedures to be used in the development of all LoadRunner test scripts.
- Extensively used Vugen for Load runner scripting by using different protocols.
- Captured session variables dynamically and made them as dynamic parameters replaced with hard coded values to ensure application is running more real times for all scenarios.
- Performed data driven tests to make sure application is responding properly for all varieties of data.
- Monitored CPU, Memory and other system resources while performing load.
- Analyzed different load runner reports and graphs including hits/second, throughput, 90 percentile response times and worked closely with development and DBA team to resolve high transaction response time issues and improve applications performance.
- Responsible for checking system logs to investigate exceptions and error messages.
- Assign and track all defects of performance test execution through Quality Center.
- Responsible to make sure each code minimizes the bugs which are already logged in Quality Center. Prepared a document to represent for upper management.
- Regularly following up with Development Team to discuss discrepancies identified during testing and performance tuning.
- Prepared Issue Log and Test Execution Log to track performance testing level of efforts (LOE).
- Prepared Close out Document for each application, which gives summary of Performance Test plan, Scope of the test and their results with tester comments. This helps upper management in calculating risk.
- Responsible for Daily status to show the Progress of the automation testing effort.
- Involved in estimation, planning & scheduling for Performance testing.
- Involved in effective script generation-cum-preparation for a variety of scenarios using multiple protocols.
- Involved in various types of testing like load, stress, failover, endurance and volume.
- Created long running endurance test scenario (100 hour) in such a way realistic load (modulating) is simulated throughout the test duration.
- Played a key role in process improvement and innovative idea generation.
- Involved in detailed test analysis and summary report preparation.
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.
Environment: UNIX, Webseal, Websphere, Oracle, J2EE (EJB, Java, JSP), Oracle 9i,
Project Type: Web Based, Client server, Peoplesoft
Protocol : Web HTML/HTTP, Oracle 2-tier
Tools Used : LoadRunner 8.1, 9.0, 9.1, Performance Center
Responsibilities:
- Involved in gathering requirements from Business Analysts.
- Analyzed requirement analysis document and prepared test cases and test plan.
- Created LoadRunner Scenarios and Scheduled the Virtual user to generate realistic load on the server using LoadRunner.
- Involved in preparation of detailed performance test plans for internal projects.
- Involved in estimation, planning & scheduling for Performance testing.
- Involved in effective script generation-cum-preparation for a variety of scenarios using multiple protocols.
- Involved in efficient scenario design, effective scenario configs & settings, and execution.
- Involved in various types of testing like load, stress, destructive, endurance and volume.
- Played a key role in process improvement and innovative idea generation.
- Involved in detailed test analysis and summary report preparation.
- Responsible for weekly status to show the Progress of the automation testing effort.
- Regularly following up with Development Team to discuss discrepancies identified during testing and performance tuning.
Environment: Solaris, UNIX, Iplanet, Web Logic, CORBA, Oracle J2EE (EJB, Java, JSP), DB2, Mainframes, Oracle
Project Type: Web Based, Client server, Service layer
Protocol: Web HTML/HTTP, Java Vuser, Winsock, Web Services
Tools Used : LoadRunner 8.1, 9.0, Quality Center
Monitoring tools: Wily, Spotlight
Responsibilities:
- Analyzing the complexity of the application with developers
- Identifying the business functionalities under the scope for Performance Testing
- Recording the Performance Scripts – adding parameters, correlating, applying error checks
- Designing a comprehensive scenario for running performance test on the product
- Observing the Application Server for CPU, Memory Utilization and identifying bottlenecks and areas of choke
- Analyzing the Performance Reports and identifying the scalability, Bandwidth Utilization, Risk involved in terms of stress on application
- Preparing a detailed and comprehensive report at the end of Performance Testing explaining the various concerns identified and improvement fixes done to improve the performance of the application.
- Interacting with Development Team for fixes and issues.
- Capturing and Customizing QTP Scripts for Sanity testing.
Environment: Solaris, Iplanet Web Server, Web Logic App Server, Tibco, Tuxedo, CORBA, Oracle J2EE (EJB, Java, JSP), DB2, Mainframes, Oracle,
Project Type : Web Based, Telecom Domain Application
Protocol: HTTP/HTTPS, TCP/IP
Tools Used: LoadRunner 8.0, J2EE Diagnostic, Jprobe, QuickTestPro 8.2
Responsibilities:
- Analyzed user requirements, business rules in coordination with the project team and prepared test procedures.
- Recording the Performance Scripts – adding parameters, correlating, applying error checks
- Designing a comprehensive scenario for running performance test on the product which includes baseline, scalability, stress, endurance.
- Observing the Application Server for CPU, Memory Utilization and identifying bottlenecks and areas of choke
- Prepared Test Status report and Bug Report
- Involved in Functional and Regression testing.
- Developed and executed various scripts using WinRunner for automated regression and system testing.
- Created different compiled modules and implemented different customized functions to make WinRunner scripts more effective and efficient.
- Performed data driven tests to ensure application is running for different varieties of data to test different scenarios.
- Responsible for tracking bugs using bug tracking tool.
- Participated in Preparation of Requirement Traceability Matrix.
Environment : Java /Jsp / Oracle
Project Type : Web Based
Protocol : HTTP/HTTPS
Tools Used : LoadRunner 8.0, WinRunner 7.6
Responsibilities:
- Involved in designing and verifying test plan and test cases.
- Conducted Functional Testing, System Testing and Performance Testing.
- Understanding and analyzing the business requirements.
- Preparation of Performance Test Scenarios.
- Used LoadRunner to simulate multiple Vuser Scenario to measure server performance.
- Defined Rendezvous Points to measure the server performance under peak load.
- Scheduled scenarios by specifying Ramp Up, Ramp Down and Duration to depict varying amounts of user load on the server at any given moment
- Prepared test scripts for automated testing using WinRunner.
- Conducted GUI Testing using WinRunner.
- Performed Functionality Testing using WinRunner.
- Drafted Test Scripts for Data Driven Testing using WinRunner to dynamically allow putting different parameters.
- Performed Security Testing in WinRunner using Positive Testing and Negative Testing to ensure appropriate user authentication.
- Used Exception Handling in WinRunner to handle errors during test execution.
- Recording TSL scripts using WR for the automated regression testing.
- Created Traceability Matrix to ensure comprehensive test coverage of requirements, identify all test conditions and test data needs.
Environment : Java 1.2, J2EE, JSP, Servlets, Struts Framework, EJB, XML, Log4j, Oracle9i, Weblogic7.x
Project Type : Web Based
Responsibilities:
- Involved in designing and verifying test plan and test cases.
- Conducted Functional Testing, System Testing and Performance Testing.
- Understanding and analyzing the business requirements.
- Preparation of Performance Test Scenarios.
- Used LoadRunner to simulate multiple Vuser Scenario to measure server performance.
- Defined Rendezvous Points to measure the server performance under peak load.
- Scheduled scenarios by specifying Ramp Up, Ramp Down and Duration to depict varying amounts of user load on the server at any given moment
- Prepared test scripts for automated testing using WinRunner.
- Conducted GUI Testing using WinRunner.
- Performed Functionality Testing using WinRunner.
- Drafted Test Scripts for Data Driven Testing using WinRunner to dynamically allow putting different parameters.
- Performed Security Testing in WinRunner using Positive Testing and Negative Testing to ensure appropriate user authentication.
- Used Exception Handling in WinRunner to handle errors during test execution.
- Recording TSL scripts using WR for the automated regression testing.
- Created Traceability Matrix to ensure comprehensive test coverage of requirements, identify all test conditions and test data needs.
Environment : J2EE Technologies, Java 1.3, JSP, Servlets, Java Beans, Bea Web Logic 7.0, Oracle9i
Tools Used : WinRunner 7.6
Responsibilities:
- Participated in Test requirement analysis.
- Recording TSL scripts using WR for the automated regression testing.
- Developed functional and Regression test cases.
- Involved in Test Case Review.
- Prepared Test Status report and Bug Report.
- Executed Functional and system Tests
Environment : Visual Basic, Oracle8i
EDUCATION:Bachelor Of Engineering