We provide IT Staff Augmentation Services!

Test Engineer Resume

VA

Professional Summary:

  • Senior tester with 11+ years of professional experience in Information Technology of which 10+ years as QA Tester/Performance Test Engineer and testing experience with a wide variety of projects and environments.
  • Attended advanced training course in Silk PerformerResults Analysis and Correlation conducted by Segue.
  • Experienced in both manual and automated functional testing of Web and Client/Server applications using automated tools such as HP/Mercury tools: (WinRunner 7.5/8.0/8.2, Quick Test Pro: 8.0/8.2/9.2, Test Director/Quality Center 7.5/8.0/8.2/9.2/10.0), Borland/Segue Silk Test: 6.5/7.0/7.5/2006.
  • Experienced in Performance testing of Web applications using tools like HP/Mercury tools: {LoadRunner 7.0/7.8/8.1 (Vugen, Controller and Analysis)}, Performance Center 8.1/9.52, and Borland/Segue Silk Performer 6.5/7/0/7.2/2006/2008-R2.
  • Expertise in analyzing the test results, identifying the bottlenecks and reporting issues.
  • Hands on experience in Software Testing process, which includes executing tests, record results, test reports and defect tracking.
  • Knowledge and experienced in every phase of Quality Assurance Life Cycle (QALC) and Software Development Life Cycle (SDLC)
  • Strong experience with test coordination and leading large projects
Well experienced in requirements gathering, business analysis, gap analysis and writing functional specifications. Experienced in the complete life cycle of Project development: System Analysis, Design, Development, Testing, Deployment and End User Training.
  • Exceptional team player endowed with brilliant communication skills.
  • Experienced in Unit, Block box, White box, Sub-System, Functional, Acceptance, Performance, Stress, Regression, Integration and System Testing.
  • Well versed with the concepts of database design, development and testing of various Client/Server Applications using ORACLE, SQL SERVER and MS Access.
  • Capable of analyzing business documentation and specifications prior to transfer them into QA.
  • Strong in Analyzing Business Specification/Requirements and Develop Test Plans, Test Cases, Test scripts and executes them.

Educational Qualification: BS Engineering

Technical Skills:

Functional Testing Tools: HP/Mercury tools: (WinRunner, Quick Test Pro, Test Director/Quality Center) and Borland/Segue: Silk Test.
Performance Testing Tools: HP/Mercury tools: LoadRunner (Vugen/Controller/ Analysis), Performance Center 8.1/9.52, Borland/Segue Silk Performer.
Operating Systems : Unix Sun Solaris, HP- UNIX, IBM-AIX, Mainframe, Windows XP.
Languages : C, C++, Java, J2EE, .Net, SQL, PL/SQL, XML.
RDBMS : Oracle, DB2, SQL SERVER and MS-Access
GUI/Web Tools : Developer/2000, J2EE, VB 5.0/6.0, HTML, Java Script, and VB Script
Browsers : Internet Explorer 6.0/7.0.
ERP : SAP R/3- 3.1H/4.0; Peoplesoft 8.0 (HRM, CRM and FDM modules)
Application Servers : WebLogic and Websphere
Web Servers : Apache and IIS

PROFESSIONAL EXPERIENCE:

Client : Confidential,VA
Duration : From June 2009 to till date
Role : QA Tester/Performance Test Engineer

Worked on various projects: LL3B, LTV125%, Title V, DFee recalculation, Weblogic upgrade and Oracle Identity Manager.

Responsibilities:

  • Extensively worked on Sourcing and Pricing applications.
  • Created Test data for various releases from GUI, VUGen scripts/backend as per requirements.
  • Created VUGen scripts by including custom transactions response times, Custom functions and verification check points to confirm retrieval of the correct page.
  • Extensively used Correlation and Parameterization for VUGen scripts created.
  • Conducted Performance testing of various applications using LoadRunner/Performance Center during the various releases of the applications.
  • Conducted performance testing of servers for load and scalability by simulating 1200 Virtual User\'s using Performance Center.
  • Analyzed the complex production performance metrics to create accurate performance models.
  • Coordinated with different teams like WIO, Centralized Deployment team, MI and Data team to build and stabilize the performance test environments.
  • Analyzed the test results, identified bottlenecks and reported issues using monitoring tools.
  • Discussed test results with developers and project team members to isolate defects and problem resolution.
  • Using Analysis tool created detailed test status reports, performance reports, web trend analysis reports, and graphical charts for higher management.
  • Created and updated the test cases on Quality Center, entered defects in QC/Clear Quest.
  • Conducted test case reviews with Business Analysts and Dev team.
  • Converted Business requirements and requirement documentation into test design products: Test Scenarios, Test Scripts and Test Cases.

Environment: LoadRunner: 8.1/9.52,Performance Center 8.1/9.52, Quality Center 9.2/10.0, Remedy 7.5,Java, J2EE, Unix - AIX, Mainframe, DB2, Rapid SQL, DOORs, Clear Quest, Weblogic 10.3 and Windows.

Client : Confidential,Princeton, NJ
Duration : April 2008 to June 2009
Role : Sr. Performance Engineer/Sr. QA Tester

Responsibilities:

  • Based on the entrance criteria received from the development team, developed an integrated test plan covering the main features of the designated modules of the application; designed test methods to verify the application\'s functions.
  • Worked closely with business users and the project team to analyze business requirements in order to create project test scenarios for performance testing.
  • Analyzed the business functionality from a performance testing perspective, and
    assisted in the design and architecture of specific performance tests.
  • Using Load Runner Vugen developed VUGen scripts according to requirements/specifications and executed multi-user (1000+) performance tests, making use of online monitors, real-time output messages, and other features of the LoadRunner.
  • Using Borland Silk Performer tool developed Test scripts according to requirements and executed multi-user (6000+) performance tests, making use of online monitors, real-time output messages, and other features of the Silk Performance tool.
  • Worked with the development team to understand the source-to-destination data mapping, as well as the data transformation process, between multiple systems.
  • Created performance test scripts by including custom transaction/page response times, and verification checkpoints to confirm the retrieval of the correct page.
  • Parameterized the Vuser scripts by replacing fixed values with parameters.
  • Optimized the Vuser scripts by correlating the statements for Database queries.
  • Analyzed the test results, identified bottlenecks and reported issues.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.

Environment: LoadRunner 8.1, Performance Center 8.1, Silk Performer release 2006 - R2/2008 – R2, IE 6.0/7.0, Peoplesoft 8.0 (HRM, CRM and FDM modules) Windows XP, Solaris 9.0, Java, J2EE, XML, Oracle 10g, DB2, Apache-Tomcat, Weblogic and Web Sphere.

Client : Confidential,Mount Laurel, NJ
Duration : June 2007 to March 2008
Role : QA Tester/Performance Engineer

Responsibilities:

  • Developed an integrated test plan covering the main features of the designated modules of the application; designed test methods to verify the application\'s functionalities.
  • Worked closely with business users and the project team to analyze business requirements in order to create project test scenarios.
  • Analyzed the business functionality from a performance testing perspective, and
    assisted in the design and architecture of specific performance tests.
  • Conducted Functional testing of Web applications using Quick Test Pro during the various phases of the product development.
  • Performed various testing types by inserting sync points, check points (GUI, Bitmap, text check points).
  • Using Load Runner Vugen developed scripts according to requirements/specifications and executed multi-user (300+) performance tests, making use of online monitors, real-time output messages, and other features of the LoadRunner Controller.
  • Worked with the development team to understand the source-to-destination data mapping, as well as the data transformation process, between multiple systems.
  • Created performance test scripts by inserting custom timers for recording transaction/page response times, and verification checks for confirming the retrieval of the correct page.
  • Parameterized the Vuser scripts by replacing fixed values with parameters.
  • Optimized the Vuser scripts by correlating the statements for Database queries.
  • Analyzed the test results, identified the bottlenecks and reported issues.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
  • Categorized functional bugs based on the severity and interfaced with developers to resolve them.

Environment: QUICK TEST PRO: 9.2, Quality Center 8.2/9.1, LoadRunner 7.8/8.1, IE 6.0/7.0, Windows XP, Solaris 9.0, Java, J2EE, Java Applets, XML, Oracle 10g, Weblogic and Web Sphere.

Client : Confidential,Philadelphia, PA
Duration : October 2006 to May 2007
Role : Sr. QA Tester

Confidential,is the world’s leading project and portfolio management software Development organization. It provides the software foundation that enables all types of businesses to achieve excellence in managing their portfolios, programs, projects and resources.

Responsibilities:

  • Converted Business requirements and requirement documentation into test design products: Test Scenarios, Test Scripts and Test Cases.
  • Developed an integrated test plan covering the main features of the application; designed test methods to verify the application\'s functions.
  • Conducted manual and functional testing of Web and Client/Server applications using Silk Test during the various phases of the product development.
  • Performed various testing types by inserting sync points, check points (GUI, Bitmap, text check points)
  • Discussed test results with developers and project team members to isolate defects and problem resolution.
  • Performed Regression, Acceptance, Unit, Integration, Usability, Cross-Platform, etc. testing during different phases of the application development.
  • Categorized functional bugs based on the severity and interfaced with developers to resolve them.
  • Worked in a fast paced Scrum team environment using the latest Agile development methods.

Environment: QUALITY CENTER 9.1, Silk Test 2006-R2, IE 6.0/7.0, Windows XP, Solaris 8.0/9.0, AIX 4.3, HP-Unix 10.0/11.0, Java, J2EE, Java Applets, XML, Oracle 8i/9i/10i, MS Access, DB2, Apache-Tomcat, Weblogic and Web Sphere.

Client : Confidential,Washington DC
Duration : March 2006 to September 2006
Role : Performance Engineer

Confidential,is the non-profit, non-stock, parent company of CareFirst of Maryland, Inc. and Group Hospitalization and Medical Services, Inc., affiliates that do business as CareFirst BlueCross BlueShield.

Responsibilities:

  • Conducted Performance testing of Web based applications (Claims, Enrollment, Retrieval and eServices) using LoadRunner during the various phases of the product development.
  • Extensively used Load Runner Vugen component to develop Vuser scripts.
  • Created automated scripts by including custom timers for recording transaction/page response times, and verification checks for confirming the retrieval of the correct page.
  • Conducted performance testing of servers for load and scalability by simulating 300+ Virtual User’s using Load Runner.
  • Parameterized the Vuser scripts by replacing fixed values with parameters.
  • Optimized the Vuser scripts by correlating the dynamic values.
  • Analyzed the complex production performance metrics to create accurate performance models.
  • Analyzed the test results, identified the bottlenecks and reported issues.
  • Discussed test results with developers and project team members to isolate defects and problem resolution.
  • Participated in review meetings to evaluate documents, plans, code, requirements and specifications.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
  • Categorized functional bugs based on the severity and interfaced with developers to resolve them.

Environment: LoadRunner 8.1 and Performance Center 8.1, Team Track, IE 6.0, Windows XP, Java, J2EE, Java Applets, Mainframe, DB2, Oracle 8i/9i, Weblogic and Web Sphere.

Client : Confidential,Tampa, FL.
Duration : Oct.2002 to February 2006
Role : Performance Engineer

Worked on various Telecommunication applications for Performance and Functional testing of Web based and Client/Server (Ex: eCO Systems/AAIS/NTAS/CAD/vBuild/CRE/AWAS/RAMS/vRepair/MSOL/WMS/VTF etc)

Responsibilities:

  • Attended advanced training in Silk Performer - Results Analysis and Correlation conducted by Segue.
  • Developed an integrated test plan covering the main functions of the designated parts of the application; designed test methods to verify the application\'s functions.
  • Converted Business requirements and design documentation into test design products: Test Scenarios, Test Scripts and Test Cases.
  • Conducted Functional and Performance testing of Web based and Client/Server applications using WinRunner/Silk Test/Silk Performer during the various phases of the product development.
  • Created automated scripts by including timers for recording response times, and verification checks for confirming the retrieval of the correct page.
  • Conducted performance testing of servers for load and scalability by simulating 2000+ Virtual User’s using Silk Performer tool.
  • Parameterized the Vuser scripts by replacing fixed values with parameters and optimized the Vuser scripts by correlating the dynamic values.
  • Analyzed the test results, identified the issues and reported issues.
  • Discussed test results with developers and project team members to isolate defects and problem resolution.
  • Participated in review meetings to evaluate documents, plans, code, requirements and specifications.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
  • Performed Regression and Performance testing during different stages of the application development.
  • Responsible for performing various types of process evaluations during each phase of the software development life cycle. Including, audit, review, walk through and hands on system testing.
  • Worked with the development team to understand the source-to-destination data mapping, as well as the data transformation process, between multiple systems.

Environment: WinRunner: 7.5/8.0/8.2, Segue Silk Test 6.5/7.0/7.5, Segue Silk Performer 6.5/7.0/7.2, LoadRunner 7.8, IE 6.0, Windows NT 4.0, Windows 95/98/2k/XP, Solaris 2.x/7.0/8.0/9.0, AIX 4.3, HP-Unix 10.0/11.0, Java, J2EE, XML, Mainframe, Oracle 8i/9i, DB2, IIS4.0, Weblogic 8.1, Web Sphere.

Client : Confidential,TX

Duration : July 2001 to Sept.2002
Role : QA Tester

Confidential,is an Internet based tool that enables enterprises to share demand and supply information across Enterprise quickly, efficiently and economically. TM has flexible, broker centric design with open system standards and a component based architecture. Due to the open architecture, TM can be integrated with i2 Technologies planning engines, ERP systems, database and legacy systems. TM is also able to provide timely and accurate procurement information to both customers and suppliers.

Responsibilities:

  • Developed an integrated test plan covering the main functions of the designated parts of the application; designed test methods to verify the application\'s functions.
  • Performed Acceptance, Unit, Integration, Usability, Cross-Platform, Regression, etc. testing during different stages of the application development.
  • Identified software problems, wrote easy-to -follow up bug reports, logged them into bug tracking tool, monitored their progress and verified their fixes.
  • Categorized bugs based on the severity and interfaced with developers to resolve them.
  • Responsible for performing various types of process evaluations during each phase of the software development life cycle. Including, audit, review, walk through and hands on system testing.
  • Conducted Functionality and Performance testing using WinRunner/LoadRunner/Silk Test/Silk Performer tools during the various phases of the product development.
  • Extensively used Load Runner Vugen tool to develop Vuser scripts.
  • Conducted performance testing of servers for load and scalability by simulating 500+ Virtual User’s using Load Runner and Silk Performer tools.
  • Parameterized the Vuser scripts by replacing fixed values with parameters and optimized the Vuser scripts by correlating the dynamic values.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.

Environment: WinRunner 6.0/7.0, TestDirector 6.0/7.0, LoadRunner 6.0/7.0, Silk 5.0.3, Silk Radar, Segue Silk Performer 5.0, IE 5.1, Windows NT 4.0, Solaris 2.x/7.0/8.0, AIX 4.3, HP-Unix 10.0, C, C++, Java, J2EE, VC++, Mainframe, Oracle 8i, DB2, Weblogic , Web Sphere.

Client : Confidential,Richardson, TX

Duration : July 2000 to June 2001
Role : QA Tester

NGSN Project:
The Next Generation Service Node (NGSN) is a state-of-the-art service node developed by MCI WorldCom Intelligent Service Platform. NGSN provides a Network interactive Voice Response (NIVR) for the Business Markets Enhanced Call Router (ECR) portfolio.

  • Converted Business requirements and design documents into test design products: Test Scenarios, Test Scripts and Test Cases.
  • Developed an integrated test plan covering the main functions of the designated parts of the application; designed test methods to verify the application\'s functions.
  • Involved in the quality testing process of a software package that was designed and developed for this project.
  • Created and executed QA test plans on multiple platforms using manual and automated QA methods.
  • Created automated scripts by including timers for recording response times, and text checks for confirming the retrieval of the correct page.
  • Created detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
  • Parameterized the Vuser scripts by replacing fixed values with parameters.
  • Automated the call plans on provisioning System using WinRunner.
  • Categorized bugs based on the severity and interfaced with developers to resolve them.

Environment: WinRunner 6.0/7.0, LoadRunner 6.0/7.0, TestDirector 6.0/7.0, Silk 5.0.3, Segue Silk Performer 5.0, UNIX – Solaris 2.x/7.0/8.0, Power Builder 6.5/7.0, C, C++, VC++, Vi editor, Weblogic 6.0/6.1/5.0, Mainframe, ORACLE 7.3/8.0, MS Access, Windows NT, SQL, PL/SQL, SQL Loader.

Client : Confidential,VA

Duration : Jan .2000 to June 2000
Role : QA Tester

VTNS (Virtual Telecommunication & Networking System) is a service order processing system. The main objective of this system is to maintain the service plan of the Toll free Bell Atlantic Customers & VACS (VTNS Automated Customer Service). VTNS system is receiving orders from Outbound (OCS) & Inbound (watts/sop) service orders.

  • Involved in the quality testing process of a software package that was designed and developed for VTNS.
  • Responsible for performing various types of process evaluations during each phase of the software development life cycle. Including, audit, review, walk through and hands on system testing.
  • Involved in the team test, test planning, test scripts design, test development, debugging and test execution.
  • Developed and implemented Win Runner automated test scripts ensuring coverage and regression test of all business rules.
  • Identified software problems, wrote easy-to – follow up bug reports, logged them into bug tracking database, monitored their progress and verified their fixes.
  • The testing of the application was done in various stages: Regression Testing, Black/White Box, Stress Testing, Performance Testing, User Interface Testing, Positive/negative Testing and System Testing.

Environment: WinRunner 6.0, LoadRunner 6.0, TestDirector 6.0, TestPartner, Silk 5.0.3, HP-UNIX 9.0, Sun-Solaris 2.x/7.0, C, C++, Vi editor, Weblogic, Windows NT, Mainframe, ORACLE 7.3/8.0, SQL, PL/SQL and SQL Loader.

Hire Now