We provide IT Staff Augmentation Services!

Test Engineer Resume

5.00/5 (Submit Your Rating)

Bridgewater, NJ

SUMMARY

  • Over six years of IT experience in design, development and testing of intranet, client - server and web applications in both UNIX and Windows environments with both Manual and Automated testing tools.
  • Experienced in all phases of SDLC, including identification of functional requirements, administration of test projects and defect tracking to ensure successful application delivery on time and within budget.
  • Strong technical expertise implementing various test phases, ranging from Smoke Testing, Unit testing, Integration testing, Functional testing, System testing, User Acceptance Testing, to Regression testing.
  • Specialized in analyzing the functional specifications and creation of automated test scripts using Mercury Interactive Tools such as WinRunner and QTP.
  • Knowledge of version control tools such as Rational ClearCase, CVS, VSS and Star team.
  • Extensive experience using defect-tracking tools such as IBM Rational ClearQuest and Test Director.
  • Expertise in performance testing applications using load-testing tools such as Load Runner, and Rational Performance Tester.
  • Expertise in Creating web, COM/DCOM, MQ, SAP web, web service, database, winsock, client/server and RTE Vuser scripts.
  • Experience in performing Back-end testing of various database servers using WinSQL, DB2 CLP and TOAD.
  • Experienced in testing and validating ETL processes in data warehouse applications.
  • Knowledge of Object Oriented Software Development Methodology.
  • Excellent understanding of the Software Development Life Cycle and role of QA
  • Excellent in communication, presentation and interpersonal skills.
  • Ability to work in tight schedules and on different applications concurrently and in teams.
  • Solid analytical and trouble shooting skills

SKILL SET SUMMARY

Operating Systems: HP-UX, Linux, Solaris, Windows 2000

Languages: C, C++, TSL, JAVA, SQL.

Databases: Oracle, MS Access, DB2 and MS SQL server

Testing Tools: WinRunner, LoadRunner, Test Director and Quick Test Pro.

Web / Application Servers: Tomcat, IIS, Jboss, WebSphere and WebLogic.

Web Technologies: Java Servlets, Java Beans, J2EE, JMS, JDBC, RMI, EJB, Swing, and JSP.

XML and Web services: XML, DOM, SAX, XSLT, XPATH, XSD, WSDL and SOAP.

Defect tracking tool: PVCS Tracker, Test Director and Rational Clear Quest.

Other Tools: Toad.

PROFESSIONAL EXPERIENCE

Confidential,Bridgewater, NJ

Test Engineer

Confidential, Institutional Finance seeks to improve internal analysis capabilities by redesigning the current Account Framework Initiative (AFI) system. The Account Framework Initiative system is a comprehensive financial analysis tool designed to capture and maintain financial data for the Institutional line of business. Its purpose is to facilitate the creation and analysis of customer level profitability in order to drive decision-making for Institutional Business. This is accomplished by merging data from multiple sources. It constitutes Modules like Inbounds, Feeder Management, Reconciliation, Process, Reports, and Online pages.

Responsibilities:

  • Analyzed the System and Created the robust Automation Framework
  • Executed a regression testing covering functionalities across GL functionalities
  • Created the Automation Scripts for all Online Interfaces using Quick Test Professional 9.1 (QTP) Script
  • Designed the Data Driven tests in Quick Test Professional (QTP) script to validate with different sets of test data
  • Tested the applications response for positive and negative sets of data with data-driven testing procedures
  • Developed several Systems Integration test scripts based on system requirements, business rules and use cases.
  • Created a single huge shared object repository for all the different modules by using the Object Repository Manager in QTP.
  • Adopted the Descriptive Techniques of QTP scripts for maintainability
  • Automated scripts created are for execution of large number of claims for using the concept of Parameterization by using Data Tables
  • Automation scripts are handled based and on the concepts of Regular Expressions in QTP script.
  • Extensively used the QTP scripting to create the actions.
  • Created a single huge shared object repository for all the different modules by using the Object Repository Manager in QTP.
  • Extensively used the VB Script to develop highly optimized scripts
  • Integrated the Quality Center 9.0 (QC) with Quick Test Professional 9.1 (QTP)
  • Developed a Framework to run the scripts including Data Driven from Quality Center 9.0
  • Used the VI - XP Scripts in Quality Center 9.0 (QC) to control the QTP Execution
  • Documented the Automation Plan, Design and Execution Specification
  • Build the Queries in PeopleSoft Query (PS Query) for Automation to validate the Back End Operations by QTP Script
  • Validated most of the Delivered/Customized Business Processes in PeopleSoft 8.8 GL
  • Verified the various Online/Inbound/Outbound/Middleware/Reconciliation Interfaces
  • Involved in Smoke, Grey Box, Ad Hoc, Functional, Regression and Back end Testing
  • Designed and Executed the Test Cases using Quality Center 9.0 (QC)
  • Performed the Defect Reporting/Tracking through Quality Center 9.0 (QC)
  • Transferred files through FTP Inbound/outbound directories using Hummingbird (FTP Tool)
  • Involved in the tracking and updating of the Requirement Traceability Matrix, thereby linking the test cases with the business requirements as well as other associated test cases in Quality Center.
  • Assured the project meet the defined business requirements and follow defined quality process

Environment: PeopleSoft General Ledger 8.8,Oracle 9i,Tivoli Mastero, Humming Bird, Windows2000 professional, QTP 8.2/9.1 Test Director 8.0

Confidential, Sunnyvale, CA

Performance Engineer

Responsibilities:

  • Responsible for implementing LoadRunner based infrastructure, including: Architecting the load testing infrastructure, hardware & software integration with LoadRunner.
  • Development of performance testing processes & procedures and responsible for /Performance testing for various Spansion third party systems.
  • Designed, created & executed business realistic scenarios across multiple LOBs.
  • Act as Load Runner expert, meet with SDC engineers to determine performance requirements and goals, determine test strategies based on requirements and architecture.
  • Performed Performance testing using Load Runner and developed test scripts and scenarios.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Develop performance test plans for new application releases and coordinate the Performance engineering team through completion of performance testing projects
  • Execute performance test scenarios and analyze results
  • Profiled slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements.
  • Wrote custom LoadRunner functions and programs to support the load testing efforts, monitor resources to identify performance bottlenecks analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed.
  • Log and prioritize performance bugs and work with Engineering and program management team on timely resolutions
  • Interacted with the SDC engineers to sort out the issues and the defects

Environment: Windows2000/NT, UNIX, SAP web, SOA, Load Runner, Oracle10gand XML/ SOAP.

Confidential,Indianapolis, IN

Performance Engineer

This Project is developed for Liberty Mutual Insurance Company Personal Lines Internet Quoting (PLIQ) Application.

Responsibilities:

  • Responsible for capacity planning for PLIQ application to meet changing demands.
  • Used Wily Introscope for Performance data for problem solving, trend analysis, and capacity planning
  • Responsible for writing performance test plan and performance strategy document
  • Responsible for creating load runner scripts, made scripts more dynamic by parameterization and correlation.
  • Act as Load Runner expert, trained and Mentored junior team members,
  • Determined test strategies based on non-functional requirements and architecture.
  • Design and develop performance test scenarios and test data for company\'s applications, APIs and data processing engine.
  • Executed performance test scenarios and analyzed results and reported findings to the project manager.
  • Tuned the Application server JVM properties by exercising with the different JVM property settings to analyze the best performance of application. Configuration of JVM parameters, particularly those related to memory usage and garbage collection.
  • Profile slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using wily Introscope tool.
  • Studied application performance and maximum scalability via critical parameters such as: number of users, response times, hits per seconds (HPS) and Throughput using Load runner.
  • Identified performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Log and prioritize performance bugs and work with Engineering and program management team on timely resolutions
  • Tune systems for optimal performance and characterize systems on multiple platform and configuration combinations
  • Work closely with the application architecture team, development team and management team to sort out the issues and the defects.

Environment: Windows2000/NT, UNIX, Java, J2EE, Load Runner, Test Director, Introscope, Oracle10g,.

Confidential,Sacramento, CA Aug ’06 to Jul ’07

Test Engineer

This project is developed for CA state government agency -Legislative Data Center (LDC). History Application is developed to replace an existing legacy application by defining the integration of various technology components and interfacing with new External Systems such as Legal Services, ISWEB, Bill Referral and Senate Journal.

Responsibilities:

  • Installed, customized and administered Mercury Imperative’s TestDirector and Performance Centre. Troubleshoot any issues encountered, evaluate and perform upgrades on any of the tools in the Mercury test suite
  • Responsible for Test Management, Performance and Functional test execution, Defect Tracking and Reporting.
  • Responsible for software Reviews, Inspections and Walkthroughs
  • Coordinated with team members and the lead in preparing detailed test plan and test cases by understanding the business logic and user requirements for manual and automated testing.
  • Responsible for White-box testing, along with JUnit tests, to analyze whether the code followed intended design, coding practices and, the expected and unexpected behavior of the software application.
  • Installed Java profiler “Jprobe” and diagnosed on memory usage, performance and test coverage, pinpoint and repair the root cause of application code performance and stability problems.
  • Pro-actively work with development team in problem analysis, reproduction and defect resolution. Maintain a close working relationship with teammates and management. Lead the efforts of other QC members when needed.
  • Involved in testing Business Objects reports and tested various classes, filters and conditions in the Universes
  • Tested Business Objects reports for different scenarios like drill down, slice and dice etc for correct data
  • Validated Data Warehouse ETL processes, XML, XML Schema structure and mapping. Extraction, Transformation and loading data into Data Warehouse by ETL tool Informatica.
  • Responsible for testing the data Conversion, Migration, Data Integrity testing and Database and data integrity validation
  • Developed SQL queries to extract, manipulate, and/or calculate information to fulfill large volumes of data and reporting requirements.
  • Responsible for Conducting the Load, Stress and Performance testing using Load Runner and developed Vugen test scripts and scenarios.
  • Responsible for load tests using LoadRunner by creating scenarios for performance testing of the application by simulating real-time user load.
  • Setup LoadRunner monitor resources to identify performance bottlenecks analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed.
  • Tune systems for optimal performance and characterize systems on multiple platform and configuration combinations
  • Documented test findings by analyzing the results from both functional and load testing scenarios. Identified problematic areas and recommended solutions to the developers and upper management.

Environment: Windows2000/NT, UNIX, Java, J2EE, Quick Test Pro, LoadRunner,Jprobe, Jmeter,Test Director, Informatica, Oracle10g, SOA, Web Logic, Struts, BO, Star Team and J builder.

Confidential,San Ramon, CA

Performance Engineer

Responsibilities:

  • Responsible for identifying and gathering test requirements from users/ stakeholders to decide on the technical test architecture and infrastructure to be used and come up with a test design process
  • Support SBC in developing testing efforts, project transactions from offshore team, setting up testing network, functional, integration, regression test and test automation.
  • Responsible for Setting up Mercury Quality Center for requirements, performance test plan and test scripts.
  • Responsible for writing Test Plans for Internal and Integration Test environments
  • Worked with project managers to provide level of effort for testing activities.
  • Performed High Level Design document reviews. Participated in Feature Design review meetings and presented test case review, strategy and feature functionality. Involved in White-box testing of the application as well.
  • Utilized Test Director for tracking requirements and communicating them to the team during the test process and integrated this capability with e-mail, ensuring that all the communication about a requirement is in a single location
  • Performed Load and Performance testing using LoadRunner to validate system response time for designated transactions or business functions.
  • Configured Web/Application/Database server monitoring setup using Load Runner Controller.
  • Used IP spoofing to generate and associate different IP addresses to Virtual Users to emulate real-time scenarios for load balancing issues
  • Activating / configuring monitors and adding desired performance counters into the Graphs
  • Performed SQL querying to validate the data in the back end data base, and also to check the data flow between different modules
  • Analyzed results of Transactions Response time, Transaction under load, Transaction Summary by Vusers, Hit per Second and Throughput
  • Used the Transactions and Web Resource monitors to pinpoint bottlenecks.
  • Responsible for developing scripts for data-driven tests and used correlation for supplying unique data to the server.
  • Responsible for creating various User Defined Functions for script enhancements and to verify business logic.
  • Utilized Test Director (Defect Tracking Tool) for communication with the production personnel, developers and team members. Conducted Defect Tracking and Management Metrics
  • Documented test results, identified and communicated issues.
  • Worked closely with software developers to isolate, track, and troubleshoot defects.

Environment: UNIX WindowsNT, Oracle, DB/2, Java, JUnit, HTML, XML/SOAP, QTP, LoadRunner, Test Director, Java script, JSP, IIS, Java, J2EE, WebSphere5.1.

Confidential

Strategic Accounting Management System.

Confidential, is a secure, web-based account management and reporting system that provides access to more than 140 standard reports including purchasing and travel card management, cardholder activity, and merchant activity. SAM batch processes all transaction data and makes it available 24-48 hours after merchant posting, and is a powerful relational database that offers extensive reporting capabilities and serves to automate transactions with default accounting. The mapping (electronic file transfer) feature offers you the ability to feed your general ledger with transaction data.

Responsibilities:

  • Understanding Requirement Specifications and Design Documents.
  • Involvement in Test Design with respect to Test Plan and using Black box Testing Techniques
  • Involvement in Test Execution and Defect Reporting using Bugzilla
  • Performing Sanity, GUI, Functional and Regression Testing.
  • Communicating with other team’s members (Development Team, Technical Support, Business Support) in order to resolve the issues
  • Involvement in Test Reporting on daily, weekly basis through company prescribed format.
  • Involved in defect analysis and reports
  • Involved in Pre-Production testing.
  • Participating in Bug-Review
  • Prepared Test Summary report
  • Maintaining Issue Logs

Environment: Test Director 8.0, UNIX, DB2, PL/SQL, ASP, C#, HTML, XML.

We'd love your feedback!