Lisa/soapui/jmeter Tester Resume
AZ
SUMMARY:
- Overall 8 years of professional experience in Information Technology with extensive experience in performing Manual and Automated Testing.
- Experience in testing Web - based and Client-Server applications on different environments.
- Strong knowledge of Software Development Life Cycle (SDLC)
- Proficient in analyzing Business/ System Requirements documents and Use Cases.
- Expertise in preparing/maintaining Requirement Traceability Matrix.
- Ability to write High Level Test Plans, Test Cases, Test Procedures and Test Scripts from Requirements and Use Cases.
- Experienced in software analysis, Requirements Management, Quality Assurance, Modeling, Configuration Management and Change Management.
- Extensive experience in Testing tools like Devtest, JIRA, Quality Center, iTKO LISA, QTP, Requisite Pro, HP ALM
- Extensive experience with object oriented analysis and design and developing use-case models.
- Ability to meet deadlines and handle pressure coordinating multiple tasks.
- Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options.
- Performed data analysis and data profiling using SQL on various sources systems including SQL Server
- Good Knowledge on Selenium WebDriver and TestNG Framework.
- Good working experience on SOAPUI and writing assertions using it.
- Performed functional testing, load testing using SOAPUI
- Excellent work experience using JMETER for performance testing
- Excellent experience in doing Regression testing and Sanity Testing on all the databases including Derby, DB2, Oracle, SQL and MYSQL on Windows and Linux Operating Systems.
- Good Experience working with JMS Automation
- Expertise in performing different types of Testing: System, Functional, GUI, Regression, Smoke, Database Integrity, User Acceptance (UAT), Stress/Load and White Box Testing.
- Experienced in creating/maintaining Test Logs, Summary Reports.
- Good understanding of Database Management Systems (Oracle, DB2, SQL Server).
- Extensive working knowledge in UNIX and Windows platforms.
- Excellent Interpersonal and Customer Relational Skills. Proven Communication, Presentation Skills and Leadership Qualities.
TECHNICAL SKILLS:
Operating Systems: MS-DOS, UNIX, MVS, Sun Solaris, Windows 9x/NT/2000/XP.
Languages: VB.NET,JAVA, SQL, PL/SQL, VB,UNIX Shell Scripting, TSL, JavaScript, VB Script
RDBMS: Oracle 9i,10g,11g, MS SQL Server,DB2
Web Technologies: HTML, ASP, XML, Java Script, Java Servlets, ASP.NET, JSP.
Packages: MS-Office( MS-Access, MS-Word, MS- FrontPage and MS-Excel)
Application Servers: WebLogic 6.0, IIS, Apache Server, Web Sphere, Net dynamics 4.0.
Testing Tools: CA Devtest, iTKO LISA 7.5. Rational Suite (Requisite Pro, Rose, Robot, Test Manager, Clear Quest, Clear Case)
Version Control: Rational Clear Case, VSS, PVCS, CVS, Star Team.
PROFESSIONAL EXPERIENCE:
Confidential, AZ
LISA/SOAPUI/JMETER Tester
Roles and Responsibilities
- Coordinating with the development team in getting the requirements for the Services
- Coordinating with the Testing Point of Contacts to make sure that the data is populated to test the web services
- Coordinating with the consumers’ team to validate the response after testing team is correct
- Installing and configuring SOAPUI to test the SOAP services
- Importing the Project.xml file given by the developers and testing the SOAP web service
- Used assertions to make sure that the data is indeed valid
- Created Test Cases using Element Locators and Selenium Web Drivers.
- Executed Selenium test cases and reported defects.
- Performed Cross Browser Testing and Data Driven testing
- Enhanced Test Cases using Java Programming features
- Created documentation of the different parameters used in our web service testing
- Installing and configuring CA LISA client on our machine to test the SOAP and REST Web Services
- Virtualized the services using Request and Response pairs method
- Modified the existing VSI’s as per the newly created version of the service
- Used the web service and created test cases and also created filters and assertions.
- Strong ability to communicate test progress, test results, and other relevant information to project stakeholders and management.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Strong knowledge of system testing best practices and methodologies.
- Used Virtual Service Image Recorder and created Virtual Service Images and Virtual Service Models
- Deployed the Virtual Service Images onto Virtual Service Environment
- Troubleshoot the issues raised after deployment of the Virtual Service Image
- Involved in the design, development, and enhancement and debugging of the applications.
- SQL Server RDBMS database development using T-SQL programming, queries, stored procedures, views.
- Performed data analysis and data profiling using SQL on various sources systems including SQL Server
- Developed and Documented scripts used for data conversion and migration, program development, logic, coding, testing and corrections.
- Extensively worked on Functional testing, Load testing using SOAPUI and Performance testing using JMETER.
- Attending daily SCRUM meetings and Offshore Meetings for knowledge transfer
Environment: DevTest, HP ALM, SOAPUI, JMETER, MS Office, MS Visio, Java 1.7, SQL Server
Confidential, WALISA Virtualization Consultant
Roles and Responsibilities
- Installing LISA 7.5 on Windows environment
- Executed the flows in IST and made sure that the flows run correctly with the given data
- Recorded the transactions while pointing to the LIVE environment.
- Executed the flows while pointing to Performance Test environment and made sure that the flows while pointing to LISA.
- Used SOAP based web services for the creation of VSI and VSM
- Created VSI and VSM using the RAW TRAFFIC file
- Created Pivot Table to see how many error codes are present in the VSI for which we create during daily maintenance period.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Eliminated all the 500 and 400 error codes from VSI by running the utility from AT & T
- Deployed the VSM on to LISA while pointing to VSI
- Made sure that no NO MATCHES and SIGNATURE issues are present in the VSI
- Deployed one off Request and Response pairs given to us.
- Strong ability to communicate test progress, test results, and other relevant information to project stakeholders and management.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Strong knowledge of system testing best practices and methodologies.
- Performed data analysis and data profiling using SQL on various sources systems including SQL Server
- Interacting directly with customers to get the detailed reporting/Data requirements.
- Create joins and sub-queries for complex queries involving multiple tables.
- Created SQL reports, data extraction and data loading scripts for different databases and schemas.
- Designed a report on the monitoring of Server Performance using Task Manager. This report gives a detail report to the management on the Performance of the server.
- Modifying the complex Business Objects universes and reports based on client’s business requirements.
- Attended daily meetings with Developers and offshore team and participated in defect review meetings
- Worked on multiple projects simultaneously
Environment: ITKO Lisa 7.5, SQL Server, MS Office, MS Visio, Windows 7, SOA, UNIX, WinSCP, Eclipse, SVN, Agile methodology, Java 1.7.
Confidential, Plano, TXSr. QA Tester
Responsibilities:
- Installing and configuring various builds of Devtest on Windows and Linux.
- Performed regression testing on the builds to make sure that the required functionality is working as expected.
- Performed regression testing with different kind of databases including Oracle, SQL, MySQL, Derby etc.
- Worked with Windows as well as Red hat Linux Operating systems for regression testing.
- Performed Sanity testing whenever required on the test cases.
- Worked on JMS Automation for the existing JMS manual test cases and created test suites out of it.
- Created Staging documents as and when required.
- Used filters and assertions on test cases for proper
- Deployed the test suites and made sure the test cases are running correctly. very good knowledge on Continuous Application Insight, created baselines and deployed services too to Virtual Service Environment
- SQL Server RDBMS database development using T-SQL programming, queries, stored procedures, views.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Demonstrated ability to administer and recognize customers’ requirements and able find and procedure company data to hold up business decisions.
- Performed data analysis and data profiling using SQL on various sources systems including SQL Server
- Able to deal effectively with unclear situations and short time frames
- Scripted automation scripts for Devtest portal using GEB and Selenium.
- Strong ability to communicate test progress, test results, and other relevant information to project stakeholders and management.
- Strong knowledge of system testing best practices and methodologies.
- Worked using Selenium WebDriver for Cross Browser Testing and data driven testing.
- Automated Selenium Test Cases using already existing TestNG Framework
- Attended daily meetings with offshore team and made sure that glitches are addressed.
- Worked as team leader for the offshore team and helping them out in assigning the tasks, training them with automation Scripting using GEB
Environment: DevTest, HP ALM, GGTS, Oracle, SQL, MySQL, MS Office, MS Visio, Java 1.7
Confidential, Madison, WISr. QA Analyst
Roles and Responsibilities
- Worked closely with teams of clients, developers and testers throughout the entire development life cycle, to identify and manage requirements.
- Documented, implemented, monitored, enforced all processes for testing as per standards defined by the organization.
- Worked on Guidewire Policy center and Billing Center to test functionality related to Policies and Billing Cycle.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Coordinated with Claims Services team to create policies required for claims testing.
- Performed Policy and Billing Center Integrations.
- Prepared Test Cases using User Stories and Business Rules from the boot camps.
- Prepared and Maintained Requirements Traceability Matrix to ensure that the test cases cover all the business requirements.
- Performed data analysis, data verification and problem-solving abilities and data profiling using SQL on various sources systems including SQL Server.
- Strong ability to communicate test progress, test results, and other relevant information to project stakeholders and management.
- Prepared Test Cases for Functional, Integration and End-to-End user scenarios.
- Performed Integration, System, Regression, Performance and User Acceptance testing of an application.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Worked with BAs and SMEs to identify and analyze functional gaps in the requirements.
- Worked with legacy teams to make sure that the integration is working as expected.
- Worked on Enhancement of scripts by adding datasets, filters, assertions and parameterization using ITKO LISA 5.0.28
- Used Interactive Test Run (ITR) Mode to playback the recorded/created iTKO LISA test scripts.
- Created and maintained automated smoke test scripts for Guidewire Policy Center using iTKO LISA.
- Tested web services using SOAP UI and iTKO LISA to test the Xml requests and the responses.
- Worked with Business Analysts and Functional experts for analyzing and prioritizing test cases for automation testing.
- Responsible for test data preparation for the system / integration testing, traceability matrix to ensure the test case coverage and defect logging.
- Responsible for running automated Smoke Test Suites after each build.
- Interacted with development team for defect prioritization and resolution using Defect Tracking Tool JIRA.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Strong knowledge of system testing best practices and methodologies.
- Strong experience in Writing and Tuning T-SQL (DDL, DML, DSL), Stored Procedures to improve the database performance and availability.
- Expert in of various source transformations, including flat files, XML and relational systems.
- Participated defect review meetings with team members.
- Communicated defects encountered during regression test and follow up with developers until all issues were resolved.
Environment: Guidewire Policy Center and Billing Center, Oracle 11g, SQL Server, MS Office, MS Visio, Windows XP, iTKO Lisa 5.0.28/7, JIRA, HTML, Rally, Toad, SOA, Altova XML Spy, SharePoint, Agile methodology.
Confidential, Chicago, ILSr. QA Analyst
Roles and Responsibilities
- Reviewed Business Requirements, Design Documents and specs to prepare Test Plans for System Testing and Integration Testing.
- Designed and developed test scenarios (Use Cases) as per the requirements in Quality Center.
- Responsible for doing coverage analysis / gap analysis for the given functionality.
- Executed test cases, logged defects, retested and closed the defects after dev fix.
- Used Quality Center for writing, executing test cases and managing defects.
- Wrote moderate to complex SQL queries to retrieve and modify the data needed for testing.
- Tested end to end internal and external Data Exchanges along with related XML’s and schemas as part of web service testing.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Strong ability to communicate test progress, test results, and other relevant information to project stakeholders and management.
- Worked closely with developers to make sure that the fixes are done in time and efficiently.
- Done regression testing for the defects fixed in every build.
- Worked as an Automation tester to write and execute automation scripts for regression and smoke testing using QTP.
- Wrote and simulated multiple end to end business scenarios for performance testing and automation testing.
- Strong knowledge of system testing best practices and methodologies.
- Prepared income and demand presentations in PowerPoint and Excel
- Performed market place analysis to attain product goals and strategies
- Lead the planning, recognition, development and completion of design and changes to keep products metrics reports.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Team player and rapid learner with the aptitude to work in fast paced surroundings
- Used Infrared (performance monitoring tool) for monitoring and diagnosing performance problems.
- Worked with Load Runner team to create and execute performance scripts for our module.
- Creating Reports and Graphs using Quality Center client for different modules.
- Generated defect reports from Quality Center for status meetings.
- Prepared slides, demos and mentored new testing team resources.
- Acted as point of contact for developers and testers for the defects logged by our team.
- Responsible for Reviewing test cases and defects logged by all the team members
- Created and maintained automation test scripts using QTP for smoke and regression testing using data provided in EXCEL sheets.
Environment: Java, J2EE, Oracle 10g, SQL, XML, Quality Center 9.2, QTP 9.5, Load Runner 9.0 Windows XP, Clear Case, Tibco, Cognos, TOAD. iTKO LISA, Agile Methodology, JavaScript.
Confidential, Madison, WISr. QA Analyst
Roles and Responsibilities
- Performed functional, regression, end-to-end and usability testing.
- Involved in Unit testing, Integration testing and UAT.
- Wrote, updated and executed test cases manually using Quality Center.
- Worked closely with Business Analysts to find out the gaps in functional requirements.
- Crated and maintained requirement traceability matrix to find out the gaps in requirement and test case coverage.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Created and executed regression test scenarios using automation tools and VB scripting.
- Duties included documentation reviewing and analysis.
- Strong knowledge of system testing best practices and methodologies.
- Filed, followed and verified bugs and bug reports using Clear Quest.
- Worked closely with project management teams.
- Responsible for uploading test cases from excel sheets to Quality Center.
- Responsible for analyzing the defects logged by client in pilot mode.
- Performed Smoke Test after each new build.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Helped data migration team in testing the migrated data through back end.
- Worked closely with development team to discuss and priorities the defect fix.
- Created and worked on user Security Matrix.
- Involved in training the users for UAT.
- Was part of the UAT support team for a group of 40 UAT testers (users).
- Good Knowledge in Logical and Physical Data modeling, creating new data models and data dictionaries.
- Was responsible for interacting with clients and explaining them the functionality and the navigational steps.
- Worked closely with performance and stress testing team.
Environment: IBM - Curram Frame Work, Java, J2EE, Oracle 10g, XML, SQL, PL/SQL, Windows XP, SAHI, WINCVS, TOAD, Quality Center and Load Runner.
Confidential, Hartford, CTQA Analyst / Tester
Responsibilities:
- Reviewed Business Requirement and Technical Specification Documents.
- Involved in Developing High Level Test Plans, Test Cases, Test Scripts and Traceability Matrix with screen prints of the policy screens wherever required.
- Documented Test cases based on the DOS policy outlines in Quality Center.
- Responsible for GUI Testing, System Testing, Regression Testing and Functional Testing.
- Was involved in the regression testing during automation process.
- Prepared Status Summary Reports with details of executed, passed and failed Test Cases.
- Created XML files using XML schema.
- Strong knowledge of the Software Development Lifecycle and related methodologies and also Agile Methodologies.
- Good experience in overseeing the design, development, and implementation of quality assurance standards for software testing.
- Strong knowledge of system testing best practices and methodologies.
- Made extensive use of MS Office tools to create and maintain documents such as test plans, test execution and test results documents.
- Responsible for updating Quality Center with the test case results and findings.
- Involved in Data Analysis, Data Validation and Data extraction.
- Used Quality Center as defect logging tool and pass them onto the developers
- Extensively involved in using various file compare tools like Beyond Compare, Ultra Edit, Adobe Reader to compare the Rating Work Sheets and Policy Issuance Files (PIF) generated from the new Web application to that of the current Dos application.
- Maintained Test Logs, Test Summary Reports and participated in Defect Review / Status / GO-NOGO Meetings.
Environment: ASP.NET, VB.NET, IIS Server, XML, DB2, Vitria, WINXP/NT, QTP, Quality Center, Lotus Notes, Irfan View, TSO, Beyond Compare.
ConfidentialQA Tester
Responsibilities:
- Developed Test Plan and overall Test Strategy for the applications
- Participated in Business Analysis, Requirement Analysis, Use-Case Analysis (Rose) and Data Analysis.
- Developed Test Cases and Test Scripts for System and UAT testing
- Tested all applications End-End Manually.
- Validated the interface from Front-end to back-end tables.
- Validated the Swift Message Application User Roles and profiles to ensure the security through Logon process.
- Validated the Data staging process before the Message creation.
- Tested the messaging formats and trade data through SWIFT Alliance Network.
- Regressions testing to ensure other components are not impacted by modifications.
- Performed Performance testing and validated the Stats by running SQL stored Procedures.
- Validated the back-end data by using SQL and PL/SQL.
- Worked extensively with Unix Shell Scripts (Ksh, Bsh)
- Performed Smoke testing after all the code is moved to validate the components that were enhanced to support the new front end and also to validate that other components are not impacted by the modifications.
- Tracked defects using Test Director and conducted Bug-Review meetings.
- Developed Test Summary Reports and participated in GO / NO-GO meetings
- Expertise in demonstrating the projects to different groups.
- Participated in regular project status meetings related to testing
Environment: Java, J2EE, EJB, XML, Web logic 6.1, Oracle 8i, Sybase, PowerBuilder, VB, TOAD, Test Director, Rational Suite 2003, Windows NT, Unix (Sun Solaris), Visio, MS Project, MS Word, MS Excel, MS Visio, MS Office.