We provide IT Staff Augmentation Services!

Soa Tester And Virtualization Consultant Resume

Appleton, WI

SUMMARY:

  • Around 7+ years of experience in Software Quality Assurance with strong experience in Service /SOA /API Testing & Service Virtualization using CA DevTest /iTKO LISA toolset.
  • Strong Experience in testing Middleware / SOA services as a standalone component before they are consumed by external applications, to identify any integration, data & mapping issues ahead of time there by shifting quality to left in the overall SDLC
  • Expertise in Testing & Virtualizing several types of SOA based services including Web Services (SOAP/HTTP(s)), REST Services (XML/JSON), MQ & JMS
  • Experienced in using XSD, WSDL, XPath, XML Spy for service testing & Virtualization.
  • Experienced in understanding Service Virtualization needs/ Requirements & creating VSI’s using WSDL, WADL, Recording, Request & Response pairs
  • Experience in working with various ESB teams that has used variegated integration tools like web Methods, IIB (IBM Integration Bus), Data Power, TIBCO & JBOSS for developing services.
  • Experience in analyzing and supporting various teams for testing and debugging of SOA based services and applications.
  • Good experience in Agile delivery process of software using Safe methodology.
  • Have excellent interpersonal skills, committed, result oriented, hard working with a quest and zeal to learn new technologies.
  • Excellent Analytical, Critical thinking, & Creative Problem - solving skills.
  • Excellent communication skills and ability to work effectively and efficiently in teams and individually.

TECHNICAL SKILLS:

Testing Tools: iTKO LISA, DevTest, SoapUI, UFT, QTP, Quality Center

Languages: Java, SQL, XML, HTML, VB Script, Java Script.

Databases: Oracle, DB2, SQL Server, MySQL, MS-Access, Toad & DbVisualizer.

Version Control Tools: Stash, Git, Subversion, Visual Safe Source.

Operating Systems: Windows, UNIX & Linux

PROFESSIONAL EXPERIENCE:

Confidential, Appleton, WI

SOA tester and virtualization consultant

Responsibilities:

  • Over 17 high priority SOAP services and MQ’s were tested as stand-alone components before getting consumed by source applications.
  • Automated all Test scenario with in the test cases, which includes pre and post procedures of a particular scenario.
  • Automated the test cases in such a way they can run in different environments by parameterizing dependencies between environments.
  • Create various Filters & Assertions to validate actual response with expected response.
  • Generate reports for each test cases.
  • Involved in development of java application to run all the automated test cases using java application.
  • Co-ordinate with off-shore team.
  • Gave training on DevTest for developers to develop their test cases.
  • Response validation against Schema / XSD to ensure the structure and underlying data conformity. Experience in testing WebServices using SoapUI tool, validating WSDL, request and response xml
  • Over 20+ services developed which are of type SOAP service were successfully virtualized.
  • Developed Match Script for virtual services based on that it will return correct response or fault response.
  • Effective use of Match Styles where necessary.
  • Used Image Validation Execution mode to configure Response Selector step to invoke live service when request didn’t match the configured request-response pair data
  • Provided support for consumers of virtual services.

Environment: DevTest 9.1, SoapUI ALM, Tortoise SVN, SQL, Java Script, Java, Jira.

Confidential, Chicago, IL

SOA tester and virtualization consultant

Responsibilities:
  • Identified virtualizing needs across the enterprise, developed and implemented virtualization
  • Over 10+ services developed in the ESB layer using SOAP/HTTP(S), REST & MQ protocols were successfully virtualized, where needed & applicable.
  • Identified and implemented Lisa Virtualization based on the below use cases, where applicable
  • Non availability of test data
  • Non availability of systems / Eliminating system dependencies
  • Simulating error conditions/scenarios
  • Emulating load on the systems using virtual service layer.
  • Used Image Validation Execution mode to configure Response Selector step to invoke live service when request didn’t match the configured request-response pair data
  • Provided support for consumers of virtual services
  • Over 60 + services developed in the ESB layer using SOAP/HTTP(S), REST & MQ protocols were tested as stand-alone components before getting consumed by source applications.
  • Service testing validated the following:
  • Ensure that the service response adheres to the defined Data Mapping/XSLT.
  • Boundary conditions, field length, type restrictions, optional & required field behavior is as expected
  • Service handles errors gracefully
  • Tested WebServices /XML / SOAP and RESTFul services using SoapUI tool
  • Leveraged various Filters & Assertions to validate actual response with expected response.
  • Used Altova XMLSpy for reading schema definitions to understand the required, optional field definition, boundary conditions, enumerations, patterns & message structure.
  • Leveraged DevTest Staging documents to emulate load on services and monitor average response time.

Environment: DevTest 8.5, SoapUI, Altova XMLSpy, IBM Web Sphere, ESB, DB2, Oracle, SQL Server, Toad, Jira, Stash/BitBucket, TeamCity

Confidential,Northbrook, IL

SOA Tester

Responsibilities:
  • Extensively involved in performing stand-alone service testing using SoapUI, Schemas & SOAP/WSDL.
  • Used Filters and Assertions (XPath Match) to compare the result of an XPath expression to an expected value
  • Hands on experience using groovy scripting to develop custom utilities and for data driven testing in the SoapUI free version.
  • Building and deploying virtualized service through Web Service Description Language (which is the combination of SOAP and XML Schema), Extensible Markup Language Schema Definition
  • Study functional requirement specification and Application Detail Design to Understand the Change and LISA (Learn Invoke Simulate Analyze) Impact to have Virtualized Service Ready
  • Extract required data from Testing Data Base through TOAD for pre-testing activity
  • Virtualize Sprint Nextel Application to LISA (Learn Invoke Simulate Analyze). Need to Set up a working Session with Front End Subject Matter Experts to understand the Functionality and Business Scenario and different transaction call made to different Application with different protocol to Stub them in LISA (Learn Invoke Simulate Analyze)
  • Creation of different payload transaction in LISA (Learn Invoke Simulate Analyze) which consist of Soap XML, Text XML, Fixed Length Message, Web service and Web service with Attributes
  • Support Front End Application in Sprint to Have Request Response pair stubbed in LISA (Learn Invoke Simulate Analyze) for Unit and System Testing
  • Writing Java Class to Phrase Specific Request XML that is not Understood by LISA (Learn Invoke Simulate Analyze) Tool
  • Responsible for managing scope, planning, tracking, change control, aspects of the project
  • Translate customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results
  • Establish Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.
  • Setup a team of two offshore resources to support LISA virtualization for Sprint development life cycle
  • Exposure on SoapUI tool

Environment: SoapUI 4.5, iTKO LISA 7.5, UFT 11.5, Oracle 11i, Toad, SQL, DB2, Oracle, SQL Server.

Confidential,Columbus, OH

QA tester

Responsibilities:
  • Worked on agile methodology using Scrum.
  • Written test cases and performed manual testing like positive testing and negative testing.
  • Reviewed manual testing methods and developed and executed automated scripts.
  • Actively participated in creating Requirements Traceability matrices.
  • Attended test case review meetings and walk through.
  • Created, developed and executed test cases and test scripts manually and using automation tools.
  • Worked on Auto sys scheduling and execution.
  • Wrote Test cases and Test scripts based on Use Cases.
  • Wrote test cases and test scripts for functionality testing.
  • Executed Test Cases manually using Quality center.
  • Worked on Test Plan, Test Strategy, Test Cases and Test Scripts walkthrough.
  • Identified and wrote test scripts to perform Regression Testing as needed.
  • Performed Backend testing using complex SQL queries
  • Performed End-to-End testing manually and Regression testing using QTP.
  • Documented test cases corresponding to business rules and other operating conditions
  • Analyzed user requirements, attended Change Request meetings to document changes and implemented procedures to test changes
  • Extract required data from Test database using TOAD or SQL Developer for pre testing activity
  • Backend verification of data using SQL queries
  • Execution of Progression and Regression test cases, defect tracking.
  • Involved in defect status reporting and prepare bug summary reports
  • Conducted Data Driven Testing (DDT) by passing parameter from Excel file.
  • Establishing test strategy and planning for Integration, System and User Acceptance Testing (UAT).

Environment: Manual testing, QTP 9.0, Quality Center 8.2, Agile-Scrum, MS Office, Java, J2EE, Oracle 9i, SQL, PL/SQL, TOAD, and Windows 2000.

Confidential

Quality assurance Analyst

Responsibilities:
  • Performed end-to-end testing on the entire system.
  • Documented every stage of testing for historical purpose.
  • Review functional and design specifications to ensure full understanding of individual deliverables.
  • Backend database testing in Microsoft SQL environment including validating stored procs, jobs and triggers.
  • Identify test requirements from specifications, map test case requirements and design test coverage plan.
  • Develop, document and maintain functional test cases and other test artifacts like the test data, data validation, harness scripts and automated scripts.
  • Execute and evaluate manual or automated test cases and report test results.
  • Hold and facilitate test plan/case reviews with cross-functional team members.
  • Identify any potential quality issues per defined process and escalate potential quality issues immediately to management.
  • Ensure that validated deliverables meet functional and design specifications and requirements.
  • Isolate, replicate, and report defects and verify defect fixes.

Environment: VB6.0, MS Excel, Oracle 8i, SQL and Windows NT.

Hire Now