We provide IT Staff Augmentation Services!

Test Engineer/team Lead Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

EXPERIENCE SUMMARY:

  • Over 10 years of extensive experience in software QA. Testing different types of applications using procedural object - oriented techniques. Well versed in testing of Internet/Intranet based web applications using Java, Selenium, Cucumber, Ruby, Eclipse, Jenkins, Bamboo, J2EE, ASP.NET, VB.NET, C#, JavaScript, VB Script, Python Scripting, Maven, Jenkins, DHTML, HTML, CSS, MS SQL Server 200X, Oracle .../11g, HP Quality Center 11.0, QTP12.0, Load Runner12.2, Apache JMeter 3.2, Blaze Meter 1.2.1 cloud-based testing, Restful web services, SOAP and MS Access as databases on WinXP/7/Red Hat Linux.
  • Expertise in Automation Testing using Java, Selenium, Cucumber, Ruby, Eclipse, Maven, Jenkins, Bamboo, Python, Quick Test Professional 12.0, by automating the web application modules.
  • Expertise in Performance testing using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools by creating virtual users with calculated threshold values based and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF.
  • Experience in writing SQL scripts execute these scripts in database and verified the test data in database using Oracle 11g, MS SQL Server 2008, jQuery, and MS Access. Good knowledge of Stored Procedures, Triggers, Cursor and Indexes.
  • Experience in ETL performance testing by creating files with voluminous data to support stress and space estimates for ETL (Data Warehouse) loads by extracting the data from database, transferring and loading in to another type of database using SQL queries or database tools which supports multiple databases and perform ETL functions.
  • Hands on experience in different types of testing like: System Testing, Functionality Testing, Regression Testing, Black Box Testing, White Box testing, Database Testing, Reports testing using both PANORAMA and SSRS etc.
  • Experience in Networking Protocols like TCP/IP, SOAP, True Client, SMTP, DHCP, LDAP Network Routers and SCSI, iSCSI, gateways, in the same and different network domains.
  • Experience in Implementing Active Directory for Data Replication, De Duplication Purpose. Experience in several types of Backups and Recovery for Tera Byte of Data using different tools for SAN disk, and RAID config, Data Replication and Data Mirroring.
  • Excellent communication and interpersonal skills, motivated self-starter with exceptional team building, Leadership, and interpersonal skills. Good team player with the ability to work in time sensitive environments.
  • Experience with ALM tool for test preparation, execution and defect management
  • Knowledge of Waterfall, Lean & Agile Methodology and conducting SCRUM and PMAP meetings, and Strong knowledge of the Software Development Life Cycle (SDLC)
  • Hands on experience in Software Testing SDLC process, like, writing Test Plan, Test Scenarios, Test Cases, Performing and maintaining RTMs using Quality Center and BOXI and maintained GAP analysis.
  • Execution of Test Cases, Generated Test Results and Reports. Recorded the defects using defect tracker tools like Jira, Quality Center, etc. Expertise in QA Methodology and QA Validations to ensure the Best Quality Assurance to the applications.

TECHNICAL SKILLS:

Testing Tools: QTP/UFT,Load Runner,Jmeter, Blaze meter, SoapUI, Restful web service, qTest,Datax, Rally,weasless,APIs.

Languages: Java, IBM Curam v6, C, C++,C#,ASP.NET, VB.NET Cucumber, Selenium,Maven, Protractor, Angularjs, XML, HTML

Database: Oracle 11g/12C, MS SQL Server, DB2, Postgressql.

Scripting: Java Script, VB Script,Unix, Shell,Python, PL/SQL, and Ruby

Management Tools: Jira, GIT, Jenkins, Bamboo, VSTS,HP ALM, Remedy.

Methodologies: Waterfall, V-Model, Iterative, Agile, Scrum.

Operating System: Windows 7/XP/NT, Linux, Unix.

Servers: IBM Web Sphere, JBoss, Apache, Tomcat, IIS.

Networking: LAN, WAN, Router, Modem, Switch

Reporting Tools: Crystal Reports, SSRS

PROFESSIONAL EXPERINCE:

Confidential, Charlotte, NC

Test Engineer/Team Lead

Responsibilities:

  • Analysed business requirements and functional documents, created the test strategy, test scenarios, and test cases documents. Maintained Requirement Traceability Matrix (RTM) between requirements, test cases, test execution, and Jira. Maintained all QA artifacts in qTest
  • Reviewed the test cases, test scenarios, and test results documents for team members and provided the appropriate comments.
  • Attended the daily scrum meeting, Iteration Planning and Release Planning meeting.
  • Assigned the testing tasks to team members
  • Review the test scripts, provide the comments, update the status to the managers
  • Crated user stories as per the testing tasks in jira. Allocated scheduling for each stories based on the complexity of the functions
  • Ceated the automation test scripts using selenium and python
  • Developed test automation framework scripts using Python Selenium WebDriver,
  • Implemented Page Object Model framework with Python and Selenium.
  • Created E2E test suite using Python Selenium WebDriver and executed on daily basis
  • Generated the report and send it to respective team member
  • Tested compatibility of application for dynamic and static content in cross browsers such as Chrome, IE, Edge, Firefox and Safari.
  • Setup Selenium GRID to run automation scripts on different browsers.
  • Created the sub stories for each user stories based on the functionalities and updated each substories and main stories along with scheduling when task has to completed.
  • Extensively worked on Regression testing as per the release wise and backlog status
  • Extensively involved in test approach, test cases & Test results walkthrough, review with peers and test leads. The workgroup consists of business system analysts, developers, architects, SEM, & Managers.
  • Extensively involved in multitasking activities. Working on current sprint testing activates, Backlog testing activities, Regression testing, automation testing, and REST API/SOAP UI/Postman testing. Testing all these different activities at a same time and delivering the results on time.
  • Highly enthusiastic, self-starter, self-driven for all the day to day activities and working on multitasks effectively and delivering all the test results on scheduled based.
  • SDLC process focusing on test planning, test execution, defect management, QA
  • Processes, procedures, Agile, test driven environment, Superior written and verbal
  • Communication skills with an aim towards continual improvement.
  • Extensively worked on Mobile Device, REST APIs/SOAP UI/Postman, and automation testing
  • Highly determined in creating Data conditioning for all types of test scenarios using DataX, BOSS, and Weaseless.
  • Worked with AWS features which provides elastic main data center as per the need and packaged applications Development, and test frameworks. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements
  • Identified the elements using locators and developed web elements. Reused these web elements in all the pages as per these elements visibility.
  • Developed the automation test scripts using reusable objects, elements and locators.
  • Developed automation test scripts using autogen
  • Expertise in developing Automation testing frameworks using Selenium web driver, Java, Manual tool and Autogen.
  • Developed automation test scripts using Selenium webdriver, cucumber BDD/TDD, TestNG.
  • Designed and updated automation framework using JAVA and cucumber BDD/TDD,
  • XPath and DOM to work with dynamic and static objects in Selenium.
  • Developed Test Results Reports using TestNG
  • Validated the Test results document with the User acceptance testing (UAT)
  • Developed synchronization using implicit and explicit wait conditions.
  • Incorporated the test data in Test scripts using cucumber BDD & TDD.
  • Created Hybrid automation framework, to integrate cucumber, protractor, and databases. Utilized multiple data driven in a hybrid framework that can support for multiple framework
  • Developed complex queries to retrieve the data from multiple databases to validate the information in UI, databases and reports.
  • Maintained the source code in SVN
  • Maintained continuous test integration and automatic build by using Jenkins and mailed the build outputs to the team members.
  • Followed Agile testing methodology, participated in daily status meetings and testing each deliverable.
  • Tested applications in different types of browsers (cross-browser testing) and versions, such as Internet Explorer, Firefox, Google Chrome to simulate different test environment.
  • Participated QA weekly meetings and various other meetings and discusses enhancement and modification request issues and defects in the application.

Environment: HP ALM Quality Center 12.0, Restful Webservices, Postman, SOAP UI, Jmeter, Blaze Meter 1.2.1, SQL Server12.0, Java, Selenium, Cucumber, SVN, Autogen, XML, HTML, Sharepoint, 508 Compliance, Jira, qTest, Splunk, Weasless, DataX, Rally, BOSS, etc.

Confidential, MD

Test Engineer/Team Lead

Responsibilities:

  • Analysed business requirements and functional documents, created the test strategy document that define the test environment, phases of testing, entrance and exit criteria into different phases of testing and resources required to conduct the effort.
  • Responsible for working within a team to create, document and execute testing strategy, test plans, test cases and test scripts in support of a set of global tools, systems, and databases.
  • Performed agile testing, reviewed the stories and participated in Daily Scrum, Iteration Planning and Release Planning meeting.
  • Performed the build and maven dependencies using Maven build tool.
  • Extensively worked in Regression testing, Parallel Processing using Selenium Web driver in TestNG.
  • Developed test scripts for Web elements by identifying the locators using Firebug and Fire Path Firefox plugins.
  • Developed automation test scripts using Java, Selenium, Cucumber, Protractor
  • Involved in Web Application GUI automation by creating regression suites using Selenium Web Driver, JUnit/TestNG, Eclipse.
  • Developed automated test scripts using TestNG annotations like Group, parameter, Data provider.
  • Working with Selenium webdriver in C#.Net/MSTest with Visual Studio IDE. Developed test automation in BDD/TDD with Fitness/C#/Selenium web driver.
  • Expertise in developing Automation frameworks with Selenium webdriver using JAVA and C#.
  • Designed automation framework using C# .NET and JAVA automation scripts.
  • Strong Experience in developing projects/application using C#, ASP.NET, ADO.NET, VB.NET, C++, AJAX, XML and Web Services.
  • Involved in test approach & test case walkthrough, review with peers and test leads. The workgroup consists of business system analysts, developers and architects.
  • Used XPath and DOM to work with dynamic and static objects in Selenium.
  • User interface validation is done through Javascript. Validated the interface against the database.
  • Created test approach and test cases for logging module.
  • Reviewed and analysed the existing test scripts.
  • Developed HTML TestNG reports for analyzing the output of test using Extent Reports API.
  • Maintained user documentation with TestNG output screenshots for User acceptance testing (UAT)
  • Developed synchronization using implicit and explicit wait conditions.
  • Incorporated the test data in Test scripts from Excel using Apache POI API.
  • Created automation scripts using Selenium Hybrid and Data-driven development framework.
  • Developed Back-end testing using complex queries to retrieve the user information to cross validate in UI and databases.
  • Maintained the Java and selenium test source code and resources in the SVN source control repository tool.
  • Analysed test results, tracked the defects and generated reports using JIRA.
  • Prepared the data to cover various scenarios and wrote SQL scripts to verify the database updates, inserts and deletion of the records.
  • Maintained continuous test integration and automatic build by using Jenkins and mailed the build outputs to the team members.
  • Followed Agile testing methodology, participated in daily status meetings and testing each deliverable.
  • Tested applications in different types of browsers (cross-browser testing) and versions, such as Internet Explorer, Firefox, Google Chrome to simulate production environment.
  • Developed features and test scripts using BDD (Behaviour driven development) in Cucumber.
  • Developed Cucumber html, JSON test reports for analyzing the test outputs.
  • Participated QA weekly meetings and various other meetings and discusses enhancement and modification request issues and defects in the application.

Environment: HP ALM Quality Center 12.0, Java, Selenium, Cucumber, Protractor, Angularjs, Ruby, Maven, Eclipse, C#, .Net, Jenkins, Bamboo, GIT, Load Runner 12.2, Apache JMeter 3.2, Restful Webservices, Jmeter, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Postgressql, XML, HTML, Sharepoint, 508 Compliance, AWS, Cloud, Jira, Xray etc.

Confidential, DC

Team Lead

Responsibilities:

  • Analyzed the Design and Requirements, and developed the Test Scenarios and Test Cases, Automated Test Scripts for projects:ULS, Auction and GIS. iHHS.
  • Created and Executed automated Test Script using tools, Java, Selenium, Cucumber and Eclipse, Maven.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Created and executed web services test scripts with RESTful Web service clients within Curam. and generated the test results report
  • Created and executed performance test scripts using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ULS, ULS: HAC, ULS: PND, ASR., Created Test Data for performance test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Extensively involved in the ETL testing using Informatica, and SSIS.
  • Transfer all the data from source to the staging area, integrated all the tables in this staging area, transfer the data to the destination database, where end client will be extra ring the data from the database.
  • Verified all the tables, Procedures, functions, views, total number of rows and data from source to destination database.
  • Executed all the relevant SQL queries based on the requirement and compared the result set data from source to destination. Extensively worked on SSIS tool to verify the data verification.
  • Extensively worked on SSRS tool to generate the reports and compare the data in reports against with the target database
  • Compared the web services test scripts with other web services tools like SOAP, and Load. Provided the best result.
  • Hands on experience in core AWS features which provides elastic main data center as per the need and packaged applications Development, and test frameworks. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Worked existing Selenium Automation Framework with C# using Data Driven Testing and NUnit.
  • Designed structured User Interface UI automated Test cases using Selenium Web driver in C# for software products used to perform software testing.
  • User interface validations are handled using Javascript. All the front end validations are verified against the database through Javascript, Java, Selenium, SQL Server, Postgressql.
  • Involved in writing test scripts using C# and executed it through Selenium WebDriver for Functional and Regression testing.
  • Developed the automation test script using Protractor, Gherkin. Cucumber, Java and Selenium
  • Used both scenario and scenario outline using gherkin language and build the automation test script and end to end test suite
  • Created end to end to test suite and deployed the same in Jenkins for CI/CD purpose
  • Working knowledge on creating a business-friendly acceptance test using Specflow (C#)
  • Performed Load Testing using Blaze Meter for Performance Testing purpose
  • Extensively involved in 508 Testing for ASR Applet removal, and TCNS projects.
  • Experience with ALM tool for test preparation, execution and defect management
  • Documented the defects identified during different types of testing environments in System, Development and AT.
  • Extensively involved in CMMI Level 3 process for Computech Inc, and provided all the artifacts as per the Testing perspective.
  • Currently involved in CMMI Level 3 process for NCI, and in the preparation of artifacts as per the Testing Perspective.
  • Extensively involved in System Test and Regression test in AT for Shut Down Processes.
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional specifications, Test Scripts, and Test Results.
  • Involved in reproducing the defects found in AT and Retest those defects in different environment system test and development and provided the workaround if necessary.
  • Attended Agile Training provided by Computech and also participated in In-house training.

Environment: Java, Selenium, Cucumber, Ruby, Maven, Eclipse, C#, .Net, Jenkins, IBM Curam v6, Bamboo, Quick Test Professional 11.0, Load Runner 12.2, Apache JMeter 3.2, Restful Webservices, Jmeter, Blaze Meter 1.2.1, SQL Server12.0Oracle 11g, Postgressql, Toad, Big Data, Informatica, SSIS, SSRS, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALMSerena Change Version Manager,AWS, Cloud. etc.

Confidential, VA

QA Test Engineer

Responsibilities:

  • Analyzed requirements and develop the Test Scenario, uploaded the same in Wiki.
  • Developed the Positive, Negative, Boundary, Verification and Validation Test Cases in Quality Center.
  • Created the Test Data for Test Cases in all Test Environments. E.g: System Test | User Acceptance Test environment.
  • Expertise in creating Automated Test Script using Ruby, Selenium, Cucumber, Java, Eclipse, Maven, and QTP tools.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating load test scripts for performance testing purpose using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects Portal 1.0, and Portal 1.1.
  • Integrated required test scripts to quality center and generated the Test Results Reports using Quality Center.
  • Extensively worked on SQL: Written and Executed the SQL Queries as per the Requirements. Exported the Data in Spreadsheet from SQL and performed the Data Validation using Excel Compare.
  • Validate the Data from SQL and Portal web site, to make sure that data validation is accurate as per the Database | Web Portal | Third Party Used Test Tools
  • Extensively worked on Linux: Generated and Executed the Bash Script for Input and Output Files for Data validation
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Extensively involved in generating the Automation Script using Quick Test Professional.
  • Executed Automated scripts in different Environments
  • Extensive experience with ALM tool for test preparation, execution and defect management
  • Generated the Automated Test Results for each Test Cases using Quality Center in Word Document and uploaded the same in Wiki.
  • Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Extensively involved in generating the Scripts using Restful Webservices for web service testing.
  • Executed the Web Service Test Scripts using Restful web services, SOAP, compare the data with the web service logs, database and portals
  • Generated the Web Service Test Results using HP-Web Service and maintained the same in Wiki
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Knowledge of Cucumber, Ruby for Generating the Automated scripts.
  • Extensively involved in the deployment of O&M releases and Portals projects. And verified the deployment test results/status whenever necessary.

Environment: Java, Selenium, Cucumber, Eclipse, Maven, Jenkins, Bamboo, Quick Test Professional 11.0, Load Runner 12.2, Restful Webservices, Apache JMeter 3.2, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Toad, Big Data, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALM, Serena Change Version Manager, AWS, Cloud. Etc.

Confidential, DC

QA Test Engineer

Responsibilities:

  • Analyzed the requirements and design defined for BCVS and O&M defined in the JRule Template thoroughly, and develop the Test Scenarios, uploaded the same in Serena Change Manager Version Control and Rule Studio, for the projects: ACT Archive 6.2.1, ADT 6.2/6.3, EOSL Release on 2012, Web Recalculation, End-To-End Test cases, UPT Functional Test Cases, 508 Compliances, EOSL Release on 2012
  • Developed the Positive, Negative, Regression Test Cases after analyzing the Requirements and Design specifications in Quality Center.
  • Created the Test Environment for testing from within Rule Studio.
  • Documented the Test Results in Rule studio.
  • Expertise in creating Automated Test Script QTP/UFT tool. Interface validation using JavaScript, VBScript, HPALM. Validated the interface validation with the backend database.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating load test scripts for performance testing purpose using Load Runner 12.0, Apache and JMeter 3.2 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ACT Archive 6.2.1, ADT 6.2/6.3, Web Recalculation.
  • Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Extensively involved in creating the web services test scripts using Restful web services and compared the test results using other tools like SOAP, Load.
  • Used the ILOG for data validation and monitoring the data in terms of the components graphs.
  • Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Documented the defects identified during different types of testing environments: system, CDE-I and ITC.
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Written and Executed the Test Scripts for Automation Testing using Quick Test Professional/Unified Functional Testing. Developed the test suite, and ran the same in different test environments CDE-I and ITC.
  • Written and Executed the Test Scripts for Performance Testing using Load Runner. And developed the test suite, and ran the same in different environments CDE-I and ITC. And generated the Test Results.
  • Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional design, Test Scripts, and Test Results.
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Extensively involved in the deployment of O&M releases and BCVS projects. And verified the deployment test results/status whenever necessary.
  • Extensively involved in Reproducing and analyzing the defects found in all the testing environments: System, CDE-I and ITC.
  • Developed Automation suite using QTP/UFT. And also ran the suite for all different stages of New Build generated.
  • Developed the Performance test suite using Load Runner, and ran the suite for all different types of environments CDE-I and ITC.

Environment: Quick Test Professional 11.0,Unified Functional Testing 14.50, Load Runner 12.0, Apache JMeter 3.2, Restful Webservices, SQL Server12.0, Oracle 11g, Postgressql, Toad, Big Data, MS Access, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Manager Version Control Manager, AWS, Cloud.

Confidential, VA

QA Test Engineer

Responsibilities:

  • Analysis of Business/Functional requirements and uploading the requirements to Requisite Pro and tagging them in Requisite Pro as Functional and Business requirements.
  • Pushing the requirements from Requisite Pro to Quality Center using ICART - RMSync.
  • Written the Test Cases/Test Scripts in quality Center and Mapping them to the respective requirements.
  • Created the Test Data for each Test Case and Executed the Test Cases in Quality Center.
  • Generating the RTM by running SQL Scripts/BOXI.
  • Executed the Test Cases/ Test Scripts in Quality Center for all the New Releases.
  • Creating Automated Test Script using VB Scripts and QTP/UFT tools.
  • Managed the user interface validations using JavaScript. Integrated the same with the VBScripts. Created the control using JavaScript, optimized the same in all the webpages in the web applications.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating Load Test Scripts using Load Runner for Performance Testing purpose.
  • Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Generated the Test Plan and Test Result Summary documents.
  • Written SQL Queries with respect to the business requirements and executed in SQL Server2005
  • Maintaining the Admin Role for Pre-Requisite Pro, Quality Center, ICART, RMSync and Clear Quest.
  • Analyzed the Data and different types of Report format Bar/Pie/etc in Panorama
  • Developed the complete end to end test suite using QTP/UFT. And also ran the suite for all different stages of New Build generated.
  • Developed the Polygraphs in Excel Sheet for the Test Results for all types of Testing.
  • Worked extensively with SQL queries.
  • I have involved in the Review process of Test cases
  • Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment: Quick Test Professional 11.0, Unified Functional Testing 14.50, Load Runner 11.0, SQL Server12.0, Oracle 11g, Big data, Toad, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Version Manager

Confidential, VA

QA Test Engineer

Responsibilities:

  • Analysis of Business requirements from DOORS and developed the Test Plan and Test cases and also developed the Test Cases in HP-Quality Center.
  • Developed the Test Cases for SIT, End to End testing, Services Testing and Back End Testing in Quality Center with respective business functions.
  • Making the Trace ability Matrix for SIT, E2E and Service Test Cases with respective Business Requirements.
  • Written SQL Queries with respect to the business requirements in DOORS and Tables in the Database.
  • Analysis of Gaps between the Requirements and Test Cases development.
  • Maintaining the RTM and generating the SDLC report for the test cases for all the business functionalities.
  • Raising the defects in Clear Quest and maintaining the Requirement Traceability Matrix for Clear Quest and Quality Center
  • Created the Test Data for each Test Cases and Executed the Test Cases in Quality Center at different stages like, Preliminary, System, Integration and Regression
  • Created the Test Data for each Test Cases and Executed all the SQL Queries for all the stages like Preliminary, System, Integration and Regression
  • Executed all the Service Test Cases using SOAP SONAR tool.
  • Generated Request XML files and ran against respective business functionalities for web services and generated Response XML files. And compared both Request and Response files.
  • Captured the XML file difference and Screen Shots during execution.
  • Automated the Scripts using VB Script and ran the automated scripts for respective business functionalities
  • Developed Automation suite using QTP. And also ran the suite for all different stages of New Build generated.
  • Developed the Test cases for Performance Testing with respect to the Product features.
  • Executed the test cases to check the Performance by using Performance Testing methodologies.
  • Developed the Polygraphs in Excel to compare the Performance of the product.
  • Responsible for GUI Testing, System Testing, Integration Testing, E2E Testing, Regression Testing, Service Testing and Back End Testing.
  • Worked extensively with SQL queries.
  • Generated SDLC report for all the modules for all the different stages.
  • I have involved in the Review process of Test cases and RTM Process.
  • Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment: QTP, SOAP SONAR, Service Testing, Performance Testing, Manual Testing, GUI Testing, E2E Testing, Regression Testing, Black Box testing, Integration Testing, PVCS, Java, J2EE, VB Script, C#, HTML, XML, Oracle 9i and Web Services, HP Quality Center, Clear Quest.

Confidential

QA Test Engineer

Responsibilities:

  • Configured the Quantum Storage devices DXI3500, DXI4500, DXI 5500, and DXI 7500 using LUN 5 mechanism.
  • Configured the DX100 machines using Solstice Disk and shared Quantum Storage over Ultra Wide Differential SCSI interfaces; stripped volumes were carefully configured to work optimally with Oracle 7.3.3 and 8;
  • Created both RAID5 and RAID 0+1 striped LUNS for use with DXI3500, DXI5500 and DXI7500 for Tera Byte of disk configuration
  • Configured the SCSI and iSCSI sender and Receiver for Back Up and Restore of Tera Byte Data Storage.
  • Generated Test Suite by using Rational Robot and Rational Unified Process.
  • Installed Oracle 7.3.3 and created a basic database. Configured the SCSI options for optimal disk I/O, UFS and VXFS file systems with appropriate block sizes.
  • Install, debug, configure and UNIX based subsystems. Installed and configured Solaris O/S, 9/10, TCP/IP network configuration, RAID Volumes on Storage Array Disks using Disk Suite, file systems, NAS and NFS services, User Accounts, Automatic Installation via JumpStart, Network Terminal Servers, Configured departmental intranets with HTTP and FTP services, Basic firewall with FireWall-1, High Availability Servers for NFS.
  • Analysis of Business requirements and developed the Test Plan and Test cases
  • Created Test Data and Ran the Test Cases for SIT, End to End testing and Regression Testing.
  • Raised the Defects in Bugzilla
  • Generated SDLC report for all the stages of SDLC Process.

Environment: Quantum DX100, DXI3500, DXI4500, DXI5500 and DXI7500, Rational tools like Rational Robot, Rational Unified Process, SOLARIS 2.x; Linux, Kernel 2.4,2.6, Raid Manager 6, IP Multicasting, Storage Array: Connector SAN Directors and switches, File System (VxFS), NFS, NAS, DNS, Jumpstart, TCP/IP, SSH, Brocade Switch, JNI/Qlogic/Emulex fibre boards and configuration, Fibre Channel Switch, Brocade 3800 FC Switches, SATA based storage arrays, FC Switches; Bugzilla, Atlassian JIRA, Atlassian Confluence

We'd love your feedback!