We provide IT Staff Augmentation Services!

Software Test Engineer Resume

5.00/5 (Submit Your Rating)

MD

SUMMARY:

  • Over 10 years of extensive experience in software QA. Testing different types of applications using procedural object - oriented techniques. Well versed in testing of Internet/Intranet based web applications using Java, Selenium, Cucumber, Ruby, Eclipse, Jenkins, Bamboo, J2EE, ASP.NET, VB.NET, C#, JavaScript, VB Script, Python Scripting, Maven, Jenkins, DHTML, HTML, CSS, MS SQL Server 200X, Oracle .../11g, HP Quality Center 11.0, QTP12.0, Load Runner12.2, Apache JMeter 3.2, Blaze Meter 1.2.1 cloud-based testing, Restful web services, SOAP and MS Access as databases on WinXP/7/Red Hat Linux.
  • Expertise in Automation Testing using Java, Selenium, Cucumber, Ruby, Eclipse, Maven, Jenkins, Bamboo, Python, Quick Test Professional 12.0, by automating the web application modules.
  • Expertise in Performance testing using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools by creating virtual users with calculated threshold values based and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF.
  • Experience in writing SQL scripts execute these scripts in database and verified the test data in database using Oracle 11g, MS SQL Server 2008, jQuery, and MS Access. Good knowledge of Stored Procedures, Triggers, Cursor and Indexes.
  • Experience in ETL performance testing by creating files with voluminous data to support stress and space estimates for ETL (Data Warehouse) loads by extracting the data from database, transferring and loading in to another type of database using SQL queries or database tools which supports multiple databases and perform ETL functions.
  • Hands on experience in different types of testing like: System Testing, Functionality Testing, Regression Testing, Black Box Testing, White Box testing, Database Testing, Reports testing using both PANORAMA and SSRS etc.
  • Experience in Networking Protocols like TCP/IP, SOAP, True Client, SMTP, DHCP, LDAP Network Routers and SCSI, iSCSI, gateways, in the same and different network domains.
  • Experience in Implementing Active Directory for Data Replication, De Duplication Purpose. Experience in several types of Backups and Recovery for Tera Byte of Data using different tools for SAN disk, and RAID config, Data Replication and Data Mirroring.
  • Excellent communication and interpersonal skills, motivated self-starter with exceptional team building, Leadership, and interpersonal skills. Good team player with the ability to work in time sensitive environments.
  • Experience with ALM tool for test preparation, execution and defect management
  • Knowledge of Waterfall, Lean & Agile Methodology and conducting SCRUM and PMAP meetings, and Strong knowledge of the Software Development Life Cycle (SDLC)
  • Hands on experience in Software Testing SDLC process, like, writing Test Plan, Test Scenarios, Test Cases, Performing and maintaining RTMs using Quality Center and BOXI and maintained GAP analysis. Execution of Test Cases, Generated Test Results and Reports. Recorded the defects using defect tracker tools like Jira, Quality Center, etc. Expertise in QA Methodology and QA Validations to ensure the Best Quality Assurance to the applications.

PROFESSIONAL EXPERIENCE:

Confidential, MD

Software Test Engineer

Roles and Responsibilities:

  • Analysed business requirements and functional documents, created the test strategy document that define the test environment, phases of testing, entrance and exit criteria into different phases of testing and resources required to conduct the effort.
  • Responsible for working within a team to create, document and execute testing strategy, test plans, test cases and test scripts in support of a set of global tools, systems, and databases.
  • Performed agile testing, reviewed the stories and participated in Daily Scrum, Iteration Planning and Release Planning meeting.
  • Performed the build and maven dependencies using Maven build tool.
  • Extensively worked in Regression testing, Parallel Processing using Selenium Web driver in TestNG.
  • Developed test scripts for Web elements by identifying the locators using Firebug and Fire Path Firefox plugins.
  • Developed automation test scripts using Java, Selenium, Cucumber, Protractor
  • Involved in Web Application GUI automation by creating regression suites using Selenium Web Driver, JUnit/TestNG, Eclipse.
  • Developed automated test scripts using TestNG annotations like Group, parameter, Data provider.
  • Involved in test approach & test case walkthrough, review with peers and test leads. The workgroup consists of business system analysts, developers and architects.
  • Used XPath and DOM to work with dynamic and static objects in Selenium.
  • User interface validation is done through Javascript. Validated the interface against the database.
  • Created test approach and test cases for logging module.
  • Reviewed and analysed the existing test scripts.
  • Developed HTML TestNG reports for analyzing the output of test using Extent Reports API.
  • Maintained user documentation with TestNG output screenshots for User acceptance testing (UAT)
  • Developed synchronization using implicit and explicit wait conditions.
  • Incorporated the test data in Test scripts from Excel using Apache POI API.
  • Created automation scripts using Selenium Hybrid and Data-driven development framework.
  • Developed Back-end testing using complex queries to retrieve the user information to cross validate in UI and databases.
  • Maintained the Java and selenium test source code and resources in the SVN source control repository tool.
  • Analysed test results, tracked the defects and generated reports using JIRA.
  • Prepared the data to cover various scenarios and wrote SQL scripts to verify the database updates, inserts and deletion of the records.
  • Maintained continuous test integration and automatic build by using Jenkins and mailed the build outputs to the team members.
  • Followed Agile testing methodology, participated in daily status meetings and testing each deliverable.
  • Tested applications in different types of browsers (cross-browser testing) and versions, such as Internet Explorer, Firefox, Google Chrome to simulate production environment.
  • Developed features and test scripts using BDD (Behaviour driven development) in Cucumber.
  • Developed Cucumber html, JSON test reports for analyzing the test outputs.
  • Participated QA weekly meetings and various other meetings and discusses enhancement and modification request issues and defects in the application.

Environment: HP ALM Quality Center 12.0, Java, Selenium, Cucumber, Protractor, Angularjs, Ruby, Maven, Eclipse, Jenkins, Bamboo, GIT, Load Runner 12.2, Apache JMeter 3.2, Restful Webservices, Jmeter, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Postgressql, XML, HTML, Sharepoint, 508 Compliance, AWS, Cloud, Jira, Xray etc.

Confidential, DC

Software Test Engineer

Roles and Responsibilities:

  • Analyzed the Design and Requirements, and developed the Test Scenarios and Test Cases, Automated Test Scripts for projects:ULS, Auction and GIS. iHHS.
  • Created and Executed automated Test Script using tools, Java, Selenium, Cucumber and Eclipse, Maven.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Created and executed web services test scripts with RESTful Web service clients within Curam. and generated the test results report
  • Created and executed performance test scripts using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ULS, ULS: HAC, ULS: PND, ASR., Created Test Data for performance test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Compared the web services test scripts with other web services tools like SOAP, and Load. Provided the best result.
  • Hands on experience in core AWS features which provides elastic main data center as per the need and packaged applications Development, and test frameworks. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Worked existing Selenium Automation Framework with C# using Data Driven Testing and NUnit.
  • Designed structured User Interface UI automated Test cases using Selenium Web driver in C# for software products used to perform software testing.
  • User interface validations are handled using Javascript. All the front end validations are verified against the database through Javascript, Java, Selenium, SQL Server, Postgressql.
  • Involved in writing test scripts using C# and executed it through Selenium WebDriver for Functional and Regression testing.
  • Working knowledge on creating a business-friendly acceptance test using Specflow (C#)
  • Performed Load Testing using Blaze Meter for Performance Testing purpose
  • Extensively involved in 508 Testing for ASR Applet removal, and TCNS projects.
  • Experience with ALM tool for test preparation, execution and defect management
  • Documented the defects identified during different types of testing environments in System, Development and AT.
  • Extensively involved in CMMI Level 3 process for Computech Inc, and provided all the artifacts as per the Testing perspective.
  • Currently involved in CMMI Level 3 process for NCI, and in the preparation of artifacts as per the Testing Perspective.
  • Extensively involved in System Test and Regression test in AT for Shut Down Processes.
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional specifications, Test Scripts, and Test Results.
  • Involved in reproducing the defects found in AT and Retest those defects in different environment system test and development and provided the workaround if necessary.
  • Attended Agile Training provided by Computech and also participated in In-house training.

Environment: Java, Selenium, Cucumber, Ruby, Maven, Eclipse, Jenkins, IBM Curam v6, Bamboo, Quick Test Professional 11.0, Load Runner 12.2, Apache JMeter 3.2, Restful Webservices, Jmeter, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Postgressql, Toad, Big Data, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALM, Serena Change Version Manager,AWS, Cloud. etc.

Confidential, VA

QA Test Engineer

Roles and Responsibilities:

  • Analyzed requirements and develop the Test Scenario, uploaded the same in Wiki.
  • Developed the Positive, Negative, Boundary, Verification and Validation Test Cases in Quality Center.
  • Created the Test Data for Test Cases in all Test Environments. E.g: System Test | User Acceptance Test environment.
  • Expertise in creating Automated Test Script using Ruby, Selenium, Cucumber, Java, Eclipse, Maven, and QTP tools.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating load test scripts for performance testing purpose using Load Runner12.2, Apache JMeter 3.2, and Blaze Meter 1.2.1 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects Portal 1.0, and Portal 1.1.
  • Integrated required test scripts to quality center and generated the Test Results Reports using Quality Center.
  • Extensively worked on SQL: Written and Executed the SQL Queries as per the Requirements. Exported the Data in Spreadsheet from SQL and performed the Data Validation using Excel Compare.
  • Validate the Data from SQL and Portal web site, to make sure that data validation is accurate as per the Database | Web Portal | Third Party Used Test Tools
  • Extensively worked on Linux: Generated and Executed the Bash Script for Input and Output Files for Data validation
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Extensively involved in generating the Automation Script using Quick Test Professional.
  • Executed Automated scripts in different Environments
  • Extensive experience with ALM tool for test preparation, execution and defect management
  • Generated the Automated Test Results for each Test Cases using Quality Center in Word Document and uploaded the same in Wiki.
  • Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Extensively involved in generating the Scripts using Restful Webservices for web service testing.
  • Executed the Web Service Test Scripts using Restful web services, SOAP, compare the data with the web service logs, database and portals
  • Generated the Web Service Test Results using HP-Web Service and maintained the same in Wiki
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Knowledge of Cucumber, Ruby for Generating the Automated scripts.
  • Extensively involved in the deployment of O&M releases and Portals projects. And verified the deployment test results/status whenever necessary.

Environment: Java, Selenium, Cucumber, Eclipse, Maven, Jenkins, Bamboo, Quick Test Professional 11.0, Load Runner 12.2, Restful Webservices, Apache JMeter 3.2, Blaze Meter 1.2.1, SQL Server12.0, Oracle 11g, Toad, Big Data, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, ALM, Serena Change Version Manager, AWS, Cloud. Etc.

Confidential, DC

QA Test Engineer

Roles and Responsibilities:

  • Analyzed the requirements and design defined for BCVS and O&M defined in the JRule Template thoroughly, and develop the Test Scenarios, uploaded the same in Serena Change Manager Version Control and Rule Studio, for the projects: ACT Archive 6.2.1, ADT 6.2/6.3, EOSL Release on 2012, Web Recalculation, End-To-End Test cases, UPT Functional Test Cases, 508 Compliances, EOSL Release on 2012
  • Developed the Positive, Negative, Regression Test Cases after analyzing the Requirements and Design specifications in Quality Center.
  • Created the Test Environment for testing from within Rule Studio.
  • Documented the Test Results in Rule studio.
  • Expertise in creating Automated Test Script QTP/UFT tool. Interface validation using JavaScript, VBScript, HPALM. Validated the interface validation with the backend database.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating load test scripts for performance testing purpose using Load Runner 12.0, Apache and JMeter 3.2 tools. And executed these performance test scripts by creating virtual users with calculated threshold values and generating load on the servers to calculate the performance status of the web application with respect to the multiple users based on threshold values. Generated and provided the test results and reports in PDF for projects ACT Archive 6.2.1, ADT 6.2/6.3, Web Recalculation.
  • Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Extensively involved in creating the web services test scripts using Restful web services and compared the test results using other tools like SOAP, Load.
  • Used the ILOG for data validation and monitoring the data in terms of the components graphs.
  • Extensively involved in testing AWS Cloud features. AWS features is to set up development environment & test take full control of complex environment to meet required developers environments using Cloud needs direct from IDEs to build and set up free tier environment on aws cloud. And tested the complete testing in this aws cloud as per the aws requirements.
  • Documented the defects identified during different types of testing environments: system, CDE-I and ITC.
  • Conducted and Participated in the peer review of QA artifacts such as Test Plan, Test Scenario, Test Cases and Test Results.
  • Written and Executed the Test Scripts for Automation Testing using Quick Test Professional/Unified Functional Testing. Developed the test suite, and ran the same in different test environments CDE-I and ITC.
  • Written and Executed the Test Scripts for Performance Testing using Load Runner. And developed the test suite, and ran the same in different environments CDE-I and ITC. And generated the Test Results.
  • Collaborated with the BAs for performing the testing activities like analyzing the requirements, functional design, Test Scripts, and Test Results.
  • Involved in reproducing the defects found in UAT and Retest those defects and provided the solutions, workaround.
  • Extensively involved in the deployment of O&M releases and BCVS projects. And verified the deployment test results/status whenever necessary.
  • Extensively involved in Reproducing and analyzing the defects found in all the testing environments: System, CDE-I and ITC.
  • Developed Automation suite using QTP/UFT. And also ran the suite for all different stages of New Build generated.
  • Developed the Performance test suite using Load Runner, and ran the suite for all different types of environments CDE-I and ITC.

Environment: Quick Test Professional 11.0,Unified Functional Testing 14.50, Load Runner 12.0, Apache JMeter 3.2, Restful Webservices, SQL Server12.0, Oracle 11g, Postgressql, Toad, Big Data, MS Access, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Manager Version Control Manager, AWS, Cloud.

Confidential, VA

QA Test Engineer

Roles and Responsibilities:

  • Analysis of Business/Functional requirements and uploading the requirements to Requisite Pro and tagging them in Requisite Pro as Functional and Business equirements.
  • Pushing the requirements from Requisite Pro to Quality Center using ICART - RMSync.
  • Written the Test Cases/Test Scripts in quality Center and Mapping them to the respective requirements.
  • Created the Test Data for each Test Case and Executed the Test Cases in Quality Center.
  • Generating the RTM by running SQL Scripts/BOXI.
  • Executed the Test Cases/ Test Scripts in Quality Center for all the New Releases.
  • Creating Automated Test Script using VB Scripts and QTP/UFT tools.
  • Managed the user interface validations using JavaScript. Integrated the same with the VBScripts. Created the control using JavaScript, optimized the same in all the webpages in the web applications.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Expertise in creating Load Test Scripts using Load Runner for Performance Testing purpose.
  • Created Test Data for Load test scripts with respect to different environments like Development, System Test and User Acceptance test environments. And executed these scripts and provided the test results and reports.
  • Generated the Test Plan and Test Result Summary documents.
  • Written SQL Queries with respect to the business requirements and executed in SQL Server2005
  • Maintaining the Admin Role for Pre-Requisite Pro, Quality Center, ICART, RMSync and Clear Quest.
  • Analyzed the Data and different types of Report format Bar/Pie/etc in Panorama
  • Developed the complete end to end test suite using QTP/UFT. And also ran the suite for all different stages of New Build generated.
  • Developed the Polygraphs in Excel Sheet for the Test Results for all types of Testing.
  • Worked extensively with SQL queries.
  • I have involved in the Review process of Test cases
  • Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment: Quick Test Professional 11.0, Unified Functional Testing 14.50, Load Runner 11.0, SQL Server12.0, Oracle 11g, Big data, Toad, Quality Center 11.0, XML, HTML, VMWare, ILOG, JRules, Valtool, 508 Compliances, Serena Change Version Manager

Confidential, VA

QA Test Engineer

Roles and Responsibilities:

  • Analysis of Business requirements from DOORS and developed the Test Plan and Test cases and also developed the Test Cases in HP-Quality Center.
  • Developed the Test Cases for SIT, End to End testing, Services Testing and Back End Testing in Quality Center with respective business functions.
  • Making the Trace ability Matrix for SIT, E2E and Service Test Cases with respective Business Requirements.
  • Written SQL Queries with respect to the business requirements in DOORS and Tables in the Database.
  • Analysis of Gaps between the Requirements and Test Cases development.
  • Maintaining the RTM and generating the SDLC report for the test cases for all the business functionalities.
  • Raising the defects in Clear Quest and maintaining the Requirement Traceability Matrix for Clear Quest and Quality Center
  • Created the Test Data for each Test Cases and Executed the Test Cases in Quality Center at different stages like, Preliminary, System, Integration and Regression
  • Created the Test Data for each Test Cases and Executed all the SQL Queries for all the stages like Preliminary, System, Integration and Regression
  • Executed all the Service Test Cases using SOAP SONAR tool.
  • Generated Request XML files and ran against respective business functionalities for web services and generated Response XML files. And compared both Request and Response files.
  • Captured the XML file difference and Screen Shots during execution.
  • Automated the Scripts using VB Script and ran the automated scripts for respective business functionalities
  • Developed Automation suite using QTP. And also ran the suite for all different stages of New Build generated.
  • Developed the Test cases for Performance Testing with respect to the Product features.
  • Executed the test cases to check the Performance by using Performance Testing methodologies.
  • Developed the Polygraphs in Excel to compare the Performance of the product.
  • Responsible for GUI Testing, System Testing, Integration Testing, E2E Testing, Regression Testing, Service Testing and Back End Testing.
  • Worked extensively with SQL queries.
  • Generated SDLC report for all the modules for all the different stages.
  • I have involved in the Review process of Test cases and RTM Process.
  • Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment: QTP, SOAP SONAR, Service Testing, Performance Testing, Manual Testing, GUI Testing, E2E Testing, Regression Testing, Black Box testing, Integration Testing, PVCS, Java, J2EE, VB Script, C#, HTML, XML, Oracle 9i and Web Services, HP Quality Center, Clear Quest.

Confidential

QA Test Engineer

Roles and Responsibilities:

  • Configured the Quantum Storage devices DXI3500, DXI4500, DXI 5500, and DXI 7500 using LUN 5 mechanism.
  • Configured the DX100 machines using Solstice Disk and shared Quantum Storage over Ultra Wide Differential SCSI interfaces; stripped volumes were carefully configured to work optimally with Oracle 7.3.3 and 8;
  • Created both RAID5 and RAID 0+1 striped LUNS for use with DXI3500, DXI5500 and DXI7500 for Tera Byte of disk configuration
  • Configured the SCSI and iSCSI sender and Receiver for Back Up and Restore of Tera Byte Data Storage.
  • Generated Test Suite by using Rational Robot and Rational Unified Process.
  • Installed Oracle 7.3.3 and created a basic database. Configured the SCSI options for optimal disk I/O, UFS and VXFS file systems with appropriate block sizes.
  • Install, debug, configure and UNIX based subsystems. Installed and configured Solaris O/S, 9/10, TCP/IP network configuration, RAID Volumes on Storage Array Disks using Disk Suite, file systems, NAS and NFS services, User Accounts, Automatic Installation via JumpStart, Network Terminal Servers, Configured departmental intranets with HTTP and FTP services, Basic firewall with FireWall-1, High Availability Servers for NFS.
  • Analysis of Business requirements and developed the Test Plan and Test cases
  • Created Test Data and Ran the Test Cases for SIT, End to End testing and Regression Testing.
  • Raised the Defects in Bugzilla
  • Generated SDLC report for all the stages of SDLC Process.

Environment: Quantum DX100, DXI3500, DXI4500, DXI5500 and DXI7500, Rational tools like Rational Robot, Rational Unified Process, SOLARIS 2.x; Linux, Kernel 2.4,2.6, Raid Manager 6, IP Multicasting, Storage Array: Connector SAN Directors and switches, File System (VxFS), NFS, NAS, DNS, Jumpstart, TCP/IP, SSH, Brocade Switch, JNI/Qlogic/Emulex fibre boards and configuration, Fibre Channel Switch, Brocade 3800 FC Switches, SATA based storage arrays, FC Switches; Bugzilla, Atlassian JIRA, Atlassian Confluence

Confidential

QA Test Engineer

Roles and Responsibilities:

  • Analysis of Business/Functional requirements and uploading the requirements to Requisite Pro and tagging them in Requisite Pro as Functional and Business requirements.
  • Pushing the requirements from Requisite Pro to Quality Center using ICART - RMSync.
  • Written the Test Cases/Test Scripts in quality Center and Mapping them to the respective requirements.
  • Created the Test Data for each Test Case and Executed the Test Cases in Quality Center.
  • Generating the RTM by running SQL Scripts/BOXI.
  • Executed the Test Cases/ Test Scripts in Quality Center for all the New Releases.
  • Creating Automated Test Script using win runner, and VB Scripts tools.
  • Created Test Data for automated scripts with respect to Development, System Test and User Acceptance test environments. And executed these automated scripts in these environments and provided the results.
  • Generated the Test Plan and Test Result Summary documents.
  • Written SQL Queries with respect to the business requirements and executed in SQL Server 2000
  • Maintaining the Admin Role for Quality Center and Clear Quest.
  • Analyzed the Data and different types of Report format Bar/Pie/etc in SSRS.
  • Developed the Polygraphs in Excel Sheet for the Test Results for all types of Testing.
  • Worked extensively with SQL queries.
  • I have involved in the Review process of Test cases
  • Identification and notification of Showstopper issues, escalation if and when required during testing. Proper communication with management/offshore team to relay details of issues and their impact on the application.

Environment: Quality Center, SQL Server 2000, Oracle 9i, Visual Basic, Java, JSP, ASP, Win Runner, VB Script, Java Script, XML, HTML,DHTL.

TECHNICAL SKILLS:

Testing Tools: QTP/UFT,Load Runner,Jmeter, Blaze meter, SoapUI, Restful web service, APIs.

Languages: Java, IBM Curam v6, C, C++, ASP.NET, VB.NET Cucumber, Selenium,Maven, Protractor, Angularjs, XML, HTML

Database: Oracle 11g/12C, MS SQL Server, DB2, Postgressql.

Scripting: Java Script, VB Script,Unix, Shell,Python, PL/SQL, and Ruby

Management Tools: Jira, GIT, Jenkins, Bamboo, VSTS, HP ALM, Remedy.

Methodologies: Waterfall, V-Model, Iterative, Agile, Scrum.

Operating System: Windows 7/XP/NT, Linux, Unix.

Servers IBM Web Sphere, JBoss, Apache, Tomcat, IIS.:

Networking: LAN, WAN, Router, Modem, Switch

Reporting Tools: Crystal Reports, SSRS

We'd love your feedback!