Apps Systems Engineer Resume
Charlotte, NC
OBJECTIVE:
Over Ten Years of experience in Software testing, which includes testing Client Server and Web - based applications in Windows and UNIX Environment to secure a consulting or permanent position as a Sr. Software QA Engineer, Sr. Software QA Test Analyst, Sr. Performance Test Engineer.
SUMMARY:
- Expert in Automation, Manual, Black Box, White Box, Functional, GUI, Positive, Negative, System Integration, Validation, Regression, End to End Testing, User Acceptance Testing (UAT), User Functional Testing (UFT), Sanity, Reliability, compatibility, Security and Database Testing (Back End Testing), Performance, Stress, Load and Volume Testing with agile scrum.
- Expertise in diverse domain technologies like Finance and Banking, US Government, Health Care, US Medical University, US State Department, and Retails.
- Proficient skill in risk analysis, quality assurance plans, requirement documents, hardware/software specifications. In depth knowledge in Software Development Life Cycle (SDLC).
- Proficient collaboration interpersonal and leadership skills. Managing teams up to 5 core team members.
- Leading the QA team on the Scrum team meetings such as daily standup meetings, Sprint planning meetings and Sprint recap meetings.
- Expert in Selenium Automation frame work, Selenium Web Driver Scripts, performing cross browser testing like Firefox, Google Chrome, Internet Explorer, Safari and IPad etc.
- Expertise in cross team communication, ability to develop good working relationship and partnering with other teams, managers and team members.
- Extensive understanding of the various SDLC methodologies like Agile Scrum, Waterfall, RUP, V-model, Iterative and RAD.
- Proficient skill in coordinating UAT team, provide UAT team the test scenarios, join and leading UAT team meeting, manage UAT Defects Triage calls etc.
- Expert in Manual and Automation Testing with Multiple Application Servers and Portal Technology.
- Knowledge in HP Diagnostic’s tool, Introscope and Rational Functional Tester
- Expertise in Writing, Executing and Maintaining Test Strategy, Test Plan, Test Summary Reports, Performance Test Plan, Test Procedures, Test Cases, Business Processes and Test Scenarios in every release, new releases summary documentation and in all phases of SDLC.
- Expertise in testing Window Base Application, IVR/CTI Application, Web Services, Client Server, Mainframe and Web based applications.
- Experience in Capital Market and Fixed Income for Manual and Automation Testing.
- Solid experience in HP LoadRunner, HP Performance Center, HP Application Life Cycle Management (ALM) or HP Quality Center (QC), HP Unified Functional Testing or QuickTest Pro, Selenium, WinRunner and IBM Rational ClearQuest.
- Proficient in Performance Testing using HP LoadRunner and Performance Center.
- Expertise in Performance Monitoring tools like HP SiteScope Dynatrace AppDynamic and Splunk.
- Expertise in HP LoadRunner Analysis tool, create MS reports, HTMl reports etc.
- Organized, attention to detail, self-motivation, personable, multi-focused and ability to meet project milestones and deliverable dates.
- Possess excellent interpersonal, written, verbal communication skills and strong analytical, problem solving, decision-making skills.
- Ability to work independently or as a member of a team successfully.
- Expertise in Data Mapping, Data Analysis, SQL scripting, SQL/Plus scripting, Execute Oracle Quarries, Descriptive Programming, VB Scripting, VuGen Scripting and Automation Scripting.
- Expertise UFT (Unified Functional Testing) or QTP Custom Framework using ALM QC.
- Expertise with UFT (Unified Functional Testing) or QTP multiple add in’s Web, Java, jQuery, .NET, Citrix, Silverlight, Flex, etc.
- Expertise with SOUP UI for web services testing.
- Expertise in SWIFT, CHIPS, FedWire and Book Transfer Global Payments Systems
TECHNICAL SKILLS:
Testing Tools: Selenium, HP LoadRunner, HP Performance Center, HP UFT(Unified Functional Testing), QuickTest Pro, HP Application Life Cycle Management (ALM) HP WinRunner, Version One, Sub Version, Oracle SQL Developer, Splunk, Dynatrace, Ready API, Business Availability Center, JIRA, jazz, Rational Clear Case, Clear Quest, RequisitePro, TOAD, Doors, PVCS Tracker and Star Team.
Operating System: Windows, UNIX, Main Frame (VistA Legacy, VistA Legacy CPRS), LINUX and MS-DOS.
Languages: VB, C, C++, JAVA, JASP, J2EE, J2RE, HTML, XML, ASP, .NET, VBScript, JavaScript SQL and PL/SQL, Familiar with Silk.
Databases: Oracle, MS SQL Server 2008-2012 and Ms Access.
Web Servers: Web Logic, Web Sphere and IIS.
PROFESSIONAL EXPERIENCE:
Confidential, Charlotte, NC
Apps Systems Engineer
Responsibilities:- Working very closely with QA group Manager, QA Team Lead, Dev Team, Business Analyst, Database Administrator, Test coordinator, UAT Team, Offshore Team and QA Team Members.
- Developing Performance Test plan, Test scenarios, Business processes from the Business Requirements, Functional Requirements, and Design Documents and save them in the share point.
- Creating VUsers Scripts using VUGen and used HP Performance Center to generate and executing Load Runner Scenarios.
- Involving 100%, in Stress, Performance, Load, and Infrastructure Testing to simulate a process, which is allow more than 3,000 Vusers login at the same time and observed the behavior of the system by using HP Performance Center.
- Involving 200% Volume testing double the Vusers using HP Performance Center.
- Involving in Long Duration Test using HP Performance Center.
- Developing LoadRunner automation test scripts in VuGen in Web Base application using True Client Web and Web HTTP HTML protocols.
- Developing LoadRunner automation test scripts in VuGen for Web Services call using Web Services protocol.
- Responsible for performance monitoring using Dynatrace, AppsDynamics and Splunk Tools.
- Responsible for validating FileNet Application Certificate, FileNet Transactions Services using Ready API (upper version of SOAPUI).
- Analyzing the Test Result and developing the HTML Reports and publishing the reports on the Share Point.
- Responsible for create functional Test cases and execute the test cases from HP ALM for FMS JDK UI, ILMS and IFT Main Frame Applications.
- Responsible for writing the test cases, test scripts and after peer review upload the test cases in ALM.
- Responsible for writing SWIFT, CHIPS, FedWire and Book Transfer Test Cases and Test Scenarios and after peer review complete upload them in HP ALM.
- Responsible for Volume Testing join with Team every morning for FMS JDK UI Testing
- Each time before process the payment modify SWIFT, CHIPS, FedWire and Book Transfer messages and inject the messages using MQJExplorer.
- Responsible for End to End Testing for FMS JDK Update
- Responsible for validate SWIFT, CHIPS, FedWire and Book Transfer Global payments Processing using IFT Main Frame Application
- Responsible for validate SWIFT, CHIPS, FedWire and Book Transfer Global payments Processing 601 region Bank and 501 region Bank Inbound and Outbound QUES
- Responsible for SWIFT, CHIPS, FedWire and Book Transfer Global payments Processing Validation Testing for unique References, for ISN, SSN, OSN, payments amounts etc.
- Review project Test Plans, Functional System Designs, Business Requirement Documents with the team and management
- Each time when find a new defect in SIT environment before open this new defect validate this issue can reproduce in UAT environment
- Using HP ALM for maintain Defect Cycle (creation, reporting, fixing, retesting and closing).
- Actively join the Defect Triage meeting twice a week and get the defect update from the Dev team.
Environment: Windows, UNIX, ASP.NET, Solaris, J2EE, Web Logic, Java, Java JDK, JSP, Java Script, C++, Oracle 10g, XML, HTML, HP LoadRunner, HP Performance Center, HP ALM Desk Top Client, HP Analysis, Dynatrace, AppsDynamics, Splunk, Ready API, HP Application Life Cycle Management, MQJExplorer, IFT MAIN Frame, SIDE, IBM Cognos Software, JERA, Water Fall Development Methodology and JDBC.
Confidential, Charlotte, NC
QA Tech Lead
Responsibilities:- Responsible for developed Test Outcome, Test plan, Test Phase Summary Reports and Test scenarios from the Business Requirements, Functional Requirements, and Design Documents and after management approval saved them in the share point.
- Responsible for created Test cases and execute the test cases for IVR/CTI Application.
- Responsible for Lead the Offshore QA testing team, Onside QA testing team and every morning meet with Offshore QA testing team and get their testing update.
- Involved in writing the test cases, test data and test scripts and after peer review upload the test cases in ALM.
- To ensure that delivering product meets the end user’s requirements it is important that sufficient testing is done and all scenarios are tested using Agile Development Methodology.
- Before each new release coordinating the UAT team, provide the UAT team test scenarios, meet UAT team twice a week for Defects Triage calls and manage the UAT Defects etc.
- Responsible for created Selenium Automation frame work, creating Selenium Web Driver Scripts for cross browser testing like Firefox, Google Chrome, Internet Explorer, iPad and Safari etc.
- Analyzed the requirement and developing requirements into test script for automation using Selenium.
- Tested by repeating execution of automated test scripts to identify errors in each modifying build in the process of testing using Selenium Regression Smoke Scripts.
- Responsible for Developed LoadRunner automation test scripts in VuGen in Web Base application using Web HTTP HTML protocol.
- Created Vuser Scripts using HP LoadRunner VuGen and Using HP LoadRunner Controller to generate and executing Load Runner Scenarios.
- Responsible for performance monitoring using Dynatrace and Introscope Tools.
- Involved in Load, Stress and Volume Testing to simulate a process, which allowing more than 2,500 user login at the same time and observing the behavior of the system by using HP LoadRunner Controller.
- Responsible for Performance Testing Test Reports after complete the each Load, Stress and Volume Testing.
- Reviewed Performance Testing Test Reports after other teams complete the each Load, Stress and Volume Testing.
- Responsible for Manual testing, Test Validation, independently supporting various QA Test Cycles and Production Cycles.
- Responsible for Today TIAA Dash Board Back End Testing (validation testing) using Oracle SQL Developer.
- Using JIRA for Today TIAA Dash Board application for maintaining Defect Cycle (creation, reporting, fixing, retesting and closing).
- Responsible for participating in team meetings, walk through, compiling various testing related documents and assisting in improvements of Testing Processes and Procedures.
- Responsible for meet with project team twice a week for Defects Triage Calls to discussing new defects, assign the defects to DEV team, change the defects status etc.
- Responsible for send the weekly project status reports to management (Green, Red and Yellow).
- Used Application Life Cycle Management for maintaining Defect Cycle (creation, reporting, fixing, retesting and closing).
- Manually performed Positive testing, Negative testing, Cross platform testing and Cross browser testing.
- Checked the data flow through the front and backend and used Splunk queries to extract the data from the database. Tested Database integrity by executed Splunk statements.
Environment: Windows, Solaris, J2EE, Web Logic, Java Script, C++, Oracle SQL Developer, Splunk, SQL * Plus, TOAD, HP ALM, JIRA, Agile Development Methodology jazz, XML, HTML, HP LoadRunner, HP UFT and Selenium.
Confidential, Charlotte, NC
Sr. QA Software Test Engineer Lead
Responsibilities:- Responsible for lead the projects, testing the implementation and system integration of PCMM for customers like American Veterans, US department of defense, and Ohio State, etc.
- Responsible for meet the SME’s and give demos on the PCMM systems in VA’s lab environments.
- Planned, designed and implement testing efforts on multiple platforms like Linux, UNIX, Windows, and VistA configurations on VMware virtual server environment.
- Developed test cases by working closely with Vista MQ programmers to develop CPRS tests for automated verification and validation of MQ Series feed framework, business functionality and Batch Data process.
- Responsible for created and maintained test scripts based on user story and functional requirement documents in Application Life Cycle Management under Test Plan module and setting up the Test Lab module for execution phase.
- Responsible for coordinate UAT Testers, create the UAT test cases, test scenarios and provide the test cases and test scenarios to the UAT testers, meet every day with UAT testers, get the update from them and discuss if they find any defect and log the defects in ALM. Also change the UAT defects status using ALM.
- Worked under CISS Portal as merging different application servers like PCMM, ESR and OHRS and doing the CISS Portal Performance testing.
- Helped implementing Agile Scrum methodologies including Version One (V1) Scrum Planning, sprint tracking and analytics.
- Created and maintaining Master System Test Plans and Performance Test Plans, covering areas including the scope of testing, testing schedules, hardware, software requirements, roles and responsibilities, risk, defect management procedures, Test Summary Reports etc.
- Responsible for update the RTM (Requirements Traceability Matrix) each week, mapping the functional test cases with functional requirements, user stories and Version One tasks and send it to business analyst.
- Developed ALM weekly testing efforts reports included how many new test cases created, how many test cases executed, how many defects open, how many fixed and closed and how many rejected. Updated this report each week and send it to PM.
- Responsible for Section 508 Compliance Testing used Jaws and Window Eye.
- Responsible for each sprint performed Load, Stress and Volume Testing, using HP LoadRunner.
- Developed LoadRunner automation test scripts in VuGen in Web Java Base application used TruClient—Firefox protocol.
- Responsible for each sprint executes the Baseline Load Test scenarios running 3,000 concurrent users and monitors the server performance (using Sitescope), analysis the test result, create MS reports and send the reports to the Development team and management.
- Developed Unified Functional Testing frame work using ALM.
- Developed the UFT (Unified Functional Testing) automation smoke test script and after complete each new build, update the scripts and ran the scripts in the QA SIT environment and send test results to the PCMM team members.
- Responsible for PCMM Users Roles and Back End testing (validation testing) using Oracle SQL Developer.
- Responsible for created Selenium Automation frame work, created Selenium Web Driver Scripts for cross browser testing like Firefox, Google Chrome, Internet Explorer etc.
Environment: Windows, UINIX, LINEX, Mainframe (VistA, Vista CPRS) Java, Solaris, J2EE, Web Logic, JSP, Java Script, C++, MS SQL Server 2008/2012, Oracle SQL Developer, XML, HTML, HP LoadRunner, HP Unified Functional Testing (UFT), Application Lifecycle Management (ALM), Version One, Sub-Version, JAWS, Windows Eyes, Agile Development Methodology and JDBC.
Confidential, Charlotte, NC
Sr. QA Engineer Lead
Responsibilities:- Understood the requirements for new functionality items by worked closely with the business team and the end users to prepare test scripts.
- Worked closely with the project Managers, project Technical Managers, Business Analysts Database Administrator, System Administrator and developers in planning coordination and implementing QA Methodology.
- Build QA test environments and splitting the project other team members.
- Prepared test cases in Quality Center to cover all the test scenarios for the new functionality items and issue/defect tracking and reporting out of Quality Center.
- Developed Test strategy, Performance Test plan, Performance Test plan Template, Test Cases and Test scenarios from the Business Requirements, Functional Requirements, and Design Documents and save them in the share point.
- Developed status update templates and provided timely status updates for the status review meetings.
- Involved in defect review meetings and coordinated with the development team for the timely resolution of defects.
- Involved Capital Market application manual testing write test cases, execute test cases and opened defect used HP Quality Center.
- Responsible for Capital Market application manual test cases to automated used HP QTP
- Coordinated the UAT testers, provided high level UAT Test Scenarios, Meet the UAT testers twice a week applications issues and defects etc.
- Worked Wealth Management Portal that merged multiple applications like IM & T, Wealth Station, Capital Markets, NAM, AAR, DARS. Did the performance testing under Wealth Management Portal many different application servers.
- To ensure that delivering product meets the end user’s requirements it is important that sufficient testing is done and all scenarios are testing used Agile Development Methodology.
- Actively leaded team meetings every week, got update from team members and gave daily update to team, walked through, completed various testing related documents and assisted in improvements of Testing Processes and Procedures.
- Used Quality Center for maintained Defect Cycle (creation, reporting, fixing, retesting and closing).
- Developed XL Sheet Template to use different projects for QA team members to upload their Test cases to QC.
- Created VUsers Scripts using VUGen and used HP Performance Center to generate and executing Load Runner Scenarios.
- Involved in Load, Stress and Volume Testing to simulate a process, which allowed more than 700 Vusers login at the same time and observed the behavior of the system by used HP Performance Center.
- Developed LoadRunner automation test scripts in VuGen in Web Base application used Web HTTP HTML protocol.
- Enhanced the scripts by added checkpoints, functions in C Language, transactions, and rendezvous points, creating parameters, and performing manual correlation to recording scripts.
- Attended meeting with Technical Project managers, Project managers, Developers and Business Analysts for each Project prior to Performance Testing.
- Configured scenarios and sated up the Monitoring Environments, added require measurements, to captured the performance of the Application Web Server Resource Monitoring, Client Server Resource Monitoring, including Oracle10g Server Performance Counters, Custom Queries, Firewall Monitors, Windows Resource Counters,, UNIX Resource Performance Counters and SQL Server Performance Counters used HP LoadRunner Performance Center.
- Monitored online graphs like Transactions per Second (TPS), Throughput, and Response time at Client side and analyzed after the completion of test.
- Analyzed various graphs generated by Load Runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Analyzed the Test Result and developed the MS Report and published the reports on the Share Point.
- Integration and Regression Testing by repeating execution of automating test scripts to identify errors in each modifying build in the process of testing used QuickTest Professional.
Environment: Windows, UNIX, ASP.NET, Solaris, J2EE, Web Logic, Java, JSP, Java Script, C++, Oracle SQL * Plus, TOAD, Oracle 10g, XML, HTML, HP LoadRunner, HP Performance Center, WinRunner, QuickTest Pro, Quality Center, JERA, Agile Development Methodology and JDBC.
Confidential, Norfolk, VA
Sr. Computer Systems Analyst
Responsibilities:- Worked closely with the project team in planning coordination and implementing QA Methodology.
- Gathered, consolidated requirements for generating performance goals and test plans.
- Developed LoadRunner automation test scripts in VuGen in Window Base application using Oracle (2- tier) protocol.
- Used Oracle Developer library and manual correlation for enhancing the LoadRunner VuGen scripts.
- Created Correlation rules in LoadRunner VUGEN to correlate the Oracle Row ID’s.
- Configured scenarios and sated up the Monitoring Environments, adding required measurements, to capture the performance of the Application Client Server Resource Monitors, including Oracle10g Server Performance Counters, Custom Queries, Firewall Monitors, Windows Resource Counters,, UNIX Resource Performance Counters
- and SQL Server Performance Counters used LoadRunner Controller.
- Analyzed test results, tracing and trouble shooting performance bottlenecks.
- Monitored online graphs like Transactions per Second (TPS), Throughput, and Response time at Client side and analyzed after the completion of test.
- Analyzed various graphs generated by Load Runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Used Performance monitor to analyzing the % CPU Usage, Memory and Requests per Second for each Scenario.
- Analyzed the Test Result and developed the MS Report and published the reports on the Share Point.
- Maintained a test log reported defects and also monitoring the resolutions of defects used a defect-tracking tool Quality Center.
- Actively participated in team meetings, walkthroughs, compiling various Testing related Documents and assisting in improvements of Testing Processes and procedures.
- Developed QTP Object Repository, Data File, Function Library, Reusable Actions and Main Driver Script and presented to team members how to maintain and work on this throughout the project
- Gave Hands on HP LoadRunner training to other team members to get familiar with the existing project.
Environment: Windows, UNIX, Solaris, J2EE, Web Logic, Java, JSP, Java Script, C++, Oracle SQL * Plus, TOAD, Oracle 10g, Oracle’s Project Raptor,XML, HTML, HP LoadRunner, WinRunner, QuickTest Pro, Quality Center, Rational ClearQuest, Rational Clear Case and JDBC.
Confidential, Charlotte, NC
Sr. QA Analyst
Responsibilities:- Developed Test strategy, Test plan, Test Cases and Test scenarios from the Business Requirements, Functional Requirements, and Design Documents and saved them in the share point.
- Involved in writing the test cases, test data and test scripts.
- To ensure that delivered product meets the end user’s requirements it is important that sufficient testing is done and all scenarios are tested used Agile Development Methodology.
- Analyzed the requirement and developing requirements into test script for automation used Quick Test Professional.
- Regression Testing by repeating execution of automated test scripts to identify errors in each modifying build in the process of testing used QuickTest Pro.
- Created Vuser Scripts used VuGen and Used Controller to generate and executed Load Runner Scenarios.
- Involved in Load, Stress and Volume Testing to simulate a process, which allowing more than 4,000 user login at the same time and observed the behavior of the system by used LoadRunner Controller.
- Inserted Transactions, Checkpoints into Web Application, VuGen Scripts and parameterized & manually correlated the scripts.
- Created scripts to enable the Controller to measure the performance of Web server, Java Application, Client server, Database server and Middle ware server under various load conditions.
- Used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Created scenarios and set up Monitors, Add required measurements, sated up the Monitoring Environment, included Oracle10g Server Performance Counters, Custom Queries, Firewall Monitors, Web Server Resource Monitors, Windows Resource Counters, Middleware Performance Monitors, Client Server Resource Monitors, UNIX Resource Performance Counters, J2EE Performance Counters, IIS web server Monitors etc.
- Analyzed the requirement and developed requirements into test script for automation used.
- Developed Test strategy, Test plan, Test Cases and Test scenarios from the Business Requirements, Design Documents save them in the share point.
- Conducting Manual testing, Test Validation, independently supporting various QA Test Cycles and Production Cycles.
- Extensively developing SQL/Plus queries and performing Backend testing for WEB/Internet Applications.
- Using Star Team for maintaining Defect Cycle (creation, reporting, fixing, retesting and closing).
- Actively participating in team meetings, walk through, compiling various testing related documents and assisting in improvements of Testing Processes and Procedures.
- Used Quality Center for maintaining Defect Cycle (creation, reporting, fixing, retesting and closing).
- Manually performing Positive testing, Negative testing, Cross platform testing and Cross browser testing.
- Checked the data flow through the front and backend and using SQL/Plus queries to extract the data from the database. Tested Database integrity by executed SQL/Plus statements.
- Maintaining a test log reported defects also monitoring the resolutions of defects using a defect-tracking tool Star Team.
Environment: Windows, Solaris, J2EE, Web Logic, Java Script, C++, Oracle SQL * Plus, TOAD, Star Team, Agile Development Methodology, XML, HTML, HP LoadRunner, QuickTest Pro, and Quality Center.
Confidential, Nashville, TN
QA Performance Test Engineer
Responsibilities:- Created LoadRunner automation Vuser Scripts according to the business transactions used Multiple Protocol (Web HTTP/HTML and Windows Sockets Protocol).
- Inserted Transactions, Checkpoints into JASP.NET Web Application in the LR VuGen Scripts and parameterization & correlating the scripts.
- Created scripts to enable the Controller to measure the performance of Web server, Client server, Database server and Middle ware server under various load conditions.
- Involved in Load and Stress Testing to simulate a process, which allows more than 2,000 user login at the same time and observing the behavior of the system by used LoadRunner.
- Used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Created scenarios and sated up Monitors, Adding required measurements, sating up the Monitoring Environment, including Web Server Performance Counters, SiteScope Resource Monitors, J2EE Monitors, Web Server Resource Monitors, Network Delay Monitors, Windows Resource Counters, Middleware Performance Monitors, Client Server Resource Monitors, IIS web server Monitors Application Deployment Solution Monitors etc.
- Analyzed the Test Result and developing the HTML Report and published the reports in the Quality Center.
- Developed Object Repository, Data File, Function Library, Reusable Actions and Main Driver Script and presented to team members how to maintain and work on this throughout the project
- Maintained a test log reported defects also monitoring the resolutions of defects used a defect-tracking tool Quality Center.
Environment: Windows, JASP.NET, Solaris, J2EE, Web Logic, Java Script, C++, Oracle SQL * Plus, XML, HTML, TOAD, LoadRunner, QuickTest Pro, WinRunner, Business Availability Center and Quality Center.
Confidential, SC
Software Performance Test Engineer
Responsibilities:- Created LR Vusers Scripts according to the business transactions used Web HTTP/HTML Protocol.
- Inserted Transactions, Checkpoints into ASP.NET Web Application in the VuGen Scripts and parameterization & correlating the scripts.
- Created scripts to enable the Controller to measure the performance of Web server, Client server, Database server and Middle ware server under various load conditions.
- Involved in Load and Stress Testing to simulate a process, which allows more than 3,000 user login at the same time and observing the behavior of the system by used LoadRunner.
- Used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Created scenarios and set up Monitors, Adding required measurements, sating up the Monitoring Environment, included Web Server Performance Counters, SiteScope Resource Monitors, Oracle Performance Counters, Web Server Resource Monitors, Network Delay Monitors, Windows Resource Counters, Middleware Performance Monitors, Client Server Resource Monitors, IIS web server Monitors etc.
- Analyzed the Test Result and develop the HTML Report.
Environment: Windows, ASP.NET, Citrix, Solaris, J2EE, Web Logic, Java Script, C++, Oracle SQL * Plus, TOAD, LoadRunner, QuickTest Pro, and Quality Center.
Confidential, Charlotte, NC
Software Performance Test Engineer
Responsibilities:- Created Vuser Scripts used VuGen and Used Controller to generate and executed Load Runner Scenarios.
- Involved in Stress, Load and Volume Testing to simulate a process, which allows more than 4,000 Vuser login at the same time and observed the behavior of the system by used LoadRunner.
- Created scripts to enable the Controller to measure the performance of Web server, Java Application, Client server, Database server and Middle ware server under various load conditions.
- Used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Created scenarios and set up Monitors, Add required measurements, sated up the Monitoring Environment, included Oracle9i Server Performance Counters, Oracle Performance Counters, Custom Queries, Firewall Monitors, Web Server Resource Monitors, Windows Resource Counters, Middleware Performance Monitors, Client Server Resource Monitors, UNIX Resource Performance Counters, J2EE Performance Counters, IIS web server Monitors etc.
- Developed GUI, Regression, Functionality, positive-negative, data driven testing used QTP
- Regression Testing by repeated execution of automated test scripts to identify errors in each modified build in the process of testing used QuickTest Pro.
- Maintained a test log reported defects also monitored the resolutions of defects used a defect-tracking tool Quality Center.
- Analyzed the Test Result and develop the HTML Report and published the reports on the Share Point.
- Write Java Message Service programs used WebSphere MQ Version 6 and Rational Application Version 6
- Used JMS and WebSphere MQ for the test environment
Environment: Windows, JSP.NET, Citrix, Solaris, J2EE, Web Logic, Java Script, WebSphere MQ, C++, Oracle SQL * Plus, TOAD, LoadRunner, QuickTest Pro, Star Team, and Quality Center.
Confidential, Annapolis, MD
Performance Test Engineer
Responsibilities:- Created Vuser Scripts used VuGen (used protocol SAP/Citrix) and used Controller to generate and executed Load Runner Scenarios.
- Involved in Stress, Load and Volume Testing to simulate a process, which allows more than 5,000 user login at the same time and observed the behavior of the system by used LoadRunner.
- Inserted Transactions, Checkpoints into Web, SAP GUI R/3, Citrix, and Win GUI, VuGen Scripts and parameterized & correlated the scripts.
- Created scripts to enable the Controller to measure the performance of Web server, SAP GUI R/3, Citrix and Win GUI server under various load conditions.
- Used Performance monitor to analyze the % CPU Usage, Memory and Requests per Second for each Scenario.
- Created scenarios and set up Monitors, Add Required measurements, Set up the Monitoring Environment, including ERP / CRM Server Monitors, SAP Portal Server Resource Monitors, SAP CCMS Resource Monitors, SAPGUI Server Resource Monitors, Citrix MetaFrame XP Monitors, Oracle Database Monitors, Firewall Monitors, Web Server Resource Monitors, Middleware Performance Monitors, Client Server Resource Monitors, UNIX Resource Monitors, IIS web server Monitors etc.
- Worked with SAP Solution Manager and as well SAP Portal R/3.
- Analyzed the Test Result and developed the HTML Report.
Environment: Windows, ASP.NET, Citrix, Win GUI, Solaris, J2RE, Web Logic, Java Script, C++, Oracle SQL * Plus, TOAD, LoadRunner, QuickTest Pro, Doors, and Quality Center.