We provide IT Staff Augmentation Services!

Systems Engineer Resume

4.00/5 (Submit Your Rating)

Falls Church, VA

SUMMARY:

  • 10 years of IT experience with Active MBI Clearance (Public Trust) covering as a System Engineer/Quality Analyst/ Software Test Engineer/ Performance Engineer, experience working with automated performance testing tools, Skilled in managing, modeling, analyzing, and tuning the performance characteristics of a system Proficient with standard performance test tools.
  • Working in many different companies for performance engineering with multiple applications Load, Endurance, Stress and capacity planning for understand production performance.
  • Experience performance testing in cloud application from scripting and running high volume of virtual users to analyzing test summery and give presentation to all stakeholders.
  • Good understanding of storage technologies, networking and performance testing storage sub systems Ability to recognize limitations of software architectures Ability to conduct Load, and performance tests Familiarity with performance test scripts, automated and manual functional test scripts.
  • Experience as an Agile QA worked with Business and Technology teams to define, execute & manage iterative functional and regression testing processes.
  • Has ability to handle multiple projects at a same time and has decision making ability under pressure. Has ability to handle offshore team.
  • As a QA Analyst, closely worked with the Project Managers, Business Analysts, and Team Leads, to develop the test strategies and plan production schedules.
  • Responsible for tracking the progress of the test activities and reporting them to the Project Manager.
  • Possess good communication skills, self - motivated, proactive, task oriented and quick learner at new technologies, able to work independently, a good team player, interacts with all levels of management
  • Experienced and implemented Agile Methodology in QA Environment for better Efficiency, Productivity and Quality of Work.
  • Performed Manual and Automated Testing on both Client-Server and Web-based applications . Performed Front-end and Back-end testing.
  • Performed F unctionality, Regression, Stress, and Screen Navigation Testing using QTP / Win Runner. Experienced in Web based & Console based Finesse Fixture Testing
  • Extensively worked in Functional, Integration, System Testing, User Acceptance testing (UAT), and Production Checkouts and extensively involved in doing Positive and Negative Testing.
  • Speaks clearly and persuasively in positive or negative situations. And speaks three languages.
  • Detailed-oriented, capable of prioritizing a broad range of responsibilities in order to meet deadlines.
  • Able to deal with frequent changes, delays or unexpected events.
  • Positive thinking, self-directed and results-oriented with the ability to motivate and inspire others.
  • Adept at communicating effectively with customers, vendors, and staff.
  • Able to motivate employees to perform to their maximum potential.
  • Exceptional organizational and planning skills; adaptable; enjoy new challenges.

TECHNICAL SKILLS:

Languages/GUI: VB.Net, C#, VB, C, Html, SQL, Java, power builder, Small talk, Pega, Silverlight.

Testing Tools: QTP, RQM, Quality Center, Load Runner, Performance center, Win Runner, J-meter, SharePoint, Clear Quest, Quality Manager, HP site scope, TFS, ALM.

RDBMS/DBMS: Oracle (10g 11g and 12C), Teradata, SQL Server, MS Access

Web development: VB Script, C, ASP.net, C#, Visio

Tools: XML Diff, Stylus Studio, Ultra Edit, Text Pad, Share point, MQWIN, VSS, PowerPoint, Toad, SQL developer, Citrix, DB2, Mainframe, IBM U-Deploy.

Operating Systems: Windows10, Windows 7, Windows XP Pro, Windows NT 4.0, 2000, 98, 95 & Unix

LoadRunner Protocols: HTTP/HTML, Web Service, Tuxedo, Oracle 2Tier, Citrix ICA, DB2 CLI, Ajax True Client

PROFESSIONAL EXPERIENCE:

Confidential, Falls Church, VA

Systems Engineer

Responsibilities:

  • Working as a system engineer to perform application performance testing for Department of Defense.
  • Create LoadRunner scripts and using web service, Oracle 2tier and tuxedo protocol for Defense medical application’s performance test. Also scripts delivery to the client for performance testing in their environment.
  • Understand overall system and environment details to analyze and calculate system load for the test.
  • Understand each different versions of the applications (AHLTA), Mid-tier applications (12 SyncManagers) and other part of the system for performance testing by the agile sprint.
  • Understand critical system architecture diagram and system details for application performance.
  • Using LoadRunner tool to create scripts with customization and complicated parameterization to run and analyze multiple users Load test.
  • Run Load and soak test in the test environment to find out production performance issues.
  • Analyze data from Oracle database by query and using Oracle SQL developer.
  • Create Documents for test summary report and also important Work related documents using Microsoft word, excel and power point and kept it in the share drive and SharePoint.
  • Created work load documents to calculate by the scripts, transactions and think times.
  • Analyze test results from LoadRunner transactions and Disk, CPU and Memory data from windows Perfmon.
  • Delivered test summary report to the management on time and getting ready for the next test.
  • Use Jira management tools to open ticket for all the assignments that I use day by day work.

Environment: Oracle11g 12c, SharePoint, Tuxedo, LoadRunner (9.5, 11.5, 12.52, 12.6), Protocol (Tuxedo, Oracle 2Tier and web service), .Net.

Confidential, Bowie, MD

Performance Test Lead

Responsibilities:

  • Working in the Agile scrum environment very closely with sprint planning, daily stand up, sprint demo and run performance test in each build before production.
  • Managing two applications performance testing in the same time.
  • Also performance testing for cloud application from scripting to analyze report and delivery report to the client.
  • Review product back lock and select test scenarios from every sprint.
  • Prepared Test cases based on Technical Specification document.
  • Reviewed database Test cases according to assigned module.
  • Every sprint before release deploys code in the performance testing environment using IBM U-Deploy tools.
  • Using Performance center, Load Runner and Vugen to create scripts and run multiple tests in the performance test environment.
  • Test multiple application and using http/html, Silverlight and Ajax True client load runner protocol to communicate with application server.
  • Using Sitescope monitoring tools to setup all the web and app servers and monitoring them when application under test.
  • Monitor Database server with IDERA SQL server tools.
  • Coordinated closely with testing team and developer during the testing life cycle.
  • All the applications performance issues very closely working with developers and infrastructure team.
  • Collect and documents and reviews all the production performance issues every week and discuss with performance team.
  • Setup performance environment and solve environmental issues sometime before testing.
  • The entire performance test document Matrix keeps in the share point to share with offshore team and others.
  • Present test plan and test summary report to the stakeholder every test in the script.

Environment: Windows Server, SQL Server, TFS, Share Point, Loadrunner, Performance Center, Sitescope, IDERA, Net, Silverlight and IBM U-Deploy.

Confidential, Lanham, MD

Information Technology Specialist

Responsibilities:

  • Working for backend testing with Confidential database conversion, technology Mainframe DB2.
  • Identifying the Test Environments for Database Testing.
  • Prepared Test cases based on Technical Specification document.
  • Reviewed database Test cases according to assigned module.
  • Using IBM tool RQM to create and update test cases for tax application backend testing.
  • As an SDLC process committee member, responsible for creation and review of testing process, testing templates, guidelines, and check-lists and related work instructions.
  • Coordinated closely with testing team during the testing life cycle.
  • Documented whole testing procedure including test cases, results, and status and test summary reports.
  • Developed test cases for Black box testing and executed them.
  • Performed System Acceptance Testing (SAT).
  • Created Test Matrix to report the progress and status of the testing effort.
  • Followed up with developers on defects status on a daily basis.
  • Database Conversion testing using DB2 and mainframe.
  • Analyze Design Document, Requirement Document, Balance & control, Individual master file documents for yearly tax return.
  • Using IBM tools call RQM to write test cases, Test execution and log defects
  • Handled regular QA meetings with the business review team, developers and project manager on test report status. Involved in enhancements and Production request issues.

Environment: Mainframe, DB2, IBM RQM, CADE2 Database Tax Module.

Confidential, Arlington, VA

Performance Test Engineer

Responsibilities:

  • Working for Pega system (A leader in business process management (BPM) and customer relationship management solutions) as a Performance test engineer.
  • Working client site West Virginia (Client Confidential ) and Home office Northern Virginia (main office) for Confidential background Check Application performance.
  • Define overview performance test plan goal (why do we need to performance test)
  • Using Load Runner tool to create, run and analyze multiple users Load test.
  • Using Ajax True client and HTTP/HTML Protocols in the Load Runner to create scripts
  • Every week Talk to client to better understand about application performance critical points and their requirements.
  • Create Performance testing Tracker to make work easer.
  • Upload and download all the documents to the Share point.
  • Using HP Performance Manager Tool to monitor Server and network.
  • Conducting requirements and performance scenario identification workshops with process teams to determine key performance test scenarios for test execution.
  • Determining the performance test approach (load test, volume test, stress test, endurance test) and interleaving this across the identified scenarios.
  • Building a performance test plan that details the scenarios, approach and working with the basis & infrastructure and process teams to execute the plan.
  • Recording, reporting results of the plan to the PMO.
  • Working with all relevant process teams to mitigate issues and documenting.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.

Environment: Pega, SharePoint, Clear Quest, Quality Manager, Ajax true client and HTML/HTTP Protocol, Java.

Confidential, Camp Hill, PA

QA Analyst\Performance Test

Responsibilities:

  • Analyzed the functional specification document for testing the application, as well as preparation and execution of Test Cases according to the business analysis document specifications.
  • Involved in testing the applications during Smoke testing, System testing, Regression testing and Performance testing.
  • Created new and updated existing test cases according to the business requirements and functional specifications.
  • Use Requirement Traceability Matrix to verify the functional specifications and test cases.
  • Developed automation scripts in Qualitia to automate smoke and regression testing.
  • Created and updated the test cases in JIRA and managing of test cases pertaining to the modules and scenarios.
  • Tested all web testing in this application and conducted security testing.
  • Identified critical test-cases to be executed during smoke testing and documented them using JIRA.
  • Involved in the End-To-End test and responsible for the Quality Assurance of this application to complete in record time.
  • Working on IBM Rational Performance tester with large number of concurrent users for web base application.
  • Create scenario for performance testing with right amount of concurrent users for right measurement and Monitor server performance.
  • Validated performance, scalability and reliability of Web-based application.
  • Created and developed workload schedules for various performances and load testing scenarios.
  • Executed tests with both small and large loads, and evaluated resulting data to measure.
  • Created the Test Data to parameterize the scripts for many users.

Environment: JIRA, JAMA, Qualitia, IBM Rational Performance Tester, HTML/HTTP Protocol, Java, BD2, JSP. C# and .Net

Confidential, Whitehouse Station, NJ

Performance Test Engineer

Responsibilities:

  • Working in multiple applications with different protocols. Citrix ICA, DB2 CLI, ODBC and HTTP/HTML protocols to recognize that application.
  • Create test plan. Test cases, test scenarios for performance testing and have meeting with project manager, QA manager and corporate stakeholder twice a week. ..
  • Working with many different language applications which are Java, Small talk, power builder, C#.net and visual basic.
  • Record scripts Citrix client server application and HTTP/HTML for E business application.
  • Use protocol analyzer to analyze application for finding correct protocol.
  • Working on HP Performance Center with large number of concurrent users for web base application.
  • Create scenario for performance testing with right amount of Vusers for right measurement and Monitor server performance.
  • Using Load Runner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the Load Runner Controller.
  • Used Site Scope to get metrics from servers.
  • Used Site Scope to monitor the databases, application and web servers (at OS & Application level) for Performance bottlenecks while conducting Load, Stress, volume, and Memory tests.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Performs in-depth analysis to isolate points of failure in the application.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Identified Disk Usage, CPU, and Memory for Web, APP, Database servers and how the servers are getting loaded.

Environment: Load Runner, Load Runner Agent, Performance center, Quality Center, QTP. Protocol: HTML/HTTP, Citrix, ODBC and DB2 CLI, Language: C, C++, Java, Small Talk and Power Builder.

Confidential, Fort Worth, TX

Performance Test Engineer

Responsibilities:

  • Involved in creating test cases, test scenarios from test plans and analyze business requirement documents.
  • Attended all Business and Technical requirements review, everyday meetings and reviewed the Business requirements documents.
  • Understood of rich client Complex Architecture of the F35 war airplane Applications.
  • Create and maintained all the automation test scripts in Load Runner by using use cases.
  • Record script java base client server application.
  • Monitored server performance metrics on UNIX level.
  • Developed and enhanced scripts using Load Runner VuGen and designed scenarios using Performance Center to generate realistic load on application under test.
  • Created the Test Data to parameterize the scripts for many users.
  • Developed automation scripts in QTP to automate smoke and regression testing.
  • Used Quality center to execute automation scripts in batch mode or unattended mode.
  • Enhanced the scripts in QTP by applying checkpoints, parameterizations, synchronization point, data driven tests and creating modular tests.

Environment: Load Runner, Performance center, Quality Center, QTP, HTML/HTTP Protocol, C, Web Logic, JSP, HTML, Java.

Confidential, Silver Spring, MD

QA Analyst

Responsibilities:

  • Fully involved in analyzing Business Requirement Documents and writing Test Plan.
  • Involved in creating test plans, test estimates, test execution and QA schedules.
  • Developed and maintained of all manual test cases and test scripts by using use cases.
  • Performed Manual Testing on different modules of the application.
  • Involved in Manual testing using Test Director to develop test cases, test scripts, and executing the scripts. Analyzed the Test plan, which detailed the testing scope, strategy, test requirements and necessary resources.
  • Prepared and maintained the automation test scripts to perform the regression testing.
  • Prepared Traceability matrix and Root cause analysis report.
  • Coordinated closely with testing team during the testing life cycle.
  • Documented whole testing procedure including test cases, results, and status and test summary reports.
  • Developed test cases for Black box testing and executed them.
  • Tested all web testing in this application and conducted security testing.
  • Performed Functional testing, black box testing and gray box testing.
  • Wrote SQL queries for Data Manipulation Language (DML) and data feeds testing on Oracle.
  • Developed and Executed automated test scripts for Functional and Regression testing using HP's QTP.
  • Worked extensively on Quick Test Pro, created Master scripts and modified scripts in expert mode for web applications.
  • Wrote Functional scripts using Quick Test Professional. Identified all windows, Pages, Objects, Methods and Properties.
  • Responsible for performing Functional Testing and Regression Testing on the application by creating Automated Test Scripts using QTP.
  • Developed Manual Test cases and test scripts to test the functionality of the application using QTP.
  • Created scripts Using Load Runner Virtual User Generator. Parameterized Unique and dynamic content in the application scripts for real time emulation. Data was queried from Data base and input data for data driven tests were kept in excel files.
  • Created Vuser Scripts in Virtual user generator, created scenarios, load in the Load Runner controller.
  • Extensively worked Load Runner in analyzed applications performance for varying Loads and Stress conditions.
  • Created load test scripts using Load Runner and developed test cases for Stress and Performance testing of the application by creating virtual users.
  • Create the vuser in Load Runner which will put the load on the application it use C as scripting language.

Environment: Load Runner, QTP, Quality Center, J2EE, Sun Application Server, XML, RSS, Windows XP, TOAD, Oracle, HTML, JavaScript, QTP, Clear Quest C#.

We'd love your feedback!