We provide IT Staff Augmentation Services!

Software Engineer /senior Performance Engineer Resume

Columbus, OH

SUMMARY:

  • Over 9 years of cumulative experience in IT Software Development Life Cycle with emphasis on Performance Engineering and Load testing under Windows and Unix Environments. The extensive software testing exposure includes the Performance and automated testing of Client - Sever, Multi-Tier, Used jMeter/Blazemeter, LoadRunner/Performance Center tool for testing and Dynatrace, Wily CA (introscope) and NMON for monitoring. technologies, Stand-alone and Web-based applications with tools HP LoadRunner, HP Performance Center, NMON, SPLUNK, HP ALM suite, TDM services and MQ (Message queues) technology, CA LISA.
  • Experienced in requirements gathering and documentation process and overall Load testing.
  • Maintain relationship with business units - work with all levels of staff to identify work process, definition of success and any existing expectations to ensure a successful outcome.
  • Lead Test Strategy development, Test Planning, Resource Allocation, and Environment and Configuration Management for the Lab Program.
  • Experienced in Planning, Implementation and Operations of Performance Engineering and Functional Test Automation.
  • Extensive experience in designing Test Strategies, Test Plans, Test Cases, Test Scenarios, Test Scripts and Test reports of manual and automated tests.
  • Strong experience on Web-based, SaaS, Mainframe technologies and AIX platforms.
  • Possess excellent abilities in HP Performance Center/LoadRunner, HP ALM SUITE
  • Created scripts on jMeter and/or HP LoadRunner, and executed Smoke test, Load test, Endurance Test and Stress tests using HP Performance center / Blazemeter.
  • Analyzed results using LoadRunner Analysis tool and NMON (for UNIX) to analysis.
  • Used SPLUNK to retrieved data from Production and used them for Load/Endurance/Stress test to match to the production.
  • Hands on experience in system testing and performance engineering of numerous Banking and Loan Applications, e-Commerce sites and internal web portals.
  • Solid understanding of Web application technologies and implementations in a complex multi-tier environment.
  • Experienced in transforming raw performance/scalability testing data into meaningful Capacity Planning for Software and Hardware.
  • Experienced in troubleshooting the system for performance testing failures and effectively communicate and resolve and test with Core Product Engineering Teams.

TECHNICAL SKILLS

  • HP Performance Center/Load Runner (9.5
  • Quality Center 11.5
  • ALM12.01
  • ALM12.53
  • Wily CA
  • LISA
  • SPLUNK
  • LDAP
  • Oracle DB
  • MS SQL Server
  • DB2
  • Web logic
  • Web Sphere
  • Apache Tomcat
  • Load Balancer - F5
  • J2EE
  • Web
  • Windows 2000 / XP/2003
  • AIX 5.3
  • Linux
  • Citrix
  • Quick Test Pro
  • Oracle BPM 10gR3
  • WCI Portal 10gR4
  • CRM.

PROFESSIONAL EXPERIENCE:

Confidential, Columbus, OH

Software Engineer /Senior Performance Engineer

Responsibilities:

  • Managing offshore and onshore team and utilize the resources to best possible ways to use to maximize and accurate deliverable
  • Creating Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Analyzed the requirement and design documents.
  • Written LoadRunner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in LoadRunner functions, used client side secure certificates.
  • Extensively Worked inWeb-HTTP/HTML, Web Service, Ajax TruClient and RTE Protocolin LoadRunner as well as jMeter.
  • Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users.
  • Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.
  • Worked on TDM services and MQ (Message queues) technology.
  • Used SPLUNK to retrieved data (volume information) from Production to match the Load/Endurance/Stress test to match the production.
  • Used DataPower for Performance test.
  • Performance tested Mainframe application LoadRunner RTE protocol.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used jMeter and HP LoadRunner to create scripts, and executed Smoke test, Load test, Endurance Test and Stress tests using HP Performance center / Blazemeter.
  • Analyzed results using LoadRunner Analysis tool and for monitoring used Dynatrace, Wily CA and NMON

Confidential, Jersey city, NJ

Performance Test Lead

Responsibilities:

  • Managing and coordinate with offshore team member of six as well onsite team members of three
  • Defining the performance goals and objectives based on the client requirements and inputs.
  • Analyzed the requirement and design documents.
  • Establish and maintain productive relationships with Development and other applicable departments.
  • Meeting business owners on daily basis to understand requirement
  • Responsible for developing and executing performance and volume tests for different types of Banking & Financial applications (i.e. Online Banking, Loan Processing etc.)
  • Extensively Worked inWeb/HTTP, Web Service Protocolin LoadRunner and Used Virtual User Generator (VuGen) to generate LoadRunner Scripts to ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved.
  • Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
  • Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
  • Implemented and maintained an effective performance test environment.
  • Used Dynatrace report to determine the root cause of issues.
  • Analyzing, interpreting and summarizing meaningful and relevant result in a complete Performance Test Report.
  • Issues/defects are appropriately identified, analyzed, documented, tracked and resolved in ALM.

Confidential, Topeka, KS

Performance Engineer Lead

Responsibilities:

  • Defining the performance goals and objectives based on the client requirements and inputs.
  • Extensively Worked in Web-HTTP/HTML, Web Service and Ajax TruClient Protocol in LoadRunner.
  • Provide effort estimates to the project and development managers for budgetary purpose for every Performance Testing engagements.
  • Establish and maintain productive relationships with Development and other applicable departments.
  • Designed tests for Benchmark, Load, Stress and Endurance testing for internal Web portals, CRM, PLM, ATG E-commerce sites, WCI Portal 10gR4, several Citrix applications.
  • Designed and Performance tested application with 5000 concurrent virtual users.
  • Used Controller/Performance Center to Launch 300, 1000, 5000 concurrent users to generate load
  • Monitor performance measurements for Oracle, Web Logic, and IIS in LoadRunner Controller.
  • Analyzed results using LoadRunner Analysis tool
  • Experienced in analyzing performance test results and contribute to Capacity Planning for the space, computer hardware, software and connection infrastructure resources that will be needed over some future period of time.
  • Provide Risk Assessment for new or upgraded applications to the Enterprise Architect Team.
  • Used Wily CA introscope for monitoring.
  • Follow established process for defect tracking and reporting, including logging defects in a detailed and reproducible way.
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a lab environment.

Confidential, New York, NY

Performance Engineer

Responsibilities:

  • Responsible for creating Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios explanation about the Tools used, Schedules, Monitoring
  • Written LoadRunner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in LoadRunner functions, used client side secure certificates.
  • Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users.
  • Run time settings were configured for HTTP, iterations
  • Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.
  • Added performance measurements for Oracle, Web Logic, IIS in LoadRunner Controller
  • Analyzed results using LoadRunner Analysis tool and NMON for server Metrics
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Worked with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Extensively used LoadRunner for performance testing.
  • Database stored procedure executions; Indexes and dead locks with load were analyzed. Tuned SQL server by changing Parallelism, update statistics, hints settings.
  • Network throughput and byte level differencing down loads were analyzed with Marimba Electronic software distribution

Confidential, Wichita, KS

Performance Engineer

Responsibilities:

  • Reviewed business requirement documents to create high-level test scripts.
  • Performance Engineering of Financial applications (Trading and Optimizer applications)
  • Interacted with the Business community and the end users to gather requirements
  • Created and implemented high level and detailed test planning to cover the scope of the software releases
  • Used LoadRunner (vUgen) to create test scripts and execute the Load, Duration and Stress test
  • Worked with LoadRunner to create scripts using Web-HTTP/HTML and Web Services protocol
  • Created workload model to create the scenario and used LoadRunner for performance testing and LR Analysis to analyze the results
  • Monitored database for sessions, connection pool and Memory issues.
  • Closely concentrated on Transactions per sec (TPS) and Transaction Response time (RT) during testing to make sure SLA are met
  • Extensively used JIRA for test planning, bug tracking and reporting.
  • Coordinated the tasks and workflow of offshore resources and utilized offshore resources.
  • Work closely with functional team defect coordinator to work with stream PMs and stream resources to identify, categorize and escalate defects found within streams.
  • Monitored Transactions per sec (TPS) and Transaction Response time (RT) during testing to make sure SLA are met

Confidential, Van Nuys, CA

Software Quality Analyst

Responsibilities:

  • Analyzed systems design specifications and developed Test Plans, Test Cases to cover overall quality assurance testing.
  • Performed Functional Testing for usability and compatibility of the custom Web application.
  • Performed end to end testing for Regression, Integration and database testing.
  • Worked with Use cases and requirements to write the test plan and test cases
  • Manually executed test cases and reported results to development teams
  • Performed manual GUI test for Data entry screen in the Application.
  • Accessed data from Oracle database using SQL.
  • Performed defect tracking for the bugs in the application that included documentation, tracking and re-validating defects that helped developers to track the problem and resolve the technical issues.
  • Involved in Preparing test plans and prepared the test cases and test scripts.
  • Executed the tests manually to verify the application functionality
  • Used Quality Center for error (defects) reporting and communicating between developers, product support and test team members.
  • Involved in the Functional testing of web pages.
  • Developed Test scripts and enhanced scripts.
  • Monitored status using Quality Center to close the bugs/cases as and when fixed.

Hire Now