We provide IT Staff Augmentation Services!

Sr. Application Quality And Performance Engineer Resume

5.00/5 (Submit Your Rating)

Brooklyn, NY

SUMMARY

  • 8 years of Performance Engineering experience using multi - protocols and cloud infrastructure.
  • Extensive exposure on Agile(Scrum,Kanban) method Confidential diversified frameworks and platforms.
  • Extensive experience in High Performance Architecture, Benchmark, Capacity Plan from SLA, BRS, SRS, High Performance Modeling, Performance Requirements Analysis. Load balancing, Application Module Pool, Exception Monitoring and Reporting Disaster Scenarios proactively.
  • Experience in automating the test scripts for the Performance, Load, Stress testing using OATS, LoadRunner, JMeter, MS/HP ALM,Performance Center and Microsoft Visual Studio TFS.
  • Depth knowledge on PTLC, SDLC, STLC concepts Test Automation Confidential agile driven approach.
  • Experienced in analysis of Service Level Agreement, Business Boundary Documents, RFP, Software Requirement Specifications (SRS), Non-Functional Requirement Specification (NFRS) and Software Design Specification (SDS).
  • Experienced wif monitoring, profiling and diagnostics tools like Oracle Management Cloud, RUEI, Bluemix, HP SitesScope, SCOM, App Dynamics, Nagios, JConsole, JvisualVM(Hotspot).
  • Cross platform tools monitoring experience based on Oracle Linux, Unix (AIX) and Microsoft.
  • Extensive Log Analytics incident reporting wif SAR, TOP,sysstat, iostat, vmstat, nslookup.
  • Expertise in the SLA,SOP review on production, system cost, documentation of system deployment and fine tune plans, Creating and executing scenario based KPI and test plan.
  • Proficient is multiple protocol, iterations, correlation and altering scripts based on workload using LoadRunner, Oracle Application Test Suite - OLT and Jmeter.
  • Strong knowledge in Load, Stress, Regression, Spike, Scalability, Volume, Endurance, Baseline and Benchmark testing to analyze performance statistics, operability, scalability and capacity.
  • Experienced in understanding application futuristic requirements, developing performance testing objectives, prioritizing execution, mapping workload and forecasting through reports.
  • Wide ranging testing exposure in both .NET and J2EE application counters, memory leaks.
  • Expert in n-tier Architecture, Design Patterns, ISOM, MVC, Virtualization, Network Topology
  • Hands on experience wif J2EE application architecture to identify point of vulnerability, execution flow, bottleneck, deadlock, locking and developing suitable performance test cases.
  • Experience in executing multi-protocol using Web HTML/HTTP, Ajax-Truculent, Citrix and other Web Service like XML, SOAP, WSDL and UDDI.
  • Skilled in tag languages as XML, JavaScript, AJAX-True Client, JSP, JMX, Servlet.
  • Experience in reproducing incidents, exceptions analysis and bottlenecks identifications.
  • Strong understanding on SOA, SaaS, KPI, RTM, Performance Metrics, Tuning methodologies.
  • Expertise in RDBMS, DML, TCL, Profiling, Database Performance bottleneck, Consistency Checker, Confidential -SQL, SQL*Plus, SQL*Trace both in Unix, AIX and Windows platform.
  • Experience in troubleshooting various aspects of the client/server, Web Server and standalone Test architecture in Open Source and Closed Source and Proprietary environment.
  • Proficient in performing agent based monitoring, IP Spoofing and Load Balancing in network.
  • Ability to perform detailed documentation, personable and able to meet project milestones.
  • Experience in addressing tracing Blocking issues, Performance monitoring wif Activity monitor, SQL Server Profiler, Resmon, Database Tuning Advisor, DMVs, SQL Diagnostic Manager.
  • Experienced in generating load in the system using multiple controllers, Thin and Thick client Confidential the same time.
  • Strong team player, work ethics, proxy observatory, ability to adopt and learn quickly, analytical skills wif ability to work under pressure wif Interpersonal and change management skills.

TECHNICAL SKILLS

Test Tools: LoadRunner, JMeter, SOAP UI, PC, OATS, VersionOne, ALM/QC.

Test Simulation: .Net, J2EE, BTM, RUEI, Web Services, JVM, IIS, HTTPD, JDBC, GC, ADF.

Analysis: Bottleneck, Non-Functional, Cloud, SRS, AQM, Compliance, Oracle Ent. Manager, APM, Performance,Capacity,Deadlock.

Language: JAVA, C/C++, XML, Core Java Script, DHTML, HTML,VB, AJAX, Shell.

Protocols: Web HTML/HTTP,SAP-GUI TruClient, Citrix, RDP, ADF.Web-Srv.

Application Server: Weblogic, JBoss Dynamics, Biztalk, Apache-HTTPD, Azure Websphere 4.0, Tomcat,Exalogic.

Monitoring: RUEI, SiteScope, SAR, JConsole, SQL Profiler, SCOM, NPM AppDynamics, Dyna Trace,IBM-Bluemix.

Bug Management Tool: Bugzilla, QC, JIRA, ClearQuest, ALM-QC, MTM.

Test Mgt Tool: PCOE, Test Link.

Utility: WinSCP, PAL, typeperf, Log Perser, WebLog Expert Lite, MobaXterm.

Database: Oracle DB 12c, SQL*Plus, SQL Developer, Putty, OBIEE, TOAD, SQL DEVELOPER, forms, Procedure, ETL.

OPERATING SYSTEMS: Redhat - RHEL 5, Windows 2003 32/64 bit, Win2k8,DOS, UNIX,SOLARIS

Domain Exposure: Healthcare, Financial, Telecom, Retail, Manufacturing, IS Audit, Supply Chain, Transportation.

Infrastructure Exposure: Code Merge, CLI, Thin Client, Terminal Server, SaaS, IaaS, Virtualization, VPN, Citrix, Mobile Cloud Services, Elastic Topology, IntelliJ, JDeveloper, Oracle-FMW, Synthetic.

PROFFESSIONAL EXPERIENCE

Confidential, Brooklyn, NY

Sr. Application Quality and Performance Engineer

Responsibilities:

  • Leading Application Quality Management(AQM) team to Test-Analyze-Report Performance and Quality issues raised.
  • Leading Application Performance Management on Portal(Web Center Portal) FMW(Intranet and External) and SOA suite.
  • Reporting performance matrix and monitoring KPI, UX, Infrastructure and elastic topology, components and log analytics.
  • Infrastructure Management (Application Tiers in DMZ) and Network Artifacts through OEM, RUEI, JVMD, ExtraHop.
  • Electronic Medical Records (EMR) Management in respecting HL7, HIPPA and PCI standards.
  • Data Masking and parametrization for Performance and Compliance test.
  • Monitor Audit log to analyze hit maps and creating compliance reports.
  • OEM-RUEI-BTM Administration for: ASH-Active Session History, DB-ADDM, Log analyzer and Topology Analysis
  • Oracle Management Cloud Control: Elastics Topology, Call Tree Diagram restructuring, Alert configuration and analysis.
  • DMZ Infrastructure Monitoring and violation reporting through ExtraHOP, RUEI and Solarwind-Network Performance Analyzer.
  • OLM: Load testing wif Oracle Load Manager.
  • CPU Utilization, Memory Utilization, Server Throughput, Response Time, Log Analysis, TPS,
  • Struck Thread Analysis, Error-Exception reporting, Warning-Notification Analysis.
  • OTM: Test Management wif Oracle Test Manager.
  • Created Workload Modeling, Test Automation, Automation Script. Performance matrix on 17 distinct applications Confidential Confidential Gateway portal, BHS(Behavioral Health Service ) and WTC( World Trade Center) Medical Monitoring Program.
  • Update test cases based on Spiral and Incremental development models and requirements.
  • Leading Junior testers and BA for requirement preparation design mockup and documentation.
  • Quality Assurance through HP-ALM(Application Lifecycle Management) and JIRA.
  • Application Performance Management on elastic topology and proactive controls over key performance indicators, analysis and reporting.
  • Middleware Stack and backend performance, PL/SQL package analysis, code review, finding performance parameters.
  • Data masking, Parameterize and Correlate on load model for trial. Tier Monitoring wif agent configuration for issues.
  • Business Analysis documentation from storyboarding.
  • OBIEE, BI Publisher and Oracle Report Analysis on Infrastructure and users request analysis.
  • Publishing requirements through HP-ALM, JIRA or PPM conduits.
  • Create Performance Strategy and implement procedure to reach maximum availability, operability and high uptime.
  • Joining daily standup meetings.
  • Project Charter, RFP, Technical Proposal, Implementation and Rollout Plan, Technical Design Documents, progress Documentation, Performance Requirement Document, Performance Test Plan, KPI, Application Capacity Plan.

Environment: OEM, RUEI, BTM, Performance Center, Fiddler,Mobile Cloud Services, Toad, JIRA, JMeter, JDeveloper, Eclipse, HP-ALM, OATS, Weblogic, MobaXtearm, Putty, Visio, Eclipse, JDeveloper, Unix, Oracle FMW, BI Publisher, OBIEE, OAM,OTD, WCP, OID,LDAP, OTD,SSO, Benchmark factory, ADF Log Analyzer, UCM, ExtraHOP, OMCS- Oracle Management Cloud Service, Solarwind-NPM.

Confidential, New York, NY

Performance Engineer

Responsibilities:

  • Mapped User Workflows through UML merging SLA after coordinating wif the Subject Matter Experts(SME).
  • Created User workflow diagrams in MS Visio, OOAD Modeling wif Magic Draw UML, Visio and MS Project.
  • Involved in Preparing of PTLC Test Plan, Test case, Model Load, review Test Execution to verify the coverage of Business Process for assigned modules, follow-up Change Request/BPR and Test objectives.
  • Involved in preparing Test Strategy, Test Coverage,Enty/Exit Criteria, RTM (Requirement Traceability matrix).
  • Involved in specifying the functional as well as Ad-hoc requirements for Performance perspective like Benchmark Profiling, Fail -Over test, Fail - Back Test against supported configurations and redefine Baseline.
  • Creating scripts on Load, Endurance and Volume testing based on business critical transactions, workload models.
  • Performed code testing of the application after each new build, release and code coverage wif Developer.
  • Designed and Execution of Test criteria, Scenarios, and Scripts from Test Definition and Test Objectives.
  • Executed Large-scale Performance Tests using Performance Center on Web, Applications and Databaseserver.
  • Used LR Controller Scenario for performing Load Test, Gathering Report, Analyze results yielding summery.
  • Involved in logging BUG/Defect and Performance Issues in global schedule through Performance Center.
  • Performed Correlation and Parameterization on LR scripts, to ensure the script runs successfully during replay.
  • Monitored Test from Unit(Code)level, Component(Module) Level and simulated Application Load based on B2B.
  • Performed extensive OS, JVM monitoring through Typeperf and Perfmon for key Performance Counters and reported result through PAL(Performance Analysis Log) utility.
  • Involved in Application performance Benchmarking, Analysis of execution logs using WebLogExpertLite.
  • Developed Performance Test deliverable like Throughput, Response Time, Network Load of Production Application Performance inventory coverage matrix through LR and Performance test schedule using HP Performance Center.
  • Involved in consistency checking wif Open Source Tool like JMeter and SOAP UI comparing wif Web Load testing for multiple Virtual Users of application using Load Runner.
  • Analyzed Defect Reports and work closely wif Application Developers for resolving the bugs using QC/ALM.
  • Simulated workload based on manual scenario in finding bottleneck, memory leak, breaking point, deadlocks, queue.
  • Executed Load Test scripts for different QA Environments and identifying capacity, consistency, durability, risk.
  • Monitored PC-COE for multiple Web services/Database Servers/Application Servers logs during test execution.
  • Monitored health check of Servers and Databases CPU utilization, throughput, and memory using HP Site Scope.
  • Actively participate in weekly Agile meetings wif project team and serving both as proxy and testing SME role.

Environment: Agile-Lean(DEV-UAT), MS Azure, SaaS, Performance Center, LoadRunner, Perl, AjaxTruclient, UNIX,UML, MS SQL Visio-TFS, Oracle, MangoDB,LDAP, HTTP, Web Services, JAVA, JBOSS, Jmeter, JConsole, Eclipse, Bash Scripts.

Confidential, Oakton, VA

Performance Engineer

Responsibilities:

  • Analyzed Business Requirement Documents (BRD), Use Case Documents, Blueprints, Workflow and the Performance Requirement Specifications (PRS) based on SLA and SOP.
  • Assisted in creating the Test plan based on Test Objectives by Integrating the Agile Testing Approach.
  • Exported Test Cases using add-in utility into Quality Center, ALM, Performance Center and converted them into Test Scenarios for automating in Test Plan until Test Lab module.
  • Generated Project status reports through Quality Center and Document Generator for team meetings and Management review.
  • Created Test Cases in HP Quality Center and mapped Test Cases to requirements using Requirements Coverage.
  • Maintained of Test Cases, Test Data, Test Build and reported in Quality Center.
  • Performed Data Validation testing; performing data integrity testing by executing SQL statements and validated the tables in the database.
  • Used SQL queries to perform data testing using SQL Queries. Prepared manual test cases to test the GUI application and performed data validation.
  • Involved in developing test environment installing and configuring LoadRunner, performed Load testing of the application for various scenarios and analyzed performance monitor and graphs to improve its efficiency and scalability of the legacy application using multiple protocols.
  • Created automated scripts for performance testing using LoadRunner to test the performance issues on physical servers, memory efficiency, Disk I/O, web services, database server to identify bottlenecks.
  • Created Vugen Scripts, implemented Correlation rules and parameterize dynamically changing parameters like Session ID’s, Database Response in LoadRunner for realistic situation.
  • Used Jmeter for JDBC testing wif different host systems and other protocols appropriate to AUT environment.
  • Performed Stress Testing on AUT using Jmeter on Unix Platform configuring Threads, Sampler,Controller.
  • Involved in Testing TPS, Throughput, Latency and required deliveries for performance requirements using Jmeter.
  • Developed Vugen Scripts for load testing, used Rendezvous point to better control and generate peak load on server thereby stressing it wif users to find bottlenecks in the server, queue/deadlocks in the database using LoadRunner.
  • Parameterized dynamic data wif base line test data for each load and developed Recovery Scenarios for a smooth run when scheduled over 150 scripts in Test Lab.
  • Involved in creating and executing Batch File, crontab, job schedulerUNIX Shell Script.
  • Used Virtualization Technology and Hypervisor types Xen Hypervisor, Hyper-V, vSphere, KVM and as Required.
  • Involved in scheduling job in both Windows & UNIX System through Task Scheduler & Cron Tab.
  • Attended the weekly Scrum Meetings and discussed the bugs, impediments raised according to their priority level.
  • Validated testing infrastructure (connectivity, scripting/protocol compatibility and Generator capacity of hardware).
  • Developed and enhanced scripts using LoadRunner VuGen and designed scenarios using Performance Center to generate realistic load on system under test.
  • Traced Java methods and database queries execution using J2EE Diagnostic Tool (Load Runner Add-In).

Environment: VPN, ASP, Ajaz, B2B-ETL, SOAP,IIS, LoadRunner, SOAP UI, Oracle, MySQL,Apache, Redhat, JSON, BugZilla.

Confidential, Buffalo, NY

Performance Engineer

Responsibilities:

  • Responsible for developing and executing performance and volume tests based on seasonal supply chain peak variance.
  • Developed test scenarios to properly load / stress based on shipment/consignment Confidential lab environment and monitor /debug performance & stability problems based on document processing, transaction management and reverse logistics.
  • Partner wif the Software development organization to analyze system components and performance to identify needed changes in the application design patterns, Architecture and capacity planning.
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Worked closely wif software developers and take an active role in ensuring that the software components meet the highest quality standards as mentioned Confidential C-TPAT and ISACA Audit Standards.
  • Correlated Local Legacy systems as well as COTS systems for load generation scenario and developed documentation for capacity planning and audit.
  • Reproduced the database deadlock and queue scenario through profiling, pin-point the bottleneck and threshold.
  • Diagnosed performance bottle-necks, performed tuning (OS and the applications), retesting, and system configuration changes for application performance optimization.
  • Manually correlated the opportunity Ids, to save the dynamically changing opportunity id’s into a parameter by going to the body of the server response in the LoadRunner.
  • Used lr function to search for the text/image to see if the desired pages are returned during replay in Virtual User Generator.
  • Extended the VuGen Script using multiple functions using C/C++ loops, iterations and customized functions.
  • Performed agile testing methodologies using SOAP UI to specify COTS product load tolerance criteria.
  • Changed the runtime settings such as Pacing, Think-time, Log settings, browser emulation and timeout settings in LoadRunner VUGEN and controller to simulate the real scenario.
  • Performed Manual Testing based on business rules-based or cause towards manipulating for Non-Functional Testing.
  • Created goal-oriented scenarios in LR controller for performing baseline, benchmark, stress and endurance tests.
  • Performed baseline test wif 1 user and 5 iterations wif multiple transactions and workload model for benchmark test under a load of 200 simultaneous users using LoadRunner controller.
  • Included Java Plugins like Jmeter into Eclipse in pro
  • Used Scenario by Schedule in the controller to change the Ramp Up, Duration and Ramp Down settings.
  • Executed stress tests wif a load of 225 users to see the breakpoint of the application.
  • Used multiple protocols script for Web Applications built on Ajax, DHTML and other web services.
  • Monitored the metrics such as response times, throughput and server resources such as CPU utilized, Available Bytes, HTTP request contention, Process, Counters, Process Bytes by using LoadRunner Monitors for IIS server.
  • Monitored the Web Logic server using FogLight which is a performance monitoring tool from Qwest Software.
  • Helped in performance tuning of the application.
  • Analyzed the Transaction Summary Report and graphs generated in a LoadRunner Analysis session.
  • Created Templates in Analysis session and analyzed web page diagnostics to see if the server was the bottle neck or the network was the bottleneck.

Environment: Loadrunner, SOAP UI, Performance Center, Quality Center, HP BAC, VB Script, .Net, JavaScript, SQL, Java, JUnit, J2EE, JSP, IIS, XML/XLST, Oracle, JDBC, Jmeter, FTP.

Confidential, Chevy Chase, MD

Performance Tester

Responsibilities:

  • Analyzed system requirements and the Mockups, and developed detailed Test Strategy for Performance Testing.
  • Designed and developed Performance Test Procedures and Cases, wif associated test data.
  • Wrote Test Cases and Performed GUI, Functional, Non-Functional, Integration, Regression, UAT testing.
  • Analyzed Software and Business Requirements documents to get a better understanding of the system from both technical and business perspectives.
  • Prepared Test Cases in QC based on Use-Cases, and executed test scripts Confidential LR, verified actual vs. expected results.
  • Executing, Reporting and Tracking the functional bugs using Quality Center tan migrated to Performance Center.
  • Parameterized and used transactions in Vuser scripts to create peak time load variable depending on Virtual users.
  • Performed correlation to anticipate and handle dynamic data from web server and database server.
  • Used XPath query language in LoadRunner to validate the XML response received from web services.
  • Designed custom reports based on the requirements using LoadRunner Analysis.
  • Interacted wif developers and Users to resolve critical and major bugs.
  • Communicated wif Application Developers, Project Managers and other Team Members on Application testing status on an ongoing basis when necessary.
  • Created and maintained regression test suite to check the progress of the testing process by performing identical tests before and after fixing defects.
  • Worked wif TFS as Source Control, data Collection, reporting and project tracking.
  • Created Test scripts for Beta Application and estimated Test execution competition.
  • Reported weekly status reports to the manager.
  • Involved in Peer Reviews and Team Walkthroughs for the project as per Test methodology.

Environment: VB, C, Java, LoadRunner, Quality Center, WebSphere, Oracle, SQL Server, Windows, AIX.

We'd love your feedback!