We provide IT Staff Augmentation Services!

Performance Engineering Lead/ Performance Architect Resume

MN

EXPERIENCE SUMMARY:

  • Around 14 years of experience in Performance Engineering and providing solutions to complex performance issues and experience in End to End QA, emphasis on working with Client Server / Web based applications using Automation tools and manual testing.
  • Strong experience in testing applications built for Finance , E - Business , Medical , Health care and Banking .
  • Strong experience in Embedded testing of Medical devices, Client server, Web apps (RIAs) database driven applications in the environments having J2EE, .NET, Flex, SQL, Oracle, Web logic and Web sphere.
  • Involved in all phases of SDLC and STLC, with timely Delivery against aggressive deadlines, with QA methodologies such as ISO, CMM, HL7 Standards, FDA Process and Agile (Scrum).
  • Expertise in writing Test Plans, Test strategy, Test Scenarios & Test cases based on Business requirements, Use cases to support quality deliverables.
  • Diversified experience in White box, Black box, Smoke, Functional, Integration, Security, Performance, Load, Stress, Regression, Compatibility, Installation and Acceptance testing.
  • Extensive experience in writing, modifying and executing automated test scripts using Quick Test Pro, Winrunner, Load Runner, Performance Center (SaaS), Rational Performance Tester, Silk performer and Jmeter . Extensively used VB Script in QTP for automating the business process testing (BPT).
  • Expertise in use of Test Director , Quality Center, Rational Clear Quest, Bugzilla for Defect Reporting. Experience in using Version Control tools such as PVCS, CVS and VSS .
  • Proficient in Testing of Database applications developed with Oracle , MySQL , DB2 , Sybase Microsoft SQL Server and Microsoft access using DDL , DML , DCL commands.
  • Knowledge in ETL Tools (Informatica, Business Objects) and Experience in Reporting Tools(Crystal Reports, SQL server Reports)
  • Very good knowledge on Relational DB administration .
  • Knowledge and Experience in TCP/IP, SMTP , network configuration, Active Directory, DNS , ADS , DHCP and Data communication techniques and Using different protocols for testing the product.
  • Knowledge on SaaS applications .
  • Experience testing medical device systems in FDA regulated environment and Web applications for Cross-browser/ Platform Compatibility and Multilingual applications.
  • Experience in reviewing Functional specifications and identifying incomplete, inconsistent and contradictory requirements. Interacted with business analysts for the base line requirement specifications.
  • Outstanding reputation for meeting demanding deadlines and delivering critical solutions on various levels of Quality Assurance for Client/Server and Web based applications on Windows , Java and UNIX / Linux environment.
  • Strong diagnostic, analytical, troubleshooting, problem-solving skills and ability to multitask .
  • Good team player with excellent communication, analytical, interpersonal and writing skills.

TECHNICAL SKILLS:

Operating Systems: Windows 2000, XP, Vista, DOS, Unix, Linux, Solaris

Content Management Tools: Documentum Content Server 5.3, WebTop, Web Publisher, eRoom.

Testing Tools: Jmeter, QTP8.0/9.0/9.2, Load Runner 8.0/9.1/9.5/9.51/11.00/11.52 , Rational Robot, Win Runner 7.0, Web Link Validator, Mercury - Quality Center 9.0, PM Smart, Test Director 7.6, Silk performer, Performance Center 9.1/ 9.51/11/11.52 , Dyntatrace Appmon 5/5.5/6.5/7, UEM, HP Perf.Manager, Site scope, DTSaaS, New relic

Version Control tools: PVCS VM 7.5, Microsoft VSS 6.0, CVS

Defect Tracking Tools: JIRA, PVCS Tracker, Bugzilla (configuration), D-Tracker, DDTS, Mantis, eProject, PM Smart.

Web Technologies: Java, ASP, JSP, .NET, XML, Ajax, HTML, DHTML, J2EE, Adobe Flex3.0, Adobe Flash 9.0

Scripting Languages: Python, Perl, Shell scripting, TSL, VB Script

Programming Languages: C, C++, Java

Databases related: Oracle 9i/10g, SQL, MySQL, TOAD, Informix, DBVisualizer, DB2, Sybase

Office Software’s: MS-Outlook, MS-Word, MS-Excel, MS-PowerPoint, MS-Project, Adobe Acrobat Reader

Networking: TCP/IP, SMTP, LAN, FTP, Telnet, Putty

Others: Hercules, TMS, Wing IDE, Web Logic, Mirth1.4, Apache 3.x, perfmon 3.0, ANT, CruiseControl, Regedit, remote desktop, Jprobe, VNC

PROFESSIONAL EXPERIENCE:

Confidential, MN

Performance Engineering Lead/ Performance Architect

Responsibilities:

  • Involved in nonfunctional Requirements gathering from business heads/Product owners/Analysts and converting requirements in to performance goals.
  • Involved in identifying the hardware required to create performance environment closure to production.
  • Responsible for Planning & executing Performance engineering effort on more than 10 applications in OPTUM. ( Pulse Check, EncoderPro, Insite & IMS-Patient/provider, Caremanager, ODX, HPM applications). and provided PE consultation to may other applications
  • Created effort estimations required to do the performance testing.
  • Involved in creating/reviewing of Test plan and test strategy documents.
  • Responsible to set up the test environment on isolated test labs.
  • Performed functional validations prior to putting the app under load to make sure functionality of the business process are as per requirements.
  • Identified test cases to be automated and developed the scripts using Load runner/Jmeter.
  • Performed Functional verification testing, single business transaction performance testing, baseline testing. Average workload testing, Peak workload testing and Endurance testing.
  • Performed Load, Stress, Endurance, Failover & Bandwidth testing using Performance Center.
  • Load runner scripts developed with following protocols (HTTP/HTML, Remote Desktop Protocol (RDP), True client, web services protocol).
  • Created load runner Vugen Scripts and used Correlation to parameterize dynamically changing parameters like Session IDs.
  • Responsible for setting up Dynatrace appmon agents on Web, App & HP agent DB servers to monitor traffic real time during test. Monitored server utilizations like CPU, Memory, Heap, I/O transfer rates, disk utilization, Network utilization, number of requests, average queue size and disk & queue wait times., Etc.
  • Analyzed test results with business teams to meet SLAs.
  • Used Dyna trace Appmon, Dynatrace UEM, HPPM, monitors to identify the root-cause of the transaction that caused the slowness in the system.
  • Analyzed CPU sampling, Thread dump, Core dumps and heap dumps for deep dive analysis to find out bottlenecks.
  • Have expertise in JVM tuning, IIS tuning, DB tuning.
  • Analyzed various graphs generated by Load runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs.
  • Analyzed and reviewed the reports with stakeholders to identify the bottlenecks.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBAs and Network Admin to ensure optimum performance.
  • Coordinate with the application SME to analyze the application issues.
  • Generated detailed analysis reports with all observations and bottlenecks.
  • Actively communicate with developers to help them reproduce and resolve the issues.

Environment: Windows 2008 server, Java, Perl, Linux, Loadrunner V/11.00,11.52, 12.55 Performance Center V/11.00,11.52, 12.53 VUGen V/11.00,11.52, DynaTrace Appmon V5/5.5, 6.5,7.0, Dynatrace UEM (User Exp.Manager), SiteScope, Websphere, Putty, Oracle, SQLServer, MS Office Tools.

Confidential, MN

Performance Test Lead

Responsibilities:

  • Responsible for Planning & executing Performance engineering effort on Claims Router application
  • Creates effort estimations required to do the performance testing.
  • Involved in creating/reviewing of Test plan and test strategy documents.
  • Responsible to set up the test environment on isolated test labs.
  • Performed functional validations prior to putting the app under load to make sure functionality of the business process are as per requirements.
  • Identified test cases to be automated and developed the scripts using Load runner.
  • Performed Functional verification testing, single business transaction performance testing, baseline testing. Average workload testing, Peak workload testing and Endurance testing.
  • Performed Load, Stress & Endurance testing using Load runner scripts developed using vugen.
  • Load runner scripts developed with following protocols (HTTP/HTML, Remote Terminal Emulator (RTE), Windows Sockets).
  • Created load runner Vugen Scripts and used Correlation to parameterize dynamically changing parameters like Session IDs.
  • Responsible for setting up monitors on Controller box, Mule servers & main frame regions to monitor the CPU, Memory, I/O transfer rates, disk utilization, Network utilization, number of requests, average queue size and disk & queue wait times. Etc.
  • Responsible for running MQ Monitors to identify the QDepth, En-queue & De-queue message rate.
  • Analyzed test results with business teams to meet SLAs.
  • Using CA- Wily monitors to identify the root-cause of the transaction that caused the slowness in the system.
  • Analyzed various graphs generated by Load runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and MuleServer Resource Graphs.
  • Analyzed and reviewed the reports with stakeholders to identify the bottlenecks.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBAs and Network Admin to ensure optimum performance.
  • Coordinate with the application SME to analyze the application issues.
  • Generated analysis reports/ Graphs for CPU utilization, I/O throughput, network utilization and F5 Load balancer stats.
  • Actively communicate with developers to help them reproduce and resolve the issues.

Environment: Linux, Mainframe, Java, Loadrunner V8.1, VUGenV8.1 Perf java, CA- Wily, MQ Monitors, Putty, Oracle, Sybase, MS Office Tools.

Confidential, NY

Performance Engineer

Responsibilities:

  • Involved in creating/reviewing of Test plan and test strategy documents.
  • Involved in Setting up the test environment & Performance testing.
  • Performed testing on the all 4 apps to make sure functionality of the business process are as per requirements.
  • Identified test cases to be automated and developed the scripts using Load runner.
  • Performed Functional verification testing, single business transaction performance testing, baseline testing. Average workload testing, Peak workload testing and Endurance testing.
  • Generated VUsers to test the stress on the application.
  • On all applications performed Load, Stress & Endurance testing using Load runner scripts developed using vugen.
  • Load runner scripts developed with following protocols (HTTP/HTML, SOAP, Remote Desktop Protocol (RDP), CITRIX).
  • Used Leela launcher to control the broker process for MQ application.
  • Created load runner Vugen Scripts and used Correlation to parameterize dynamically changing parameters like Session IDs.
  • Responsible for setting up monitors OPNET- PANORAMA & PerfJava to monitor the CPU, Memory, IO, Network utilization.
  • Responsible for running MQ Monitors to identify the QDepth, En-queue & De-queue message rate.
  • Analyzed test results with business teams to meet SLAs.
  • Using PANORAMA Diagnostics to identify the root-cause of the transaction that caused the slowness in the system.
  • Analyzed various graphs generated by Load runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Analyzed and reviewed the reports with stakeholders to identify the bottlenecks.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBAs and Network Admin to ensure optimum performance.
  • Automated regular tasks using Shell script like running DB Monitors (SP target & SP Slowmon) on Sybase.
  • Used different system performance monitoring utilities on UNIX such as vmstat, iostat, prstat, mpstat to identify I/O transfer rates, disk utilization, number of requests, average queue size and disk & queue wait times. etc.
  • Coordinate with the application SME to analyze the application issues.
  • Generated analysis reports/ Graphs for CPU utilization, I/O throughput, Memory utilization .
  • Actively communicate with developers to help them reproduce and resolve the issues

Environment: Linux, SAN Storage, Java, Loadrunner V11.0, VUGenV11.0 Perf java, OPNET- PANORAMA, Leela Launcher, MQ Monitors, Putty, DB2, Sybase, MS Office Tools.

Confidential, IN

Performance Engineer

Responsibilities:

  • Understand FDA regulatory processes, environment and undergo various s
  • Involved in creating/reviewing of Test plan and test strategy documents.
  • Involved in Operability (Go-No-Go), System & Integration, Regression & Performance testing.
  • Performed manual testing on the all 7 apps to make sure functionality of the application is as per requirements.
  • Executed test cases, recorded results and reported defects in HP Quality Center 11.0
  • Created & executed automated scripts using QTP11.00 to test TEDI testing as part of regression testing.
  • Identified test cases to be automated and developed the scripts using Load runner.
  • Participate in peer reviews of performance Test Cases and Test scripts.
  • Performed Functional verification testing, single business transaction performance testing, baseline scalability testing. Growth rate scalability testing, Endurance testing and Migration testing.
  • Created load runner scripts using vugen & used controller to run multiuser tests.
  • On all applications performed Load, performance testing using Load runner scripts.
  • Load runner scripts developed with following protocols (HTTP/HTML, Informix, and Remote Terminal Emulation (RTE) for Green screen).
  • Worked with the development team to sort the issues due to proxy and firewall settings to achieve the exact performance of the application.
  • Created load runner Vugen Scripts and used Correlation to parameterize dynamically changing parameters like Session IDs. Used Rendezvous points , Load balancing and IP spoofing to load test specific transactions.
  • Responsible for setting up monitors in controller, Site Scope and interscope to monitor the CPU and memory utilization. Responsible for setting up Site Scope monitors to drill down different layers of the application to identify the performance bottleneck
  • Analyzed test results with business teams to meet SLAs.
  • Using HP Diagnostics to identify the root-cause of the transaction that caused the slowness in the system.
  • Conducted Regression and functional testing for different applications.
  • Written commands in SQL to verify backend data and have also monitored CPU and memory usage on servers.
  • Analyzed various graphs generated by Load runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Analyzed and reviewed the reports with developers and network engineers to solve the bottleneck.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBAs and Network Admin to ensure optimum performance.
  • Used different system performance monitoring utilities on UNIX such as vmstat, iostat, prstat, mpstat to identify I/O transfer rates, disk utilization, number of requests, average queue size and disk & queue wait times. etc.
  • Coordinate with the application SME to analyze the application issues.
  • Generated analysis reports/ Graphs for CPU utilization, I/O throughput, Memory utilization .
  • Actively communicate with developers to help them reproduce and resolve the issues

Environment: Solaris version 10.X, Windows NT, UNIX M5000 server, SAN Storage, Informix, Loadrunner 9.51, Performance Center9.51 (SaaS), VUGen 9.52 QTP 11.00, Quality Center 11.00, DBVisualizer 6.0.2, Oracle11g, MS Office Tools.

Confidential, MN

Automation Tester

Responsibilities:

  • Understand FDA regulatory processes, environment and undergo various s
  • Involved in implementation of agile methodologies, playing a role of Scrum master.
  • Involved in creating Test plan and test strategy documents.
  • Designed protocols as per FDA regulatory processes, developed Python test scripts to implement the protocols.
  • Participate in peer reviews of protocols, scripts and managed them in PVCS VM
  • Identified protocols to be automated or Semi-automated and developed the scripts in Python
  • Debug the python scripts during Execute Debug phase to ensure correctness of scripts
  • Performed installation and un-installation testing of firmware on Latitude device .
  • Prepared/updated, executed Test Scenarios/cases for Web Application Server of Latitude
  • Data exchange between medical applications using HL7 protocol/ standards .
  • Verified Electronic Medical Records (EMR) followed HL7 standards.
  • Used Mirth cross platform HL7 interface engine.
  • Baseline the test stations right from OS installation based on the release versions.
  • Automated few EMR reports with Selenium.
  • Selenium used to check/automate the attributes of PDF reports.
  • Involved in defining performance requirements and bench marking.
  • On WAS application did the performance and stress testing using Loadrunner.
  • Used Jprobes Leak doctor to identify memory leak issues.
  • Performed Load testing on Webservices (SOAP).
  • Set up different LR monitor on servers to find the bottleneck.
  • Analyzed and reviewed the reports with developers and network engineers to solve the bottleneck.
  • Generated VUsers to test the stress on the application.
  • Validated XML files and XML-Schema.
  • Validated the Web Services (SOAP), both Request and Response messages.
  • Automated routine activities using UNIX/Linux Shell Scripting includes test log and flat file comparison.
  • Used different system performance monitoring utilities on UNIX/Linux such as vmstat, top, iostat to identify I/O transfer rates, disk utilization, number of requests, average queue size and disk & queue wait times. etc.
  • Involved in End to End testing of the WAS application interfaces.
  • Created automated test scripts using VB script in QTP.
  • Created Data Driven and parameterized automated test scripts using QTP Actions and VB Script functions.
  • Enhanced VB scripts with Database Checkpoints, User defined function and Data Driven Test Scripts for AUT using Quick Test Professional.
  • As part of Data base testing developed SQL queries to test the back end process using Toad.
  • Created and executed SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data.
  • Used cruiseControl for automating build process.
  • Involved in CCB meetings and documented issues discussed.
  • Defects logged on to QC and tracked till closure.
  • Coordinate with the tools group to set up the test environment/test bed, tools for the team
  • Helped the Team Lead to create Traceability Matrix to ensure comprehensive test coverage of requirements, identify all test conditions and test data needs.
  • Responsible for Formal prep and internal audit to meet FDA regulatory standards for Latitude.
  • Execute all the test scripts in Formal DVT and archive the Event Logs, RAS, TeraTerm and Test logs as required by FDA
  • Actively communicate with developers to help them reproduce and resolve the issues
  • Provide estimates for verification of existing SCRs related to fixed issues or new features and verify them.

Environment: Python, Hercules, Oracle, SQL, WebSphere, Unix/Linux, Windows NT/XP, VMWare, Quality Center, QTP9.0, VUGen 9.5, Performance Center9.5, PVCS VM 7.5, TMS, MS Office Tools, Implant devices, Simulation equipment, CruiseControl, Toad, SeleniumIDE

Confidential, NY

Test Lead

Responsibilities:

  • Involved from Requirements walkthrough to System Integration testing and Pre-Production Validation Testing.
  • Responsible for leading testing team .
  • Used DOORS (Dynamic object oriented requirement system) for configuration management, version control and Requirement Management.
  • As part of lead activity proposed and created testing strategy, Test plan, execution plan, effort estimations and time line to ensure the delivery of a quality software system.
  • Performed Technical Testing on the application.
  • Involved in preparation of test schedules using MS Project.
  • Involved in bench marking performance requirements, performed Load and Stress testing of AUT.
  • On Angel Portal application did the performance and stress testing using Silk performer.
  • Created and debugged load test scripts using Silk Performer.
  • Performed load testing on Web services using Silk Performer.
  • Try scripted scripts using True Log explorer.
  • Customized session handling using True Log Explorer.
  • Set up different monitor on servers to find the bottleneck using Silk Performer.
  • Performed Load testing on Web services (SOAP) UI.
  • Extensively worked with Load Runner Controller for preparing Goal-based Scenarios and manual Scenarios
  • Used Performance monitor and Load Runner graphs to analyze the results.
  • The Average CPU usage, Response time, TPS are analyzed for each scenario.
  • Created Performance test reports.
  • Used Quality center for requirements management, test development and test executions.
  • Development of automation scripts from scratch using QTP.
  • Automated the functionality and generated VB scripts (QTP).
  • Involved in object creation, descriptive programming in QTP.
  • Developed and executed test cases to test interactive Flash images and Offline streaming videos.
  • Developed and executed various scripts/functions/object repositories using QTP for automated testing of functionality of the applications.
  • Worked with synchronization, parameterization, debugging and regular expressions in QTP.
  • Performed volume testing by transmitting batch files and validating them.
  • Modifying the scripts as required for new builds and monitoring the automation run using QTP.
  • Tracked defects using Quality center & Conducted Bug-Review meetings and root cause analysis.
  • Designed and implemented complex SQL queries using joins and aggregate functions to extract data from different databases for timely reporting to different Subject Matter Experts and for Data Analysis and Validation.
  • Involved in performance testing on the application to find out response time and through put.
  • Responsible for Creating Traceability Matrix to ensure test coverage.
  • Used ANT tool to automating build process.
  • Used Apache tomcat as web server, to process client request.
  • Performed browser compatibility testing on different browsers IE, Netscape, Firefox, Safari for Export of data.
  • Used shell script to compare data in flat files.
  • Documented and communicated Testing progress, test results, test summary reports.
  • Involved in evaluation Quality Assurance processes to recommend efficiency improvements
  • Worked with development team to ensure testing issues are resolved.
  • Participated in Walk-through, inspections and reviews of specifications and test cases
  • Conducted Daily Huddle Meetings and weekly Team Meetings.

Environment: Java, JSP, MXML, J2EE, Flex builder, Adobe Flex3.0, Adobe Flash Player 9.0, IMac, Win 2000, Win XP, Win Vista, Safari, IE 7.0, FireFox 2.0, MySQL, JVM, JSF, WebLogic, Silk performer, Loadrunner, DOORS, Quality Center 9.0, QTP 9.0, web-link validator, ANT.

Confidential, CA

Automation Tester

Responsibilities:

  • Responsible for leading testing team of 3.
  • Involved in Preparation of Test plan, test strategy and Requirement Traceability Matrix documents.
  • Used DOORS (Dynamic object oriented requirement system) for version control and Requirement Management.
  • System Test Case Design, Review and Test Case Execution using Quality Center.
  • Performed smoke testing on each Build and communicated status of each build.
  • Performed Technical testing on the AUT.
  • Involved in designing QTP Framework and writing the reusable functions.
  • Developed Test Scripts using QTP for Functionality and Regression Testing for multiple environments with multiple Global sheets of test data.
  • Created Check points and Synchronization points for functional testing using QTP.
  • Created Data Driven and parameterized automated test scripts using QTP Actions and VB Script functions.
  • Extensively used descriptive programming in QTP for automating the business process testing (BPT).
  • Took the ownership for maintaining the QTP Object Repository using regular expressions.
  • Designed the reusable and modular action components for the AUT using QTP
  • Validated XML files and XML-Schema.
  • Validated the Web Services (SOAP), both Request and Response messages.
  • System performance monitored using perfmon 3.0 utility.
  • Involved in defining performance requirements.
  • Creating scripts in VUGen and parameterized, correlated properly where necessary using Load runner.
  • Shakeout with different numbers of Vusers to check the scripts.
  • Created manual and goal oriented scenarios according to the requirement using Load runner.
  • Performing Performance, load and Stress Testing by testing whole program to make sure that the system takes the specified time to execute the command using Load Runner.
  • Monitoring the performance of the application using the Load Runner Controller.
  • Analyzing reports with different groups to find the bottlenecks
  • Identified defects, retested defects using defect Tracking and Reporting software’s defects using Quality Center during various phases of testing.
  • Followed Agile (Scrum) methodology for testing the application.
  • Wrote SQL scripts for retrieving and updating the database tables for the Backend Testing using querying tools Toad.
  • Used Apache tomcat as web server, to process client request.
  • Conduct Triage meetings to evaluate bugs and/or issues found during various testing phases
  • System test cases developed and executed for SUT.
  • Involved in Functional, GUI and Usability Testing.
  • Involved in Smoke , integration, Regression, System, Security and UAT testing.
  • Used Unix shell script to compare data in flat files.
  • Performed Testing on different browsers for cross-browser compatibility.
  • Setting up the Test Environment & creation of Test Data.
  • Documenting the Test results, Test execution status, test reports using Quality center.

Environment: QTP 9.0,Quality Center 9.0, DOORS, windows XP, Agile(scrum), Apache 3.1, SQL Server 2005,ADO.NET, ASP.NETC#.NET, XML,SOAP Web Services, IIS 6.0, MS-Excel, Loadrunner, JBoss, Win 2000, XP, Vista, IE6.0,7.0, Firefix 2.0, Safari, Toad, XR perfmon 3.0

Hire Now