Sr. Performance Engineer Resume
Charlotte, NC
SUMMARY:
- Performance Tester/ Engineer with over 7 years of diversified experience in testing Web Based E - Commerce, GUI based client/server applications using Manual and Automated testing procedures.
- Involved in all stages of Quality Assurance Life cycle. Extensively followed CMM, TSO Methodologies for Quality Analysis.
- Extensively worked in LoadRunner/Performance Center Web, Web Services, Winsock, Database, Citrix Oracle, Click and Script Protocol.
- Working in centralized performance test teams which cater to different application owners, Experience in gathering the performance requirements like Peak Load, Number of users, load rate, environment from the client and designing performance test scenarios and developing performance test plan for the application.
- Hands on experience with Modularizing scripts using C header files and calling them into LoadRunner/Performance Center.
- Experienced in Parameterization, Host and Scheduling Operation.
- Ability to effectively analyze a variety of scenario, Performance Reports and Summary Reports.
- Configuring Run-time settings for VuGen and Controller.
- Well versed with all functionality of Virtual User Generator and Correlating Statements.
- Oracle Database performance was monitored for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of WSJ application
- Experience in Rational Unified Process (RUP), ClearCase, ClearQuest, Rational Requisite Pro and UML
- Good understanding of object oriented methodologies, software life cycle and software testing methods
- Automated testing of games with TouchTest, Used SOASTA to recorded and customized complex motions, gestures and context, with high precision. Tested varouis games like Skeleton and Climbers
- Automated testing of Mobile applications
TECHNICAL SKILLS:
Operating Systems: AIX, HP-UX, Solaris, UNIX, Windows XP,2003,2000,Vista, Windows NT and Linux
Languages: C, C++, JAVA/J2EE, VB Scripts,PERL, Python, XML, Shell Scripting, Ruby
Databases: Oracle, DB2, SQL Server, MS-ACCESS, MySQL
Web Related: DHTML, XML, HTML
Testing Tools: Mercury LoadRunner/Performance Center, WinRunner, Quick Test Pro, SOASTA
Web / Application Servers: Apache, Tomcat, Weblogic, WebSphere, IIS
Methodologies: RUP, Agile, Performance Testing
Project Management /Analysis: MS Project, MS Visio, ClearCase, ClearQuest, Rational Requisite Pro and UML
Other: Testing tools, Performance Center, SiteScope, TeamQuest, Wily, ALM/Quality center
PROFESSIONAL EXPERIENCE:
Sr. Performance Engineer
Confidential, Charlotte, NC
Responsibilities:
- Developed Performance testing plan based on business and technical requirements.
- Perform large-scale load volume end-to-end testing using large users data files
- Used Virtual User Generator to generate VuGen Scripts for Web (Http/Html), .Net and Web Services protocol
- Configured and used IP Spoofing in LoadRunner/Performance Center to simulate a more realistic testing scenario.
- Involved in conducting Verification, Benchmark test, stress tests, Fail over and volume tests against the application using LoadRunner/Performance Center.
- Used Scenario By Schedule in the controller to change the Ramp Up, Duration and Ramp Down settings
- Used lr xml get values function to capture the values from the SOAP response and use it in the next SOAP Request
- Identified the load balancing issues on the servers
- Tested different versions of the application on performance and pre-production environments before going live to Production
- Design different Scenarios in controller and also setup performance monitors to help identify Application issues such as database locks, bottlenecks, etc
- Monitor system performance using HP Business Availability Center, HP Site scope, NMON and Windows performance monitor
- Monitored the metrics such as response times and server resources such as Total Processor Time, Available Bytes and Process Bytes by using LoadRunner/Performance Center Monitors
- ALM/Quality center to record defects, scenarios and invoking deferrals
- Produced status reports, test results, analysis, recommendations, identified risks, if applicable and published metrics used in stakeholder’s decisions.
- Performed Performance Test on Cloud Test using SOASTA. Used SOASTA Mobile Test to record and customize complex motions, gestures and context, with high precision.
Environment: Load Runner, UFT/QTP, Oracle, MS SQL Server, Web Sphere, Load Balancer, JAVA, Informatica, ALM/Quality center, J2EE Diagnostic Tool, Ethereal, Jmeter, Windows, Wily, Cloud Testing, SOASTA.
Performance Analyst
Confidential, NJ
Responsibilities:
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
- Extensively used LoadRunner/Performance Center for performance and stress testing.
- Used Winsock and ODBC Protocol to execute store procedure using LoadRunner/Performance Center for Database migration from Sybase to SQL Server
- Used Java RMI Protocol in Load Runner to capture JAVA API web base applications.
- Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters
- Prepare the Huge data for the Parameterized values in the scripts for multiple scenarios
- Developing VUser scripts and enhanced the basic script by Parameterizing the constant values using LoadRunner/Performance Center
- Extensively used ALM/Quality center for test planning, maintain test cases and test scripts for test execution as well as bug reporting.
- Used ALM/Quality center to invoke the scripts and initially performed the baseline testing and organized all the scripts systematically and generated reports.
- Checked the latency on the network with WAN emulation ON and OFF and analyzed the results by comparing to current location.
- Monitored BufferCacheHitRatio, FullScansPerSecond, SqlCompilationsPerSecond, LockTimeoutsPerSecond and traced database server performance using SQL Profiler.
- Develop the Daily status reports and publish the same to Development, Configuration, DBA and Network Teams.
Environment: LoadRunner/Performance Center, UFT/QTP, Jmeter, ALM/Quality center,Oracle, IIS, apache tomcat, Unix, Java, ASP.NET, ADO.NET, WebServices, IBM AIX, Solaris, Web Logic
Performance Engineer
Confidential, MN
Responsibilities:
- Responsible for gathering business requirement, studying the application and collecting the information from developers, and business.
- Parameterized large and complex test data to accurate depict production trends.
- Developed & Enhanced LoadRunner/Performance Center Scripts using C functions.
- Parameterized cookies, stored dynamic content in LoadRunner/Performance Center functions, used client side secure certificates.
- Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submits under Http protocols.
- Validated scripts to ensure they have been executed correctly and meet the scenario description.
- Analyze the CPU, Memory stats on Web servers, Application servers and DB servers using Wily.
- Monitor the Garbage collections, JDBC connections and Timeouts during the Test Execution.
- Analyzed the LoadRunner/Performance Center reports to calculate Response time and Transactions Per Second
- Developed performance analysis reports, Graphs (include Load Runner build -in graphs and MS Excel - custom graphs).
Environment: Windows 2000 Advanced Server, .Net, IIS, Oracle Database, SQL Server, Web Logic, MQ series (IBM and MS), Clear quest, ALM/Quality center, QT Pro, LoadRunner/Performance Center, WinRunner
Performance Engineer
Confidential, OH
Responsibilities:
- Gathered business requirement, studied the application and collected the information from Analysts.
- Developed and deployed test Load scripts to do end to end performance testing using LoadRunner/Performance Center
- Developed Virtual User Scripts for Web (Http/html), Java, and Citrix ICA protocol.
- Used MQ Client protocol to test websphere MQ.
- Developed LoadRunner/Performance Center scripts by using Virtual User generator for Base Line, Soak (Endurance test) and Stress test scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by checking the content in sources.
- Simulated multiple Vuser scenarios in LoadRunner/Performance Center.
- Defined Rendezvous point to create intense load on the server to measure the server performance under load.
- Monitoring software and hardware behavior during test run using PERFMON and LoadRunner/Performance Center online monitors.
- Responsible for monitoring Oracle Database performance for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of application.
- Identified and analyzed memory leaks at each component level.
- Database stored procedure executions, Indexes and deadlocks with load were analyzed.
- Ensure that defects are appropriately identified, analyzed, documented, tracked and resolved in ALM/Quality center.
Environment: Web Sphere, RUP, Windows 2000 Advanced Server, IIS 5, JMeter, IBM AIX, DB2, PL/SQL, MQ series (IBM and MS), Mainframe, Performance Center, LoadRunner/Performance Center, ALM/Quality center, WinRunner, Quick Test Professional, IBM Rational, Sitescope, HP J2EE Diagnostic, HP Business Availability Center .
Performance Tester
Confidential, NE
Responsibilities:
- Involved in project planning, coordination and implemented performance methodology.
- Developed Performance Test Plans and Test Strategies based on business requirements.
- Conducted Performance testing by creating Virtual Users and Scenarios using Load Runner
- Responsible for the development of Vuser scripts for several different protocols such as COM/DCOM, RDP and Web Services.
- Created Rendezvous points to simulate a scenario of multiple users performing the same action simultaneously
- Manually Correlated the Session ID’s and Database Primary Keys to save the dynamically changing value into a Parameter by going to the body of the server response.
- Measured Response times at sub transaction levels for Web, App servers and Database server levels by using Optimal Application expert. Highly concentrated on Transactions per sec during testing.
- Analyzed the load test results including transactions by drilling down, merged graphs (overlay graphs, correlate graphs), cross result graphs and auto correlating measurements and thus focusing on behavior patterns and identifying problematic elements using the LoadRunner/Performance Center Analysis tool.
- Monitored PERFMON Counters and windows resources such as, CPU Usage, % of Memory Occupied, I/O Stat
Environment: Windows, HPUX, AIX, Solaris, JavaScript, Oracle, C, C++, DB2, Sybase, MS Access, WebSphere, Apache, Weblogic, SQL, Tuxedo, Wily Introscope, Mainframe, Netscape, IE, XML, SSH, ALM/Quality center, WinRunner, Load Runner, UFT/QTP, Sitescope