We provide IT Staff Augmentation Services!

Lead Performance Engineer/analyst Consultant Resume

2.00/5 (Submit Your Rating)

Warren, NJ

SUMMARY

  • 10+ years of diversified experience in Software Quality Assurance, Performance Testing/Engineering.
  • 3+ years of experience as Performance Test Lead.
  • Strong noledge and experience in Performance Tuning applications.
  • Expertise in finding teh root - cause analysis and identifying teh bottlenecks.
  • Knowledge and working experience with Amazon Web Services - AWS (EC2, RDS, RedShift DB).
  • Experience in Performance testing of Web applications, Client/Server applications, UNIX, Linux and Internet applications using Load Runner/Performance Center and Jmeter.
  • Performance testing Experience in J2EE, Oracle, web and Thin client applications using different protocols such as HTTP/HTML, Oracle NCA, JAVA, RTE, RDP, Citrix, web services and multiple protocols.
  • Expertise in Server-side profiling using Wily Introscope, Riverbed (OPNET), AppDynamics and DynaTrace.
  • Expertise in Client-side profiling using Fiddler and DynaTrace FE analysis to find teh problems with teh user.
  • Experienced in using Vmstat, Nmon, Sar, Top System Monitor in Linux System to measure Linux System performance under load
  • Expertise in Heap Dump and Thread Dump analysis, JVM heap tuning including Garbage Collection algorithms.
  • Experienced in Performance testing .Net and Java web applications.
  • Monitoring system resources such as CPU Usage, Memory Usage, network bandwidth, I/O state, VM Stat using monitoring tools
  • Performed Batch testing using Load Runner Vugen for DB.
  • Worked as Independent Contributor and managed teh team at onsite as well as offshore.
  • Experienced in using teh monitoring tools like Wily Introscope, PerfMon, JConsole, TeamQuest Viewer, BMC Portal, HP Diagnostics, Site Scope, BAC RUM, DynaTrace, BMC Petrol, BMC Performance Assurance Console (UNIX, Windows) and QPasa
  • Expertise in creating documents such as Performance Test Plans, Performance Test Strategies and Performance Test Result Analysis document.
  • Experienced in developing teh Workload models.
  • Performed Load, endurance, Scalability, Fail-over, Stress and Disaster recovery tests using LoadRunner and Performance Center/ALM.
  • Knowledge in writing Bash scripts.
  • Experienced in installing teh test platforms, test applications and teh test tools.
  • Experienced with defect tracking tools like Quality Center, VSTS, IBM ClearQuest and JIRA.
  • Excellent verbal, communication, interpersonal and organization skills.
  • Extensively worked on applications dat were hosted on Windows, Linux, and UNIX both Physical and Virtual.
  • 10 years of extensive experience in Microsoft Tools: Word, Excel, PowerPoint, Visio, MS Project and Access

TECHNICAL SKILLS

Operating Systems: AIX, HP-UX, Solaris, UNIX, Windows XP, 2003, 2000, Vista and Linux

Languages: C, C++, JAVA/J2EE, XML

Databases: Oracle, DB2, SQL Server, MySQL, VISION, Cassandra

Web Technologies: XML, XHTML, SOAP, WSDL, XML Web Services

Testing Tools: Load Runner, Performance Center, JMeter, SOAP UI

Monitoring Tools: HP Diagnostics, BAC RUM, Site Scope, Wily IntroScope, DynaTrace, BMC Petrol, BMC Performance Assurance Console (UNIX, Windows), QPasa, Splunk, Optier BTM, Jprofiler, TeamQuest

Project Mgmt: MS Project, MS Visio, Quality Center, JIRA

Web/App servers: Apache, Tomcat, Web logic, Web Sphere

Analytical tools: Tableau (Data visualization tool), Google Analytics

PROFESSIONAL EXPERIENCE

Confidential - Warren, NJ

Lead Performance Engineer/Analyst consultant

Responsibilities:

  • Worked with application team and developer(s) to select Use Cases and gather performance test requirements including SLA’s for java based applications.
  • Developed and implemented test procedures for teh projects being impacted every Enterprise Release.
  • Managed performance testing team of 5 offshore and 2 on site resources working on APIs performance testing every release.
  • Handled multiple projects (6 to 7 every ER) for performance testing and provided performance sign off for every Enterprise Release.
  • Coordinated with developers to identify projects (APIs) being impacted and requirements for performance testing. Assigned projects to performance testing team, created daily touch point meeting to track progression to meet timeline of ER.
  • Generated “created Vs Resolved issues report” from JIRA for projects and presented defects statistics every ER. Participated in daily Walk Through and Defect report meetings.
  • Extensively worked on APIs XML comparison test to validate XML tags and values in teh response from Mainframe and new Cassandra DB. Reported XML tag/value mismatches to teh developers and retested after issue fixed.
  • Worked extensively on APIs testing on VISION Mainframe. Prepared framework for performance testing of around 1000s of APIs using singe script.
  • Developed Load Runner scripts and Executed load, stress, capacity and failover test for an application running on cloud base AWS Linux servers and Cassandra cloud database.
  • Captured and compared application KPIs on Premises and Cassandra DB before providing performance sign off for any project moving to new platform.
  • While running a test, I monitored server resources and performance metrics using Wily IntroScope tool and provided required recommendations to tune teh applications/servers.
  • Analyzed teh GC logs and identified teh root cause for teh Full Garbage Collections by looking at teh heap dumps.
  • Provided JVM heap size recommendations, tuned abnormal long GC pauses by breaking it down into smaller incremental pauses, tuned number of full GC causing CPU spikes at high memory conditions by increasing heap size and theirby eliminated JVM abnormalities.
  • Worked with DBA, Sys Admin, and development team to tune database and application performance.
  • Performed multiple runs of capacity test to fine tune teh heap size and recommended best suitable Garbage collection algorithm to get optimum performance.
  • Performed failover and Disaster recovery test for teh environment on single cell (passive cell) in production. Provided failover latency and server resource metrics.
  • Captured thread dump and analyzed it to troubleshoot an issue of high response time and high CPU consumption. My analysis captured and presented root cause of hung thread/dead lock condition.
  • Tested APIs performance under stress on Cassandra/Datastax cloud Database.
  • Prepared presentations for performance testing effort for Order Rewire (Cassandra) projects and presented to teh app team and upper management.
  • Presented MIPs saving progression after every rollout (offloading APIs from Mainframe to Cassandra) in a graphical view.

Environment: WebSphare, Load Runner 12.5, Wily IntroScope, Quality Center, One JIRA, LINUX, AIX, Cassandra, MCS GRID, Windows 2008 servers, AWS cloud, DVS, VISION Mainframe, AWS cloud, RDS, EC2, Redshift DB

Confidential - Jersey City, NJ

Sr. Performance Analyst consultant

Responsibilities:

  • Gatheird performance requirements by scheduling different sets of meetings.
  • Prepared Test Cases, Test Strategy, and Test Plan based on teh non-functional business requirements for two different projects (Combined Intranet and MDH/P&L Reporting).
  • Extracted teh Batch Jobs from TIDAL and compared Job Durations after upgrading teh Data Warehouse and Informatica Servers from IBM Power 6 to Power 7 boxes.
  • Verified new jobs added should not have performance impact on batch jobs.
  • Developed Test script using LoadRunner Vugen Web HTTP/HTML and Citrix protocols.
  • Debugged and customized test scripts using different functions such as string manipulation functions to make it work as expected for Business functionalities.
  • Captured SAML Response Token (Dynamic) from server response using web reg save param function.
  • Analyzed Thread Dump and identified/reported threads with “Stuck” status on Hotspot JVM on WebLogic server.
  • Created dashboards in AppDynamics and Splunk to monitor teh server resource utilizations and service/method level response times.
  • Analyzed Verbose GC logs and identified cause of high response time due to full GC and memory leak issue.
  • Worked on log parsing, complex Splunk searches, including external table lookups.
  • Plan, design, execute and maintain different types of performance tests like load, stress, endurance, spike, capacity etc.
  • Created Shell script on Bash to evaluate system logs for failures.
  • Involved in performance tuning and monitoring, tunedSQLqueries using Explain plan.
  • Performed MQ load test using LoadRunner Web service protocol to send number of messages to MQ.
  • Was involved inCapacity Planning for Enterprise release applications, to calculate teh future growth in teh transaction volume and concurrent users on teh system.
  • Used HTTPWatch and Fiddler to gather Web network metrics and finalized expected concurrent user load metric.
  • Used TeamQuest Tool to measure resources utilization in virtual environment.
  • Tuned servers for memory and CPU as per teh requirement after test result analysis.
  • Prepared Summary report for all test runs and presented it to management and Team

Environment: WebLogic, WebSphere, Load Runner 11, Wily IntroScope, TIDAL, PerfMON, TeamQuest, Eclipse, MQ, HTTPWatch, Data Warehouse, Informatica, IBM Power 6 & 7, AppDynamics, Splunk

Confidential - Wilmington, DE

Sr. Performance Engineer consultant

Responsibilities:

  • Interacted with teh Business community and teh end users to gather requirements and developed User Requirement Specification (URS) document.
  • Worked in agile environment and provided report on day to day basis.
  • Prepared Test Cases, Test Strategy, and Test Plan based on teh non-functional business requirements.
  • Collaborated with architect and development teams to analyze teh application’s core functionalities and its various dependencies to identify potential bottlenecks.
  • Developed and maintained repeatable and scalable performance test suites.
  • Analyzed Load pattern and created test scenarios to emulate teh real life stress conditions.
  • Designed and executed different types of performance tests like load, stress, endurance and capacity test.
  • Designed Test Case documents for Performance testing in quality Center and report defects.
  • Addressed both Production and Enhancement Modification Requests, more importantly making sure dat all defects are addressed in software developmental life cycle.
  • Work with operations and capacity teams to gather production data to create workload models.
  • Prepared Workload Model and Configured appropriate Pacing and Think Time to meet targeted load rates.
  • Developed LoadRunner scripts using WEB/HTTP to performance test .Net application.
  • Correlated large dynamic values such as Viewstate, Event validation etc.
  • Developed LoadRunner scripts using RTE and RDP protocols to test client server application.
  • Utilized TE synchronization functions such as TE wait sync,TE wait text,andTE wait cursor in RTE Scripts.
  • Manually added synchronization point on top of each mouse click function to verify recoded and replay Screen Image before continuing in RDP Protocol.
  • Have thorough understanding of assigning text/image checks, rendezvous points, parameterization, and correlation (capturing dynamic values like session id’s /cookies) irrespective of teh application.
  • Developed and enhanced complex VUgen scripts using RTE and RDP Protocols.
  • Generated Loan Numbers using RTE script and passed it to another RTE script using Virtual Table Server (VTS).
  • Gatheird teh results from each test run and conducted in-depth analysis on teh transaction response times, Throughput, error rate, TPS, network latency and teh performance of each server.
  • Monitored system level resources as well as CLR metrics such as Total Processor Time, Processor Queue Length, Available MBytes, Private Bytes usage, %Time spent in GC and LocksAndThreads metrics using PerfMon.
  • Created dashboards in Dynatrace to monitor teh system resources, exceptions and transaction response time.
  • Using Dynatrace drill down on PurePath and response time HotSpots features, Identified method call and DB statements contributing into high response time for transaction.
  • Coordinated with tools team to Install Site Scope, Wily Intro scope and Dynatrace on teh Performance environments for triage calls to identify teh bottlenecks.
  • Developed analytical Production environment capacity forecast model using Microsoft Excel.
  • Entered defects in QC and updated status for current defects. Generated defect chart reports for each Projects.

Environment: WebLogic, Load Runner 11, Wily IntroScope, SiteScope, PerfMON, NMON, Quality Center, UNIX, AIX 6.1, RHEL 5.5, ONBase, Windows 2008 servers, Virtual Table Server, Dynatrace

Confidential - Bloomington, IL

Lead Performance Engineer consultant

Responsibilities:

  • Introduced teh automation environment and led teh team to test teh JAVA WEB based application with various Testing Phases, Testing Techniques and Quality Work Products.
  • Participated in all meetings, planned for particular release and obtain necessary technical automation requirements. Such meetings includes, design review, test execution timeline etc.
  • Conducted a complete assessment of teams, processes and environment; defined issues and risks; and oversaw teh design and introduction of tools, processes and best practices across testing life cycle to improve throughput, communication and on-time delivery of projects in an Agile development environment.
  • Developed VUser Scripts using protocols like HTTP/HTML, Oracle NCA, Java and Web Services, Enhanced scripts with C functions. Validated teh scripts to ensure teh script runs successfully during replay.
  • Worked with other technical team members (Architects, DBA) to support teh test execution to ensure correct environment configuration just prior to execution.
  • Executed Load and stress tests in Performance center to find out performance breaking point for teh system.
  • Monitored CPU, Memory, Network, Web connections and throughput using different monitoring tools such as HP Diagnostics and SiteScope while running different types of tests.
  • Set up alerts in HP Diagnostics tool to receive automated emails for teh metrics crossing specific threshold.
  • Analyzed Heap Dumps using IBM Heap Dump Analyzer, Identified memory leak issue and presented teh analysis to application team.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Measured Response times at sub transaction levels at web, App servers and database server levels by using HP diagnostics.
  • Analyzed Thread Dump and identified/reported threads with “Wait” status and having deadlock condition.
  • Monitored Oracle DB performance for Indexes, Sessions, Connections, poorly written SQL queries.
  • Found performance degradation issues with high CPU usage metric due to Full GC happening on JVM.
  • Monitored server metrics such as Thread pool utilization, JDBC connection Pool, EJB connections and recommended ideal size of thread pool after multiple test runs.
  • Performed regular, detailed analysis of historical business data (using HP Sitescope), such as Business load critical transaction data, metrics, and other key drivers of workload volume to develop analytical models to aid in short- and long-term capacity planning.
  • Prepared and presented capacity analysis and projected forecasts to leaders to handle future workload proactively and to eliminate unhandled fault which may impact business.
  • Captured individual statements in a stored procedure through SQL Server Profiler and observed execution.
  • Performed MQ testing using LoadRunner and monitored MQ behavior using QPasa tool.
  • Accessed different server logs using Splunk Lite to analyze exceptions and long running transactions.
  • Created and configured management reports and dashboards using Splunk.
  • Used BMC Performance Assurance Console to monitor server side metrics such as CPU and Memory Utilization.

Environment: Web Sphere, Windows 2000 Advanced Server, IBM AIX, UNIX, Oracle Database, SQL Server 7.0, MQ series (IBM and MS), Load Runner, JMeter, Performance Center 11, SiteScope, HP Diagnostics, HP RUM, Qpasa, Splunk, Optier

Confidential

Performance Tester

Responsibilities:

  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
  • Extensively used LoadRunner for performance and stress testing.
  • Extensively used JMeter to automate business transaction to test Web services application.
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters.
  • Developed and executed Jmeter scripts, monitored teh performance of teh load testing hardware under load, analyzed and reported teh results, and worked with teh right people to resolve issues.
  • Extensively used Quality Center for test planning, maintain test cases and test scripts for test execution as well as bug reporting.
  • Used Developer Tool to analyze client side performance of single user. Identified teh culprit for teh slow responsive transaction for an application.
  • Checked teh latency on teh network with WAN emulation ON and OFF and analyzed teh results by comparing to current location.

Environment: LoadRunner, Jmeter, Quality Center, Oracle, IIS, apache tomcat, Unix, Java, ASP.NET, ADO.NET, Web Services, IBM AIX, Solaris, Web Logic

We'd love your feedback!