We provide IT Staff Augmentation Services!

Lead Performance Architect Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • 10 years of experience in IT Industry mainly in Designing, Developing and Integrating Components for robust Web - based and client/server software applications.
  • Hands-on experience in design, development and execution of performance test plans as well as Test Strategies for Functional, Automated and Performance Testing.
  • Capacity to identify problems, analyze test results, and investigating their cause and suggesting remedies.
  • Extensive knowledge of Software Development Life Cycle (SDLC), Performance Test Life Cycle, Scrum and Agile having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing.
  • QA experience in Manual, Automated Testing as well as Load Testing on web and client-server applications
  • Worked extensively on Mercury Interactive testing tools - Load Runner, Soap UI,Win Runner, QuickTestPro (QTP), Quality Center and Test Director (TD)
  • Involved in writing Test Plans and Test Methodologies(Scrum)
  • Installed and configured Load runner and load generators
  • Performed System, Functional, Performance/Load, Compatibility, Regression, Integration, Positive, Negative, and Black box Testing
  • Created automated Test Case scenarios using QTP and WinRunner.
  • Prepare and also review the performance test scripts in JMeter
  • Used SQL queries to interact with database.
  • Defect tracking using Test Director (TD) and Quality Centre (QC)
  • Knowledge in all stages of the Software Development Life Cycle (SDLC) beginning from definition and initiation to deployment and support.
  • Hands on experience on troubleshooting Windows Server and desktop based systems.
  • Hands on experience in Load Runner protocols like Web (HTTP/ HTML), Web Services, SAP GUI, SAP Web. Some hands on experience using other Tools - Silk Performer, Neo Load, Soap UI, Quick Test pro, Wily Intro Scope, and Rational Tools
  • Performed SOA testing using Webservices.
  • Employed Agile, Waterfall methodologies and especially SCRUM to ensure rapid iterative Software development
  • Performance testing expertise in developing Performance Test plans, Test Strategy, Load Modeling, Performance Metrics and Performance Analysis.
  • Developed and deployed test Load scripts to do end to end Performance testing using Load Runner.
  • Sound knowledge of creating Load, Stress and Performance tests using LoadRunner
  • Experience in monitoring servers using tools like Site Scope, Wily Introscope, Solarwinds, Fact finder and DynaTrace.
  • Worked on Mobile Computing to validate mobile user traffic
  • Actively involved in Application Performance Monitoring using BSM, BPM, Site Scope, Dynatrace and Fiddler for end client behavior.
  • Involved in Root Cause Analysis with Middleware, Infrastructure Admins and Developers
  • Analyzed the test results (TPS, Hits/second, Transaction response time, CPU utilization etc) using Loadrunner Analysis, various monitor tools and prepare Test Reports
  • Checked for Network Bottlenecks using Network Delay Time and Vuser Graphs
  • Extensively used of Quality Center for Change Management and Defect Tracking.
  • Actively involved in Performance tuning like JVM/GC tuning in GC parameters and heap
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Proactively monitored end client response/behavior using HTTP Watch and Dynatrace
  • Excellent Documentation, Management and Presentation skills for the QA Project Team including Test Plan, Test Case Specs, Test Requirement, Test Case Matrix and Defect Reports.

TECHNICAL SKILLS

Testing Tools: LoadRunner 8.0/ 8.1/9.0/9.1/9.5/11.0/11.3/11.5/12.02/ Performance center, NeoLoad, Silk Performer, Load UI, Soap UI, Quick Test Professional,Win runner, Wily, Quality

Centre/Test: Director 6.0/7.6/8.0/9.2, Site Scope, Rational Clear Quest, Lotus Notes, Rational Suite.

Operating Systems: Windows XP/2000/NT/2003/2008, Macintosh, UNIX, AIX.

Monitoring Tools: Introscope wily, Dynatrace, HP Sitescope, Solar Winds, Blue strip Factfinder, Windows Perfmon, Fiddler

Languages: Java, C, C#, .NET, JAVA/J2EE,Java Script, XML, VB script, 4-Test, TSL, PL/SQL

Databases: MS SQL Server 7.0/2000/2005/2008, DB2, Oracle 9i/10G/11i, SQL, PL SQL.

Web Servers: iPlanet, Web Server, Tomcat, IIS

Application Servers: Web logic, iPlanet, WebSphere, Jakarta Tomcat, JRun

Design Tools: UML, Rational Rose, ERwin 4.0/3.5, Visio, MS Project

ERP: SAP ECC/4.6 B/4.6C, IBM Maximo

Middleware: WebMethods, EDI, JMS

Internet Tools: HTML, JavaScript, VBScript, VBA, JSP, XML, ASP, .NET

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Lead Performance Architect

Responsibilities:

  • Designed and configured Performance Infrastructure (Load generators, Performance Center, Controllers) as per the application environment zone (DMZ, SESF)
  • Worked on planning testing schedules and resources to ensure adequate test coverage for all applications in each release
  • Worked closely with peers in the technology organization to plan and co-ordinate testing activities
  • Escalated critical and blocking issues when necessary and drive them to closure
  • Tracked overall progress and provide regular status updates to senior management
  • Researched, evaluated, and implemented new performance testing based on existing needs
  • Identified and managed the implementation of process improvement initiatives
  • Managed and mentored staff, including skill training and career development
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Evaluated Business Requirements for testing needs and looped with business teams in improvising.
  • Design and developed Domain/Value Objects.
  • Enabled Transaction demarcation in Data Access Objects.
  • Wrote Queries using SQL.
  • Used sniffer tools like HTTP Watch and Dynatarace to understand/observe the end client behavior and application monitoring.
  • Worked on Mobile Computing to validate mobile user traffic.
  • Used SVN for Source Code Control.
  • Worked on web services using SOAP UI tool to validate the services on application.
  • Verify SQL queries against backend database to ensure test codes retrieve the right data on testing.
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Developed robust benchmark workloads based on production traffic patterns and anticipated
  • Creates analyzed and summarized test results in reports, capacity planning / best practice guides.
  • Developed and review test plans, results analysis, and capacity planning guides.
  • Involved in creating various Performance test scripts for testing the web-based applications using Load Runner 11.5, 12.02 and Performance Center with HTTP/HTML, and Web Services protocols.
  • Performs software performance and stress testing against requirements
  • Interacted with the various project teams to find out the end user actions and scenarios.
  • Involved in creating and executes performance testing scripts on J2EE and/or ERP applications
  • Experienced in developing and implementing comprehensive SDLC and test strategies methodologies
  • Experienced in planning and executing tests from large system wide testing (SIT, Functional, Regression, UAT, Performance, etc.).
  • Coordinates and facilitates meetings with developers, administrators, Architecture teams.
  • Involved in UI JVM parameter validation/tuning.
  • Involved in Memory, Cores/JVM validation/tuning.
  • Involved in Garbage Collector (GC) tuning process.
  • Prepared Load Runner automation scripts and validated with appropriate data inputs.
  • Prepared different Load Runner scenarios as per test plan.
  • Extensive experience in working with both agile and waterfall methodologies.
  • Organizes and maintains information within the assigned test management system.
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Executed Web services, API and RESTful web services using load runner
  • Involved in creating various Performance test scripts for testing the web-based applications using Load Runner 12.0 and 12.02 with HTTP/HTML, Mobile App, API, Web Services and RTE(mainframe) protocols.
  • Performs work flow analysis and recommends process in quality improvements.
  • A concrete Load Model was prepared using Load runner Scenarios so that it would apply the exact load as per production metrics.
  • Analyzed Application Logs in root cause analysis.
  • Reports activities and results to senior management

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, Performance Center, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Chattanooga, TN

Sr Performance Engineering Lead

Responsibilities:

  • Effectively managed performance and automation engineers, allowing them to be impactful in their roles.
  • Created technical strategy for future test and performance and QTP frameworks and work with the team to implement that strategy
  • Worked on planning testing schedules and resources to ensure adequate test coverage for all applications in each release
  • Worked closely with peers in the technology organization to plan and co-ordinate testing activities
  • Researched, evaluated, and implemented new performance testing based on existing needs
  • Identified and managed the implementation of process improvement initiatives
  • Managed and mentored staff, including skill training and career development
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Worked on Descriptive Programing to create user defined Objects in QTP
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Provided technical expertise and leadership to major performance testing tasks
  • Created in QTP frameworks with the using technical strategies.
  • Worked on setting up Object repositories in QTP
  • Involved in creating various Performance test scripts for testing the web-based applications using Load Runner 11.5, 12.02 and Performance Center with HTTP/HTML, and Web Services protocols.
  • Involved in creating and executes performance testing scripts on J2EE and/or ERP applications
  • Experienced in developing and implementing comprehensive SDLC and test strategies methodologies
  • Performs requirements analyses, risk assessments, issue analyses and software/hardware development with emphasis on analysis of user requirements, test design and test execution.
  • Creates and maintains documentation on test requirements, cases, scripts and execution. Insures completeness, accuracy, and correctness.
  • Experienced in planning and executing tests from large system wide testing (SIT, Functional, Regression, UAT, Performance, etc.).
  • Involved in UI JVM parameter validation/tuning.
  • Involved in Memory, Cores/JVM validation/tuning.
  • Involved in Garbage Collector (GC) tuning process.
  • Prepared Load Runner automation scripts and validated with appropriate data inputs.
  • Prepared different Load Runner scenarios as per test plan.
  • Extensive experience in working with both agile and waterfall methodologies.
  • Organizes and maintains information within the assigned test management system.
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Executed Web services, API and RESTful web services using load runner
  • Involved in creating various Performance test scripts for testing the web-based applications using Load Runner 9.5, 11.0.3,11.5 and 12.02 with HTTP/HTML, Mobile App, API, Web Services and RTE(mainframe) protocols.
  • Performs work flow analysis and recommends process in quality improvements.
  • A concrete Load Model was prepared using Load runner Scenarios so that it would apply the exact load as per production metrics.
  • Analyzed Application Logs in root cause analysis.
  • Reports activities and results to senior management
  • Strong interpersonal and communication skills with a track record of motivating and developing team leaders and team players.
  • Proven hands-on experience with quality assurance practices, including project plan development, test strategy development, test plan development, test case & test data review, and test automation.
  • Creative problem solver with advanced analytical, planning, and scheduling skills with a focus on timely delivery with utmost quality.
  • Working experience with HP tools.

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Jersey City, NJ

Performance Architect

Responsibilities:

  • Lead in implementing pilot systems and Proof of concepts.
  • Created new instrumentation process to support Mobile Technology
  • Implemented successful POC’s on various emerging MOBILE technology tools and established various processes for different scenarios for the Mobile Platform. effectively lead and participated in cross-functional meetings, clearly articulate project risk, and provide concise recommendations to the Development and Operations teams
  • Involved in AGILE Methodology for the software development process.
  • Worked with Agile, Scrum methodology to ensure delivery of high quality work
  • Extensively worked on establishing the benchmark results for various 3rd party Vendors.
  • Partnered with subject matter experts in Different components in the Infrastructure technology stack in areas including LAN, WAN, UNIX, LINUX, AIX, VM and Application deployment technologies
  • Lead in certifying IMAGE processing servers from various Vendors up to company standards.
  • Involved in Root Cause Analysis with various teams in bug fixes.
  • Participated in Agile planning sessions
  • Participated in SOS Meetings ( Scrum of scrums)
  • Participated in Demos and Retrospectives
  • Evaluated and implement commercial and open source tools for use by the team
  • Evaluated new hardware and commercial software for use in resolving potential issues
  • Identified bottlenecks and single points of failure in the architecture
  • Evaluate and implement commercial and open source tools for use by the team
  • Worked with leads in development, architecture, operations and product management to identify, prioritize and resolve potential future and current issues around performance, latency, availability, accuracy and scalability.
  • Coordinated and communicated with Business to gatherrequirements.
  • Prepared and also reviewed the performance test scripts in JMeter
  • Evaluated Business Requirements for testing needs and looped with business teams in improvising.
  • Design and developed Domain/Value Objects.
  • Enabled Transaction demarcation in Data Access Objects.
  • Wrote Queries using SQL.
  • Used sniffer tools like HTTP Watch and Dynatarace to understand/observe the end client behavior and application monitoring.
  • Worked on Mobile Computing to validate mobile user traffic.
  • Used SVN for Source Code Control.
  • Worked on web services using SOAP UI tool to validate the services on application.
  • Verify SQL queries against backend database to ensure test codes retrieve the right data on testing.
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Developed robust benchmark workloads based on production traffic patterns and anticipated
  • Creates analyzed and summarized test results in reports, capacity planning / best practice guides.
  • Developed and review test plans, results analysis, and capacity planning guides.
  • Reproduce critical customer situations requiring special performance tests or simulations.
  • Proactively monitored critical business applications/services and provide tuning information as needed
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Involved in creating various Performance test scripts for testing the web-based applications using Load Runner 9.5, 11.0.3 and 11.5 with HTTP/HTML, Mobile App and Web Services protocols.
  • Interacted with the various project teams to find out the end user actions and scenarios.

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, Neo Load, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Tampa, FL

Performance Analyst/QA Analyst

Responsibilities:

  • Defining the test scenarios and making sure that scripts are working according to planned scenario.
  • Different loads at different increments generated starting from 50 Users and ramped up to 1000 Users until it reached 100% CPU.
  • Extensively worked on Virtual User generator. Scripts in VuGen and enhancing the Script by doing Correlation, parameterization and debugging the script.
  • Involved in developing Performance scripts Using SAP GUI for SD module.
  • Configured Load Runner to monitor various performance parameters of database servers, application server s and web servers - for e.g. - CPU usage, total sessions created, current open sessions count, and number of open users on the database.
  • Heavily involved in designing load models for all the projects undertaken here. It involved lot of calculations for accurate balancing of the load.
  • Extensively used NT Performance Monitor to analyze the System Bottlenecks like Memory Leaks, CPU Utilization etc and also Network Bottlenecks.
  • Verify the firewall checkpoints with HP Diagnostic’s tool.
  • Involved in developing and maintaining scalable, reusable performance test scripts using Load Runner.
  • Extensively used Vugen to create and debug scripts.
  • Debugged and enhanced the Load Runner Scripts using C Language.
  • Parameterized the scripts and enhanced large files according to the test cases.
  • Provided multiple sets of data during test execution and used the data randomly, sequentially and uniquely.
  • Uploaded Scripts, Created Timeslots, and Created Scenarios and ran Load Tests in controller.
  • Added and monitored Web Logic, App server and Windows servers during performance testing by using Site Scope.
  • Develop and maintained the Test strategies, processes and standards.
  • Involved in the performance testing of a number of Apps Running on a variety of platforms ranging from legacy systems to Web (JAVA J2EE, Microsoft.NET).
  • Used Controller to perform Load Test, Longevity test and Stress Test.
  • Performing the Load testing to test the behaviour of the DB Servers, Web Servers and Application Servers.
  • Analysed the Windows Resource utilization viz. CPU and Memory impact on the application and server when there is a change in load and environment.
  • Analysed Average CPU usage, Response time, TPS and also analysed Web Page Breakdown graphs to pin-point response time problems.
  • Performed Base Line test for comparison with actual Load Test.
  • Performed back-end testing by querying the database with complex SQL queries and validate the data populate in tables.
  • Identified defects, assess root cause, and prepared detailed information for developers and business stakeholders
  • Responsible for making defect status report and project status report every week.
  • Identified performance bottlenecks in the Web, Middleware and databases, by configuring Site scope monitor on servers, and using detailed analysis.
  • Conduct performance testing and coordinate monitoring as joined activity - DBA and application developers monitoring the server health.
  • Designing and Executing the Scenarios in Performance Center and Used the Controller to perform Load Test and Stress tests.
  • Used Performance monitor and Load Runner graphs to analyze the results.
  • Identifying the Bottlenecks and reporting the conflicts against the SLAs.
  • Work with development team, users and support groups to understand the application architecture, use the current production issues to simulate the best possible real time scenarios for load and stress testing.
  • Communicating and leading with offshore teams, scheduled meetings to discuss.

Environment: Load Runner, Silk Performer, Winsock, HP Quality Center, HP Performance Center, SAP ECC, BI, BW,BI, BW Portal, MM, SD, XML, SAP GUI, BEA Web Logic, J2EE Diagnostics, Web Services, JavaScript, IIS, COM+, CA Wily Introscope, Oracle 10g/9i SCM, F&R, FI, SAP Net Weaver, MDM, web Methods and Client.

Confidential, Dallas, TX

QA Analyst

Responsibilities:

  • Prepared Test Plan and Test Cases based on the business requirements.
  • Involved in conversion testing to convert data from existing systems for use in replacement systems.
  • Identified the test requirements based on application business requirements and blueprints.
  • Involved in analyzing the applications and development of test cases
  • Involved in doing System testing of the entire applications along with team members
  • Applications are tested manually.
  • Analyzed and reviewed the software requirements, functional specifications and design documents.
  • Proficient in QA processes, test strategies and experience in creating documents like Test plan, Test procedures.
  • Developed test scenarios and test procedures based on the test requirements.
  • Participated in Preparing Test Plans.
  • Load runner is used for Load and performance testing and analysis of various reports.
  • Created scripts that randomized test data and parsed responses for session ID information in order to dynamically update virtual user scripts while running load tests.
  • Configured different Run Time Settings in Controller for various scripts.
  • Performed database validations using SQL.
  • Performed resource analysis (defined number of generators, hardware and software requirements, and number of machines for stress testing).
  • Used Toad to access database and performed functional Database Testing.
  • Created and designed different scenarios to best simulate the real world conditions.
  • Monitored server resources for different scenarios and end user response time for various transactions.
  • Used Analysis to check the performance using various graphs and reports.
  • Interacted with Developers to report and track Defects using various defect tracking tools like Quality Center, Clear Quest.
  • Wrote SQL queries and stored procedures to validate data.
  • Documented errors and implemented their resolutions.
  • Created test scripts, executed test scripts.
  • Developed Test Objectives and test Procedures.

Environment: Oracle 7.1, Visual Basic 5.0, Manual testing, Win Runner 5.01, Test Director5.0, Load Runner, Rational Clear quest, Quality Center, Test Director, BAC, VB Script, .Net, JavaScript, SQL, Java, JAVA, J2EE, JSP, IIS, XML/XLST, Oracle, Windows 2000, UNIX, Solaris, Vision, UML, Rational Requisite Pro.

Confidential

Quality Analyst

Responsibilities:

  • Reviewed the Business Requirement Document, System Requirement Specifications and use cases in the initial phase of development
  • Involved in preparing Test Cases, Test Script based on Business Requirements Document (BRD).
  • Developed the test cases to test Functionality, interface of the application
  • Performed Manual Testing on different modules of the application.
  • Tests were planned and managed using Test Director.
  • Wrote SQL Queries to retrieve the data from various Tables and to test the database.
  • Tracked and reported bugs using Test Director.

Environment: Manual Testing, VB, ASP, Test Director5.0, SQL, Windows 98.

We'd love your feedback!