We provide IT Staff Augmentation Services!

Lead Performance Architect Resume

3.00 Rating

Dallas, TX

SUMMARY

  • 10 years of experience in IT Industry mainly in Designing, Developing and Integrating Components for robust Web - based and client/server software applications.
  • Hands-on experience in design, development and execution of performance test plans as well as Test Strategies for Functional, Automated and Performance Testing.
  • Capacity to identify problems, analyze test results, and investigating their cause and suggesting remedies.
  • Extensive knowledge of Software Development Life Cycle (SDLC), Performance Test Life Cycle, Scrum and Agile having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing.
  • QA experience in Manual, Automated Testing as well as Load Testing on web and client-server applications
  • Worked extensively on Mercury Interactive testing tools - Load Runner, Soap UI,Win Runner, QuickTestPro (QTP), Quality Center and Test Director (TD)
  • Involved in writing Test Plans and Test Methodologies(Scrum)
  • Installed and configured Load runner and load generators
  • Performed System, Functional, Performance/Load, Compatibility, Regression, Integration, Positive, Negative, and Black box Testing
  • Created automated Test Case scenarios using QTP and WinRunner.
  • Prepare and also review teh performance test scripts in JMeter
  • Used SQL queries to interact wif database.
  • Defect tracking using Test Director (TD) and Quality Centre (QC)
  • Knowledge in all stages of teh Software Development Life Cycle (SDLC) beginning from definition and initiation to deployment and support.
  • Hands on experience on troubleshooting Windows Server and desktop based systems.
  • Hands on experience in Load Runner protocols like Web (HTTP/ HTML), Web Services, SAP GUI, SAP Web. Some hands on experience using other Tools - Silk Performer, Neo Load, Soap UI, Quick Test pro, Wily Intro Scope, and Rational Tools
  • Performed SOA testing using Webservices.
  • Employed Agile, Waterfall methodologies and especially SCRUM to ensure rapid iterative Software development
  • Performance testing expertise in developing Performance Test plans, Test Strategy, Load Modeling, Performance Metrics and Performance Analysis.
  • Developed and deployed test Load scripts to do end to end Performance testing using Load Runner.
  • Sound knowledge of creating Load, Stress and Performance tests using LoadRunner
  • Experience in monitoring servers using tools like Site Scope, Wily Introscope, Solarwinds, Fact finder and DynaTrace.
  • Worked on Mobile Computing to validate mobile user traffic
  • Actively involved in Application Performance Monitoring using BSM, BPM, Site Scope, Dynatrace and Fiddler for end client behavior.
  • Involved in Root Cause Analysis wif Middleware, Infrastructure Admins and Developers
  • Analyzed teh test results (TPS, Hits/second, Transaction response time, CPU utilization etc) using Loadrunner Analysis, various monitor tools and prepare Test Reports
  • Checked for Network Bottlenecks using Network Delay Time and Vuser Graphs
  • Extensively used of Quality Center for Change Management and Defect Tracking.
  • Actively involved in Performance tuning like JVM/GC tuning in GC parameters and heap
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Proactively monitored end client response/behavior using HTTP Watch and Dynatrace
  • Excellent Documentation, Management and Presentation skills for teh QA Project Team including Test Plan, Test Case Specs, Test Requirement, Test Case Matrix and Defect Reports.

TECHNICAL SKILLS

Testing Tools: LoadRunner 8.0/ 8.1/9.0/9.1/9.5/11.0/11.3/11.5/12.02/ Performance center, NeoLoadSilk Performer, Load UI, Soap UI, Quick Test Professional,Win runner, Wily, Quality Centre/Test Director 6.0/7.6/8.0/9.2 , Site Scope, Rational Clear Quest, Lotus NotesRational Suite.

Operating Systems: Windows XP/2000/NT/2003/2008, Macintosh, UNIX, AIX.

Monitoring Tools: Introscope wily, Dynatrace, HP Sitescope, Solar Winds, Blue strip FactfinderWindows Perfmon, Fiddler

Languages: Java, C, C#, .NET, JAVA/J2EE,Java Script, XML, VB script, 4-Test, TSL, PL/SQL

Databases: MS SQL Server 7.0/2000/2005/2008 , DB2, Oracle 9i/10G/11i, SQL, PL SQL.

Web Servers: iPlanet, Web Server, Tomcat, IIS

Application Servers: Web logic, iPlanet, WebSphere, Jakarta Tomcat, JRun

Design Tools: UML, Rational Rose, ERwin 4.0/3.5, Visio, MS Project

ERP: SAP ECC/4.6 B/4.6C, IBM MaximoMiddleware WebMethods, EDI, JMS

Internet Tools: HTML, JavaScript, VBScript, VBA, JSP, XML, ASP, .NET

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Lead Performance Architect

Responsibilities:

  • Designed and configured Performance Infrastructure (Load generators, Performance Center, Controllers) as per teh application environment zone (DMZ, SESF)
  • Worked on planning testing schedules and resources to ensure adequate test coverage for all applications in each release
  • Worked closely wif peers in teh technology organization to plan and co-ordinate testing activities
  • Escalated critical and blocking issues when necessary and drive them to closure
  • Tracked overall progress and provide regular status updates to senior management
  • Researched, evaluated, and implemented new performance testing based on existing needs
  • Identified and managed teh implementation of process improvement initiatives
  • Managed and mentored staff, including skill training and career development
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Evaluated Business Requirements for testing needs and looped wif business teams in improvising.
  • Design and developed Domain/Value Objects.
  • Enabled Transaction demarcation in Data Access Objects.
  • Wrote Queries using SQL.
  • Used sniffer tools like HTTP Watch and Dynatarace to understand/observe teh end client behavior and application monitoring.
  • Worked on Mobile Computing to validate mobile user traffic.
  • Used SVN for Source Code Control.
  • Worked on web services using SOAP UI tool to validate teh services on application.
  • Verify SQL queries against backend database to ensure test codes retrieve teh right data on testing.
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Developed robust benchmark workloads based on production traffic patterns and anticipated
  • Creates analyzed and summarized test results in reports, capacity planning / best practice guides.
  • Developed and review test plans, results analysis, and capacity planning guides.
  • Involved in creating various Performance test scripts for testing teh web-based applications using Load Runner 11.5, 12.02 and Performance Center wif HTTP/HTML, and Web Services protocols.
  • Performs software performance and stress testing against requirements
  • Interacted wif teh various project teams to find out teh end user actions and scenarios.
  • Involved in creating and executes performance testing scripts on J2EE and/or ERP applications
  • Experienced in developing and implementing comprehensive SDLC and test strategies methodologies
  • Experienced in planning and executing tests from large system wide testing (SIT, Functional, Regression, UAT, Performance, etc.).
  • Coordinates and facilitates meetings wif developers, administrators, Architecture teams.
  • Involved in UI JVM parameter validation/tuning.
  • Involved in Memory, Cores/JVM validation/tuning.
  • Involved in Garbage Collector (GC) tuning process.
  • Prepared Load Runner automation scripts and validated wif appropriate data inputs.
  • Prepared different Load Runner scenarios as per test plan.
  • Extensive experience in working wif both agile and waterfall methodologies.
  • Organizes and maintains information wifin teh assigned test management system.
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Executed Web services, API and RESTful web services using load runner
  • Involved in creating various Performance test scripts for testing teh web-based applications using Load Runner 12.0 and 12.02 wif HTTP/HTML, Mobile App, API, Web Services and RTE(mainframe) protocols.
  • Performs work flow analysis and recommends process in quality improvements.
  • A concrete Load Model was prepared using Load runner Scenarios so dat it would apply teh exact load as per production metrics.
  • Analyzed Application Logs in root cause analysis.
  • Reports activities and results to senior management

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, Performance Center, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Chattanooga, TN

Sr Performance Engineering Lead

Responsibilities:

  • Effectively managed performance and automation engineers, allowing them to be impactful in their roles.
  • Created technical strategy for future test and performance and QTP frameworks and work wif teh team to implement dat strategy
  • Worked on planning testing schedules and resources to ensure adequate test coverage for all applications in each release
  • Worked closely wif peers in teh technology organization to plan and co-ordinate testing activities
  • Researched, evaluated, and implemented new performance testing based on existing needs
  • Identified and managed teh implementation of process improvement initiatives
  • Managed and mentored staff, including skill training and career development
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Worked on Descriptive Programing to create user defined Objects in QTP
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Provided technical expertise and leadership to major performance testing tasks
  • Created in QTP frameworks wif teh using technical strategies.
  • Worked on setting up Object repositories in QTP
  • Involved in creating various Performance test scripts for testing teh web-based applications using Load Runner 11.5, 12.02 and Performance Center wif HTTP/HTML, and Web Services protocols.
  • Involved in creating and executes performance testing scripts on J2EE and/or ERP applications
  • Experienced in developing and implementing comprehensive SDLC and test strategies methodologies
  • Performs requirements analyses, risk assessments, issue analyses and software/hardware development wif emphasis on analysis of user requirements, test design and test execution.
  • Creates and maintains documentation on test requirements, cases, scripts and execution. Insures completeness, accuracy, and correctness.
  • Experienced in planning and executing tests from large system wide testing (SIT, Functional, Regression, UAT, Performance, etc.).
  • Involved in UI JVM parameter validation/tuning.
  • Involved in Memory, Cores/JVM validation/tuning.
  • Involved in Garbage Collector (GC) tuning process.
  • Prepared Load Runner automation scripts and validated wif appropriate data inputs.
  • Prepared different Load Runner scenarios as per test plan.
  • Extensive experience in working wif both agile and waterfall methodologies.
  • Organizes and maintains information wifin teh assigned test management system.
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Executed Web services, API and RESTful web services using load runner
  • Involved in creating various Performance test scripts for testing teh web-based applications using Load Runner 9.5, 11.0.3,11.5 and 12.02 wif HTTP/HTML, Mobile App, API, Web Services and RTE(mainframe) protocols.
  • Performs work flow analysis and recommends process in quality improvements.
  • A concrete Load Model was prepared using Load runner Scenarios so dat it would apply teh exact load as per production metrics.
  • Analyzed Application Logs in root cause analysis.
  • Reports activities and results to senior management
  • Strong interpersonal and communication skills wif a track record of motivating and developing team leaders and team players.
  • Proven hands-on experience wif quality assurance practices, including project plan development, test strategy development, test plan development, test case & test data review, and test automation.
  • Creative problem solver wif advanced analytical, planning, and scheduling skills wif a focus on timely delivery wif utmost quality.
  • Working experience wif HP tools.

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Jersey City, NJ

Performance Architect

Responsibilities:

  • Lead in implementing pilot systems and Proof of concepts.
  • Created new instrumentation process to support Mobile Technology
  • Implemented successful POC’s on various emerging MOBILE technology tools and established various processes for different scenarios for teh Mobile Platform. effectively lead and participated in cross-functional meetings, clearly articulate project risk, and provide concise recommendations to teh Development and Operations teams
  • Involved in AGILE Methodology for teh software development process.
  • Worked wif Agile, Scrum methodology to ensure delivery of high quality work
  • Extensively worked on establishing teh benchmark results for various 3rd party Vendors.
  • Partnered wif subject matter experts in Different components in teh Infrastructure technology stack in areas including LAN, WAN, UNIX, LINUX, AIX, VM and Application deployment technologies
  • Lead in certifying IMAGE processing servers from various Vendors up to company standards.
  • Involved in Root Cause Analysis wif various teams in bug fixes.
  • Participated in Agile planning sessions
  • Participated in SOS Meetings ( Scrum of scrums)
  • Participated in Demos and Retrospectives
  • Evaluated and implement commercial and open source tools for use by teh team
  • Evaluated new hardware and commercial software for use in resolving potential issues
  • Identified bottlenecks and single points of failure in teh architecture
  • Evaluate and implement commercial and open source tools for use by teh team
  • Worked wif leads in development, architecture, operations and product management to identify, prioritize and resolve potential future and current issues around performance, latency, availability, accuracy and scalability.
  • Coordinated and communicated wif Business to gatherrequirements.
  • Prepared and also reviewed teh performance test scripts in JMeter
  • Evaluated Business Requirements for testing needs and looped wif business teams in improvising.
  • Design and developed Domain/Value Objects.
  • Enabled Transaction demarcation in Data Access Objects.
  • Wrote Queries using SQL.
  • Used sniffer tools like HTTP Watch and Dynatarace to understand/observe teh end client behavior and application monitoring.
  • Worked on Mobile Computing to validate mobile user traffic.
  • Used SVN for Source Code Control.
  • Worked on web services using SOAP UI tool to validate teh services on application.
  • Verify SQL queries against backend database to ensure test codes retrieve teh right data on testing.
  • Performed GUI testing, Functional testing, Integration testing, Regression testing, Ad - hoc testing, Negative testing, End to End testing, Load testing, User Acceptance testing on multiple projects.
  • Involved in scoping performance testing efforts prioritize test cases, develop solid project milestones.
  • Developed robust benchmark workloads based on production traffic patterns and anticipated
  • Creates analyzed and summarized test results in reports, capacity planning / best practice guides.
  • Developed and review test plans, results analysis, and capacity planning guides.
  • Reproduce critical customer situations requiring special performance tests or simulations.
  • Proactively monitored critical business applications/services and provide tuning information as needed
  • Proactively monitored application Logs/GC Logs, VMSTAT, NMON logs and AWR reports.
  • Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on requirements using
  • Involved in creating various Performance test scripts for testing teh web-based applications using Load Runner 9.5, 11.0.3 and 11.5 wif HTTP/HTML, Mobile App and Web Services protocols.
  • Interacted wif teh various project teams to find out teh end user actions and scenarios.

Environment: Java,.Net, Web logic, Web Services, AIX, XML, Load Runner, Neo Load, SOAP UI, Wily Introscope, Quality Center, Oracle, SQL and Dynatrace, HP Sitescope, HP Diagnostics, Mobile Computing, Windows Perfmon, JMeter

Confidential, Tampa, FL

Performance Analyst/QA Analyst

Responsibilities:

  • Defining teh test scenarios and making sure dat scripts are working according to planned scenario.
  • Different loads at different increments generated starting from 50 Users and ramped up to 1000 Users until it reached 100% CPU.
  • Extensively worked on Virtual User generator. Scripts in VuGen and enhancing teh Script by doing Correlation, parameterization and debugging teh script.
  • Involved in developing Performance scripts Using SAP GUI for SD module.
  • Configured Load Runner to monitor various performance parameters of database servers, application server s and web servers - for e.g. - CPU usage, total sessions created, current open sessions count, and number of open users on teh database.
  • Heavily involved in designing load models for all teh projects undertaken here. It involved lot of calculations for accurate balancing of teh load.
  • Extensively used NT Performance Monitor to analyze teh System Bottlenecks like Memory Leaks, CPU Utilization etc and also Network Bottlenecks.
  • Verify teh firewall checkpoints wif HP Diagnostic’s tool.
  • Involved in developing and maintaining scalable, reusable performance test scripts using Load Runner.
  • Extensively used Vugen to create and debug scripts.
  • Debugged and enhanced teh Load Runner Scripts using C Language.
  • Parameterized teh scripts and enhanced large files according to teh test cases.
  • Provided multiple sets of data during test execution and used teh data randomly, sequentially and uniquely.
  • Uploaded Scripts, Created Timeslots, and Created Scenarios and ran Load Tests in controller.
  • Added and monitored Web Logic, App server and Windows servers during performance testing by using Site Scope.
  • Develop and maintained teh Test strategies, processes and standards.
  • Involved in teh performance testing of a number of Apps Running on a variety of platforms ranging from legacy systems to Web (JAVA J2EE, Microsoft.NET).
  • Used Controller to perform Load Test, Longevity test and Stress Test.
  • Performing teh Load testing to test teh behaviour of teh DB Servers, Web Servers and Application Servers.
  • Analysed teh Windows Resource utilization viz. CPU and Memory impact on teh application and server when there is a change in load and environment.
  • Analysed Average CPU usage, Response time, TPS and also analysed Web Page Breakdown graphs to pin-point response time problems.
  • Performed Base Line test for comparison wif actual Load Test.
  • Performed back-end testing by querying teh database wif complex SQL queries and validate teh data populate in tables.
  • Identified defects, assess root cause, and prepared detailed information for developers and business stakeholders
  • Responsible for making defect status report and project status report every week.
  • Identified performance bottlenecks in teh Web, Middleware and databases, by configuring Site scope monitor on servers, and using detailed analysis.
  • Conduct performance testing and coordinate monitoring as joined activity - DBA and application developers monitoring teh server health.
  • Designing and Executing teh Scenarios in Performance Center and Used teh Controller to perform Load Test and Stress tests.
  • Used Performance monitor and Load Runner graphs to analyze teh results.
  • Identifying teh Bottlenecks and reporting teh conflicts against teh SLAs.
  • Work wif development team, users and support groups to understand teh application architecture, use teh current production issues to simulate teh best possible real time scenarios for load and stress testing.
  • Communicating and leading wif offshore teams, scheduled meetings to discuss.

Environment: Load Runner, Silk Performer, Winsock, HP Quality Center, HP Performance Center, SAP ECC, BI, BW,BI, BW Portal, MM, SD, XML, SAP GUI, BEA Web Logic, J2EE Diagnostics, Web Services, JavaScript, IIS, COM+, CA Wily Introscope, Oracle 10g/9i SCM, F&R, FI, SAP Net Weaver, MDM, web Methods and Client.

Confidential, Dallas, TX

QA Analyst

Responsibilities:

  • Prepared Test Plan and Test Cases based on teh business requirements.
  • Involved in conversion testing to convert data from existing systems for use in replacement systems.
  • Identified teh test requirements based on application business requirements and blueprints.
  • Performed manual testing and maintain documentation on different types of Testing viz., Positive, Negative, Regression, Integration, System, User-acceptance, Performance and Black Box
  • Involved in analyzing teh applications and development of test cases
  • Involved in doing System testing of teh entire applications along wif team members Applications are tested manually.
  • Analyzed and reviewed teh software requirements, functional specifications and design documents.
  • Proficient in QA processes, test strategies and experience in creating documents like Test plan, Test procedures.
  • Developed test scenarios and test procedures based on teh test requirements.
  • Participated in Preparing Test Plans.
  • Load runner is used for Load and performance testing and analysis of various reports.
  • Created scripts dat randomized test data and parsed responses for session ID information in order to dynamically update virtual user scripts while running load tests.
  • Configured different Run Time Settings in Controller for various scripts.
  • Performed database validations using SQL.
  • Performed resource analysis (defined number of generators, hardware and software requirements, and number of machines for stress testing).
  • Used Toad to access database and performed functional Database Testing.
  • Created and designed different scenarios to best simulate teh real world conditions.
  • Monitored server resources for different scenarios and end user response time for various transactions.
  • Used Analysis to check teh performance using various graphs and reports.
  • Interacted wif Developers to report and track Defects using various defect tracking tools like Quality Center, Clear Quest.
  • Wrote SQL queries and stored procedures to validate data.
  • Documented errors and implemented their resolutions.
  • Created test scripts, executed test scripts.
  • Developed Test Objectives and test Procedures.

Environment: Oracle 7.1, Visual Basic 5.0, Manual testing, Win Runner 5.01, Test Director5.0, Load Runner, Rational Clear quest, Quality Center, Test Director, BAC, VB Script, .Net, JavaScript, SQL, Java, JAVA, J2EE, JSP, IIS, XML/XLST, Oracle, Windows 2000, UNIX, Solaris, Vision, UML, Rational Requisite Pro.

Confidential

Quality Analyst

Responsibilities:

  • Reviewed teh Business Requirement Document, System Requirement Specifications and use cases in teh initial phase of development
  • Involved in preparing Test Cases, Test Script based on Business Requirements Document (BRD).
  • Developed teh test cases to test Functionality, interface of teh application
  • Performed Manual Testing on different modules of teh application.
  • Tests were planned and managed using Test Director.
  • Wrote SQL Queries to retrieve teh data from various Tables and to test teh database.
  • Tracked and reported bugs using Test Director.

Environment: Manual Testing, VB, ASP, Test Director5.0, SQL, Windows 98.

We'd love your feedback!