Sr. Performance Monitoring, Diagnostic And Tuning Manager Resume
Montvale, NJ
SUMMARY:
- 15+ years’ experience in field of Application Performance, Engineering, Architecture, Testing, Bottleneck Analysis, Tuning, Capacity Management and Assurance including performance test methodology and tools.
- Certified as PMP, CSQA, Six Sigma Yellow Belt
- Strong Financial background ( Confidential, Citi, Confidential / Bank of America, Confidential, Confidential ) and Health Care background ( Confidential )
- 8+ years of Application Performance Management/Monitoring tool experience (including server side code profiling) in AppDynamics, OPNET AppInternals and AppResponse, dynaTrace purePath and Ajax Edition, Microsoft Avicode,, CA Wily Introscope, JProfiler, HP Diagnostics
- Experience in implementing APM’s such as AppDynamics, dynaTrace from requirements phase through implementation, in high transaction Production cloud based systems that serviced more than 30,000 users.
- Experience in defining monitoring and metric strategies for major products which shortened the Performance issue resolution process tremendously (from months to a few weeks).
- Performance and automation testing experience with HP Performance Center/ HP LoadRunner, HP SiteScope, HP BAC (Business availability Center), Win Runner, QTP and Test Director, Ranorex, SOAPUI/LOADUI and VSTS Load Test
- Experienced in defining, establishing, managing, implementing and coordinating the complete Performance Testing/Engineering and Tuning processes and teams. (Can successfully lead the Performance Testing and Tuning Process from start to completion including tools setup)
- Experienced in building and maintaining Performance Test environments, for varying technologies.
- Expertise in performance improvement recommendations and in recommending performance tool sets.
- Experienced in installing, managing and administrating Performance Center, Load Runner, VSTS, Sitescope, APM environments (performance labs), test platforms, test applications and other test tools
- Experienced in leading/testing Financial, Health Care, Transportation, Services, Trading (FIX/FAST protocol)
- Experience in leading/testing Cloud based solutions, Mobile Native apps/solutions, JD Edwards, ERP applications (GL), PeopleSoft applications (GL, AP and FA), Oracle Financial applications, Sarbanes - Oxley applications, Cognos, Essbase, Hyperion Products
- Experienced in Performance Testing using Java RMI, C template (FIX), CITRIX, Web, ODBC, Oracle 2 Tier, Oracle NCA, and Winsock Protocols
- Experienced in testing and tuning Application servers .Net, IBM WebSphere, JBoss/Hibernate, J2EE, JVM’s, WebLogic, Delphi, IIS; And Database servers Oracle, SQL Server and Sybase
- Experienced in Quality Assurance of clustered multi-tier/n-tier systems, virtualized servers, SOA based systems, Client/server systems and Web Applications/Sites including testing large enterprise software applications and across the networks using Shunra WAN emulation.
- Experienced with Database validation using Oracle, Sybase, Microsoft SQL Server, DB2 and Access and working knowledge of RDBMS concepts and knowledge of SQL, TSQL, and PL/SQL.
- Working knowledge of Object-oriented languages Java, C++, C# and TSL, C, ASP, JSP, HTML, XML, JavaScript, Shell and Visual Basic and Unix
- Working knowledge of quality principles, test engineering methodologies, verification and validation techniques, defect tracking/management, quantitative methods, risk management and measurement programs
- Detail oriented, self- motivated, analytical, problem solving and excellent verbal, written, business and internal communications, presentation interpersonal, time management and organization skills
- Experienced in working with a diverse workforce including other Project Team members, Business teams/partners, Customers, QA teams, Capacity Planning teams
- Experienced in conducting Status meetings with technical reviews and in sending out Project status reports to stakeholders
EXPERTISE INCLUDES:
- Performance, Throughput, Concurrency, Peak Load, Volume, Stress, Capacity, Soak, Endurance/Scalability, Stability, Fail Over, Spike, Response Time tests, Single User baselines, Regional, WAN, Hardware profiling and configurations and Baseline Testing.
- Performance instrumentation, monitoring, diagnostics and tuning for end to end application infrastructure/architecture.
- Profiling, Trouble-shooting, Bottleneck and Root cause analysis using APM’s/Profiling tools.
- Capacity and scalability planning, modeling and workload planning.
- Client side - web browser, desktop, thin client and thick client performance analysis and tuning.
- Web server and application server profiling, performance analysis and tuning - .Net, Java.
- Database server profiling, performance analysis and tuning - SQL server and Oracle.
- Performance testing Cloud based solutions and mobile enabled solutions.
- Interface/integration and Network performance impact analysis and tuning.
TECHNICAL SKILLS:
APM/Profiling Tools: AppDynamics, dynaTrace PurePath, dynaTrace Ajax, Microsoft Avicode, SiteScope, Wily, OPNET AppResponse and AppInternals, HP Diagnostics, JProfiler, Business Availability Center (BAC), Fiddler
Performance/QA Tools: Performance Center, LoadRunner, VSTS Load Test, SOAP UI/LOAD UI, Ranorex, WinRunner, QTP, Test Director, Quick Test Professional, Rational Test Manager, PVCS, Rational Robot, Team Quest, Clear Quest, TFS, CTS, Jmeter
Load Runner Protocols: CITRIX, HTTP/HTML, PeopleSoft, ODBC, Oracle 2 Tier, Oracle NCA, Winsock, Com/Dcom and Java Protocols
Testing Technologies: Performance, Stress, Load, Volume, Longevity, Stability, Scalability, Automated, GUI/User Interface (UI), Black Box, Unit, System, End-to-end, Integration, Component Based, Web, Sanity, Smoke, Functionality, Compatibility, Configuration, Security, Database, User Interface, Regression, Functionality and White Box testing
Methodologies: Performance Test Methodology, Software Development Life Cycle (SDLC), Waterfall, Agile Testing, Quality Assurance Life Cycle (QALC), QA Methodology, Load Runner Testing Process, Win Runner Testing Process, Rational Unified Process (RUP), Capability Maturity Mode (CMM)
Operating Systems: Windows, UNIX, Linux, Sun Solaris, IBM AIX and MS-DOS
Application/Web Servers: .Net, IIS, JBoss, BEA Web Logic, IBM Web Sphere, J2EE App servers, Jakarta Tomcat, Apache server
ERP/Reporting Software: Oracle 11i forms, SAP (AP, GL and AR), JD Edwards, Peoplesoft, Hyperion Reports, Spreadsheet services, ESSbase tools, Actuate Reports, Cognos
Languages/Technologies: TSL, Java, C, C++, C#, SQL, PL/SQL, Shell, VB, .Net, J2EE, JSP, ASP, JavaScript, VBScript, HTML
RDBMS: Sybase, Oracle, MS SQL Server, DB2, IBM AS/400 iSeries and MS-Access
Other Tools/Editors: Quest, TOAD, Putty, Rational Rose, Visual Source Safe, Crystal Reports, SQL Advantage, Erwin, DB Artisan, Remedy Desktop Application, Macromedia Dream Weaver, PowerBuilder, Comm Server, Requisite Pro, MS Office (PowerPoint, Vision, Word, Excel), Team Quest and Macromedia Dream Weaver
Project Tools: MS Project, Knowledge of Virtualization, DNS, Akamai, IP addressing, firewalls, load-balancing issues, network ports, WAN optimization and system tools vmstat/top
PROFESSIONAL EXPERIENCE:
Confidential, Montvale, NJ
Sr. Performance Monitoring, Diagnostic and Tuning Manager
Responsibilities:
- Responsible for Application Performance Management and Monitoring Tools of multiple projects
- AppDynamics implementation from ground up for various technology projects
- Initiated the requirements analysis for an APM tool and gathered the requirements.
- Responsible for communication with Vendor for license procurement, implementation support.
- Procured and implemented the tool in a standalone Performance Monitoring environment.
- On boarded multiple applications to enable monitoring and instrument system, application and code in Test, Staging and Production environments.
- Defined the Monitoring and Metric strategy to include performance metrics across the system and application.
- Identify potential bottlenecks using the APM’s, recommend solutions to improve the performance of the solution.
- Drive the tuning process working with the Performance Engineering team and ensure the improvements are quantified.
- Enhanced the Performance issue resolution process via APM’s which improved system performance as a whole.
- Develop, implement and enhance Performance strategy and process/frameworks for multiple .Net products.
- Involved Performance Engineering practice for multiple products, deliverables including - architecture, engineering, planning, scope, tools, roadmap etc.
- Responsibility of Performance Engineering/Testing and Engineering Tasks for engaged projects - including strategy, estimation, planning, execution oversight, metrics reporting and risk management
- Oversee Test engineers and other test team members to ensure performance test scripts and test cases are developed according to the identified test scenarios and test objectives that are in line with the Service Level Agreements (SLA’s) specified in the Non-Functional requirements documents.
- Identify, recommend, and implement Performance test framework, strategies and toolsets to optimize performance test execution, expand test coverage, and reduce risk
- Establishes and maintains strong working relationships with Development, Project Management, Client Support, Business Analysts and Users to foster a team environment.
- Partner with Project Management Organization in the development of project plans, status communications, and the management of development software solutions.
- Provide regular status reporting consistent with established program governance requirements and present updates to senior management and at All-hands meetings
Skill Environment: AppDynamics 4.0/4.1, Performance Center 11, .Net, SQL Server, IIS, Audit applications, Document management systems, Java, IBM Websphere, SharePoint, Project Planning, JProfiler
Confidential, New Jersey
Enterprise Performance and Monitoring Architect
Responsibilities:
- Implementation and management of Application Monitoring for multiple applications via tools such as eG, Orion, OPNET AppInternals, OPNET AppResponse and dynaTrace Purepath.
- Oversaw dynaTrace implementation for cloud based system that services more than 30,000 users
- Initiated the requirements analysis for an APM tool and gathered the requirements.
- Responsible for communication with Vendor for license procurement, implementation support.
- Procured and implemented the tool in a standalone Performance Monitoring environment.
- On boarded multiple applications to enable monitoring and instrument system, application and code.
- Defined the Monitoring and Metric strategy to include performance metrics across the system and application.
- Identify potential bottlenecks using the APM’s, recommend solutions to improve the performance of the solution and drive the tuning process.
- Enhanced the Performance issue resolution process via APM’s which improved system performance.
- Identify, recommend, and implement Performance test framework, strategies and toolsets to optimize performance test execution, expand test coverage, and reduce risk.
- Manage a continuous process improvement program within the team and at the project level reviewing trends, data, etc.
- APM tool implementation - dynaTrace and OPNET for major products which shortened the Performance issue resolution process tremendously
- Interaction with projects/ Business Analysts to document Non Functional requirements (SLA’s), and lead test engagements from PE/PT perspective.
- Develop standards and templates for Performance Engineering Strategy, Plan, and Scenarios. Ensure sign-off from all parties
- Ensure adherence of the Performance standards and ensure that each application is reviewed through the Architecture teams
- Review, Recommend and Approve Performance Engineering Strategy, Test Plan and Design for multiple portfolio projects.
- Conduct, and Analyze results of Load tests, and Produce Tuning Recommendations accordingly
- Define, Produce and maintain Key Performance Indicators (KPIs) around the testing performed
- Analyze and tune Performance issues found throughout the lifecycle
- Coordinate/drive resolution across a matrix organization, and work with 3rd party service vendors based on their test recommendations, expertise, and experience.
- Identify, diagnose and resolve performance bottlenecks, through performance testing techniques and tools (including investigating or remediating)
- Identify potential limitations in software architectures: devise / communicate improvements working with Product architects
- Select and implement Application Performance Management tools for monitoring and diagnostics - dynaTrace purePath, OPNET AppResponse and AppInternals, Avicode
- Define Monitoring/Metric strategy for a comprehensive instrumentation of the system, application and code.
- Work with Client Support for analyzing external problems develop a resolution plan
- Monitor quality/performance risks and escalate, as required.
- Build reports for trending analysis.
- Performed the analysis and published the Performance Test Results that included System Metrics, Response Time Metrics and any Tuning performed.
- Oversaw the database/store procedure tuning which brought down the response times, CPU utilization and improved overall system performance
- Analyze, understand, and communicate technical data, specifications, designs, etc.
- Lead by example by being hands on when needed to test, de-bug and trouble shoot.
- Train the offshore teams to work on Performance problems and to identify code level performance concerns and communicate with the team
- Follow up through the entire process to ensure the defect workflow is followed and the performance recommendations are implemented
- Do lunch and learn sessions to educate developers with performance best practices
- Making recommendations, assessing industry best practices and striving for constant improvements to ensure most effective performance testing and engineering approach.
Skill Environment: Performance Center 9.5, LoadRunner 11.0, .NET, ADO.NET, SQL Server Windows Server 2008, 2003, IE8, Silverlight, Winforms, Citrix XenApp, RDP/Terminal Server, Terradata, DataWarehouse, ETL (Extraction, Transformation and Loading), DataMarts, Delphi, Com/DCom, WebLoad, SoapUI/Load UI, dynaTrace, OPNET, Avicode, Web services, VMware, Hyper-V (server virtualization).
Confidential, Jersey City, NJ
Performance Test Architect
Responsibilities:
- Responsible for defining and implementing Performance Engineering strategy for Phoenix
- Initiated the Performance Engineering process with development and defined the process thru delivery
- Implemented profiling tools - including JProfiler, dynaTrace, Wily to identify Performance hotspots for developers to fix during the development life cycle
- Drive the testing process with TCS and ensure they were successful in implementing it
- Published and presented the finding to the stakeholders
- Train, mentor and drive Performance Analysts for the project to implement Performance Engineering process
- Diagnose issues within the system and suggest Performance recommendations to Development and Architects
- Work with Business Analysts to ensure the Performance Requirements are defined and Testing is planned as part of application’s SDLC
- Work with Development to in corporate proper unit performance testing in to the SDLC
- Train the offshore teams for Performance Test process
- Follow up through the entire process to ensure the defect workflow is followed and the performance recommendations are implemented
Skill Environment: Performance Center 9.5, .NET, Java, SQL Server, Windows Server 2003, IE6, TIBCO, ComputeGrid, Abinito, Object Grid, nCache 3.8, iPlanet Web Server, Sun Solaris, SiteMinder.
Confidential, New York, NY
Sr. Performance Test/Tuning Lead/Performance Architect/PC and BAC Admin
Responsibilities:
- Train, mentor and manage 6 Performance Engineers
- Involved in bringing stakeholders together to design and agree on a Performance Strategy for testing Vision application
- Translate software specifications and user requirements into performance test scenarios
- Involved in improving the Performance Test Process by implementing standard practices and processes
- Develop the Project Plans for the project that included task lists and timeline for Performance Testing the different releases
- Use Fiddler to understand each call’s foot print to suggest performance improvements early on at UI level
- Developed/Oversaw detailed Performance Test Plans including Load Models for different releases
- Coordinate the deployment of the application and needed software for hardware setup in Performance Environment
- Manage script development in VUGen, Service Test and QTP for .Net and SOA based systems
- Manage execution of Performance tests that included Load, Fail Over, Peak, Endurance (Stability) and Capacity (Scalability) tests using PC 9.5
- Have status meetings with the resources involved to ensure smooth execution of the project and send out status reports to the stakeholders
- Work with HP to resolve HP Performance Center installation issues and administered Performance Center for all of Confidential ’s Performance projects. Worked with HP for licensing requests and issues
- Implement HP Business Availability Center solutions for the applications under test and use Real Time monitoring for Production comparison and proper Performance Analysis and planning
- Install and implement SiteScope for monitoring the application servers under test
- Suggested multiple enhancement areas for the application under test for improving performance. Contributed to meeting the performance goals of the application by reducing the response times from half a minute to less than 4 seconds.
- Involved in testing SOA/Web services layer using HP Service test for sister applications
Skill Environment: Performance Center 9.5, LR9.5, QTP 9.2, Quality Center 9.0, HTTP/HTML Protocol, HP BAC 6.5, SiteScope 10.0 Windows 2003, WebSphere - WebServices (SOA),Tandems, MS Project, Office, Fiddler, HP Service Test
Confidential, New York, NY
Performance Testing and Tuning Lead/Performance Center (PC) Admin
Responsibilities:
- Train, coach and manage 3 Performance Team members.
- Involved in the study of QA processes in the organization to define and create Performance Testing Standards, Methodologies or Processes.
- Developed relationships with various teams that are in need of Performance Testing.
- Interacted with various teams to gather Performance Test requirements for their applications and acquire resources for proper planning and executions of the Performance Test process.
- Developed the Project Plans for the project that included task lists and timeline for Performance Testing.
- Coordinated the hardware setup and software/application deployment in Performance Environment
- Oversee and finalize the development of Performance Test Plans, Load Model’s as part of the Test Prep.
- Ensure Performance requirements proper capture by performing stakeholders Test Plan reviews.
- Manage the development of scripts in multiple protocols including Java (Fix/Fast protocol), C template, CITRIX, HTTP/HTMP and scenarios for the Test execution phase.
- Manage and execute the Performance Tests planned.
- Have status meetings with the resources involved to ensure smooth execution of the project and send out status reports
- Manage the analysis and preparation of the Performance Test Report
- Manage the submittal of the performance defects through the proper Defect Life Cycle for proper escalation
- Manage the tuning of propriety application servers on Linux and Solaris platforms and the involved Oracle database servers including MIT servers
- Present the Performance Test results with any recommendations or enhancements and escalate any issues or risks that may rise during the Performance Test Life Cycle
- Ensure performance tested systems meet all application capacity and scalability goals
- Have regular meetings with the Performance Team (onshore and offshore) and ensure the Team goals are met through best practices, knowledge sharing
- Have final Closure meetings to gather lessons learned and best practices
- Maintained and administered Performance Center
Skill Environment: Performance Center 9.0, LR9.1, QTP 9.5, Quality Center 9.0, Java RMI, C template, FIX protocol, CITRIX and HTTP/HTML Protocol, QTP 9.2, WebSphere - WebServices (SOA), SAP, Java, Solaris - Sun OS 5.1, Linux, .Net 3.0, Oracle 10g & 11g, MS Project 2007
Confidential, Secaucus, NJ
Performance Testing and Tuning Lead
Responsibilities:
- Train, mentor and manage 4 Performance Team members (3 on shore and 1 off shore).
- Develop Project charter and Project Schedules with the proper input of the stakeholders.
- Initiated and maintained contact with Business for accurate Performance modeling.
- Managed the development of a detailed Performance Strategy Plan to test the different modules of IOCM suite which included US, Canada and Asia Pac/Europe (APEU).
- Setup requirement reviews for identifying potential performance risks/issues and escalated to the proper teams for action.
- Coordinated the deployment of the application and needed software for hardware setup in Performance Environment.
- Oversaw the development of scripts and scenarios using Load Runner and VSTS.
- Oversaw and executed the Scalability tests to find the breaking point of the application.
- Oversaw and executed the Stability tests to find the stability of the application over a period of time.
- Worked with the architecture and development team to come up the tuning advice.
- Worked towards reducing the overall performance of batch jobs by 75%.
- Worked towards reducing the user response times drastically for high response functions, some by 19 minutes.
- Oversaw the database tuning and store procedure tuning which brought down the response times, CPU utilization and improved overall system performance.
- Involved in managing the testing and tuning clustered .Net, clustered JBoss applications servers and Sybase and clustered SQL server database servers
- Found critical flaws in the implementation of the design and suggested implementation changes that lead to redevelopment of code.
- Ensured the Defects found followed the proper Defect Life Cycle in Test Director, as part of standard QA procedure.
- Proactively escalated issues to the QA Lead and alerted the project team on potential impact to the systems and testing schedule.
- Involved in code evaluations and conducting code reviews for Performance Defect resolution
- Involved in reviewing the Test Results with the Business to ensure user approval of the tuning and state of the application.
- Analyzed Load Runner/SiteScope monitors/graphs for Transaction response times, Resource utilization etc.
- Obtain approval for the new environment, the build resources from management
- Obtain estimates and develop a project schedule for building the environment
- Manage resources onsite and off site to successfully execute the project plan that built the US Performance environment
- Manage smoke/Sanity testing of the environment to ensure environment is setup correctly
Skill Environment: Load Runner 8.1, JD Edwards, VSTS Load Test 2005, JBOSS, .Net 2.0, Sybase, SQL Server, Unix, Solaris, Windows 2003 Server, IIS, DB Artisan, Quality Center, Windows XP, MS Project, and Quality Center 9.0, IBM Websphere Application server, Apache with Websphere plug in as Web server, Database on IBM AS/400 iSeries and Windows 2003
Confidential, Paramus, NJ
Performance Test Lead/ UAT Testing Coordinator
Responsibilities:
- Manage 3 Performance Team members.
- Establish processes to improve Performance QA Team, which included developing strategies for documentation, automation and result analysis.
- Developed procedures and standards for the Performance QA Team.
- Developed Non-functional (Performance and Installation) Test Plans that included requirement analysis (NFR’s), test case creation, automation strategy and results analysis strategy.
- Oversaw creation of Performance Test Cases and Test Scripts for successful test execution and oversaw the development of Automation scripts for the Test Cases that are automatable.
- Oversaw the Hardware setup needed for Test environment and oversaw the installations for the test environment, which included installation of OS, Application and Tools.
- Oversaw the Performance Test Execution.
- Responsible for analysis of results, identifying bottlenecks for tests performed and tuning activities.
- Oversaw the defect submittal process to ensure appropriate resolution takes place.
- Performed and aided other teams in monitoring and troubleshooting different versions of our application, which included Integration Testing.
- Coordinated and managed the User Acceptance Testing (UAT) for GSS application.
- Coordinated the setup of systems and installation of the application on the new hardware for acceptance testing.
- Coordinated the timings of the UAT with the user schedules.
- Interacted with the Users to make sure that they are able to access and use the system as intended.
- Escalated the defects found during the process through the appropriate escalation path.
Skill Environment: Load Runner 8.0, .Net, Oracle, Rational Test Manager, Rational ReqPro, PVCS, Rational Robot, Windows 2000, Windows XP, Windows 2000 Server, Windows 2003 Server
Confidential, Richmond, VA
Senior Performance Lead
Responsibilities:
- Train, mentor and manage 3 Performance Team members/Engineers.
- Developed and designed Performance Test Strategy and Plan for Confidential 2005.
- Managed Test Plan development for all the applications involved in the exercise.
- Helped Project Teams come up with a Performance Test Strategy for their Disaster Recovery Plans.
- Interacted with IBM consultants to create Test Execution Strategy and creating a Test Schedule.
- Develop standards to stabilize the Performance Test processes.
- Managed the Performance Test Team and supervised the work of Performance Testers in creating and executing performance test scripts.
- Coordinated with Technical Architecture, Infrastructure, Network, Security, Capacity Planning and Desktop teams to ensure accurate set-up of performance environment.
- Managed the execution of the load during the exercise at the recovery site of Confidential .
- Defined process for identifying issues during the different project phases and reporting status to management.
- Acted as point of contact for collecting all Performance Test Results .
- Published the final Performance Test Results that included System Metrics, Response Time Metrics and Load Metrics.
Skill Environment: Load Runner 8.0, Web (HTTP/HTML), Oracle 2 Tier, WebServices (SOA) CITRIX, Clear Quest, MS Project, TOAD, Putty, Windows XP
Confidential, New York, NY
Senior Performance Engineer-Lead/Offshore team coordinator
Responsibilities:
- Interacted with the customer to gather Performance Test requirements and used TestDirector for requirement analysis.
- Developed cost estimates for the engagement and preparing Statement of Work’s
- Prepare Statement of Work’s - SOW’s (estimates) and project schedules/plans for the Performance Test Life Cycle for each application with the appropriate onshore and offshore resources
- Developed Performance Test Plans, Load Model’s as part of the Test Preparation
- Analyze and gather Test data for Test execution
- Developed Vuser scripts using VUGen with Oracle NCA, Oracle 2 Tier, HTTP/HTML protocols.
- Performed Load/Performance/Stress Tests using Load Runner
- Ran Tests from various locations over the globe to sample responses times over the WAN (included installing LoadRunner to these computers) using Shunra WAN emulation.
- Used TeamQuest to monitor the server resources under Test
- Analyzed Load Runner monitors/graphs for Transaction response times, Throughput, Users, Errors etc
- Predicted the response times over WAN using OPNet and LoadRunner WAN emulation techniques.
- Involved in analysis of Test Results with the Project, Business and System Testing Team
- Tracked Project progress and Defects using CMextra - a ClearQuest tool.
- Migrated the project to the off-shore Performance Test team and trained the team for executing regression tests
- Managed the off-shore team during the regression tests
- Also involved in Quest software evaluation, PVCS evaluation
- Review and obtain approval of the SOW’s from the stakeholders
- Managed the hardware costs for the projects and billed to the teams
Skill Environment: Load Runner 7.8, VSTS Load Test, Oracle 11i forms, Hyperion Reports, Spreadsheet services, ESSbase tools, TestDirector, Quest, TeamQuest, PVCS, Sun Solaris, IBM AIX, Tomcat server, Oracle, ClearQuest, MS Project, and Windows XP/2000, OPNet, J2EE, Sarbanes-Oxley Applications, Java, JSP/ASP, Sun Solaris, IBM AIX, Tomcat server, Apache server, Oracle, Clear Quest
Confidential, Richmond, VA
Senior Performance Engineer/Test Lead
Responsibilities:
- Involved in the study, definition, execution and measurement of QA processes across the application.
- Defined performance test automation standards for different combinations of technology components used in application.
- Interacted with the customer to gather Performance Test requirements.
- Developed Performance Test Plans, Load Model’s as part of the Test Preparation
- Analyzed and gathered Test data for Test execution
- Developed Vuser scripts using VUGen with WinSock, ODBC, PeopleSoft, Java and Oracle 2-Tier protocols.
- Performed and supervised Load/Performance/Stress/Fail Over/Capacity/Tuning tests using Load Runner, by creating and executing different sets of scenarios to accomplish different objectives.
- Performance Test applications for daily production volume during the disaster simulation.
- Coordinated the communications to setup, plan and execute the actual testing exercise.
- Involved in analysis of Test Results with the Project, Business and System Testing Team
- Tracked Performance Defects and Functional Defects using ClearQuest.
- Analyzed Load Runner monitors/graphs for Transaction response times, Throughput, Users, and Errors etc.
- Analyzed Web Logic Server resources and Load Runner monitors/graphs for Transaction response times, Throughput, Users, Errors etc.
- Monitored the performance of the application using BMC Patrol agent and analyzed the results using PerformaPredict Models.
Skill Environment: Load Runner 7.8/7.5, Load Runner 7.8, Win Runner 7.5, PowerBuilder, Comm Server, PeopleSoft, Web Logic Server, J2EE, Http, SQL Server, Oracle, Clear Quest, MS Project, TOAD, Putty, Windows 2000, NT and XP
Confidential, Hackensack, NJ
Senior QA/ Automation /Performance Tester
Responsibilities:
- Analyzed the Business Requirements of both the systems and developed test cases, scripts and test plans.
- Responsible for writing the Test Plan for different components of the System and involved in writing the Master Test Plan.
- Developed Test Scenarios with Test Data to support Test Objectives.
- Designed Test Cases and Test Scripts for Functional testing and for Automation using Win Runner.
- Responsible for automated testing of Smoke test and Regression Test using WinRunner 7.0
- Involved in developing a test suite, using Quick Test Professional, for evaluation.
- Used LoadRunner 7.2 for Volume, Performance and Stress testing with Web Protocol (HTTP/HTML)
- Responsible for comprehensive Functional, System testing of assigned features of the system.
- Responsible for validating the security privileges in the system.
- Responsible executing test procedures and documenting results
- Analyzing and reporting the test problems/failures and defects to the Development Team.
- Instrumental in reporting and Tracking defects/bugs to using PVCS Tracker
Skill Environment: WinRunner 7.2, Test Director, LoadRunner 7.2, Quick Test Professional, ASP, VB script, Java script, HTML, Microsoft SQL Server 8.X, Microsoft IIS, Crystal Reports, Windows XP and 2000, Visual Basic, MS Access and Requisite Pro
Confidential, Jersey City, NJ
Senior QA/ Automation /Performance Tester
Responsibilities:
- Instrumental in Test documentation (Test Plan, Test Scenarios, Test Cases, Test Data, Test Scripts), using Test Director
- Analyzed the Business Requirements of the Product/system. Worked with the development team to review project plans, requirements specification, design documents, and computer software.
- Involved in comprehensive Functional testing of system CoPeR application and IPRO system and setting up automated regression suites
- Involved in designing, developing executing the WinRunner Test scripts (TSL) and Reporting Test Results in Test Director.
- Performed Data Driven Testing using WinRunner and Database Testing via Database checkpoints
- Performed Load/Performance/Stress using Load Runner with Web (HTTP/HTML) protocol.
- Reported the defects to the team using CMExtra, an in house bug-tracking tool.
- Involved in analyzing the problems using Transaction Breakdown, Network Monitoring and Resource Monitoring graphs.
- Involved in debugging SQL procedures and Shell scripts.
- Involved in installing and testing Patches and Hot Fixes in Test environment.
Skill Environment: Load Runner 7.2, WinRunner 7.2, Test Director, J2EE, JDK, JSP, Sybase, Java Script, Java Servlets, Web Logic 6.0 Application Server, Korn Shell, Rational Rose, Visual Source Safe, TSQL, SQL Advantage, Remedy, Windows NT and UNIX