Sr.performance Analyst Resume
Charlotte, NC
Professional Summary
- Total Seven years of experience and expertise in the IT industry of Quality Assurance, Software Testing, Manual and Automation with client/server Applications and Web Applications.
- Strong experience in Writing Test cases and Executing the Test cases, Giving estimations, Preparing Test Plan, Performing Functional Testing, Integration Testing, Regression Testing, System Testing, Performance Testing, Writing Test Summary Report, Preparation of Bug Severity Chart.
- Expertise in using various automated testing tools like Proficient Load Runner, Quick Test Pro, Quality Center, Performance Center, Mercury Test Director, Win Runner.
- Strong experience in writing manual and automation Test Plans.
- Extensively involved in all phases of Performance/Load/Stress testing including setting up test environment, planning Load Test, creating virtual user scripts, creating and running scenarios to analyze system performance under different loads.
- Performed QA activities for Mainframes, Java, .Net and Web based applications.
- Prepared various metrics and reports like Defect Prevention plan, Test execution report, weekly status report, and casual analysis.
- Maintained Requirement Traceability Matrices (RTM) to measure the testing process and requirement progress.
- Expertise in Querying and Testing RDBMS such as Teradata, Oracle, SQL Server using T-SQL and PL/SQL for Data Integrity.
- Strong experience in writing UNIX shell scripts and expertise in testing batch jobs.
- Strong experience in performing web services testing by simulating, mocking and invoking web service requests between two applications.
- Involved in various phases of Software Development Life Cycle (SDLC). Involved in all phases of Testing Life Cycle.
- Extensive experience in preparing Test Suits, Test Cases, Test Scenarios, Test Reports and Documentation of both Manual and Automated Tests.
- Experience in Verification, Positive, Negative, Boundary testing.
- Good understanding of Six Sigma and CMMI Levels.
- Positive & Forward-looking attitude, Self-Motivated, positive orientation to expand present knowledge base, and Team spirit.
- Outstanding communicator with extensive experience in customer service as well as ability to identify, develop and enhance client relationships.
- Strong analytical and interpersonal skill and consistently being organized by management and peer for producing high quality works.
Education
MS in Computer Science & Engineering .
Technical Skills
Testing Tools : Load Runner 11.0/9.x/8.x, ALMPerformanceCenter, QTP11/10/ 9.0, Win Runner 8.2/7.5, QualityCenter 10/9.x, Jira,Jmeter, Test Director
Monitoring Tools : Site Scope, Mercury Diagnostics
Languages : C, C++, JAVA
Script Languages : HTML, DHTML,VB Script, shell scripts, java script
GUI Technologies : Visual Basic 6.0, VB.Net.
Work Flow Tool : MS-Word, MS-Excel
Query Language : T-SQL, PL/SQL
Operating Systems : MS DOS, Windows 95/98/2000/XP/NT/Vista
Database : SQL Server 2000, Oracle 9i/10g, Teradata
Experience Summary
Confidential, (July ’10 to Till date)Client : Confidential,Charlotte, NC
Role : Sr.Performance Analyst
Internet Banking deals with the testing systems that are centered on the applications used by the Wachovia Retail Banking Customers. Wachovia Provides the Online Banking facility to all its Customer where he can manage his/her accounts. The main objective of the application is to provide the end user with complete online banking experience. The online customers can avail different services like online banking with bill pay, online brokerage, online retirement access, online statements, transfer of funds, mobile banking, view all online accounts at one stop, view images of paid checks, manage checking and savings accounts online, credit cards etc.
Responsibilities:
- Involved in entire QA Life Cycle, which includes Designing, Developing, implementing and Execution of the entire QA Process and Methodologies.
- Involved in the complete cycle of gathering performance requirements, designing the Load model, scripting, test execution, analysis and reporting
- Performed Performance Testing, Smoke testing, System Testing, Full regression testing, and backend testing consists of mainframe testing.
- Independently developed load test scripts usingLoadRunneraccording to performance test specifications/requirements.
- Developed LoadRunner VUser scripts using Web and Java protocols.
- Enhanced the scripts with parameterization, Correlation, error handling logic and Synchronization Check points.
- Extensively worked on manual and auto correlations for dynamically changing values in like .net ViewStates, response buffers and encrypted data to make sure script run for desired iterations without fail.
- UsingLoadRunner, executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunnerController.
- Analyzed the Load Runner reports to calculate Response time and Transactions per Second (TPS).
- Created Work Load Matrix based on Production Volumes and Designed and Executed Test Run Scenarios based on the Work Load Matrix.
- Extensively used Site Scope connecting to Web logic server and Apache server and other database servers to collect various measurements like CPU and Memory utilization and coordinating with Server and Network support to clear the bottlenecks.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Responsible for Preparation of project estimations.
- Raised defects in Jira.
- Worked on Mainframes to create customer and customer accounts.
- Performed cross browser testing to test webpage contents for different browser compatibility.
- Written modification/change requests for the defects in the application and helped developers to track and resolve the problems.
- Prepared test closure reports and test sign-off documents
- Involved in walkthrough and meetings and also responsible for daily and weekly status reports
- Leverage application and testing tools knowledge to the team members to build effective test scenarios.
Environment: J2EE, Mainframes, Jira, LoadRunner 9.5,Sitescope, Windows XP, MS Excel, Oracle 10g, RUMBA, RSA case management tool.
Confidential, (Mar’08 to June’10)
Client : Confidential,Rhode Island
Role : Performance Analyst
The objective of the project is migration of customer’s data from DB2 to Oracle and addition of new features to online banking application like transfer of funds to non-Citizens account, bill pay, mobile banking and view online statements.
Responsibilities:
- Responsible for gathering Business requirements and transforming them into functional specifications by creating process flow diagrams, data flow diagrams, Business use cases.
- Involved in entire QA Life Cycle, which includes Designing, Developing and Execution of the entire QA Process and documentation of Test Plans, Test Cases and Test Scripts .
- Created, maintained and executed test cases in Quality Center.
- Tested the seamless migration of customer data using web tools like DB Viewer and CMOD reports. Testing also included the inactivation of customers in the old system and new features of the online banking application.
- Performed functional testing on newly added features and also tested regression suite.
- Involved in Performance testing using LoadRunner, executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller.
- Extensively used Controller for running various load tests scenarios by ramping up Vusers for different intervals for 2 times and make sure there is no much difference in two results.
- Used controller for runtime settings and running Vuser scripts and make sure they are working fine on various load test generators.
- Developed and implemented load and stress tests with Mercury LoadRunner and present performance statistics to application teams, and provide recommendations of how and where performance can be improved
- Extensively made use of LoadRunner functions to enhance Script to verify required business process is captured and validated in the replay.
- Different loads were generated starting from 100 Vusers and ramped until it reached 100% CPU.
- The Average CPU usage, Response time, TPS are analyzed for each scenario.
- Interacting directly with developers, architects, DBAs and all other necessary required resources when there are any performance related issues.
- Analyze, interpret, and summarize results in a complete Performance Test Report.
- Worked on data conversion from old system to new. Wrote SQL and PL/SQL for converted data validation.
- Troubleshooted various issues in testing and production environments.
- Organized progress/defect status meetings with the team on outstanding defects status.
- Involved in various sign off meetings with the lead developers, business analysts, managers and team leads
Environment: Windows Xp/NT, Quality Center, DB Viewer Tool, LoadRunner 9.x, MS Excel, DB2, UNIX
Confidential, (Dec’06 – Feb’08)
Client : Confidential,
Work Location : Infosys Technologies, India
Role : QA/Performance Analyst
The objective of this middle office application is to report the risks of Equities related positions held by UBS for the close of business for the prior working day. The main function of this application is to read positions level and slide data from Front Office Systems and load the data in designed tables under specific schema’s so that it can be used to create and generate reports. This application is mainly used by MRO to analyze the market risk across different desks.
Responsibilities:
- Analyzed business and technical requirements, developed strategic test plans, test cases and test scripts and responsible for executing test cases.
- Involved in system, regression, functional and Performance testing.
- Extensively used VUGen for generating Load runner scripts, replaying and debugging scripts.
- Worked on validating the scripts by parameterization of various values like domain names, input test data values etc.
- Worked on various runtime settings, recording settings and general options based on client requirements and used rendezvous points where ever required in the scripts.
- Created test scripts to meet the performance test requirements, performed Load test up to 750 Virtual users
- Extensively used Analysis for analyzing the results and other issues like excessive CPU utilization and memory leakages, network issues, database issues, application issues after conducting performance testing.
- Maintained test logs, test summary reports and participated in defect review/status meetings.
- Designed various test scenarios as per load model and the Load Models are created based on User Requirements, Number of Concurrent Vusers/Workflow, TPS/workflow estimated.
- Generating scripts and handling Correlation as well as parameterization using Load runner Vugen, executed scenarios using LoadRunner Controller and analyzed the results using Load runner Analyzer.
- Monitored Web-Application server Web logic Counters like Pending Requests, JVM heap free Current and CPU during the test.
- Reported the final results, observations and conclusions of the performance test to the Project team.
- Verified the flow of data from front end systems to actual user. Worked on TOAD to write SQL queries for data completeness to ensure that all expected data is loaded correctly into tables.
- Worked closely with business analysts, development team, database administrators and system architects in troubleshooting activities to identify business logic and coding errors.
Environment: Windows XP/NT, Quality Center, LoadRunner 8.x,MS Excel, Oracle 9i, TOAD, UNIX, Shell Scripts, Teradata, Oracle, Informatica 7.1, MS Access.
Confidential, (Mar ‘06 – Nov ’06)
Client : Confidential,
Work Location : Confidential,India
Role : Test Engineer
This web application enables Microsoft’s partners to generate revenue for Microsoft, by selling their products to end customer and register for incentives against the sales.
Responsibilities:
- Owned the complete test responsibility of the core PSX functionality.
- Validated the back-end data by using SQL.
- Was point of contact from test team for UAT and production support issues.
- Analyzed and translated business requirement documents into test scenarios and test cases.
- Maintained the requirements traceability matrix across the deliverables of the project.
- Escalated and communicated critical defects and quality issues to design and development teams.
- Documenting the test results in detail level, like data snapshot before execution and after execution, compare the data and proof the test.
- Participated in bug triage meetings with developers to validate the severity of the bug and responsible for tracking of the bug life cycle using ETCM.
- Involved in tracking the defects to ensure Quality Control of the product.
Environment: WindowsXP/NT, VSTF, ETCM, Siebel, MSCRM, MS Excel, Lotus Notes, Test Director 8.0
Confidential, (Nov ’04 – Jan ’06)
Client : Confidential,India
Role : Test Engineer
The main objective of this application is to track the information of the software companies. It mainly deals with Companies Information, Appointments and Reports. Companies Information retrieves all the information of the company, which is tracked through the company ID. Appointments deal with contacting the company person and taking the appointment, meeting the client. Reports generate the reports such as Company Report, Appointment Report, and Discussion Report.
- Involved in functional study of application
- Analyze the UAT Requirements of application
- Participated in the design, development, testing and deployment of the application.
- Maintaining records of clients visited to the company.
- Involved manually testing the application under test (AUT).
- Created Test Cases for some of the modules of the application under test (AUT) using Functional Specifications.
- Involved in developing a full suite of test Cases to test the product’s stability
- Execution of test cases and report the defects in terms of Major, Minor & Critical
- Involved in Integration testing, System testing
- Defect Tracking.
Environment : VB.net, MS Access, Windows 2000, MS Excel, Manual Testing