Summary of Qualifications:
- Professional Experience of over 8 Years in IT Industry with Core competency in Areas of Manual Testing and Automated Testing, End-to-End Test Coordination for Distributed , Client Server, Web ASP, .NET, Java J2EE , based applications.
- Quality assurance experience includes Student Loans, Banking, telecommunications, Insurance, and E-commerce organizations.
- Expertise using Manual and automated tools Mercury Quality Center, LoadRunner and Quick Test Pro Testing applications developed in the .NET Framework, Java, VB, Oracle, SQL Server on Windows and UNIX platform.
- Expertise at preparing master test plans, formulating test scenarios, preparing traceability matrix from the requirements document and test case document.
- Worked closely with developers and business analysts to create scripts that emulate the actual business process for documenting and testing.
- Experience in writing Test scripts, designing Test procedures, Test cases, Verification and Validation.
- Having good knowledge and understanding of various Testing Methodologies like Agile, Waterfall, etc. and their Life Cycles.
- Strong experience in client/server based business applications, and web based applications including test planning, test case development, test case reviews, test data preparation, test setup, test execution, test analysis, defect reporting and tracking.
- Expertise in System Testing, Functional Testing, Integration Testing, Performance Testing, Regression Testing and User Acceptance Testing UAT .
- Experience in development of test suites and test harness for regression testing.
- Strong knowledge in using test management tools and various bug reporting tools such as MS Product Studio, Test director, Bugzilla and Quality center. Extensively involved in Backend testing with Oracle and SQL Server
- Knowledge of Oracle EBusiness Suite global ERP solution supply chain and Finance modules Experience at working in close association with Release management team, third-party clients, managing test teams and interfacing between the test team and clients.
- Experience leading a team of test engineers, coordinating the testing activities between the onsite and offshore test team, reporting the testing activities to the client and ensuring that deliverables are met.
- Testing Tools: LoadRunner 11.52/11.0/9.5/8.1, Performance Center 11.0/9.5/HP UFT 11.5/ QTP10.x/9.x, HP QC 10.x/9.x, HP BPT, NeoLoad 4.2, Oracle Application Testing Suite 12x
- Languages: PL/SQL, C and C
- Databases: Oracle, MS SQL Server, MS Access
- Web Tools: HTML, DHTML, XML, SOAP
- Tools: TOAD, SQL Query Analyzer, Visual Basic, MS office tools, Database Monitors
- Operating Systems: MS-DOS, Windows NT/XP/2003/Vista/07/08, UNIX
- Data ware house Tools: Informatica
- Test Management Tools: Product Studio, Quality Center, Bugzilla, Jira
- Version Control: MS Visual SourceSafe, Microsoft TFS, MS SharePoint
Role: Lead Performance Testing Analyst
WALMART is the biggest retailer in the world. WALMART IT has a number of Applications around Supply chain, Forecasting and replenishment, Legacy applications, JAVA applications towards eCommerce. I worked on functionalities around Billing payment, Gifting, Shipping, POS, Forecasting and Replenishment, Pricing etc.
- Responsible for writing both high level and detailed test plan.
- Organized walkthroughs and inspections with the business people, business analysts and developers to decide the scope of testing and testability of requirements.
- Led a team of 6 performance test engineers
- Responsible for setting up the Performance QA environment, scheduling the testing efforts.
- Responsible for coordinating the build migration to UAT and other testing environments
- Worked with Business Analysts and customers to compile the Functional Requirements.
- Responsible for testing new releases and comparing them to the baseline results on hand
- Used SharePoint for uploading of project deliverables and testing resources
- Performance tested both .NET Apps and J2EE JAVA Apps
- Responsible for conducting Defect triages in relation to Identification and analysis of defects on hand
- Created Data Loading activities using LoadRunner scripts, Batch scripts, SQL Scripts etc.
- Developed Scripts using HTML/HTTP, web click script, Ajax, SAP GUI, Citrix, SAP WEB, WCF Webservice protocols.
- Conducted ETL testing in terms of performance
- Used SOAPUI for validating and testing web services across various applications
- Used SOAPUI to develop XML script from WSDL and verified whether the WSDL was responding correctly.
- Performed Data Seeding for Vusers and assisted Dev team in Database build for the application.
- Verify that the new or upgraded applications meet specified performance requirements. Identified long running queries and optimize those queries to improve performance
- Scheduled Oracle AWR reports for the load test duration and analyzed the snapshots after the load test is complete
- Used TFS for configuration management
- Used NeoLoad for creating scripts for HTML5 and FLEX based Applications.
- Worked with the Application, DB and other teams on Tuning of the Application and Database tiers
- Responsible for analysis of Production logs in IIS and WebSphere Application servers
- Used WILY and TeamQuest for analyzing thread counts, thread dumps and other performance metrics
- Generated and analyzed wily reports to confirm the load levels
- Worked with DBAs on the SPM Sequel plan management , Tuning of expensive queries across various applications.
- Create Test Scenarios based on the test plan and used LoadRunner Controller to execute multi-user performance tests used online monitors, real-time output messages and other features of the Controller.
- Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
- Used APM monitoring tools like Wily Introscope, HP Diagnostics for monitoring parameters like CPU Memory and Threat pools
- Assist in production of testing and capacity certification reports. Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Extensively Using SQL Server 2008 and Oracle 11 for Data prep and to validate the data in the UI and the reports.
- Used Quality Center tool for Defect tracking and Scripts Creation and Execution. for Waterfall Methodology
- Worked closely with Scrum Master during Sprint Planning, Retrospective and Estimation sessions.
Environment: HP LoadRunner 11.52,HP Performance center 11.5, Windows 2000 NT, SQL, Apache, Quality Center, VuGen, Web services, SAP SD, MM, F R, BI, ETL, Informatica 9.5, SOAPUI 5.0, COGNOS, IIS 7.5/6.0, Microsoft TFS, WebSphere Application server WAS , AIX, WilyIntroscope, Apache JMeter, Oracle, BAC, Oracle 11i/12C, LINUX, Unix, AWR, TOAD
Role: Performance Testing Analyst
TG Created by the Texas Legislature in 1979, TG is a public, nonprofit Corporation that administers the Federal Family Education Loan Program FFELP . The FFELP was formerly known as the Guaranteed Student Loan Program. The FFELP is a government-sponsored program that provides low interest loans to help students and their parents pay for education beyond high school.
- Worked closely with Business Analysts and Developers to gather Application Requirements and Business Processes in order to formulate the test plan.
- Complied with the existing and established test processes and methodologies
- Involved as a Performance Testing Analyst for establishing the Individual Benchmarks and Baselines for a number of J2EE Applications .NET Application.
- Created a bunch of LoadRunner scripts using a variety of protocols ranging from WEB Http/Html , Mobile http/html , Ajax Truculent, Web Click and script, SAP-WEB, web services etc.
- Created Mobile Scripts and executed scripts with different types of network speed simulation.
- Created the LoadRunner scripts using the LoadRunner API in an effort to enhance the scripts and incorporate the business logic.
- Led a team of 5 performance testers across various performance engagements.
- Created a number of Load testing scripts for populating data from front end towards data seeding purposes.
- Conducted a variety of tests including Baseline, Benchmark, Endurance, Stress and Volume Testing.
- Worked extensively with HP Performance center for scheduling and execution of Load Tests.
- Involved in setup and installation of LoadRunner and Generators across multiple Geographic locations.
- Performed WAN Emulation for load tests using SHUNRA.
- Generated Virtual users to ensure multiuser logging and multi session logging and analyzed the results.
- Performed stress testing, endurance and volume testing.
- Used Redgate Deployment manager2 for deployment of releases to various environments like INT, UAT, and Staging.
- Monitored the scenario run using various online monitors in Load Runner Analysis.
- Used HP Site scope for monitoring .NET, JAVA WEB/APP Servers
- Performed JVM profiling in terms of heap dumps, memory.
- Analyzed the results and wrote write-ups using LoadRunner Analysis Graphs.
- Responsible for creating the workload Distribution tables for various scripting modules involved.
- Responsible for conducting Batch volume testing apart from online testing.
- Responsible for creating the scenario mix and various runtime configurations for the individual scripts that are part of the mix.
- Ensured connectivity between Performance Center, QC and Load Generators.
- Involved in stress testing, volume testing and endurance testing across various Apps
- Involved in creation of high level strategy documentation and detailed testing docs.
- Worked with DBAs team and submission and processing of data requests and migration activities to Load Environment.
- Published High level Load test report to business users Technical report to technical team.
- Deeply involved in Unit Testing, Integration Testing, Performance Testing, System Testing and UAT Testing.
- Prepared test data for positive and negative testing used in data driven testing for testing the application dynamically
- Responsible for Automation Scripts library functions maintenance.
- Analyzed the Business Requirement Document BRD and developed detailed Test plans, prepared Test cases.
- Performed manual testing on critical functionalities of the application to verify the application is complete and stable.
- Accountable for writing Detail Test Plan by understanding business logic and user requirements.
- Maintained amicable relationships with developers and other stake holders in better triaging and narrowing down the bugs.
- Set up test cases, test sets, and defects in Quality Center.
Environment: HP LoadRunner 11.52/ 11.0/9.5, HP Performance center 11.0/9.5, SAP BI, Net weaver, HP QC 11.0/9.5, SAP HRMS, HP UFT 11.5/QTP 9.5,OATS, NeoLoad 4.2,Windows 2008/2007/2003/XP/2000, Java J2EE, .NET, IBM Tivoli performance viewer, HP Diagnostics, IBM Web Sphere, Oracle 11G, Microsoft IIS 7.0, HTML
Role: Performance Analyst
Project: Department of Public Welfare Projects
DPW has a bunch of IT Applications towards various social programs for the state of PA. Applications include Food stamps, Health and human services, Child care, Department of Children and Families web application. The IT Technology stack ranges from Mainframe to open systems
- Responsible for conducting walk through and discussions on the requirements with the clients and business analysts.
- Understood and digested the business requirements and preparation of the Software Test Plan and Test Cases
- Created a number of Load scripts for Data seeding purposes.
- Created the work load models based on the requirements
- Designed and created a number of LR Scripts for Applications deployed on a variety of platforms including Java J2EE, .NET, Ajax and web service protocols etc.
- Created LoadRunner scripts manually as and when needed without relying much on the recording feature
- Developed LoadRunner scripts using a variety of protocols ranging from WEB http/html , SAP, Oracle 11i, web services, Ajax, Web Click Script etc.
- Worked in onsite-offshore model across various performance testing engagements
- Involved in performing volume testing based on the production volumes and cycles.
- Involved in setting up and configuration of HP Performance center
- Responsible for execution of load tests using HP Performance center
- Responsible for creating the Load Distribution tables for various scripting modules involved.
- Responsible for coordinating the Batch processes alongside the online performance testing efforts.
- Responsible for creating the scenario mix and various runtime configurations for the individual scripts that are part of the mix.
- Worked with the Architecture/Tech team in Performance tuning exercises.
- Responsible for performing analysis of the load tests ran and presenting the same to the management
- Experience working in an onsite offshore model
- Generated load test reports and performed distribution of reports and publishing.
- Performed Extensive Correlation and Parameterization in a variety of environments including .NET, Oracle Web Logic etc. for the recorded scripts in order to handle varying and dynamic variables.
- Conducted various kinds of Benchmark testing, Baseline testing, Silo testing, volume testing, batch testing etc.
- Used HP Site scope and CA Wily Introscope for monitoring JAVA WEB/APP Servers
- Created a bunch of scripts exclusively for Data setup and data seeding for the AUT.
- Conducted endurance and batch volume testing of various applications
- Used HTTP Watch for monitoring the HTTP and HTTPS related data like parameters, HTTP Status codes, redirections etc.
- Tracking of issues in HP Quality Center and reporting them to the stake holders.
- Ensuring test case delivery on schedule and regular/ timely updates on the project testing progress
- Coordinating and cooperating with the team to enhance the efficiency productivity
- Good understanding of SOA requirements and SOA Reference Architecture
- Designed and executed SOA web services for performance testing
- Created, maintained and updated test plans, test cases, test environment, and test ware through life cycle.
- Created and managed testing schedule and monitored completion of activities.
- Set up test environment and conducting execution as per approved test cases and procedures.
- Performed manual testing of various business scenarios for which Automation is not an option.
- Conducted cross browser testing to check the compatibility of the AUT with different Browser's IE, Netscape, Mozilla, chrome .
- Preparation of Test Reports and Defect Reports.
Role: Performance Tester
Capital One is on the biggest banks in country. There are about 120 Apps that cater to various Industry needs with products ranging from Banking, Mortgages, securities, Retirement, Insurance etc. The Platforms include JAVA J2EE, Microsoft, SAP, Legacy systems, Middleware etc.
- Identified the Functional Requirements based on application business requirements and blueprints.
- Developed Test Plan and Test Scenarios to test out the existing reports and analytics for functionality and performance.
- Responsible for creating test scripts using LoadRunner using various protocols including, Web/HTTP, and DB Users.
- Responsible for installation of the LoadRunner software including controller software, Agent software, Load Generators etc.
- Performed Systems Integration testing for various Apps.
- Responsible for creating Load Test scenarios based on the load testing scripts that are to be loaded.
- Responsible for identifying the counters that needs to be added onto the scenario for data collection from analysis perspective
- Monitored the Available bytes graphs that would give an idea as to the Memory leak if any present in the system.
- Performed load testing on Microsoft COM , Web Logic and BizTalk Application Servers.
- Responsible for migrating the necessary stored procedures and other database stuff that is needed by the Load Test database.
- Responsible for installation of Quality center with SQL Server database and creating user profiles for users.
- Responsible for monitoring the web/app servers for performance and issues
- Used HP Performance center for scheduling and execution of Load tests
- Analyzed the results for Performance bottlenecks across the Technology stack.
- Worked with the Architecture team in Performance tuning exercises.
- Used Fiddler for logging all HTTP S traffic.
- Customized the views in Quality center for different kinds of users using the VB Script.
- Utilized Quality center for the Test Management and defect tracking activities.
- Involved in compatibility testing of the application with different platforms using Quick Test Pro.
- Involved in Database testing by writing SQL queries, testing triggers and PL/SQL procedures.
- Involved in setting up the test data for the individual modules for the load test.
- Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
- Responsible for Load Testing Co-ordination with various other projects involved in load testing activity.
- Responsible for the generation of the LoadRunner Analysis files based on the LoadRunner Results file generated by the load test.
- Responsible for filtering the analysis file data based on the durations required.
- Generated detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
- Involve in preparing the Test Plan and Test Cases
- Participated in regular meetings with developers for reviews and walkthroughs.
- Used HP Quality center QC for defect management and test management
- Expertise in design and execution of Test Plans and Test Strategy, Test cases and reporting defects.
- Worked as Quality Center Administrator to maintain users, groups, domain, projects, and policies.
Environment: HP LoadRunner 9.5, HP Performance center 9.0, Agile, Microsft.NET, VB.NET, ASP.NET, IIS, Fiddler, JSF Spring 2.0, RAD/STS, web load, JTest, Unix Sun Solaris9.0, Oracle 10g/DB2, SVN, Maven/Hudson/Nexus, QTP 10.0 and Quality Center 9
Role: QA Analyst
E-Procurement: Electronic procurement is the company's operational buying process for No production Related Materials and Services commodity types. Involved in all phases of testing life cycle. Performed Screen Level Testing, Functional Testing, System Testing and Regression Testing. Screen level testing performed using Win Runner 6.02 Test Director 6.0 and load testing using Load Runner.
- Preparation of Test Presentation on the existing system, which details the Test Processes and Testing Technologies to be implemented.
- Development of Integration and System Test Plan.
- Testing the Business Plan.
- Involve in preparing the Test Plan and Test Cases in Rational Test Manager.
- Involve in Manual testing and Automation testing
- Automate various tests using Rational Robot.
- Test responsibility include Functional and Database testing
- Involve in QA team meetings and bug tracking meeting
- Involve in System test, Integration test and Regression test.
- Help the users in performing User acceptance test
- Develop Scripts using Test Manager for different transaction in the applications
- Involve in designing Test Plans and Performance Test Plans
- Manipulate the SQL Queries to extract data from the database
- Maintain Test Logs, Test Summary reports and participated in status Meetings.
- Review of Test Scripts and Execution of Test Scripts.
- Preparation of Test Reports.
- Preparation of Defect Reports.
Environment: LoadRunner 6.02, Rational Suite RequisitePro, Test manager, Robot, Clear Quest, Clear Case , ASP, ADO, COM, MS Site Server, IIS Server, MS Transaction Server, SQL Server, Windows NT 4.0