Performance Tester Resume
Chicago, IL
SUMMARY:
- 12+ years of IT experience on software testing and L3 production support related to LoadRunner, UNIX shell scripting, Core java, EAI/ETL, SQL, Oracle, SOAP UI, and Google Analytics.
- In duration of 12+ years, I performed the role as Performance Tester, ETL Tester, and L3 production support SME.
- As a performance tester/analyst, my responsibilities were - (i) Performance testing engineering using the performance testing tool Loadrunner.(ii) Build and maintain performance testing strategy, test scenarios and test cases.(iii) Performance, load, concurrent user and stress testing development, execution and publish results.(iv) Requirements and functional specifications review.(v) Test environment management/configuration.(vi) Perform capacity, scenario and endurance tests.
- (vii) Analyze scalability, throughput and load testing metrics against test servers.(viii) Compare and contrast system performance with varying levels of physical resources like RAM, CPU, disk utilization, and networks etc.(ix) Execute performance optimization experiments and recommend short and long term plans.(x) Measure, report and recommend performance response time guidelines/SLAs.(xi) Generate test summary reports for management review. (xii) Analyze root causes of performance issues and provide corrective actions.(xiii) Reporting defects in HP ALM and JIRA so that DEV team can work in the code level and look for SQL query tuning to increase performance.
- As an ETL tester, my responsibilities were - (i) Testing of ETL components/software using ETL tools Ab-Initio and Datastage.
- (ii) Test ETL datawarehouse components. (iii) Executing backend data-driven test (iv) Create, design, and execute test strategy, test scenarios, test plans and test cases. (v) Write and maintain test cases in HP ALM and SharePoint (vi) Execute test cases in both INT and UAT(vii) Log defects in HP ALM (viii) Identify, troubleshoot and provide solutions for potential issues (ix) Test execution coverage through unit, integration, UAT, performance and smoke testing. (x) Defect triage meeting twice in a week to label as must fix/not a must fix (xi) Multiple round of testing(system testing, integration testing, regression testing) to make sure that software functions as per business requirement and code is defect free once moved to production (xii) Exploratory testing to evaluate robustness and functionality of software products.
- As a L3 production support SME, I analyzed root cause and fixed multiple production related issues.
- I Performed development activities in the code for the minor enhancement requests (MERs).
- Used majorly UNIX shell scripting and core JAVA for the development and tested in lower environment.
- Get the development work reviewed and approved by business user in UAT environment before moving the code to production through change request (CR).
- Also I executed all the ITSM process (incident management, problem management, change management, release management, SRQ etc.) to carry out production support activities in day-to-day basis.
- Tool used to report ITSM process activities is HPSD and ServiceNow.
- Extensive experience on testing tools and software - LoadRunner, Junit, SOAP UI (REST and SOAP Automated API Testing Tool), Google Analytics, HP ALM, and JIRA.
- Used LoadRunner for measuring system behavior and load performance.
- Used Junit for unit testing the application.
- Have good knowledge in HP QTP automation tool and have executed functional and regression test cases in the project to speed of testing efforts.
- Experience extensively on UNIX shell scripting, Core Java, JSP, XML, SQL, Web Services, WebSphere MQ, EDI and EAI/ETL technologies.
- Have used HPSD (HP Software Division) and ServiceNow for incident, problem, change, SRQ and release management.
- Business Domains worked: Manufacturing and Supply Chain, Automobiles, Retail and Pharmacy, Banking and Financial Services.
- Have worked both in Scrum Agile framework and Waterfall framework projects.
- Experience in DevOps Atlassian tools like GIT, JIRA.
- Experience in using Oracle 11g.
- Used HP ALM and JIRA for defect management and release management.
- Used Google Analytics to measure the result of individual activity in real-time, data comparison between different dates for business decision.
- Experience in testing the web services using SOAP UI.
- Experience in using version and build applications/tools like Tortoise SVN, GIT and CVS.
- Solving skills by delivering quality solutions to complex problems within a limited time span.
- Self-motivated with natural leadership traits, excellent communication skills and a good team player.
TECHNICAL SKILLS:
Testing related Tools: LoadRunner, Junit, SOAP UI, Google Analytics, HP ALM, and JIRA.
EAI/ETL Middleware Integration: SUN JCAPS, Ab-initio, Datastage, EDI, Tumbleweed, WebSphere MQ.
Web Technologies: JSP, SOAP UI, MQ, Axway B2B Integrator.
ITSM management tool: HPSD and ServiceNow.
RDBMS: Oracle 11g, MSSQL, Netezza 7.0
Operating Systems: UNIX, UNIX shell scripting, Sun Solaris, MS offices.
Repository: Tortoise SVN, SharePoint, GIT and CVS.
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Performance Tester
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Conduct smoke testing to show that the application performs as expected under normal conditions.
- Conduct stress testing to find out how the application perform under the worst condition, helping developers determine which component are more likely to fail first.
- Complete understanding on FRD, BRD and TDD.
- Work with testing team to develop performance test plans and cases.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review and sign-off.
- Upload the test results, scenarios, cases and strategies in available SharePoint location.
- Analyze root causes of performance issues and provide corrective actions.
- Taking part in coaching session conducted by scrum master about agile scrum to deliver successful deliveries.
- Always making sure that project is working smoothly and every team member has tools/softwares/access etc. to get their work done effectively.
- Taking part in daily scrum, sprint planning, sprint demo and retrospective meetings.
- Ensuring the correct use of scrum process with the co-ordination of scrum team in agile framework.
- Attend the daily stand up meeting and provide high level update about work done in previous date and work that would be carried out in today’s date.
- Helping scrum master to make the product backlog in proper shape and make them ready for next sprint, based on the necessity.
- Helped offshore team to understand about Epics, User Stories, Tasking, Story points and backlog tracking.
- Co-ordinates with other scrum team members to make sure there is no delay in requirement gathering to carry out project work.
- Taking part in user story grooming session to understand the requirement better and provide the story points and velocity from testing side.
- Defect logging in JIRA tool under the defined board.
- Provide continuous feedback to the entire development team on the status/progress of the product being designed.
- Make suggestions and give input on automating night jobs, integration and regression as well as exploratory and acceptance testing.
Software and tools Used: LoadRunner, Charles River Advisor application, Oracle SQL, UNIX, JIRA, SOAP UI and Google Analytics.
Confidential, Chicago, IL
Performance Tester/ETL Tester
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Conduct smoke testing to show that the application performs as expected under normal conditions.
- Conduct stress testing to find out how the application perform under the worst condition, helping developers determine which component are more likely to fail first.
- Work with testing team to develop performance test plans and cases.
- Complete understanding on FRD, BRD and TDD.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review and sign-off.
- Upload the test results, scenarios, cases and strategies in available Sharepoint location.
- Analyze root causes of performance issues and provide corrective actions.
- Taking part in daily scrum, sprint planning, sprint demo and retrospective meetings.
- Ensuring the correct use of scrum process with the co-ordination of scrum team in agile framework.
- Attend the daily stand up meeting and provide high level update about work done in previous date and work that would be carried out in today’s date.
- Helping scrum master to make the product backlog in proper shape and make them ready for next sprint, based on the necessity.
- Helped offshore team to understand about Epics, User Stories, Tasking, Story points and backlog tracking.
- Co-ordinates with other scrum team members to make sure there is no delay in requirement gathering to carry out project work.
- Taking part in user story grooming session to understand the requirement better and provide the story points and velocity from testing side.
- Test Case design after understanding the FRD and BRD documents. Once the test cases are signed off, perform the execution.
- Followed the complete software testing life cycle (STLC) for end-to-end success execution of the project.
- Test the ETL tool and its functions. Test the ETL Data Warehouse system.
- Running the ETL graph through UNIX script and validating the results in Oracle DB and generated flat file.
- Based on the requirement, generation of sample graph so that DB tables or files can be compared to find out the differences.
- Reporting the failure with complete information in HP ALM as defect and assigning to DEV team for fix and RCA.
- Develop and implement quality assurance problem reporting processes and systems.
- Test Case design after understanding the FRD and BRD documents. Once the test cases are signed off, perform the execution.
- Complete Analysis on Application data model document and design the test cases.
- Defect log in ALM and JIRA tool.
- Test cases execution - Sanity test, integration test, regression test.
- Discussion with DEV team and business users to understand the process flow and re-design the test cases whenever required.
- Understanding project requirements and providing QA effort estimation.
- Getting the final test strategy, test scenarios, test cases and test summary report approved by QA manager.
- Conducting the meetings with various stakeholders to make sure they are aligned with the project timeline.
- Regression test suite creation in HP ALM and get the same signed-off by the customer.
- Upload test cases, test results, FRD/FDD/BRD etc. in SharePoint.
- Explaining the project requirement and technical aspects with offshore team to make sure they can work on testing activities smoothly.
- Scheduling UAT meeting well in advance with business users to let them know the testing strategy and timelines so that they are aligned and ready to support UAT activities.
- Following all the agile process methodology.
Tools Used: LoadRunner, Junit, HP ALM, JIRA, SOAP UI
Software Used: UNIX Shell Scripting, Oracle, JAVA and Ab-Initio, Google Analytics.
Confidential, Chicago, IL
Performance Tester/ETL Tester
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Conduct smoke testing to show that the application performs as expected under normal conditions.
- Conduct stress testing to find out how the application perform under the worst condition, helping developers determine which component are more likely to fail first.
- Complete understanding on FRD, BRD and TDD.
- Work with testing team to develop performance test plans and cases.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review and sign-off.
- Upload the test results, scenarios, cases and strategies in available SharePoint location.
- Analyze root causes of performance issues and provide corrective actions.
- Followed the complete software testing life cycle (STLC) for end-to-end success execution of the project.
- Test the ETL tool and its functions. Test the ETL Data Warehouse system.
- Running the ETL graph through UNIX script and validating the results in Oracle DB and generated flat file.
- Based on the requirement, generation of sample graph so that DB tables or files can be compared to find out the differences.
- Reporting the failure with complete information in HP ALM as defect and assigning to DEV team for fix and RCA.
- Develop and implement quality assurance problem reporting processes and systems.
- Test Case design after understanding the FRD and BRD documents. Once the test cases are signed off, perform the execution.
- Complete Analysis on Application data model document and design the test cases.
- Defect log in ALM and JIRA tool.
- Test cases execution - Sanity test, integration test, regression test.
- Discussion with DEV team and business users to understand the process flow and re-design the test cases whenever required.
- Understanding project requirements and providing QA effort estimation.
- Getting the final test strategy, test scenarios, test cases and test summary report approved by QA manager.
- Conducting the meetings with various stakeholders to make sure they are aligned with the project timeline.
- Regression test suite creation in HP ALM and get the same signed-off by the customer.
- Upload test cases, test results, FRD/FDD/BRD etc. in SharePoint.
- Explaining the project requirement and technical aspects with offshore team to make sure they can work on testing activities smoothly.
- Scheduling UAT meeting well in advance with business users to let them know the testing strategy and timelines so that they are aligned and ready to support UAT activities.
- Following all the agile process methodology.
Software Used: UNIX, Oracle, SQL, Ab-Initio
Tools: Used for defect tracking: HP ALM, JIRA
Tools Used for testing: LoadRunner, JUnit
Confidential, Chicago, IL
Performance Tester/ETL Tester
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Conduct smoke testing to show that the application performs as expected under normal conditions.
- Conduct stress testing to find out how the application perform under the worst condition, helping developers determine which component are more likely to fail first.
- Complete understanding on FRD, BRD and TDD.
- Work with testing team to develop performance test plans and cases.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review and sign-off.
- Upload the test results, scenarios, cases and strategies in available SharePoint location.
- Analyze root causes of performance issues and provide corrective actions.
- Followed the complete software testing life cycle (STLC) for end-to-end success execution of the project.
- Test the ETL tool and its functions. Test the ETL Data Warehouse system.
- Running the ETL graph through UNIX script and validating the results in Oracle DB and generated flat file.
- Based on the requirement, generation of sample graph so that DB tables or files can be compared to find out the differences.
- Reporting the failure with complete information in HP ALM as defect and assigning to DEV team for fix and RCA.
- Develop and implement quality assurance problem reporting processes and systems.
- Test Case design after understanding the FRD and BRD documents. Once the test cases are signed off, perform the execution.
- Complete Analysis on Application data model document and design the test cases.
- Defect log in ALM and JIRA tool.
- Test cases execution - Sanity test, integration test, regression test.
- Discussion with DEV team and business users to understand the process flow and re-design the test cases whenever required.
- Understanding project requirements and providing QA effort estimation.
- Getting the final test strategy, test scenarios, test cases and test summary report approved by QA manager.
- Conducting the meetings with various stakeholders to make sure they are aligned with the project timeline.
- Regression test suite creation in HP ALM and get the same signed-off by the customer.
- Upload test cases, test results, FRD/FDD/BRD etc. in SharePoint.
- Explaining the project requirement and technical aspects with offshore team to make sure they can work on testing activities smoothly.
- Scheduling UAT meeting well in advance with business users to let them know the testing strategy and timelines so that they are aligned and ready to support UAT activities.
- Following all the agile process methodology.
Software Used: UNIX, Oracle, JAVA, Ab-Initio
Tools Used for defect tracking: HP ALM, JIRA
Testing Tools: LoadRunner, Junit.
Version Repository: GIT, Tortoise SVN, CVS
Confidential, Detroit, MI
Performance Tester/ETL Tester
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Conduct smoke testing to show that the application performs as expected under normal conditions.
- Conduct stress testing to find out how the application perform under the worst condition, helping developers determine which component are more likely to fail first.
- Complete understanding on FRD, BRD and TDD.
- Work with testing team to develop performance test plans and cases.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review and sign-off.
- Upload the test results, scenarios, cases and strategies in available SharePoint location.
- Analyze root causes of performance issues and provide corrective actions.
- Followed the complete software testing life cycle (STLC) for end-to-end success execution of the project.
- Test the ETL tool and its functions. Test the ETL Data Warehouse system.
- Running the ETL graph through UNIX script and validating the results in Oracle DB and generated flat file.
- Based on the requirement, generation of sample graph so that DB tables or files can be compared to find out the differences.
- Reporting the failure with complete information in HP ALM as defect and assigning to DEV team for fix and RCA.
- Develop and implement quality assurance problem reporting processes and systems.
- Test Case design after understanding the FRD and BRD documents. Once the test cases are signed off, perform the execution.
- Complete Analysis on Application data model document and design the test cases.
- Defect log in ALM and JIRA tool.
- Test cases execution - Sanity test, integration test, regression test.
- Discussion with DEV team and business users to understand the process flow and re-design the test cases whenever required.
- Understanding project requirements and providing QA effort estimation.
Software Used: UNIX, Datastage, JAVA, Oracle, SQL, WebSphere MQ, SOAP UI, Tumbleweed.
Tools: Used for defect tracking: HP ALM, JIRA, HPSD
Tools: used for Testing: LoadRunner, Junit.
Confidential, Detroit, MI
L3 Production Support SME
Responsibilities:
- Provide 24×7 operational support to all production practices on holidays and weekends.
- Monitor all alerts and escalate all issues to the concern team for quick resolution.
- Conduct SME reviews with the team to ensure coding standards and best practices are followed.
- Supported new projects in setting-up the code templates, checklists, standards and SLA documents.
- Involve in Business Requirements and technical requirement Discussions.
- Work closely with business users and technical architects to deal with critical issues and close them within the SLA.
- Coordination with IT team and external vendors and ensure effective application services to ensure reliability of all applications.
- Coordinate with different teams and raise incident ticket for all issues, analyze root cause and assist in efficient resolution of all production issues/bugs.
- Creation of customized UNIX shell scripts to monitor the end to end flow for applications and place them in cron job. If is there any issues in the applications the alert mail will be sent to corresponding application team. This Script will help to reduce the manual check for each application and also helped us to avoid the Customer complaints.
- Perform the development related fixes for all the minor enhancement requests (MER) using UNIX shell scripting and JAVA technologies, which are integrated within EAI/ETL tools.
- Creation of Change Request and get it approved by change authorization board so that dev code can be moved to production.
- Developed Unix Script and implemented in dashboard for the team so that it will help the team to take corrective action on a timely manner as the dashboard is UI and displayed in TV screen.
- Provide Knowledge Transition both functional and technical for the team members about the application for smooth support of the application.
- Responsible for getting incidents and requests from different end-users, analyzing these and either responding to the end user with a solution or escalating it to the other IT team.
- Support trouble-shooting client issues with high level data analysis and project review.
- Maintain interaction within entire organization and third party related entities.
- Engage in client calls based on the requirement.
- Provide support hours of operation and off hour production emergencies.
- Identify root causes of production issues in production and find out a permanent solution to avoid re-occurrence.
- Maintain the KEDB for the identified issues for future reference.
- ITSM process management through HPSD management product.
Software Used: UNIX Shell Scripting, JAVA, Oracle, EAI/ETL (JCAPS/SRE), WebSphere MQ
Tools: Used for defect tracking: HPSD