Sr. Qa Automation Resume
Pennington, NJ
SUMMARY
- Around 18 years of diversified experience in Software Quality Assurance, analysis, designing, setting up of Quality Assurance process for functional as well as performance testing of client/server, mainframe, web based applications, and SAP Modules.
- Actively participated in all the stages of the Software Development Life Cycle and the testing life cycle and was extensively involved in all the dimensions of Performance testing and UAT.
- I have been involved in various stages of Performance Testing which includes Scalability Testing, Volume Testing, Failover Testing, and testing for Disaster Recovery scenarios.
- Extensive experience in setting up testing process in order to perform data driven and Regression testing using Mercury/HP and IBM tools.
- I have extensive experience in HP ALM and IBM suite.
- Excellent communication and professional skills and have worked independently and lead teams in structured as well as unstructured environments.
TECHNICAL SKILLS
Testing Tool: HP ALM Suite Load Runner, Performance Center, Postman, Sense ESDynatrace, Selenium, Open STA, UFT, Elastic Search HEAD, RFHUtil
Scripting Languages: TSL, VBScript, VBA, JAVA
Protocol: HTTP/HTML, SAP, Truclient, IE/Firefox.
Bug Tracking Tools: HP Quality Center, Rational Clear Quest, BUGZILA, JIRA
Database: Management Oracle, MS Access, Sybase
Programming Languages: C/C++, Java, ABAP, ASP .NET, Python
Web Technologies: HTML
Operating Systems: Linux, UNIX, Microsoft Windows NT/XP, iOS
Other: VERIFIX Certification, MS Visio, Balasamiq, FIX Protocol
PROFESSIONAL EXPERIENCE
Confidential, Pennington NJ
Sr. QA Automation
Responsibilities:
- Identified and the scope of automation and created automation framework for regression testing using QTP/UFT for Portfolio Rebalancing System called Optima.
- Create Automation infrastructure to execute automation scripts on a scheduled basis using Selenium.
- Testing FIX messages internally simulating 3rd parties internally and using simulation tools like VeriFIX.
- Tested third party Rest APIs using Elastic Search engines using Postman.
- Development and Executed Manual test cases and managed them using Quality Center.
- Tracking and Logging Defects in Rally and QC.
- Maintaining automation scripts for the Order Management System for any future enhancement.
- Involved in all aspects STLC from creating test plan, test strategy to execution, coordination and reporting. Involved in developing a typically customized script framework and complex scripts for LR using C language and in automating all the use cases in the product.
- Creating and executing performance scenarios for benchmarking to optimize JVM and Application configuration and performance optimization.
- Identified key performance bottlenecks in the application and also performed end - to-end root cause analysis on these bottlenecks to pin point the reasons. Also presented my analysis in the reports to the client which was helpful in improving performance.
- Creating and executing scenarios for Focus tests to target key business use cases.
- Effectively analyzed GC logs and provided necessary recommendations to the client that helped reduce bottlenecks and increase performance on the system under test.
- Involved in UNIX testing on trade related File systems and Trader related file permissions and processes.
- Reviewed Error log files in UNIX box when order fails to load into SQL tables.
- Involved in User Training sessions on One on One as well as group lectures to train the end users on the system integrity
- Ensuring that all procedures for reconciling equity trade breaks tested in a timely manner.
- Assessed and analyzed current Issue/Defect tracking tools and user needs identified gaps and developed plans to implement and standardize Issue/Defect Tracking and Reporting tool across the organization. Managed development, architecture, and implementation of overall workflow for test plans/test lab.
- Created Scenarios using V-users in Load Runner and run load tests in Controller as per defined SLAs in tandem with different Trading teams and Broker/Dealers.
- Conduct testing and defect analysis and resolution using HPQC
- Follow Scrum development methodology for development, system and integration testing of application components.
- Develop and deliver appropriate QA work products defined for QA Planning, Preparation and Execution phases. These include test estimates, test plans, project status and defect management, and test metrics reports, and other reports as may be requested by the project team.
- Testing trade related communications between various and its interfacing applications using FIX protocol interfacing Broker dealers.
- Build and maintain effective working relationships with PMO.
Tools: Used: HP ALM 12.5, Selenium, JIRA, Load Runner, RMA, Oracle 12.0, JIRA, Dynatrace, Oracle, XML Spy Editor, Python, FIXml, Verifix
Confidential, Brooklyn NY
Sr. QA/Automation Tester/Analyst
Responsibilities:
- Review the data received from the agencies and validate if the files provided and as per requirements and data dictionary.
- Process the test files using iWay, ensure that all data flows through the right MQ (message queues), and the incorrect data goes to the Error Qs.
- After iWay processes files, using RFHUtil, extract the XMLs, parse the data using excel macros and validate that data against the flat file for any anomalies. report them using QC
- Once the data is pushed to Elastic Search using Storm, the JSONs are exported, parsed using excel macros and compared with the XMLs for any anomalies.
- Ensure all primary and alternate search parameters on Elastic Search queries work as expected.
- Ensure quality and quantity of data that is processed from agencies through Data Share (HUB) via Elastic Search to NYC Portal (UI).
- Used MS Visio for preparing Use cases and relationship diagrams for User Acceptance Testing which was conducted with business users.
- Created and created Python based unit tests on core modules.
- Created Data Driven Test using QuickTest Professional (QTP) / UFT.
- Create infrastructure to execute automation scripts on a scheduled basis using QTP.
- Create Automation infrastructure to execute automation scripts on a scheduled basis using Selenium.
- Performed Performance testing on the iWay and STORM servers by processing 3X the expected number of messages at any given time. Monitoring the servers using Perfmon and Dynatrace and creating test reports.
- Involved in Performance, Stress and Endurance testing using LoadRunner
- Executing Load Test, Stress Test and Endurance Test in Load Runner
- Design, Develop and Execute load tests using LoadRunner
- Creating test scripts and categorize then under iWay, Elastic Search for maximum coverage of positive and negative scenarios.
- Execute all the test case and report defects, define severity and priority for each defect.
- Developed and Executed Test Scripts using Selenium and analyzed Test Results.
- Carry out regression testing every time when changes are made to the code to fix defects
- Used JIRA for bug tracking, reporting and requirement traceability and coverage.
- Developing of Manual and Performance test scripts for the entire system/regression level testing using the HP ALM. Worked on designing and developing automation framework for Agile development methodology.
- Develop and deliver appropriate QA work products defined for QA Planning, Preparation and Execution phases. These include Load test estimates, Performance test plans, project status and defect management, and test metrics reports, and other reports as may be requested by the project team.
- Ensuring that the Performance test schedule and Milestones are strictly adhered and ensuring that timely Performance test deliverables.
Tools: Used: MQ Explorer, iWay, HP ALM 12.0 Load Runner 12.0, RMA, CMS FICO, Oracle 11/12.0, PerfmonDynatrace, Oracle, XML Spy Editor, Python
Confidential, Brooklyn NY
Sr. QA and Automation Tester
Responsibilities:
- All responsibilities of test planning, execution and reporting.
- Ensure that the team has all the necessary resources to execute the testing activities and to ensure that testing goes hand in hand with the software development in all phases.
- Prepare the status report of testing activities and updating project manager regularly about the progress of testing activities.
- Created daily defect status reports for management reporting.
- To review all the documents and understand what needs to be tested. Based on the information procured in the above step decide how it is to be tested.
- Inform the test lead about what all resources will be required for software testing.
- Develop test cases and prioritize testing activities
- Developed and Executed Test Scripts using Selenium and analyzed Test Results.
- Used Balsamiq for creating clickable mockups and wireframes.
- Identified the scope for Automation and created scripts using QTP for functional automation.
- Created Data Driven Test using QuickTest Professional (QTP) / UFT.
- Create Automation infrastructure to execute automation scripts on a scheduled basis using Selenium.
- Performed mobile device testing on cross browser devices using android and iOS.
- Execute all the test case and report defects, define severity and priority for each defect.
- Carry out regression testing every time when changes are made to the code to fix defects
- Developing of Manual and Performance test scripts for the entire system/regression level testing using the HP ALM. Worked on designing and developing automation framework for Agile development methodology.
- Develop and deliver appropriate QA work products defined for QA Planning, Preparation and Execution phases. These include Load test estimates, Performance test plans, project status and defect management, and test metrics reports, and other reports as may be requested by the project team.
- Involved in Performance, Stress and Endurance testing using LoadRunner.
- Executing Load Test, Stress Test and Endurance Test in Load Runner.
- Design, Develop and Execute load tests using LoadRunner
- Ensuring that the Performance test schedule and Milestones are strictly adhered and also ensuring that timely Performance test deliverables.
- Involved in defining the overall Performance Test Strategy and scope for performance testing for the R4 project and defining the baseline with all stakeholders.
- Design, develop and support frameworks for our test infrastructure and provide automation expertise to our development teams with a strong emphasis on using code to solve technical challenges and shorten the test cycles through automation using UFT.
- Used MS Visio for preparing Use cases and relationship diagrams for UAT
- Involved in developing a typically customized script framework and complex scripts for LR using C language and creating the LR scripts for each scenario.
- Performed backend testing to ensure data quality on cross environments.
- Participated in data integrity using SQL Server, Oracle and CSV dumps.
- Performed integrated testing using RMA (for rule trigger) and CMS (FICO for content trigger)
Tools: Used: HP ALM 12.0 Load Runner 12.0, RMA, CMS FICO, Oracle 11/12.0, QTP, Selenium, UFT
Confidential, Stamford CT
QA Automation Lead/Performance Tester
Responsibilities:
- Identified and the scope of automation and created automation framework for regression testing using QTP /UFT.
- Create Automation infrastructure to execute automation scripts on a scheduled basis using Selenium.
- Testing FIX messages internally simulating 3rd parties internally and using simulation tools like VeriFIX.
- Involved in defining the performance scenarios based on the client provided QA use cases and inputs for different Trader and Trade related transactions.
- Tested third party Rest APIs using Elastic Search engines.
- Development and Executed Manual test cases and managed them using Quality Center.
- Tracking and Logging Defects in Rally and QC.
- Maintaining automation scripts for the Order Management System for any future enhancement.
- Involved in all aspects STLC from creating test plan, test strategy to execution, coordination and reporting. Involved in developing a typically customized script framework and complex scripts for LR using C language and in automating all the use cases in the product.
- Creating and executing performance scenarios for benchmarking to optimize JVM and Application configuration and performance optimization.
- Identified key performance bottlenecks in the application and also performed end-to-end root cause analysis on these bottlenecks to pin point the reasons. Also presented my analysis in the reports to the client which was helpful in improving performance.
- Used Balsamiq for creating clickable mockups and wireframes.
- Creating and executing LR scenarios for Performance Verification Testing (PVT) which helps in comparison of performance across different versions and the comparison of performance across different dimensions of data model with in a same version.
- Creating and executing scenarios for Focus tests to target key business use cases.
- Involved in analyzing various graphs for the client side and server side metrics like Transaction Response Time, Hits per second graph, Pages download per second, Throughput, Memory & CPU utilization, trace logs.
- Created unique Peak Load scenarios for “Market Hours”, with various ramp-ups and ramp-downs.
- Effectively analyzed GC logs and provided necessary recommendations to the client that helped reduce bottlenecks and increase performance on the system under test.
- Involved in UNIX testing on trade related File systems and Trader related file permissions and processes.
- Reviewed Error log files in UNIX box when order fails to load into SQL tables.
- Involved in User Training sessions on One on One as well as group lectures to train the end users on the system integrity
- Ensuring that all procedures for reconciling equity trade breaks tested in a timely manner.
- Assessed and analyzed current Issue/Defect tracking tools and user needs identified gaps and developed plans to implement and standardize Issue/Defect Tracking and Reporting tool across the organization. Managed development, architecture, and implementation of overall workflow for test plans/test lab.
- Created Scenarios using V-users in Load Runner and run load tests in Controller as per defined SLAs in tandem with different Trading teams and Broker/Dealers.
- Conduct testing and defect analysis and resolution using HPQC
- Follow Scrum development methodology for development, system and integration testing of application components.
- Develop and deliver appropriate QA work products defined for QA Planning, Preparation and Execution phases. These include test estimates, test plans, project status and defect management, and test metrics reports, and other reports as may be requested by the project team.
- Testing trade related communications between various and its interfacing applications using FIX protocol interfacing NYSE.
- Build and maintain effective working relationships with PMO.
- Written SQL Queries and executed them using TOAD and SQL Editor.
- Serve as the main point of contact with business and offshore teams and communicate test plans, scope and timelines.
- Liaison between the business and development teams to translate business workflows into testable business use cases.
Environment: Windows 2000, XP, QTP 12.1, HPQC 12.1, Clear Quest, UNIX, Load Runner 12.0, J2EE, Oracle, IE4.X, C, C++, .NET, iOS
Confidential, NY
QA Lead/Automation Tester
Responsibilities:
- Developing of Manual and Automation scripts for the entire System/Regression level testing using the HP and IBM tools. Worked on designing and developing automation framework for Agile development methodology.
- Setting up of all Performance test environments and tools like Load Runner and Performance Center and testing the entire LR suite on the ACCELA products for DoB and DCA agencies. Development of automation scripts
- Performed XML based protocol for exchanging information between computers/systems and verifying the contents on incoming and outgoing messages on CMBS trading OMS.
- Contributed in configuring the Continuous Integration and running the nightly build on a daily basis using the Jenkins
- Development and Executed Manual test cases and managed them using Quality Center.
- Identified the scope for Automation and created scripts using QTP for functional automation.
- Used Shell Scripts to check/validate Middle tier services, browse through the pi.log files and test attributes.
- Tracking and Logging Defects in Rally and QC.
- Maintaining automation scripts for any future needs
- Involved in all aspects STLC from creating test plan, test strategy to execution, coordination and reporting.
- Involved in defining the overall Performance Test Strategy and scope for performance testing for the FFE project and defining the baseline with all stakeholders.
- Design, develop and support frameworks for our test infrastructure and provide automation expertise to our development teams with a strong emphasis on using code to solve technical challenges and shorten the test cycles through automation using UFT.
- Involved in developing a typically customized script framework and complex scripts for LR using C language and in automating all the use cases in the product.
- Creating and executing performance scenarios for benchmarking to optimize JVM and Application configuration and performance optimization.
- Used JIRA for bug tracking, reporting and requirement traceability and coverage.
- Identifying key performance bottlenecks in the application and also performed end-to-end root cause analysis on these bottlenecks to pin point the reasons. Also presenting my analysis in the reports for improving performance.
- Creating and executing LR scenarios for Performance Verification Testing (PVT) which helps in comparison of performance across different versions and the comparison of performance across different dimensions of data model with in a same version.
- Creating and executing scenarios for Focus tests to target key business use cases.
- Involved in analyzing various graphs for the client side and server-side metrics like Transaction Response Time, Hits per second graph, Pages download per second, Throughput, Memory & CPU utilization, trace logs.
- Follow Scrum development methodology for development, system and integration testing of application components.
- Develop and deliver appropriate QA work products defined for QA Planning, Preparation and Execution phases. These include Load test estimates, Performance test plans, project status and defect management, and test metrics reports, and other reports as may be requested by the project team.
- Ensuring that the Performance test schedule and Milestones are strictly adhered and also ensuring that timely Performance test deliverables.
Confidential, Cranford, NJ
QA Analyst
Responsibilities:
- Design and creation of appropriate Test Plan, Test Scenarios and Test Cases.
- Setup QC, created test scripts, Run and analyzing them for functional and technical issues and mitigate bottlenecks. Performed Data Driven Testing and Data Validation using QC.
- Installed, customized and administered Mercury Imperative's Quality Center. Troubleshoot any issues encountered, evaluate and perform upgrades on any of the tools in the Mercury test suite
- Extensively creating scripts using Mercury Quality Center scripts for SAP modules like MM, SD, FICO.
- Coordinate the test execution and documentation of test results. Ensure that testing is being done accurately, as planned within the allotted timelines.
- Coordinating across the development teams and ensuring systematic Unit Testing and End to End testing
- Assessed and analyzed current Issue/Defect tracking tools and user needs identified gaps and developed plans to implement and standardize Issue/Defect Tracking and Reporting tool across the organization through Mercury Quality Center
- Managed development, architecture, and implementation of overall workflow for test plans/test lab
- Developed test plans and prioritize testing efforts across the QA group to meet product and project schedules.
- Manage QA environments to ensure correctness and completeness of coverage.
Environment: SAP MM, SD, FICO, Windows 2000, XP, Win Runner 7.0, HP Quality Center, MS, EDI Office Suite, Netscape 4.X, IE4.X, Oracle.
Confidential
Sr. QA / Automation Engineer
Responsibilities:
- Setting up QA process, manual suite and automation framework in the newly structured ‘Client Services’ group. Generated and implemented templates for Test Plan, Test Cases, Test Scripts, Business Analysis, Test Defect Log, Test Case Checklist etc.
- Setup QTP and Load Runner. Created Load scripts, Run and analyzing them for performance related issues and potential bottlenecks. Performed Data Driven Testing and Data Validation using Win Runner & QTP.
- Installed, customized and administered Mercury Imperative's Quality Center and Performance Center. Troubleshoot any issues encountered, evaluate and perform upgrades on any of the tools in the Mercury test suite
- Extensively creating scripts using Load runner v 9.0.
- Executing Load Test, Stress Test and Endurance Test in Load Runner. Designed and tested peak load scenarios with various clients that gave specific SLAs for different LR scenarios.
- Customize Parameterization in DATA file using via Load Runner to test the application with different sets of data. Inserted rendezvous points to create intense load on the server and thereby to measure server performance
- Responsible for Test Management, Performance and Functional test execution, Defect Tracking and Reporting. Prepared Test cases and Executed Test cases.
- Design and execute Test Cases, Generate Test Scripts and Test scenarios using Vugen component in Load Runner with various scalability combinations of ramp ups and ramp downs.
- Involved in analyzing various graphs for the client side and server-side metrics like Transaction Response Time, Hits per second graph, Pages download per second, Throughput, Memory & CPU utilization, trace logs.
- Performed performance engineering with different stake holders and resolving bottlenecks by using monitors. These monitors were application server monitors, web server monitors, database server monitors and network monitors.
- Analyzed system requirements and developed detailed Manual and Automation Test Plan. Reviewing existing test cases and created new test scripts as per the BRD.
- Work with Project and Development managers to define QA activities for project plans along with ensuring delivery schedules for testing activities
- Testing and reviewing error log files in UNIX and identifying zombie processes.
- Acted as release manager, where required to ensure product components are released for production deployment as tested by Software QA Group
- Lead a Team of 5 testers and worked on the Onsite-offshore model to ensure product delivery in time.
- Involved in User Training sessions on One on One as well as group lectures to train the end users on the system integrity
- Ensuring that all procedures for reconciling equity trade breaks tested in a timely manner.
- Assessed and analyzed current Issue/Defect tracking tools and user needs identified gaps and developed plans to implement and standardize Issue/Defect Tracking and Reporting tool across the organization. Managed development, architecture, and implementation of overall workflow for test plans/test lab.
- Performed API testing to make sure that the basic units of the software application function properly as desired. We performed API testing right from the initial stages of the product cycle to the final phase, ensuring that the product release in the market is error-free. API testing involved testing the methods of .NET, JAVA, J2EE APIs for valid an invalid input, plus testing the APIs on Application servers.
- Acted as liaison between the company and vendor for Test Management Tools and services to support testing.
- Provided support to groups of tool users regarding global setup and maintenance activities. Worked with primary and secondary users to define standards, and methods to achieve best results in tool utilization and standardization.
- Evaluated and integrated Test Management Tool solutions. Provided analysis, configuration, and standards monitoring across the testing organization. Consulted on training needs and advised various test groups regarding specific needs.
- Developed and maintained strong relationships with the other groups within the Project to ensure sharing of QA knowledge, testing consistency, and quality of testing throughout the department designing SOA Value Proposition as well as SOA Integration Strategies
Environment: Windows 2000, XP, QTP, Load Runner 8.0, OSTA, Clear Quest, UNIX, J2EE, MS Office Suite, Oracle, IE4.X, .NET