We provide IT Staff Augmentation Services!

Team Lead Resume

3.00/5 (Submit Your Rating)

Topeka, KS

SUMMARY:

  • 11 years of diversified experience as a Software Test Engineer in various domains such as Banking and Finance, Healthcare, Insurance, Communications and Railroad . Proficient in mobile, manual, web and database testing. Expertise in analyzing business requirements to develop and document test plans and strategies, use cases, test scenarios, test cases, test metrics and defect reports.
  • Strong knowledge in Software Testing Life Cycle ( STLC ), QA Methodologies (Water Fall, Spiral, V - Model, Agile, Iterative, Incremental).
  • Proficient in formulating Test Procedures, Test plans, Test scripts, test result validation derived from analyzing the Business Requirements, Software Requirement Specifications.
  • Very strong experience in Mobile and Manual testing . Expertise with tools like HP Application Life Cycle Management (ALM), HP Quality Center, Device Anywhere.
  • Experienced implementation with Agile Product Lifecycle Management(PLM) and waterfall software development methodologies Scrum, TDD (Test Driven Development).
  • Expertise in mobile application testing for mobile apps including Blackberry, iOS, iPad and Android devices.
  • Application validation using simulators and emulators with real time devices like iPad, iPhone and Blackberry.
  • Write detailed functional test plans to ensure system changes work properly, existing processes remain unaffected, and the needs of the users are met.
  • Very strong experience in implementation and maintenance of Quality Engineering Processes.
  • Well experienced in testing applications for validating data integrity based on web services and REST applications using SOAP UI and Postman client.
  • Expertise in User acceptance testing and regression testing in Client-Server and Web-based applications.
  • Implemented test case execution using Java Junit coding process.
  • Strong Skills in using Integration, Functionality, Regression, Performance, Stress Web, GUI, database testing.
  • Good experience in Accessibility testing for testing application from different roles to test the access for different kinds of users for different modules/links in the application.
  • Experience with manual and automated testing of client/server, web based, mobile (native apps and mobile web) and cloud based applications in different environments and platforms. Also, involved in Omniture testing along with generic functional testing like smoke and regression testing.
  • Proficient in Test Automation using UFT (Formerly QTP) and Selenium Tools.
  • Worked on Security testing in multiple banking projects which involves testing of login page for user experience to include gatekeeper check as per the security standards and per compliance.
  • Proficient in analyzing business, technical requirements to design Test Strategies, Test Plan and execute Test Scenarios, Test cases, Test scripts at service level, functional, SIT, regression, automation, performance, end-to-end and UAT of complex applications in large projects and Programs.
  • Verification of data using SQL queries by performing back-end testing.
  • Taking lead in setting up the test environment and creating test data as per the requirement.
  • Cross-project collaboration experience with developers and testers and experienced Lead Tester to coordinate and maintain collaboration with team members to ensure enhancement of software quality.
  • Experience in Railroad domain and Positive Train Control(PTC) experience in strategy design and development.
  • Excellent leadership skills in managing QA team, analyzing diagnostic skills as well as attention to detail ensuring efficient test coverage
  • A dedicated Team leader, Team Player and goal-oriented professional with excellent Verbal and written communication skills, organizational and interpersonal Skills.

TECHNICAL SKILLS:

Web Technologies: HTML, VB Scripting, Java, XML, JavaScript

Database Systems: SQL server 2005, Oracle

Tools: & Utilities: Device Anywhere Tool, IDX, HP ALM, HP Quality center 9.0/9.2/10.0, HP QTP 9.2, Python, JavaScript, Test Director 8.0, Citrix, Toad, Bugzilla and Doors, SOAP UI, Postman

Operating Systems: Windows XP, UNIX, Android, iOS

PROFESSIONAL EXPERIENCE:

Confidential, Topeka, KS

Team Lead

Environment: Tools: VersionOne, HP ALM, SOAP UI, Postman, Eclipse

O/S : Windows XP, Unix

Responsibilities:

  • Extensive experience in designing Test Strategy, Test Plan for Enterprise systems and identifying the performance testing needs.
  • Test planning, documentation, coordination and execution and work with other QA engineers to coordinate testing.
  • Followed agile methodology for the project and attended daily scrum meetings to discuss on day to day activities and issues.
  • Test Execution is done using Postman and SOAP UI for REST and SOAP services to validate the XMLs and JSONs for data integrity and updated status in VersionOne tool.
  • To confirm the structure of the functionality and to verify the internal composition, wireframe validations are performed for the respective web page.
  • Performed several types of testing like smoke, functional, system integration, black box, positive, negative and regression testing.
  • Validation included creating test strategies and test plans, functional testing of site page behavior against initial wire frames and designs.
  • Collaborate with System Architect, Software Engineers and Product Management to identify and prioritize areas requiring testing as well as areas requiring design review.
  • Perform initial sanity and smoke testing on the application build to check the readiness of the application for further system integration testing. This also includes security testing to validate the data confidentiality.

Confidential, Phoenix, AZ

Team Lead

Environment: Tools: Device Anywhere, Firefox user agents, IOS Simulators, Android Emulators and HP ALM .

O/S : Windows XP, Unix

Responsibilities:

  • Extensive experience in designing Test Strategy, Test Plan for Enterprise systems and identifying the performance testing needs.
  • Test planning, documentation, coordination and execution and work with other QA engineers to coordinate testing.
  • Followed agile PLM methodology for the project and attended daily scrum meetings to discuss on day to day activities and issues.
  • Test Execution is done using simulators and emulators and device anywhere tool for Mobile App Testing in Android and iOS Platforms.
  • To confirm the structure of the functionality and to verify the internal composition, wireframe validations are performed for the respective web page.
  • Performed several types of testing like smoke, functional, system integration, black box, positive, negative and regression testing.
  • Validation included creating test strategies and test plans, functional testing of site page behavior against initial wire frames and designs.
  • Performed Web services testing using SOAPUI Tool by adding assertions to validate the XMLs for data integrity in Web services SOAP services.
  • Collaborate with System Architect, Software Engineers and Product Management to identify and prioritize areas requiring testing as well as areas requiring design review.
  • Testing is performed on different combinations of operating systems and browsers both on mobile and web based devices including link validation and required page rendering.
  • Performed User Acceptance Testing (UAT) manually and developed Test Matrix to give a better view of testing effort.
  • Perform initial sanity and smoke testing on the application build to check the readiness of the application for further system integration testing. This also includes security testing to validate the data confidentiality.
  • Cross-Browser Testing was performed on different versions of IE and other Browsers.
  • Execution of Baseline, Performance, Load Test with load runner/Performance center.

Confidential, Phoenix, AZ

Test Lead

Environment: Tools: Device Anywhere, Firefox user agents, IOS Simulators, Android Emulators and HP ALM and Quality Center .

O/S : Windows XP, Unix

Responsibilities:

  • Developed and maintained innovative, repeatable QA test plans based on functional requirements, use cases, user interface designs, system design documents and domain knowledge.
  • Validations include testing of site page behavior against preliminary wire frames, conformity to structural standards and verification of middleware functionality.
  • Test Execution is done using Firefox user agents, IOS simulators, Android emulators, some handheld devices and Device Anywhere Tool
  • Analyzed and translated requirements and business design into Test Cases.
  • Testing includes link validation, proper page rendering and security on mobile and web devices using multiple OS/browser combinations.
  • As part of a team, involved in Project planning, gathering requirements, modifying, and executing GUI and participated in weekly project status meetings and reviews.
  • Performed execution of test cases manually to verify expected results and performed positive and negative testing.
  • Performed Java Junit coding for completing assertions for the code base which was later used for Mobile app.
  • Reproduce reported defects and analyze the root cause to assist developers in focused defect correction. Used Quality Center tools to create and track the defects about any failures until they were resolved and generated defect summary reports.
  • Created Test requirements using Quality Center to provide full test coverage.
  • Tested the Performance of the application under Load using Load Runner.
  • Prepared and reported Day-to-Day Reports and Testing Documentation.
  • Followed agile PLM methodology for the project and attended daily scrum meetings to discuss on day to day activities and issues.
  • Identified User and System requirements for the application and established links for better trace of requirements using Traceability Matrix.
  • Coordinate with the offshore testing team and help resolving the issues that occurs and making the smooth execution of the testing activities at offshore.

Confidential, Phoenix, AZ

Senior QA

Environment: Tools: Device Anywhere, Firefox user agents, IOS Simulators, Android Emulators and HP ALM and Quality Center .

O/S : Windows XP, Unix

Responsibilities:

  • Gathering business requirements through interviews and sessions with end users.
  • Responsible for writing Test Cases, Test Plans, Test scripts and other test documents based on business requirement.
  • Followed agile methodology for the project and attended daily scrum meetings to discuss on day to day activities and issues.
  • Responsible for developing detailed SIT and UAT test plans, conditions, scenarios, cases and related data and participated in SIT and UAT planning and execution activities for business requirements.
  • Coordinate with the offshore testing team and help resolving the issues that occurs and making the smooth execution of the testing activities at offshore.
  • Test Execution is done using Firefox user agents, IOS simulators, Android emulators, some handheld devices and Device Anywhere Tool.
  • Responsible for gathering business requirements to create detailed test cases and uploading test cases into Quality center. Ability to map test cases to business requirements and perform high end regression testing.
  • Participated in weekly test planning, project status meetings and reviews and reported Day-to-Day Reports and Testing Documentation in test environments.
  • Participated in walkthroughs of requirements, specifications, database designs and test strategies.
  • Mentoring a team of testers on several projects and working closely with the end users during UAT Testing.
  • Recognize scope limitations and raise potential scope issues while designing the best system approach.
  • Tested the user interface and constantly redesigned the system to increase its user friendliness for the clients.

Confidential, Bethesda, MD

Senior QA

Environment: Tools: IDX(Interpretive Data System), HP ALM(Application Life Cycle Management) and Citrix

O/S : Windows XP

Responsibilities:

  • Worked closely with Business Analysts and Developers to gather Application Requirements and Business processes to formulate the test plan.
  • Develop test strategies, plans, schedules and cases utilizing product knowledge and coordination with technical analysts.
  • Involved in creation of claims in IDX Application and on edits functionalities in Claims, including creating test data by extracting reports from IDX.
  • Developed test plans, test strategies, test cases and test process, test case suites based on project documentation, communication with clients and feedback from test groups.
  • Involved in all phases of testing (Requirements gathering, Test Case Design, Test data Creation, Test Execution, Results Reporting and Status Reporting) using industry standard reporting techniques.
  • Execution of System Integration Testing, Functional Testing, Regression Testing, Performance and Load Testing and result analysis.
  • Responsible for designing and developing integration test plans and test documentation
  • Responsible for the Release testing, that is the last phase of testing before pushing the changes into production
  • Involved in Java coding for Junit test cases assessments for various functionalities in project.
  • Bug reporting and coordinating with teams for testing in multiple test environments including system testing and production environments using Application Lifecycle management.
  • Worked with the TAE (test engineer environment) in setting up the test environment.
  • Performed execution of test cases manually to verify expected results.
  • Carried out Boundary value analysis using extreme of input domain such as maximum, minimum, just inside/outside boundaries, typical values and error values.
  • Participated in daily status call with client and coordinating on status and work.
  • Reviewed test cases of other team members and provided mentoring and feedback.

Confidential, Bethesda, MD

Senior QA

Environment: Tools: IDX(Interpretive Data System), HP ALM(Application Life Cycle Management) and Citrix

O/S : Windows XP

Responsibilities:

  • Responsible for end to end testing activities includes to ensure quality and defect free products delivered within budget limits and schedule and without impacting the customer facing applications, significant degree of coordination to ensure adequate test coverage throughout the lifecycle of projects/programs and to ensure that each phase of project execution meets QA methodology requirements
  • Coordinate project testing, often including testing between multiple test teams
  • Develop test plans, test strategies, test cases utilizing product knowledge and coordination with technical analysts.
  • Involved in Creation of claims in IDX Application and on edits functionalities in Claims.
  • Worked on creation of test data on Claims by extracting reports from IDX.
  • Execute test cases and create defect reports using industry standard reporting techniques
  • Write and review Test plans and Test Strategies with participating teams in formal procedure.
  • Involved in all phases of testing (Requirements gathering, Test Case Design, Test data Creation, Test Execution, Results Reporting and Status Reporting).
  • Execution of System Integration Testing, Functional Testing, Regression Testing, Performance and Load Testing and result analysis
  • Responsible for the Release testing, that is the last phase of testing before pushing the changes into production
  • Responsible for providing test summary reports for all system testing scenarios for all projects involved.
  • Worked with the TAE (test engineer environment) in setting up the test environment.
  • Created high level strategy documentation and detailed test documents.
  • Carried out Boundary value analysis using extreme of input domain such as maximum, minimum, just inside/outside boundaries, typical values and error values.
  • Participated in daily status call with client and coordinating on status and work.
  • Reviewed test cases of other team members and provided mentoring and feedback.

Confidential, Minnetonka, MN

Senior QA

Environment: Tools: Bug Tracking Tool, Mercury Quality Center

Database : Oracle

O/S : Windows, UNIX

Responsibilities:

  • Coordinate project testing, often including testing between multiple test teams.
  • Develop test strategies, plans, schedules and cases utilizing product knowledge and coordination with technical analysts.
  • Execute test cases and create defect reports using industry standard reporting techniques.
  • Developed test case suites based on project documentation, communication with clients, and feedback from test groups.
  • Write and review Test plans and Test Strategies with participating teams in formal procedure.
  • Involved in all phases of testing (Requirements gathering, Test Case Design, Test data Creation, Test Execution, Results Reporting and Status Reporting).
  • Developed test plans, test strategies, test cases and test process and implemented
  • Execution of System Integration Testing, Functional Testing, Regression Testing, Performance and Load Testing and result analysis
  • Responsible for designing and developing integration test plans and test documentation
  • Created high level strategy documentation and detailed test documents.
  • Performed execution of test cases manually to verify expected results.
  • Bug reporting using Application Lifecycle Management.
  • Documented risks, impacts, replication procedures, availability, enhancements and stability concerns from root cause analysis; provided detailed solutions to resolve these issues.
  • Participated in daily status call with client and coordinating on status and work.

Confidential, Milpitas, CA

Senior Test engineer

Environment: Tools: Mercury Quality Center

Database : Oracle

O/S : Windows

Responsibilities:

  • Responsible for developing detailed UAT test plans, conditions, scenarios, cases and related data and participate in UAT planning and execution activities for business and operations participants.
  • Coordinate with the offshore testing team and help resolving the issues that occurs and making the smooth execution of the testing activities at offshore.
  • Participated in walkthroughs of requirements, specifications, database designs, ETL code, and test strategies.
  • Participate in the early phases of project to ensure the inputs into testing process and translate into Test plan and cases.
  • Tested back out scripts for various CR TESTING (change requests) throughout the application.
  • Responsible for the creation and execution of test plans and scripts to verify software functionality is working according to business requirements.
  • Worked with software testing methodologies with all phases and stages of testing desired (Functional, System, Integration, Regression, Data Validation, User Acceptance (UAT)).
  • Responsible for gathering business requirements to create detailed test cases. Ability to map test cases to business requirements and perform high end regression testing.
  • Reporting bugs for analysis. Documenting the actual results and metrics.
  • Creating the test data before executing the tests.
  • Responsible for designing, developing and the execution of reusable performance scripts.
  • Coordinating with the Business Analysts to understand the business Requirements.

Confidential

Senior Test engineer

Environment: Tools: Bug Tracking Tool, Mercury Quality Center

Database : Oracle

O/S: Windows, UNIX

Responsibilities:

  • Analyzed Business Requirements, Technical and Functional Documents and prepared test cases. Importing and maintaining test procedures into DOORS.
  • Develop test strategies, plans, schedules and cases utilizing product knowledge and coordination with technical analysts.
  • Execute test cases and create defect reports using industry standard reporting techniques
  • Developed test case suites based on project documentation, communication with clients and feedback from test groups
  • Responsible for complete testing activities, test plan, test cases, test scripts, test reports, defect management.
  • Coordinate project testing, often including testing between multiple test teams
  • Involved in all phases of testing (Requirements gathering, Test Case Design, Test data Creation, Test Execution, Results Reporting and Status Reporting).
  • Execution of System Integration Testing, Functional Testing, Regression Testing, Performance and Load Testing and result analysis
  • Responsible for the Release testing, that is the last phase of testing before pushing the changes into production
  • Created high level strategy documentation and detailed test documents.
  • Performed execution of test cases manually to verify expected results.
  • Responsible for testing, implementing, documenting and supporting rail and Train Control Systems including supporting transportation systems that allows for remote control of trackside power switches and signals to control train movement.
  • Positive Train Control(PTC) integration testing of the subsystems to verify the compliance of the system functionalities and design requirements.
  • Testing, implementing, and documenting an ongoing support of the Train Management Dispatch System (TMDS) used by the railway in our rail traffic control centers.
  • Documented risks, impacts, replication procedures, availability, enhancements and stability concerns from root cause analysis; provided detailed solutions to resolve these issues.
  • Participated in daily status call with client and coordinating on status and work.
  • Responsible for overseeing the Quality procedures related to the project and preparation of MPTT (Metrics Planning and Tracking Tool).

Confidential

Test Engineer

Environment: Tools: Mercury Quality Center, Citrix, PLM

Database : Oracle

O/S : Windows

Responsibilities:

  • Gathering business requirements through interviews and sessions with end users.
  • Responsible for data validation testing using SQL.
  • Responsible for developing detailed SIT and UAT test plans, conditions, scenarios, cases and related data and participated in SIT and UAT planning and execution activities for business and operations participants.
  • Prepared and reported Day-to-Day Reports and Testing Documentation in test environments.
  • Participated in walkthroughs of requirements, specifications, database designs and test strategies.
  • Involved in various testing modules Application & Data, Roles & Privileges, Schedule Management throughout the application.
  • Responsible for writing Test Cases, Test Plans, Test scripts and other test documents based on business requirement.
  • Analyzed and translated requirements and business design into Test Cases.
  • Recognize scope limitations and raise potential scope issues while designing the best system approach.
  • Tested the user interface and constantly redesigned the system to increase its user friendliness for the clients.
  • Responsible for overseeing the Quality procedures related to the project.
  • Preparation of Traceability Matrix both at Micro and Macro level.

Confidential

Test Engineer

Environment: Tools: Test Director

Language : VBScript

Database : Oracle

O/S : Windows

Server : BEA Web logic 8.1

Responsibilities:

  • Involved in documenting Test Plan and Created Test Cases.
  • Performed major role in the execution of test cases.
  • Performed manual testing of the application and identified defects.
  • Responsible for Functional, Performance Testing, Back End testing and User acceptance testing.
  • Developed Test plan and modified the test plan when required in later stages of testing.
  • Prepared test cases for Navigational testing, Functionality testing under load and GUI testing using Test Director.
  • Parameterized the test scripts to generate and test different reasons for inquiry.
  • Involved in database testing of the application to check for insert and update operations.
  • Handled unexpected errors using inbuilt exception handling.
  • Performed regression testing to check for defects in previous tested modules.
  • Correlated the unique data generated from the server such as Session ID.
  • Analyzed transaction graphs, web resource graphs and network graphs to pinpoint the bottleneck.
  • Analyzed Business Requirements, Technical and Functional Documents and prepared test cases.
  • Involved in Database Testing using SQL Queries.
  • Participated in meetings and walkthroughs.
  • Logged the errors and track them using Test Director and co-coordinated and efforts estimated in test planning.

We'd love your feedback!