Lead Quality Assurance (qa) Analyst Resume
Washington, DC
SUMMARY
- Over 8+ years of extensive experience as a Quality Assurance Analyst (QA) in various domains
- Excellent expertise in application development process employing different Software Development Lifecycle Methodologies (SDLC): Waterfall, Agile Scrum, Agile XP, RUP, Iterative and Incremental
- Extensive experience following Test Life Cycle - Test Planning, Test Designing, Test Development, Test Execution, and Test Evaluation
- Excellent analytical skills for understanding the business requirements, business rules/processes, and detailed application design
- Extensively skillful in software specification analysis and writing master test plan documenting test objectives, test strategy, requirements traceability, and test environment
- Strong experience in ETL testing as per the Data Mapping document - validating the data source, data target, and the respective business rules
- Strong Expertise inFunctional testing, Integration testing, System testing, GUI testing, Performance testing, Load testing,Volume testing,Stress testing, Configuration testing, DB testing, Hardware Testing, Smoke testing, Regression testing, and supporting User Acceptance testing (UAT)
- Solid experience in database testing ensuring database is been built as per the Entity Relationship model (ER Diagram) - testing entities, relationships, data integrity rules, database constraints, and triggers
- Proficient in working with different databases like SQL Server, Oracle, Sybase and also performed complex SQL queries using various data retrieving tools like TOAD, PL/SQL, Management Studio, SQL Analyzer, SQL Plus to extract data and to perform Data Validations and Data Issues Investigations
- Proven expertise in managing Defect Tracking Lifecycle process: maintaining test log, opening & tracking defects, assigning defect to developers, follow-ups, and generating defects summary reports, using Quality Center (Test Director), JIRA, BugZilla
- Well versed in generating Test Scripts and conducting automated testing via QTP, LoadRunner, Selenium
- Hands-on experience in assisting project manager in creating detailed project plan, resource management, development scheduling, testing timelines, and identifying & tracking project timelines
- Good understanding of different types of risks: Market Risk, Credit Risk, Operational Risk, Counter-party Risk, Interest Rate Risk, Inflation Risk, Systematic Risk, and Un-systematic Risk
- Expertise in organizing, conducting, and leading various JAD sessions with clients, status meetings with stakeholders & developers, and walkthrough sessions throughout the application development process
- Experienced in handling both on-shore and off-shore teams
TECHNICAL SKILLS
OS: Windows 2000/2003/NT/XP, Unix
Languages: SQL, Visual Basics, JavaScript, JAVA, .NET, J2EE, C++, VB Script, HTML, T-SQL
RDBMS: SQL Server, Oracle, DB2, Sybase, MS Access, PL/SQL, TOAD, Management Studio, SQL Analyzer, SQL Plus
Testing Tools: QTP, WinRunner, LoadRunner, Selenium, Silk, DiffEngineX, GEM
Defect Tracking: Quality Center (Test Director), JIRA, Bugzilla, Rational Clear Quest
Modeling Tools: Erwin, MS Visio, Rational Rose
Other Tools: Rational RequisitePro, Rally, Autosys, Confluence, Wiki, Teradata, Business Objects, Crystal Reports, OBIEE, SSRS, Informatica, SSIS, MS Office, MS SharePoint
PROFESSIONAL EXPERIENCE
Confidential, Washington DC
Lead Quality Assurance (QA) Analyst
Responsibilities:
- Extensively followed Agile SCRUM methodology and implemented various QA methodologies, testing strategies, and test plans in all stages of SDLC
- Effectively worked in developing Test Plan, Test Cases, and Test Scripts for the AHDP application as per the user requirements, system specifications, and business rules for each release
- Worked with Toad tool for testing Raw Data from different Agency databases
- Extensively used Database techniques for validating, transforming and cleaning the data.
- Worked on different Testing methods like Functional testing, Regression Testing, Parallel Testing, System testing and Negative & Positive testing for maintaining Data Quality and Data Completeness.
- Extensively used QC ALM for creating Test Cases and managing defect tracking lifecycle - logging test cases, capturing open defects, and generating testing reports & graphs
- Wrote SQL queries in TOAD to access the data from the database tables and crosscheck the data from different Agencies.
- Collaboratively with development team conducted ETL batch processes and created Stored Procedures to transform the data
- Maintained High Level Summary document (Traceability Matrix for Data) to trace the test cases and to ensure the user requirements have been successfully developed, tested, and implemented effectively
- Exclusively Worked with Weekly Status Reports for reporting Test Execution Coverage, Test Execution Summary, Defect Trend and Issue Log status
- Assisted the project manager with timelines, resources allocation, and deliverables
- Participated in Issue Log weekly status meetings, Report status meetings and Project status meetings to discuss issues and workarounds
- Communicated with developers through all phases of testing to prioritize bug resolutions
- Generated daily progress report for the project team and conducted formal bug review meeting
Environment: QC ALM, TOAD, SQL Server Management Studio, Microsoft Outlook 2007, MS Visio, MS Office, Windows
Confidential, Chicago, IL
Senior Quality Assurance (QA) Analyst
Responsibilities:
- Extensively followed Agile SCRUM methodology and implemented various QA methodologies, testing strategies, and test plans in all stages of SDLC
- Effectively worked in developing Test Plan, Test Cases, and Test Scripts for the GOLD application as per the user requirements, system specifications, and business rules for each release
- Extensively worked with GEM tool for testing Raw extracts from YOLUS and GOLD databases
- Conducted Data Centric Testing, Functional testing, Regression Testing, Parallel Testing, System testing and Negative & Positive testing
- Worked with DiffEngineX tool for generating reports from multiple components in GOLD for downstream system - RISK team
- Extensively used JIRA for managing defect tracking lifecycle - logging test cases, capturing open defects, and generating testing reports & graphs
- Wrote SQL queries in TOAD to access the data from the database tables and crosscheck the data in Extracts
- Wrote SQL queries in SQL Server Management Studio and SQuirreL SQL Client for validating the code and crosscheck the data in raw files
- Worked with Yolus Application Manager for monitoring heterogeneous database server
- Collaboratively with development team conducted ETL batch processes and created Stored Procedures to transform the data
- Extensively participated in User Acceptance testing (UAT) for GOLD-APAC Region project by closely working with end users / development team
- Maintained GEM High Level Summary document (Traceability Matrix for extracts) to trace the test cases and to ensure the user requirements have been successfully developed, tested, and implemented effectively
- Exclusively Worked with Weekly Status Reports for reporting Test Execution Coverage, Test Execution Summary, Defect Trend and Issue Log status
- Assisted the project manager with timelines, resources allocation, and deliverables
- Responsible for Issue log and Acceptable Difference log Reports and GEM detail and High Level Summary reports
- Developed defect tracking and analysis procedures using JIRA
- Participated in Issue Log weekly status meetings, Report status meetings and Project status meetings to discuss issues and workarounds
- Communicated with developers through all phases of testing to prioritize bug resolutions
- Generated daily progress report for the project team and conducted formal bug review meeting
Environment: JIRA, SQuirreL SQL Client, TOAD, Yolus Application Manager, GEM Tool, SQL Server Management Studio, DiffEngineX, Lotus Notes 8.5, MS Visio, MS Office, Windows
Confidential, WA
Sr. Quality Assurance Analyst
Responsibilities:
- Effectively implemented different QA methodologies/policies, testing strategies, and test plans in all stages of SDLC - following Agile SCRUM methodology
- Designed and developed Test Plan, Test Cases, and Test Scripts for the application as per the user requirements, system specifications, and business rules for each release
- Identified & designed complex positive and negative test scenario through analyzing different UML diagrams such as Use Cases, Activity Diagrams, Swim Lanes Diagrams, and Data Flow diagrams
- Conducted Functional testing, Integration testing, System testing, GUI testing, Database Integrity testing, Performance testing, Negative & Positive testing, Smoke testing, and Regression testing
- Worked with the GUI team to understand user interface guidelines and conducted GUI testing on various screens - Trade Blotter, New Order, Order History, and Trade Execution
- Conducted end-to-end ETL load process testing validating data source, data target, and business rules as per the data mapping document
- Participated in Validation and Data driven testing with different sets of data
- Performed database validation to ensure database triggers, dependencies, constraints, relationships, data objects are in accordance to the ER data model
- Extensively participated in User Acceptance testing (UAT) closely working with end users / developers
- Extensively used Quality Center as a repository for all testing deliverables and to manage defect tracking lifecycle - logging test cases, capturing open defects, and generating testing reports & graphs
- Maintained Requirement Traceability Matrix to trace the test cases to business requirements and to ensure the user requirements have been successfully developed, tested, and implemented effectively
- Assisted the project manager with creating project plan, timelines, resources allocation, and deliverables
- Responsible for Testing Reports and Auditing Reports
- Wrote SQL queries to access the data from the database tables and crosscheck the results
- Assisted in writing Stored Procedures, Triggers, Tables, Views, and SQL Joins and other statements for various applications
- Collaboratively with development team created Stored Procedures to transform the data and worked extensively in T-SQL for various needs of the transformations while loading the data
- Validated T-SQL stored procedures to generate DML scripts that modified database objects dynamically based on user inputs
- Developing defect tracking and analysis procedures using Quality Center
- Participated in strategy meetings, build release meeting, and status meetings to discuss issues and workarounds
- Worked with the OBIEE Reporting System for reporting to downstream systems
- Communicated with developers through all phases of testing to prioritize bug resolutions
- Generated daily progress report for the project team and conducted formal bug review meeting
Environment: Quality Center, SQL Server, Management Studio, SSIS, SSRS, MS Outlook, T-SQL, QTP, MS Visio, MS Office, Windows, Unix, Java, C++, J2EE, HTML
Confidential, Richmond, VA
Senior QA Tester
Responsibilities:
- Extensively worked with end users, systems analysts, designers, and programmers to analyze and create detailed test plan, test scenarios, test cases, and test scripts confirming quality assurance processes
- Followed complete testing lifecycle - Test Planning, Test Designing, Test Development, Test Execution, and Test Evaluation for quality assurance process
- Implemented the platform following the principles of Waterfall methodology
- Analyzed BRD and FSD to developed a master test plan in documenting of testing scope, testing approach, testing resources and testing entry/exit criteria
- Reviewed and analyzed UML models such as Use Cases, Written Use Cases, Activity Diagrams, and Business Process Flows to understand the application functionality and document test description, test scope, test objectives, and test environment
- Developed test cases by analyzing functional requirements and executed test cases to maximize bug count, block premature product releases and to verify compliance with a specific requirement
- Assisted test automation team in creating test scripts for functional, performance and regression testing
- Performed all different types of testing Functional testing, System testing, Negative testing, Black Box testing, End-to-End testing, Usability testing, Performance testing, Smoke testing, Regression testing, and facilitated User Acceptance testing of the application
- Validated the ETL Process from flat file to target database tables / columns confirming to the data mapping document; tested the database to ensure entities, relationships, and data attributes are defined as per the conceptual, logical, and physical data model
- Managed the complete defect tracking lifecycle – executed test cases, analyzed test results, reported bugs, tracked bugs, and followed-up with developers to resolve bugs
- Developed several SQL queries to retrieve data from database to conduct back-end testing, data validation testing, and data analysis
- Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links, and formatted the results into reports
- Assisted in drafting documentation describing the database schema as well as metadata for each entity
- Analyzed, optimized, and tuned complex Queries/Stored Procedures in database for faster execution and assisted in developing database structures
- Involved in UAT testing phase to support clients by setting up the testing environment, reporting defects, and notifying test lead of any issues impeding progress of testing
- Actively participated in change requirement meeting and reviewed change request documents
- Generated various project reports such as Test Case Execution reports, Daily Status reports, and Weekly Status reports for testing process
- Conducted formal bug review meeting, JAD sessions, status meetings, and generated progress reports for development team, QA team, business team, and PMO team
- Managed Requirement Traceability matrix to track the test cases to business requirements and ensure the requirements have been successfully developed, tested, and implemented effectively
Environment: JIRA, LoadRunner, SQL, Oracle, PL/SQL, MS Outlook, Rational Rose, MS Office, Windows, Java
Confidential, MD
Quality Analyst Tester
Responsibilities:
- Developed test plan and test cases by reviewing business requirements, functional specifications, and use cases
- Conducted requirement review meetings with Business Analysis team and Development team to determine test timelines and test scenarios
- Employed Agile XP methodology guidelines throughout the application development process
- Developed Test Scripts based on the business requirements and technical specifications
- Involved in the Test Case walkthroughs, assessment meetings
- Performed Sanity Testing, Integration Testing, GUI Testing, Functional Testing, Regression Testing, System Integration Testing, and Performance testing
- Conducted back end testing using SQL queries
- Wrote / executed SQL queries to retrieve data, back-end testing, data validation, and data analysis
- Supported User Acceptance Testing (UAT) process before rolling out to production
- Managed defect tracking lifecycle, log defect and maintained defect repository using Quality Center
- Responsible in providing regular test reports to the management
- Determined and documented the nature and priority of defects occurred while testing and co-ordinate with the development team to confirm the fixes
- Conducted formal team review meetings & produced daily progress report, Defect report to Project Managers
- Worked extensively to schedule and execute the batch jobs
- Actively participated in Change Requirement Meeting and reviewed Change Request Documents
- Assisted in migration of data and database objects from Development to Production Server
- Created Requirement Traceability Matrix document to trace test cases and defects to requirement
Environment: Windows, UNIX, SQL, Java, Quality Center, HTML, MS Visio, Oracle, TOAD, MS Office, MS Excel
Confidential
Quality Analyst
Responsibilities:
- Analyzed and reviewed the software requirements, functional specifications, and design documents
- Identified the test requirements based on application business requirements and blueprints
- Implemented QA processes – documenting Test Plan, Test Strategies, Test Scenarios, Test Objectives, and Test Procedures
- Involved in analyzing the applications and development of test cases
- Performed manual testing and maintain documentation on different types of Testing – Positive, Negative, Regression, Integration, System, User-acceptance, Performance, and Black Box
- Wrote SQL queries and stored procedures to validate data.