Lead Qa Tester Resume
West Des Moines, IA
SUMMARY
- 8 years of diverse experience in Information Technology with emphasis on Software Quality Assurance, Software testing on Desktop and Web - based applications
- Good understanding of quality assurance testing methodology relative to the Software Development Lifecycle (SDLC) as well as STLC
- Extensive knowledge of QA methodologies such as Rational Unified Process (RUP), Waterfall Model and Agile Methodologies
- Extensive experience in reviewing and analyzing FRS (Functional requirement specifications) and SRS, and writing detailed Test Plans, Test Cases, and Test Scripts
- Excellent working knowledge in design & implementation of all QA Test strategy plans for both Manual and Automation as demanded by the UAT
- Extensive Experience in Planning and Test Execution and Post test result analysis
- Thorough understanding of structured Testing Methodologies working to implement, particularly from testing team perspective and user perspective
- Extensive working knowledge in Windows and UNIX
- Experience in testing using both manual testing and automated testing tools, client/server and web based applications in a mature process framework. Experience in testing with web services and XML
- Experience in performing various types of testing like Integration Testing,Functional Testing, System Testing, Regression Testing and Acceptance Testing
- In depth experience working withmanual and automated test casesusing various automations tools includingQuick Test Professional, Win runner
- Experience in backend testing using SQL statements to check whether the changes which are made in the front-end are effecting at the back-end
- Used Different defect trackingtools like Mercury Quality Center, Bugzilla, Jira to report defects
- Excellent problem solving skills, experience in working on group projects with a desire and ability to learn and apply new technologies
- Team player with excellent verbal and written communication skills
- Highly self-motivated responsible and reliable resourceful team player with a set of strong technical skills takes a positive approach to problem resolution
TECHNICAL SKILLS
Testing Tools: Win Runner, QTP and Quality Center
Operating Systems: Windows, Linux and UNIX
Languages: C, C++, Java, HTML
Database: Oracle, SQL Server 2000, MS-Access, DB2, Teradata
Query Tools: Toad, MySQL, SQL Navigator, SQL* PLUS, SQL Developer, ALTOVA XML SPY
Packages: MS Word, MS Power Point, MS Excel, Microsoft Query
Test Strategies: Functional Testing, GUI Testing, Integration Testing, System Testing, Regression Testing, UAT Testing, Black Box Testing
ETL Tools: Talend 5.1.1, Data Stage, Informatica 8.6
PROFESSIONAL EXPERIENCE
Confidential, West Des Moines, IA
Lead QA Tester
Responsibilities:
- Analyzed the Business Requirements of the project and worked with the Business Analyst to define the Acceptance Criteria and design the Test scenarios.
- Analyzed the Feasibility ofAutomationand also worked with Automation Architect in determining which tests will be executed viaAutomationand which tests will be executedmanuallyand documented the split in the Test Plan. And also involved in developing and reviewing the Test Plan
- Designed and developed theTestAutomation Frameworkthat tests from Source to the Target Databases usingQuick Test Proand modifying the framework as per client requirements.
- Automated the Test Scriptson all the business scenarios forFunctionalandRegression Testing.
- Ensuring the requirement traceability is up-to-date and also linking the Test cases to theRequirements.
- Performed ETL testing on variousLegacy Systems(INGENIUM, AS400, FASAT, ADMINSERVER, AINA and Capsil) and importing the data from these systems according the business rules using management tools.
- Performed Integration and system and UAT testing onInformatica Mappings.
- Extensively worked on writing complex SQL Queries on various ETL Source and Target databases likeTeradata, Oracle, and Microsoft SQL Server, DB2by retrieving the data using various kinds ofjoins andalso worked on queryingExtract Files and XML files(web services)by importing the data intoMS Access.
- Extensively worked onXML data feedsthat are sent to Confidential ’sData AggregatorsandKey Distribution Partners (KDPs)and writing the ComplexX-PathQueries inALTOVA XML Spy Tool.
- Extensively PerformedFunctionalityTesting, Sanity Testing, Smoke Testing, System Integration Testing andRegression Testingon ETL Source and Target Databases.
- Involved in testing of generatedSurrogate keysandNatural/Business keyson variousETL Target DatabaseTables, and validating the entire data using thesekey fields.
- Developed and modifiedUNIX shell scripts asto test the ETL Workflows.
- Extensively worked onError Handling and Balance and Controlas part of Test Execution.
- Logged & Tracked defects in QualityCentreand linked the defects to its corresponding test cases in Test Lab.
- Coordinating the Testing activities with offshore team on Daily basis andreporting theweekly statusto the client.
- DailyDefect Meetingsduringtest executionphasefor smooth QA process and also worked with the developers and Tech leads to review and identify the defect root cause.
- Provided theTechnical supportandProduction support.
Environment: Quick Test Pro 11.0/10.0, HP ALM Quality Center, Oracle 10g, ALTOVA XML Spy Tool 2007, MS Access Database, ALTOVA XML SPY 2012, TOAD, UNIX, Putty, Vi editor, Oracle Express, MS Query, Beyond Compare, Business Objects, SQL Developer, Teradata SQL Assistant 13.0, DB Visualizer 7.1.5, Informatica Power Center 8.6, Microsoft SQL Server 2005 and Windows XP.
Confidential
Team Lead
Responsibilities:
- Analyze business requirements and module-specific functionalities to identify test requirements and formulate an effective master test plan
- Involved in writing Test Plans, Test cases and developing Test scripts using Quality Center
- Maintain requirements and create Traceability between Requirements and Test Cases
- Validate the data of source tables with the target tables by using complex SQL Queries
- Maintain documentation on different types of Testing (Regression, Integration, System, Functional and User Acceptance Testing/UAT)
- Communicate inconsistencies between system specifications and test results to development and/or analyst team
- Perform various types of testing (System Testing, Integration Testing and Functional Testing)
- Perform regression testing of application for every new build released
- Perform testing with extract files by pulling them into Microsoft Query and comparing with the target data in the Datawarehouse
- Attend Business review meetings to discuss the Change Request
- Maintain track of the introduced Change Requests
- Perform the System testing and regression testing of the Change Request
- Manipulate the data and validate the data from source to target
- Manage the overall testing effort, including coordination of testing personnel and test lab resources
- Develop detailed Test Plans and Test Cases and provided the documentation
- Executed the Test Cases by moving into Test Lab in Quality Center
- Retest and perform Regression Testing after fixing the problems
- Execute automated scripts responsible for Regression Testing using QTP
- Extensive use of Quality Center for Defect Tracking.
- Track and Report software defects and interact with developers to resolve technical issues
- Attend the release status meetings and update the team about the status of the defects
- Coordination of onsite/offshore team members to ensure the team stays on task and completes assigned work in a timely manner
- Taking daily and weekly status updates from the offshore team members
Environment: Informatica Power Center 8.1, Quality Center, Quick Test Professional, SQL Developer, Windows XP, Virtual Machines, Microsoft Query, SQL Server 2005, DB Visualizer, AS400, Teradata, Oracle, DB2
Confidential
Team Lead
Responsibilities:
- Analyzed business requirements and module-specific functionalities to identify test requirements and formulate an effective master test plan
- Involved in writing Test Plans, Test cases and developing Test scripts using Quality Center
- Maintained requirements and created Traceability between Requirements and Test Cases
- Validated the data of source tables with the target tables by using complex SQL Queries
- Maintained documentation on different types of Testing (Regression, Integration, System, Functional and User Acceptance Testing/UAT)
- Communicated inconsistencies between system specifications and test results to development and/or analyst team
- Performed various types of testing (System Testing, Integration Testing and Functional Testing)
- Performed regression testing of application for every new build released
- Managed the overall testing effort, including coordination of testing personnel and test lab resources
- Developed detailed Test Plans and Test Cases and provided the documentation
- Executed the Test Cases by moving into Test Lab in Quality Center
- Retesting and Regression Testing after fixing the problems
- Executed automated scripts responsible for Regression Testing using QTP
- Extensively used Quality Center for Defect Tracking.
- Tracked and Reported software defects and interacted with developers to resolve technical issues
- Attended the release status meeting and updated the team about the status of the defects
- Coordination of onsite/offshore team members to ensure the team stays on task and completes assigned work in a timely manner
- Took daily and weekly status updates from offshore team members
Environment: Informatica Power Center 8.1, Quality Center, Quick Test Professional, SQL Developer, Windows XP, Virtual Machines, SQL Server 2005, DB Visualizer, AS400, Teradata, Oracle, DB2
Confidential, Madison, WI
QA Tester
Responsibilities:
- Analyzed Business Requirements documents for iMAS and NimbleDesign application.
- Performed requirements review and manual test verification
- Extensive performance of manual testing forJava/J2EEWindows and Web based applications
- CreatingTest Run Plans, Test Run reports, Test Plans, Test Cases and Test Scenariosbased on customer and product software requirements, Use cases and Functional Specifications in the Quality Center and Excel
- Conducted smoke testing in Java IDE in eclipse before the release of version and also conducted smoke test as soon as the version is released to ensure that the version is ready to test.
- Performed System, Functional, Integration, Regression, Compatibility, GUI, UAT, End-to- End and Sanity testing for each iteration
- Participating in unit test using JUnit.
- Created the SQL queries to perform the testing against the Oracle database
- Wrote defects using Quality Center tool, Bugzilla and Jira then followed-up with developers, to retest defects and ultimately close the defects
- Attended CCB (Change Control Board) meetings, participated in bug review meetings with the software development team throughout the testing phase
- Worked with the whole project team during go-live and Production testing, problem resolution
- Performed Multi-OS testing with Windows XP/Vista/7, Linux 2.6 and Mac 10.6
- Created traceability of test cases to requirements, test metrics and participated in status meetings and reported the progress to the manager.
- Validated data by executing Perl scripts
- Extensive use of Linux/Unix command to perform data comparisons
- Tested DAU tool in Red Hat Linux environment
Environment: Java/J2EE,JDBC, Servlets, JSP’s, Java Beans, Spring, JMS, Web Services (SOAP, WSDL), J2EE Design patterns, Tomcat,Quality Center tool, Bugzilla, Oracle 10g, Virtual Machines, JUnit Toad,Windows XP, Vista and Windows 7, Linux/Unix and Mac 10.5 and 10.6 Operating systems
Confidential, Madison, WI
QA Tester
Responsibilities:
- Analyzed business requirements and module-specific functionalities to identify test requirements and formulate an effective master test plan
- Expertise in developing Test Scenarios, Test Procedures, Test Cases and Test Data
- Conducted manual tests meticulously and exhaustively for entire application using manual scripts designed specifically for this purpose
- Strong experience in testing using manual testing and automated testing tools, client/server and web based applications in a mature process framework
- Validating the Source Data against the target Data by writing complex SQL Queries on the both the Databases
- Expertise lies in Functional testing, Manual Testing, Automated Testing, Regression testing, User Acceptance testing (UAT), System testing, Integration testing, White box and Black box testing.
- Performed Functional and Regression Testing
- Conducted Functionality, Interface and Regression Testing during the various phases of development
- Executed Test scripts from Quality Center Test lab on local and remote machines
- Developed SQL queries to test the database functionality of the application
- Worked with QA Manager and Project Managers to prioritize and scheduled testing and to resolve scheduling conflicts
- Documented test execution results and prepared reports
- Conducted Regression testing for the fixes and enhancements in the application
- Reported and tracked defect in Defects tab in Quality Center
- Responsible for weekly status meeting to gather the progress of manual testing
- Updated team lead with daily and weekly testing status
Environment: Quality Center, Oracle 10g, JAVA, Informatica power Center,SQL Developer MS Excel, MS Word 2003, Java Script, HTML, DHTML, Agile, Windows Server 2003, Win Runner, MS Outlook 2003, QTP, UNIX, and Windows
Confidential, Brooklyn, NY
QA Tester
Responsibilities:
- Analyzed the business requirements and specifications.
- Designed and developed test plans and tests scripts for manual testing of all the modules.
- Developed test cases after analyzing the specifications document.
- Analyzed the Use Case diagram, Sequence diagram, Class diagram and wrote test cases.
- Tested the website for all the functionalities.
- Involved in creating environment for the test bed to test the application.
- Performed GUI testing using Mercury Win Runner automation tool.
- Performed extensive Regression Testing for subsequent versions of the application using Win Runner Proficiency in Database Testing and Browser Testing
- Writing and enhancing Test Scripts with every release of the product.
- Excellent logical skills for understanding and developing system workflows, computing and Verifying software Metrics and well suited for communicating with both technical and nontechnical Professionals
- Experience in writing SQL queries for database manipulations.
- Participating in unit test using JUnit.
- Performed end-to-end testing on the release version of the software application and detected GUI bugs.
- Reported and tracked defect in Defects tab in Quality Center.
- Facilitated defects call betweenBusiness, Production team, Developers, QA Team, System Integration Testing Team and User Acceptance Testing team
- Set up ofPerformance LabwithMultiple Machines,NetworksandSimulated environments.
- Knowledge in Analysis of Defect severity, Defect tracking System and Defect reporting
Environment: Mercury Win Runner7.0, Load Runner 8.1, Java, J2EE, Servlets, JSP, Java Script, Junit, HTML, XML, SQL Server 2000, HTML, MS SQL Server, Oracle, Windows NT