Qa Lead Resume
Hartford, CT
Summary:
Around 13 years of IT experience out of which around 8 yrs as a Team Lead in Design, Development and Implementation of testing procedures and methodologies for Web, Client/Server and ERP based Applications. Extensive experience on ETL (Informatica) Testing and Oracle, SQL Server, UDB DB2 and ERP Applications. Over 5 years of experience in developing and executing automated/manual test scenarios. Experienced in leading, coordinating and communicating the status of the testing effort to ensure that the QA Best Practices are followed. Extensive knowledge of QA methodology, standards and procedures such as creating Test Plan, Develop Test Cases and Test scripts on multiple Operating Systems like Windows, UNIX, LINUX. Involved in Status/Project Meetings to resolve issues & report the Testing Progress
Expertise includes:
- Strong experience on implementing all phases of Software Testing Life Cycle including Automation Testing using WinRunner, LoadRunner and Quality Center.
- Extensive knowledge on Black Box Testing, Functionality Testing, Positive Testing, Negative Testing and Regression Testing
- Involved in Functional, GUI, Database & Performance Testing of the application
- Strong knowledge on writing SQL Queries to perform Database Testing
- Experience in creating UDF (User Defined Functions) to enhance Test Scripts
- Excellent interpersonal and oral/written communication skills with an ability to work independently as well as in a team.
Technical Skills:
Operating Systems Windows XP/2000/NT/98, UNIX
Languages C/C++, Java, VB, Siebel, People Soft, SQL, PL/SQL
Databases Oracle, DB2, MS SQL Server, MS Access
Testing Tools WinRunner, LoadRunner, Test Director
Web Technologies Java, J2EE, JSP, HTML, Java Script, VB Script, ASP .Net, VB .Net
Tools Informatica, Data Stage, SQL Navigator
Education:
Masters in Computer Applications.
Professional Experience:
Confidential,Windsor/Hartford, CT
Confidential,QA Lead
August 2007 – Till Date
Responsibilities:
- Design and Develop test strategies, master test plans and test cases and deploy data warehouse extract/transform/load (ETL) processes using Informatica and SQL in
Db2 and ORACLE environments based of off functional requirements and specifications (Process Flows and Pop Specs).
Experienced in Back End, data warehouse testing Proficient in writing and execution of complex SQL statements. - Understanding of ETL concepts and hands-on experience in testing Informatics Workflows.
- Proficient in most test automation tools (Win Runner, QTP, Silk Test etc), automation methodologies and in writing automated scripts.
Execute test cases. Perform functional and regression testing. - Mentor and manage test team members.
- Working with the end users to create user acceptance test strategies and test scenarios to verify that the system performs the business functions as required.
- Experienced in leading small to big test teams. Prioritizing testing tasks and scheduling tasks in accordance with project delivery guidelines and methodology.
Experience in working with off-shore teams. - Experience in writing shell scripting language in UNIX.
- Strong exp in writing & testing PL/SQL code is required (queries, stored procedures etc)
- Communicate the test team expectations and strategies to rest of the team and project manager.
- Work experience in insurance Domain with exposure to SOX compliance.
- Create the test status reports/Defect reports for the management.
- Advanced level skills in test management tools like Quality Center for Requirements, Planning, and Design & Execution of automated scripts, Defect management, Generating reports and traceability.
- Testing Coordination/Execution - Defect Management Status, Issues, Metric Reporting.
Environment: IBM Mainframe, Oracle, DB2 Sybase, UNIX, LINUX, JCL, Informatica, WMS, Winrunner, Loadrunner, MS-Office, MS-Visio
Confidential,New Jersey Warehouse Management System QA Lead March 2003 – July 2007Responsibilities
- Formed a QA team, including job description development, hiring and training team members, and performed staff performance reviews.
- Owned the creation and ongoing maintenance of test process documentation and test plans
- Worked with the development team to create a suite of test data (both input files and expected results) that fully exercises data validation (detecting and rejecting bad data) and actual ETL (Informatica) functionality (properly processing good data, i.e., transformations, calculations, and derivations)
- Managed the ETL build validation process on an ongoing basis
- Responsible for the validation of any additions/modifications to the Cognos reports before release to production
- Coordinated with the ETL and Cognos administrators on the development/test/production promotion process, i.e., how new builds get deployed to production after successful testing
- Wrote SQL queries to validate that actual test results match expected results
- Directed the load and verification process for new clients initial data before release to production, working with other QA analysts to ensure that any client-specific business rules are applied correctly.
- Identified and log defects if/when a test fails, if necessary, using SQL to narrow down the root cause of the problem for efficient investigation by the development team
- Owned the overall defect tracking process, by which defects are prioritized and scheduled for resolution, including preparation of reports that summarize overall project status for review by the team
- Participated in stress testing activities, in collaboration with the development and production teams, to ensure satisfactory performance of both ETL processing and end-user reporting
- Collaborated with the development team the creation of test strategy and plans
Environment: AS/400, DB2, UNIX, LINUX, JCL, Informatica, WMS, Winrunner, Loadrunner, MS-Office, MS-Visio
Confidential,Parsippany, New Jersey
ETL/QA Analyst
May 2001 – February 2003
Responsibilities:
- As a QA Analyst I was required to provide technical quality assurance for web based applications, meeting both company standards and end-user requirements.
- Developed manual and automated test cases and Winrunner scripts to test the functionality of the applications and generated Test Scripts and automated it for future use in Winrunner.
- Experience in mainframe and client server testing. Experience in data validation and cross-system integration.
- Executed UNIX Shell Scripts for Batch Job Execution
- Prepared Score card and Test metrics after each Cycle’s Test execution.
- Defects have been logged in Quality Center by giving appropriate priority and severity levels to each defect.
- Worked with Data Reconciliation for the entire ETL (Informatica) project. Written several backend SQL queries.
- Heavily interacted with ETL team for testing data quality and data validation.
- Worked with ETL reconciliation process for exact count on source, target and reject records.
- Written Test Plans, Test Conditions scripts and Test Cases.
- Performed all phases of Testing Methodologies that included User Acceptance Testing, Functionality Testing that includes Regression Testing, Integration Testing, Black Box Testing, Interface Testing, End to End testing.
- Performed Unit Testing and validated the data.
- Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury’s Test Director.
- Created and executed various Load Test Scenarios, Analyzed the test results for Load Testing using LR Analysis, Written Performance test plan/Load Test Plans based on User requirements.
- Worked with database to compare the results from front-end data values with backend data values.
- Responsible in executing test scripts manually, to ensure that the quality control procedures are correctly implemented and effectively improved.
- Written test cases to test the application manually and automated using Winrunner.
- Defects logged & tracked in Mercury Quality Center.
Environment: SQL, PL/SQL, UNIX, LINUX, Informatica, Winrunner, RCS, SAS, Oracle 9i, XML, Excel, IE, Netscape, Windows XP
Confidential,QA Engineer
September 1998 – April 2001
Responsibilities:
- Created Test conditions and Test Plan after reviewing the Business Requirements and Technical Specifications.
- Prepared Test Cases for all the links and various policy activities to test the company Portal.
- Identified and Automated the Test cases for Regression Testing
- Developed Manual Scripts to test the functionality of the new portal.
- Created multi environment scripts that could be executed on different environments.
- Used Regular Expressions to identify an Object Properties in changing environments.
- Built Data driven reusable actions that could be used in different Test scripts/ flows
- Created different Test lab test sets to run in QA and Production environments
- Used verification points, Output values and conducted Data Driven Test with multiple test data sets and verified the contents of the website.
- Executed the Automation scripts for QA Release cycles, Production check out and Infrastructure changes.
Environment: Siebel 6.0 (CRM), MS SQL Server, SQL Navigator, VB.Net, ASP.Net, UNIX, Web Logic 6.1/8.1, XML, Windows XP, IE, Netscape