Testing Lead Resume
IL
Work Experience
Software Eng(Data warehouse) Confidential,May 2006
Data warehouse Tester,Confidential,Dec 2007
Data warehouse Engineer, Confidential,May 2008
Sr. Data warehouse Analyst, Confidential,Apr 2009
QA Analyst ,Confidential, Aug 2010
Testing Lead ,Confidential, Mar 2012
PROFESSIONAL EXPERIENCE
- 6+ years of IT experience in Quality Assurance for ETL, ERP, Web based, Client/Server applications using Manual and Automated testing tools.
- Proficient in writing SQL queries to perform data driven tests and involved in front-end and back-end testing. Strong knowledge of RDBMS concepts. Developed SQL queries in Oracle databases to conduct DB testing. Worked on Data files & label parameter of data file, strong in writing UNIX Korn shell scripting.
- Experience in analyzing Business Requirement Document (BRD), Functional Requirement Specification (FRS) and assisting in developing TEST PLAN.
- Expertise in creating/developing TEST CASES and developing TEST SCRIPTS.
- Experience in working as a liaison between the team members and the manager.
- IT industry experience in Banking, Financial, Software Services and Telecommunication industries with strong Business and Functional knowledge and Ability to evaluate ETL/BI specification, verify it with systems and Experienced with Master Data Management (MDM)
- Experience in testing Data Marts, Data Warehouse/ETL Applications developed in Informatica, Data Stage and Ab Initio using Oracle, DB2, SQL Server and UNIX.
- Expertise in writing SQL queries, PL/SQL Stored Procedures and Triggers
- Familiar with Informatica Repository Manager, Designer, Workflow Manager, Workflow Monitor
- Comfortable in working with fixed length, delimited flat files and also worked with large volume of data in the database/data warehouse.
- Expertise in developing test cases/ test scripts for Inbound and Outbound ETL processes.
- Experience in testing/validating the Source, Stage and Target (End-to-End).
- Expert in analyzing the error caused to failure of the ETL load with reference to the log file and report them to the corresponding team to get it rectified.
- Strong knowledge of working with Data Marts (Star Schema and Snowflake Schema)
Project Details
Confidential, IL
Project Title: Used Truck Solution
Domain: Finance
Scope of the project: Navistar is decommissioning the legacy mainframe Marketing and Retail Accounting Systems (MRAS). The MRAS system has ties to Warranty, Parts, Navistar Finance Corporation (NFC), Truck Invoicing, miscellaneous GL transactions and Used Truck Operations.
Duration: From: AUG 2010 TO: Till date
Team Size: 9
Role: Testing Lead
Responsibilities
- Participating in MRAS Decommissioning project.
- Managing the end to end testing activities for the implementation of the Used Truck Solution application during the project life cycles.
- Working directly with testing resources to co-ordinate technical and functional testing.
- Identifying the project business requirements to schedule and communicate the testing plans and results that support the production release decisions.
- Breaking down the requirements and developing test strategies for projects of simple to high complexity.
- Confirming that the testing resource follows the testing standards, guidelines and testing methodology as specified in the test approach.
- Reviewing test deliverables including test plans, test cases and requirement traceability.
- Measuring and Monitoring testing progress during each test cycle to confirm that the application meets the requirements and is piloted on time and within budget.
- Review and confirm defect tracking (Identification, fixing, re-testing and migration of defects) is conducted during each testing cycle.
- Manage test environments and version control for each test cycle.
- Document test results (metrics) for status meetings in order to track testing progress; ensure results are accessible to project teams.
- Checking for the performance of the solution.
- Develop a test plan to co-ordinate time and cost estimates.
- Facilitate the resolution of issues through individual and collaborative efforts.
- Delivering a Microsoft Dynamics AX/IDMS - Used Truck Solution for different modules like Customer, Accounts Receivable, Accounts Payable, Commissions, Locators, New VIN attributes etc.
- Performing test over Customer Master Interface between multiple ERP interfaces like Siebel, PeopleSoft, and Microsoft Dynamics AX.
- Performing integration test over the used truck solution which interlinks to six different tools.
- Participating in Mass sync test and support the production.
- Secured the interfaces on different environments; UT1, DEV1, TRN1 and Performing Regression test to check for data power (SOAPUI) issues.
- Integrating all the modules of Microsoft Dynamics AX-IDMS (AR, AP, Banking, GL, Administration, Inventory etc.)
- Software Tools/Skills: T-SQL, PL/SQL, Microsoft Dynamics AX 2009, Microsoft SQL Server, SharePoint, Control Version System, Siebel CRM 8.0, PeopleSoft, Management studio, Visio, Active Perl 5.8.9, Shell Scripting, Tera Data. SOAPUI
- Environment: UNIX, Windows XP, ASP.Net, Visual Studio 2010, Cognos 8.0, SQL Server 2008, Oracle 10G.
Confidential,CA
Project Title: Data Migration Delivery
Domain: Banking
Scope of the project: Data Migration of wholesale customers migrating from Wachovia bank to Wells Fargo Bank. The migration includes the services like Desktop deposit and Remote desktop capture services. It relates with the setting up the legendary database of Wachovia (SQL Server) into Wells Fargo (Oracle).
Duration: From: March 2010 TO: March 2012
Team Size: 30
Role: QA Analyst Lead
Responsibilities
- Liaison between onsite and off shore resources.
- Involved in functional, exploratory and Integration testing.
- Performed Data validity testing for reports and feeds based on client\'s requirement.
- Validated format of the reports and feeds.
- Performed manual testing of the application to check the data validity of the reports.
- Written SQL queries to access the data in SQL Server and Oracle database to execute back-end testing.
- Creation of the special SQL Script files for Automation
- Updated weekly status on the testing progress and other concerned issues.
- Involved in Formal Reviews and walkthrough for preparing test plans and test cases.
- Participated in daily QA meetings to resolve technical issues.
- Communicated with developers and Business Analysts to discuss issues and priorities.
- Test Case Management using Quality Center
- Identify the primary key (logical / physical ) and the references between the tables and put update or insert logic
- Deleting the target data before processing based on logical or physical primary key
- Design and execute the test cases on the application as per company standards
- Preventing occurrences of multiple runs by flagging processed dates
- Written Test Cases for ETL to compare Source and Target database systems.
- Interacting with senior peers or business to learn more about the data
- Identifying duplicate records in the staging area before data gets processed
- Test data creation in various required formats up to millions of records as per banking requirements.
- Actively involved in the requirements gathering sessions
- Testing the source and target databases for conformance to specifications
- Conditional testing of constraints based on the business rules
- Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Conducted regression testing on the application under test
- Software Tools/Skills: TSQL, PL/SQL, Informatica 8.6.1, Oracle SQL Developer, TOAD, Quality Center 10.0, Control Version System, Data Center Manager, Management studio, Erwin, Visio, Active Perl 5.8.9, Shell Scripting
- Environment: UNIX, Windows XP, SQL Server 2008, Oracle 10G.
.
Confidential,Illinois
Project Title: Equity Sales system
Domain: Finance
Scope of the project: The project involved validation of equity quote request provided by the users. It has checks for validating the UI portal using tools QTP. The data validation is carried out on the basis of the specifications and use case documents provided by the business team.
Duration: From: JAN 2009 TO JUN 2010
Team Size: 4
Role: Sr. Data Warehouse Tester
Responsibilities
- Executed test cases and updated test case.
- Involved in functional, exploratory and Integration testing.
- Performed Data validity testing for reports and feeds based on client\'s requirement.
- Validated format of the reports and feeds.
- Performed manual testing of the application to check the data validity of the reports.
- Written SQL queries to access the data in SQL Server database to execute back-end testing.
- Updated weekly status on the testing progress and other concerned issues.
- Involved in Formal Reviews and walkthrough for preparing test plans and test cases.
- Participated in daily QA meetings to resolve technical issues.
- Communicated with developers and Business Analysts to discuss issues and priorities.
- Test Case Management System-tool was used to create test cases.
- Identify the primary key (logical / physical ) and put update or insert logic
- Deleting the target data before processing based on logical or physical primary key
- Design and execute the test cases on the application as per company standards
- Preventing occurrences of multiple runs by flagging processed dates
- Written Test Cases for ETL to compare Source and Target database systems.
- Testing of records with logical delete using flags
- Interacting with senior peers or subject matter experts to learn more about the data
- Identifying duplicate records in the staging area before data gets processed
- Extensively written test scripts for back-end validations
- Ensured that the mappings are correct
- Conducted data validation testing
- Actively involved in the requirements gathering sessions
- Testing the source and target databases for conformance to specifications
- Conditional testing of constraints based on the business rules
- Identify and request test resources like QA engineers, Software (SW) and Hardware (HW)
- Performed functional testing using QTP for end to end application testing.
- Written SQL scripts to test the mappings.
- Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Conducted regression testing on the application under test
- Create and execute load test plans and scripts using Load Runner
- Debugging the SQL-Statements and stored procedures
- Developed regression test scripts for the application
- Involved in metrics gathering , analysis and reporting to concerned team
- Tested the Oracle PL/SQL testing programs
- Prepared daily/weekly bug status reports highlighting bug fix metrics and tracked the progress of test cycles in Quality Center
- Conducted Training & Knowledge Transfer Sessions on new applications to QA Analysts
- Software Tools/Skills: Informatica 8.6.1, SQL, PL/SQL, XML, XML Spy 2010, Shell Scripting, QTP9.5, TOAD, Quality Center 10.0, VB Script, Teradata SQL Assistant. Environment: UNIX, Windows XP, Oracle 10G, Teradata v12, SQL Server 2008.
Confidential, New Jersey
Project Title: Financial Data warehouse
Domain: Finance, Banking
Scope of the project: This is an anti-money laundering project. It features a brand new framework - APPOLLO or CDS. It is referred to as QCDS in the QA environment, PCDS in production and DCDS in Dev. There is an audit in Sept/ Oct to bring Barclays card processing to anti-money laundering standards. This is for all cards - Consumer and Business.
Duration: From: MAY 2008 TO APR 2009
Team Size: 11
Role: Data Warehouse Engineer
Responsibilities
- Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
- Involved in extensive DATA validation using SQL queries and back-end testing
- Used SQL for Querying the database in UNIX environment
- Developed separate test cases for ETL process (Inbound & Outbound) and reporting
- Involved with Design and Development team to implement the requirements.
- Developed and Performed execution of Test Scripts manually to verify the expected results
- Design and development of ETL processes using Data Stage ETL tool for dimension and fact file creation
- Involved in Manual and Automated testing using QTP and Quality Center.
- Conducted Black Box – Functional, Regression and Data Driven. White box – Unit and Integration Testing (positive and negative scenarios).
- Defects tracking, review, analyze and compare results using Quality Center.
- Participating in the MR/CR review meetings to resolve the issues.
- Defined the Scope for System and Integration Testing
- Identifying field and data defects with required information in Datastage ETL process in various jobs and one to one mapping.
- Prepares and submit the summarized audit reports and taking corrective actions
- Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
- Document and publish test results, troubleshoot and escalate issues
- Preparation of various test documents for ETL process in Quality Center.
- Involved in Test Scheduling and milestones with the dependencies
- Functionality testing of email notification in Datastage job failures, abort or data issue problems.
- Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule.
- Used PL/SQL, SQL Loader as part of ETL process to populate the operational data store.
- Created and executed test cases for DataStage jobs to upload master data to repository.
- Identified & presented effort estimations related to testing efforts to Project Management Team
- Conducted test case review and revision of surrogate key generation in DataStage to uniquely identify master data elements for newly inserted data.
- Responsible to understand and train others on the enhancements or new features developed
- Conduct load testing and provide input into capacity planning efforts.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
- Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
- Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
- Sending package install requests for new builds and verifying proper packages are installed.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
- Software Tools/Skills: Windows XP, DataStage 7.x, QTP 9.2, SQL Server 2005, PL/SQL, Quality Center 9.0, Load Runner7.0, Oracle 10g, Java, Unix AIX 5.2,VB Script.
Confidential,llinois
Project Title: Enterprise Data warehouse – Claims Data mart
Domain: Insurance
Scope of the project: The project is about UI web portal developed for providing quotes for the group insurance/ corporate insurance. The end dates and reminder notification are triggered to the corporate offices thru production.
Duration: From: DEC 2007 TO APR 2008
Team Size: 11
Role: Data Warehouse/ ETL Tester
Responsibilities
- Analyzed specifications and test plan for the testing process of Insurance Applications
- Developed test cases after analyzing the specifications Document.
- Developed Base line scripts for testing the future releases of the application using Rational Robot
- Ability to express technical concepts and procedures clearly and in easy to understand information, both verbally and in writing
- Developed test scripts for Functional, Performance and data driven tests for the site.
- Written test cases to test the Performance of the application using Rational Performance Tester
- Performed Load and Performance testing using Rational Performance Tester.
- Conducted Stress testing using Rational Performance Test.
- Conducted Performance testing under off load and peak load conditions.
- Lead test case review sessions
- Used Rational Clear Quest for the defect reporting and tracking
- Identify and record defects with valuable information for issue to be reproduced by development team
- Generates functional JUnit test cases that capture actual code behavior as a deployed application is exercised
- Generates extendable JUnit and Cactus (in-container) tests that expose reliability problems and capture behavior
- Executes the test suite to identify regressions and unexpected side effects
- Parameterizes test cases for use with varied, controlled test input values (runtime-generated, user-defined, or from data sources)
- Monitors test coverage and achieves high coverage using branch coverage analysis
- Identifies memory leaks during test execution
- Testing within Agile environment
- Steps through tests with the debugger
- Tests individual methods, classes, or large, complex applications
- Testing in a timebox
- Tracks how test results and code quality change over time
- Communicate application / product readiness on time to time basis and testing results to project team
- Conducted Functionality& Regression testing during the various phases of the application Using Rational Robot
- Executed the test scripts using Rational Robot and analyzed the results.
- Created automated scripts in Rational Robot to conduct GUI and functionality testing
- Created batch test for overnight execution of SQA test scripts.
- Used Rational Clear Case for version controlling
- Software Tools/Skills: Apple MAC 10.4, Oracle7.3, Java 2.0, Shell Scripting, XML, JDBC, Java Script JSP, Servlets, J builder, J2EE, EJB, Rational Robot 2003, Rational Performance Tester 7.0, Rational Clear Quest 7.0.1, Rational Clear Case 7.0..
Confidential, India (Wipro Technologies)
Project Title: Corporate Banking Data mart
Domain: Banking
Duration: From: MAY 2006 TO OCT 2007
Team Size: 6
Role: Data Warehouse Engineer
Responsibilities • Understanding of complex financial business requirements
- Writing test plans and test scripts
- Maintained Progress Report of team members for update on individual work during test execution cycle.
- Written complex SQL queries.
- Performed testing in mainframe environment
- Tagging the Testable and Non Testable requirements in FSS
- Updating the tagging process for new versions of FSS
- High Level Test Case Design and writing detailed test cases
- Involved in extensive DATA validation using SQL queries and back-end testing
- Reported periodic project status and updates to the QA Lead and QA Manager
- Heavily involved in interacting with UNIX Shell scripts.
- Analyzed Business Requirements and Developed the Test Plan, Test Scripts and Test Cases
- Exporting test cases to Test Director
- Preparation of Data Requirements
- Data Mapping (one-to-one mappings)
- Accessing mainframes and validating the data
- Identify & record defects with required information for issue to be reproduced by other teams.
- Worked with high-volume, real-time DB2 database applications and systems
- Conducted Bug Review meetings for update on defects from development team & retesting of bug fix. Worked with developers to fix faults found in the structure and functionality of the application.
- Experienced in co-ordinating resources within Onsite-Offshore model
- Conducted Training Sessions and Knowledge Transfer Sessions on new applications
- Managing testing documents in Test Director.
- Software Tools/Skills: Windows, SQL, PL/SQL, Mainframes, SAP BP, SAP XI, JCL, .Net, DB2, ASP.Net, Test Director
