We provide IT Staff Augmentation Services!

Technical Test Lead Resume


  • Over 15 years of diverse professional experience in all phases of Information Systems Development Life Cycle in Banking, Telecom, Insurance, etc. Work experience includes 8+ years of experience in senior/lead roles in Software Quality Assurance, Testing, Process Improvement, and Software Development. Recent projects included project assessment, planning and sizing, creating functional and system requirements, data collection and analysis, timeline and issue management, team building and team management, software test planning and strategy, etc.
  • Used several methodologies for assessing and improving implementation processes, Agile Methodology, Rational Unified Process (RUP), Software Development Lifecycle Methodology (SDLC)
  • Work collaboratively with team members to achieve common goals, while maintaining responsibilities and commitments. Foster professional growth through mentoring relationships. Presentations, end - user training and negotiation skill, easily communicate with users, developers and executives for software requirement gatherings, and reporting.
  • Extensive experience in various BI tools including BO, Crystal reports, SAP BA, Hyperion. Oracle certified and currently getting trained and exposed OBIEE. Strong ETL experience for several years.
  • Used in-depth technical strengths and business understanding to solve project (pre & post implementation) problems.
  • Projects involved include: system and functional requirements (Test Plan/Test Strategy/Test Cases/Test Scripts) and system testing.
  • Extensive experience with and understanding of: (1) Use Case / Business Case / Systems Case / Test Case (Technical and Functional Requirements); (2) Software Development Life Cycle (SDLC); (4) Project Life Cycle; (5) Visio Diagrams / UML / RUP; (6) Quality Assurance (QA);
  • Conducted all levels of testing within the Quality Assurance testing cycle: Functionality Testing, Integration testing, Ad Hoc Testing, Regression Testing, End-to-End System Testing, Co-existence testing, black box/white box testing on Compliance/Trading/P&L applications, etc.
  • Performed calculations and business rule validations on UOMs, Commodity and Currency (dollar amount differences between source and target) valuations which then used to calculate the portfolio based results (both Confidential and Confidential projects).


Operating Systems: HP - UNIX, Sun-Solaris, and Windows 7/10

Hardware: IBM PC/AT, SunE3000, Spark 5, and HP9000

Software: DOT-NET, ASP, Java, Microsoft Visio, Microsoft Project

Languages: SQL, PL/SQL, HP/Sun Solaris Unix, XML

Databases: Oracle 9i, SQL Server Management Studio 2012, DB2, Lotus Notes, and MS-Access

Testing Tools: ALM 11, Beyond Compare

Database Tools: SQL*PLUS, SQL Navigator, TOAD 9.0.1, and Query Analyzer, DB Artisan 8.5.4 Version Control/

Defect Tracking Tools: StarTeam, Lotus Notes SIR Database, Visual Source Safe, ALM 11

Reporting Tools: nVision, Hyperion, OBIEE, Crystal Reports, Business Objects

Third Party Tools: Livelink, OpenText tools - Elink and OODE (Online Office Document Editor)

ETL Tool: Informatica Ver 8.1.1, Business Objects SAP BI/BW concepts and navigation flow, People Soft (Financials, Treasury, GL,), SQR, and Oracle Application 11i





  • Working on multiple Dot Net based applications (eApproved projects to meet the requirements of FDA)
  • Verifying the data integration between old ERP system and SAP system related to Quotes, Sales Orders, Materials, etc.
  • All data sourced for all customers, material, and material instances from SAP are validated against the old ERP system including renaming of labels to reflect SAP naming-conventions.
  • Verify/compare the reports against the existing and new changes for all the applications (about 50 reports verified).
  • Integration testing with SAP - All changed fields and functionality associated with SAP integration requirements are to be tested.
  • Sourcing of all customers, material, and material instance data from SAP like Equipment Number, SAP Batch Number, SAP Serial Number, etc.
  • Verify the data integration between old ERP (Legacy) and SAP system related to Quotes, Sales Orders, Materials, etc.
  • Experience with creating and maintaining dashboards, communicating status, preparing status reports and keeping upper management informed.
  • Provide support for CSV (GxP) testing in PQ/OQ environment for all eApproved projects that meet with FDA requirements and provide support as needed for CSV testing.
  • Handling the projects in an onshore / offshore model managing resources in offshore (Assign daily tasks to the offshore teams and communicate on need basis with the offshore team).
  • Perform testing on multiple browsers (IE, Chrome, Firefox and Mac) for marketing applications.
  • Generate Test Summary/Test Execution/ Traceability Matrix reports from ALM for upper management for all releases.
  • Query/send requests in SOAPUI to validate SAP transactional data (GetCustomerDetails, GetMaterials, GetSalesOrderDetails, GetVendors, etc.) for testing purposes.
  • MS Azure Cloud migration testing - validating data on SAP system that are posted from front-end dot-net apps.

Hardware/Software: Windows 7, ALM 11, Oracle, Dot Net, Microsoft Office Suite 2003, SQL Server Management Studio 2012, IE 8, IE 9, Firefox, Chrome. SOAPUI, SAP ECQ


Test Lead/Project Coordinator


  • Review the requirements documents (Business Requirements, Functional Requirements and Technical Requirements) and create test plan/strategy, test scenarios and test cases from the system requirement documents including appropriate scripts to execute testing cases.
  • Review the Test Scenario/Test Cases with the Business Users/Analysts after the peer review.
  • Prepare and review Test Scenarios/Test Cases for Client Acceptance Testing with the CAT Testers.
  • Generate daily status reports (defect status/test set/SIT/CAT/Req not in scope) from ALM and share with the Business Users/Analysis on a daily basis.
  • Log, track, and verify resolution of software and specification defect in ALM (Quality Center)
  • Document software issues and enhancement requests and assists development with concise and detailed steps for duplication of the problem.
  • Verify resolved Change Requests and maintain accurate status for Change Requests entered in each project.
  • Determine testing strategies and provides testing resource estimations and consulting to project teams.
  • Experience and ability to participate in test planning, coordination and resource estimation for the project.
  • Prepare and manage test schedule, resources with project development schedule
  • Solid understanding of relational database concepts, SQL and procedural SQL
  • Execute automated testing scripts using QTP in ALM where automated.

Environment: Web based/Database/Mainframe Modules, ALM (Quality Center), Ab Initio, Unix, DB2 Data Studio, Oracle, Toad, MS Access, QTP 11.0, Peoplesoft HR, HP Sprinter, Unix, Workday, Ultraedit, Putty, Windows Server


Project coordinator /Hands-on TEST LEAD


  • Manages overall project risks/issues
  • Review and Approval sign-off for Test Strategy Plan
  • Identify appropriate test stages and techniques to be used during testing and guides the project team through to implementation
  • Validation of Data loaded from Source tables/files to staging and staging tables to Datamart Teradata using FastLoad TeraData utility.
  • Validation of Data in reports and Datamart Teradata using BTEQ TeraData utility including incremental load testing.
  • Perform validation of Error handling process and error log table using valid/invalid scenario.
  • End to End Testing to test business functionalities for accuracy and the reports generated are tested for data consistency.

Environment: Tera Data, Microstrategy, FastLoad and BTEQ (Tera Data Utility), Unix Solaris, Jira




  • Participate in Requirements review with business
  • Develop standard artifacts using QUALITY CENTER based on business and systems requirements, establish traceability between requirements and test cases, create and execute the defined test cases in Unix /Neteeza environment, analyze test results, report and track defects to closure and conduct root-cause analysis.
  • Prepare test data requirements and cross reference data with test conditions.
  • Create functional, Regression and End to End Test Cases and review with business analyst.
  • Execute Test cases using HP Quality Center and tracking the progress.
  • Verify Tidal Jobs (Parent and its dependencies) are migrated successfully before commencing the testing.
  • Verify all Tidal Jobs run successfully (including dependencies) without any failure.
  • Verify that the files are processed successfully at every layer (staging/EDS/Business Objects report)
  • Perform functional/ regression which includes negative testing by mocking up the files for error validation like empty files, missing of done files, missing rows in the data files, updating of flag in the control table during the invalid testing, etc.
  • Verify archival of done and data files (encrypted) from the source directory after successful process.
  • Verify decryption flag is updated appropriately in the control table when the data file on the Grid Server is decrypted.

Environment: Neteeza database, WinCVS (GUI revision control system on files), Unix, TIDAL Job Scheduler, AGINITY workbench for Neteeza, WinSQL, WinSCP (FTP/SFTP client).


Senior QA Analyst


  • Verify Replication of data between homogeneous/ heterogeneous database management systems
  • Validation/Comparison of data between production & replicated data source (RDS)
  • Perform valid/negative testing between RDS/development and data warehouse
  • Verify performance is as expected and appropriate checks are performed (validating indexes, profiling tables, partitioning etc.)
  • Unit of Measure (UOM) testing between source and target (e.g. conversion to meter, kilogram, second, ampere, kelvin, mole, candela, etc. converted into SI metric units) - tested with java function and custom function.
  • Verify appropriate transformations are applied for UOM and any other fields as defined in the specification document
  • Performed Column/Table Analysis, Primary Key Analysis, Cross-Table Analysis, Relationship Analysis between source and target
  • Data analysis includes Database Entities, Data Types(SQL Server Vs. Oracle particularly SQL Variant Data type conversion from SQL Server to Oracle), Ambiguous Columns, Data Type Comparison, Column Uniqueness, Cross table Analysis, Cross table Domain Comparison, Relationship Analysis, Data quality issues, Source-to-target mappings
  • Verify the records that are error free are set separately, one for old records that will updated and one for new records that will be inserted and in every case (error or processed) the staging table is updated either with a processed status or error status and records in error can be reprocessed
  • Verify reports and data extracts on SQL Server/Oracle databases using SQL Server Reporting Services (SSRS) for data validation, calculation of UOM values, formatting, tables/rows/columns are imported correctly, data received without truncation, time taken/speed for transfer, etc.
  • Manage/update projects/documents and provide/update status using SHAREPOINT on the daily Scrum meeting.
  • Provide estimate/track the tasks that are entered into TFS system on a daily basis for AGILE planning/Test Execution





  • Perform batch setup
  • Performed data quality check parallel to prod before releasing any job for the transaction data received from various sources like ENERGY KAPITAL, etc.
  • Investigated/analyzed the failures during batch run and rerun the batch job with the Run Book instructions. Rerun the job with the current day feeds when the batch picks up the previous day feeds due to delay in the receipt of the feeds in prod from LOB (clean up the data from the respective tables and make sure that the jobs picks up the correct version/run id while rerunning the job).
  • Communicate issues identified during testing with SA, DBA and AD for resolution
  • Written test plans. Maintain records of test progress, document test results, prepare incident reports and presents results, as appropriate.
  • Performed impact and risk analysis as part of a risk based testing methodology
  • Proficiency in navigating in a UNIX environment to analyze the dataflow; be able to query, analyze, and update data in relational databases
  • Performed back-end validation testing as needed during the batch run.



Test Analyst/ Lead


  • Developed test plans and test scripts on Quality Center (QC) after analyzing the complex requirements/Use Cases and the URT and LUM (Logical Universal Model) documents for the Webi reports on Business Objects reporting tool.
  • Done extensive data quality testing on the SQL Server Source systems, in the Oracle Data Warehouse and the Business Objects Universe for different applications. Created SQL queries on the SQL Server Management Studio, Toad for Oracle and Teradata SQL Assistant. Written complex/lengthy SQL Queries for the business derivatives provided by the Business users.
  • Created Test Sets on QC and executed test cases. Reported defects on Rational Clear Quest Defect tracking tool.
  • Executed QA and regression test cases as needed for the development projects and the Change Requests.
  • Interacted with the end users during UAT /Post UAT and helped in validating the requirements of each release.
  • Ensured that all requirements are met by creating traceability matrix and business requirements are addressed within all relevant project deliverables throughout the SDLC with the help of Quality Center.

Environment : DATASTAGE/Business Objects Universe, Webi reports, SQL Server 2005, Oracle data warehouse, Teradata, EJB, Servlets, SQL/PLSQL, Quality Center, SnagIt, Rational ClearQuest Defect tracking tool

Test Analyst/Lead



  • One iteration for every two weeks (one or more mapping in each iteration released for testing)
  • Responsible for the testing of the data transformation processes developed by Informatica
  • Review design specifications with Functional Analysts(FA)/Data Modelers/Project Managers on bi-weekly meetings to discuss ongoing change of requirements in iteration plan on FA/Source to Target Mapping (STM) documentation
  • Create tests based on functional specifications and verify test scenarios with FAs.
  • Define test conditions based on design documentation - Create test cases for positive, negative, Null, calculations, aggregations, performance, comparison between legacy, and improbable or illogical conditions that exercise all processing paths in Informatica ETL process.
  • Prepare expected results and cross reference to test conditions. Walkthrough test conditions with the Client QA team and get Signoff of test plan/test cases from them to ensure they understand our test logic and meet their requirements and expectations.
  • Log defects in Team Track and load defects in Certify and create a link in TDR Viewer for tracking/updating the test cases.
  • Prepare test data requirements and cross reference data with test conditions.
  • Take sessions as a breakpoint for testing and focus on source and target comparisons to see if transformations are correct.
  • Debug workflows to ensure mapping logic consistent with STM
  • Performing incremental integration testing - Continuous testing of new functionality,

Environment: SQL Developer, Informatica, Team Track (Process Management Tool),Certify (Testing Tool), Data Quality (Data standardization tool from Informatica),Auto Unit Test (Java auto unit test tool ), Autosys, Unix, SQL Server


Senior Data/QA Analyst


  • Developed and executed User Acceptance Test (UAT) plans/test cases for all business processes in QC from mapped business requirements documents and got signed off from Business Users.
  • Business Rules/ Data validation/transformation rules were tested across the Data warehouse layers
  • Validated Historical/Normal Views (Reports) by comparing results in DB SOLO/BEYOND COMPARE between Production and Test Environment.
  • Performed root cause analysis for all variations found by executing queries via TOAD/RapidSQL at the appropriate data warehouse layers as well as within the ENDUR database level when necessary.
  • Tested and validated Freeze & Non-Freeze data.
  • Analyzed data using business logic/rules including the default values for Errors/Warnings expected.
  • Used HP Quality Center to Execute/update test cases and provide update status by generating reports at the end of the day, and followed up with Operations/development teams for resolution.
  • Created test cases using QualityCenter to validate all ETL and data-handling processes.
  • Created test data sets (for positive and negative testing between different layers) to assure data quality for all ENDUR, Nucleus data sources (data includes slowly changing dimension data, fact data).
  • Validated data for COMPLIANCE requirements between ENDUR and test environments
  • Performed P&L reports testing (IMPACT ANALYSTREPORTS & IMPACT RESULTS)
  • Verify that the system will allow the user to export reports in the following format: Excel, PDF,, CSV, CSV (with options)

Environment: ETL Tool (Informatica), Rapid SQL, Oracle, Microsoft Office, Toad, DB SOLO, BEYONDCOMPARE, ENDUR/Nucleus/Swaptions, SharePoint, Quality Center, BO Reports


QA Lead/Data Analyst


  • Took lead responsibility in planning and coordination of technology deliverables that are tangibly linked to business deliverables for Derivatives/TCP projects. Gathered business requirements from the Line of Business (LOB) to meet the defined requirements for Investment Banking/Risk Technology/Credit Risk/Market Risk.
  • Ensured thorough testing of new Fixed Income /Derivatives/ Collateral/ Energy Trading supplies is done for On-Boarding and End 2 End process work correctly and review System Integration Testing(SIT) test plan to ensure coverage of test scenarios in SIT
  • Ensured that the day-to-day exposures received through various line of business streams are reported to BASEL II regulatory reporting based on the requirements after going through’ adjustments/calculation processes in the business data.
  • Tested web-based applications developed in J2EE (Java) and/or .NET framework to perform critical validation (header/trailer) on the LOB files before loading thro’ ETL Informatica process.
  • Performed running batch commands, scheduled jobs, and reviewing output, logs, etc. for the supplier files received from LOB
  • Performed connectivity testing with Secure FTP/NDM by coordinating with the Line of Business/Operations teams to receive files thro’ secured process in the Unix server.
  • Performed critical validation and verification rules on the carat delimited flat/xml files (header and trailer records) sent by the Suppliers from Asia & Europe via SFTP/NDM process.
  • Developed standard artifacts using QUALITY CENTER based on business and systems requirements, establish traceability between requirements and test cases, physically create and execute the defined test cases in a UNIX, Oracle and Datastage environment, analyze test results, report and track defects to closure and conduct root-cause analysis via application logs, error queues, and database dumps.
  • Provided Leadership and supervision for the on-shore and off-shore Project - QA/ODS Testing Team. (Lead a team of 3 testers both offshore and onshore combined)
  • Worked with deployment team participating in after-hours deployments/migration of data and quality assurance testing efforts to test production environment updates before they are migrated from UAT environment.
  • Provided Run Metrics at the end of each cycle using Test Director/Excel to measure test progress and quality outcome.
  • Used extensively SQL queries to validate data with verification, validation and transformation rules including data manipulation in the data staging area - tools DB Artisan/TOAD used for the purpose.
  • Prepared test scripts by referring ETL and Reporting functional specifications. Executed ETL test cases (which involves executing Informatica mappings and checking the output in the database tables)
  • Perform positive & negative testing both at data stage and ODS levels when new Change Requests are raised and implemented and for defect fixes.
  • Created test data (both positive and negative scenarios) to support application during unit/system testing when required by mocking up the XML/TXT files. Mocked up the source files for NULL values (Flat/XML documents) by inputting valid/invalid values (identify the correct location with physical message specs in the flat file and by inserting/adding elements.
  • Supported emergency on-call 24/7 support during production migration.

Environment: ETL Tool (Informatica), XML, Dbartisan, Oracle/SQL Server, SQL Navigator, Business Objects Web version - Credit Risk Reporting Tool, Microsoft Office, FTP /NDM, XML/XMS, HTTP, Unix Emulator, TLM, MQ Series, Quality Center, MS Project, Intraspect Tool, Batch Testing, UI Web-based, UNIX, Toad, Hummingbird, Putty, AutoSys, ClearQuest, Quality Center, Java/J2EE, Clearquest, Quality Center, Data Mart, SQL Server

Hire Now