We provide IT Staff Augmentation Services!

Sr.qa Hadoop Automaton Tester Resume

3.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Banking, Insurance,Industrial, Manfacturing and Health Care(US) domain.
  • 12 years of experience in Quality Assurance & Software Testing.
  • 8 years of experience in QA Lead, Test Strategy, Test Planning, Test Completion Report,Traceability Matrix, Test Estimates and Test Design.
  • 7 years of experience in Test Execution Analysis, Defect Tracking, Capturing Test Metrics, TestReporting and experience in QA Sign - Off.
  • Having 4 years of Big Data Testing experience wif differenct Hadoop distributions like Cloudera,Hortonworks.
  • Having 5 years of Data Warehouse(ETL) Testing experience in ETL development tools likeTeradata, Informatica, Ab-Initio, Data Stage and BI tools like Cognos, Qlikview.
  • Having 1 year of experince in Big Data Automation Testing using Spark-Scala by reading the datafrom CSV,JSON,Parquet, ORC files and performing transformations for validation of business-rulesand E2E Integration Testing.
  • Having 3 years of Mainframe Testing experience like Batch Testing, Online Testing(CICS) wifdifferent Mainframe tools like EZCopy, TSO/ISPF, QMF, SPUFI, JCL, IMS/DB, File-Aid, VSAM, Control-M and DB2.
  • Having 2 year of AS400 Testing experience like Batch Testing, Online Testing.
  • Having 1 year of performance testing experience in JMeter
  • Having 1 year of Rest Services Test Automation experience in Cucumber wif Ruby.
  • Having experience in Hadoop Components like Apache Phoenix, HBase, Spark, Hive, SQOOP, HDFS,PIG, Oozie, ZooKeeper, Flume and sound knowledge on Map Reduce, MongoDB, Flume, Yarn and Scala.
  • Experienced in working onDevOps/Agile operations process and tools area (Code review, Build &Release, Environment, Service, Incident and Change Management)
  • Experience in-depth wif cross platforms (Windows, Linux, Ubuntu, Centos) and goodunderstanding in Clusters and Replications.
  • Having good experience in Testing of Distributed, Client/Server, Web and Mainframe basedapplications.
  • Experience wif XML, JSon, TSV, CSV, Sequence files, ORC files, AVRO and Parquet file formats.
  • Experience in compression techniques like Snappy, Gzip.
  • Well experienced in using networking and platform switching tools like Putty, Filezilla & WinSCP.
  • Having good knowledge on PL/SQL and experience in troubleshooting Procedures, Packages.
  • Extensive experience in Data Warehousing testing, Data Migration testing and expertise in writingComplex SQL Queries and data validation of Data Warehouse & BI applications.
  • Experience in understanding on Data Acquisition, Data Integration, Data Storage, and DataPresentation, End-to-End data validation for ETL & BI systems.
  • Highly experienced in performing data quality test, data correctness test, data consistency test,and process interdependency tests to determine whether processed data is correctly extracted, transformed and loaded.
  • Verified and Validate the Missing records, Duplicates records, Nulls, Defaults records as per thedesign specifications.
  • Involved and verified the data at back end level, Data Integrity like Default values, Null check, DataCleaning, Sampling, Scrubbing, Aggregation and Data Merging operations.
  • Good Experience in testing Initial (Reconcile) and Daily (Delta) ETL loads
  • Strong skills in data analysis of business rules, data requirement analysis and data mapping(S2T),Transformation/Business Logics for ETL processes.
  • Having good experience in testing domain wif exposure to DB, ETL, Web concepts.
  • Experience on Unit Testing, Smoke Testing, Sanity Testing, Functional Testing, System IntegrationTesting, GUI Testing, System Testing, Regression Testing, Performance Testing, Database Testing, Data Redundancy, End to End Testing, ETL Workflows Testing, User acceptance testing (UAT), Compatibility Testing and Product Assurance Testing for Web based and Client/Server applications.
  • Worked in Agile, Scrum Methodologies and participated in daily stand-up meetings. Performedroot cause analysis of the defects and worked wif the Developers, BSA's and business to resolve any issues that arise out of the testing process.
  • Contributed to Defect Management throughout various phases of testing and re-testing.Facilitatingdefect review meetings such as defect/bug prioritization, planning, effort estimates, status updates and working closely wif the project manager to keep the project releases on track.
  • Having good experience in Testing of BPM, Client/Server, Web and Mainframe based applications.
  • Experienced in Test Management tools like HP QC/ALM and version control tools like VSS & RPM.
  • Experience in preparation of Test execution status, daily/weekly status reports, Test Metric Reportsand handling Client Calls.
  • Performed tasks like scheduling and estimation of project work products, coordinating andcommunicating wif various team to ensure quality deliveries.
  • Having good expenerience of Web Services, SOAP UI XML test process and their concepts andfamiliar wif Selenium Webdriver.
  • Knowledge on functional, performance, load, stress, volume, endurance testing using HP LoadRunner tool.
  • Knowledge on Java, Phython, Ruby languages.
  • Very good at root cause analysis and faster resolution of the defects, their by contributing to theproductivity improvement in the project.
  • Good experience in maintaining and troubleshooting of servers in the Test Environments.
  • Demonstrated ability to handle multiple tasks both by working independently and in a team.
  • Ability of mentoring and providing knowledge transfer to team members, support teams and

TECHNICAL SKILLS

Operating Systems: Windows 10/8, Windows XP/2000/2007, Unix, Linux Red Hat, Mainframe & AS400, Ubuntu

Database: Oracle 11g/10g/9i, DB2, SQL Server, MY SQL, No SQL(MongoDB), IMS/DB, HBase

Big Data Ecosystems: HDFS, Map Reduce, HBase, Hive, PIG, SQOOP, Oozie, Phoenix, Flume, Cassandra, Apache Nifi, StreamSets

DWH Tools: Ab-Initio, Informatica, Teradata, Data Stages, Cognos, Qlikview

Languages: Java, Ruby, Python, SQL, JCL

Web servers and Application Servers: Tomcat, Apache, Web logic 10.x/8.x, Web Sphere 8.x/7.x

CM Tools: Microsoft Visual Source Safe, Rational Clear Case, Rational Portfolio Manager(RPM)

Testing Tools: HP Load Runner12.x

Test Management Tool: HP Quality Center/ALM, Rational CQTM, JIRA, JAMA

Mainframe Tools: TSO/ISPF, File-AID, SPUFI, EZCOPY, QMF), IMS, VSAM

Other Tools: AutoSys, Teradata SQL Assistance, SQL Developer, IBM BPM, Aqua Studio, TOAD, Putty, WinSCP, SOAP UI, UML, MS Project, MS Visio

PROFESSIONAL EXPERIENCE

Confidential

Sr.QA Hadoop Automaton Tester

Environment: Python, HDFS, Hive, HBase, Spark, KAFKA, Shell Script, Scala, SQL, VersionOne

Responsibilities:

  • Worked on script to read all JSON,CSV,ORC,Parquet files using Spark-scala
  • Worked on scripts to to validate business rules in Scala using Spark SQL, String functions
  • Worked on script to compare schema & data between source tables vs hive tables
  • Worked on script to compare schema & data between parquet files vs hive tables
  • Worked on script to perform Integration Testing on JSON, Parquet files
  • Involved in Test Plan, prepared traceability matrix for test coverage, high level business scenarios.
  • Participated in Agile Scrum meetings(Grooming, Planing, Daily Standup Meetings)
  • Worked in CICD process and involved in build deployments & validation for each build releaseusing Bamboo/Bit-Bucket.
  • Worked on Post Prod-Validation, identifying root-causes, troubleshooting issues in PROD.
  • Raised various defects/issues in JIRA application
  • Created various shell scripts to re-processing the production data.
  • Involved in Release management and validated build components.
  • Validated Yarn application jobs against failures and identifed root cause
  • Validation of Audit log, recount, and hash totals between source, Hadoop.
  • Simulation of scenarios to check the restartability of batch jobs, logs, error log table, rejectlogs.
  • Validation for Data completeness and accuracy
  • Validation of transformation logics as per mapping.
  • Validation that batch functionality for any jobs failure.
  • Involved in System Testing, System Integration Testing, Regression testing & Perfromance testing.

Confidential

Sr.QA Hadoop Tester/Prod-Support

Environment: Python, HDFS, Hive, HBase, Spark, KAFKA, Shell Script, Streamsets, SQL, Oozie, Bit Bucket, Bamboo, JIRA

Responsibilities:

  • Worked on NASCO, CS90, FEP Claims/Membership data which are migrated from legacysystems to Big Data.
  • Involved in Test Plan, prepared traceability matrix for test coverage, high level business scenarios.
  • Participated in Agile Scrum meetings(Grooming, Planing, Daily Standup Meetings)
  • Worked in CICD process and involved in build deployments & validation for each build releaseusing Bamboo/Bit-Bucket.
  • Preparation of SIT/PPV Test Cases in based on buisiness requirements
  • Worked on Post Prod-Validation, identifying root-causes, troubleshooting issues in PROD.
  • Raised various defects/issues in JIRA application
  • Developed Streamsets Pipelines(work flow) to ingest EBCDIC/Fixed width length, Delimetter Separated files into HIVE.
  • Created various shell scripts to re-processing the production data.
  • Involved in Release management and validated build components.
  • Validated Streamsets pipelines and insertion of data into HIVE tables for ASCII/Fixed width length, delimetter separated files.
  • Validation of Metrics/Reports calculation using Ground systems log files against business rules.
  • Created HIVE complex queries for PPV against business rules/transformations.
  • Validation of input files in Landing area.
  • Validated Yarn application jobs against failures and identifed root cause
  • Validation of folder structure of files in Hadoop clusters (HDFS).
  • Validated ETL jobs and troubleshooted SQL procedures based on errors displayed in error log.
  • Validation of Audit log, recount, and hash totals between source, Hadoop.
  • Simulation of scenarios to check the restartability of batch jobs, logs, error log table, rejectlogs.
  • Validation for Data completeness and accuracy
  • Validation of transformation logics as per mapping.
  • Validation that batch functionality for any jobs failure.
  • Involved in System Testing, System Integration Testing, Regression testing & Perfromance testing.

Confidential, Melbourne, FL

Sr.QA Hadoop Tester

Environment: MapReduce, HDFS, Hive, HBase, Spark, Shell Script, NIFI, Apache Phoenix, Cassandra, Flume, SQL, Oozie, Bamboo, JIRA, JAMA, SQL Server

Responsibilities:

  • Worked on various Ground System applications FPM(TV/AVOD Services), Health Reporting (TV/AVOD Services & Connectivity), Analytics/User Reports which are migrated from legacysystems to Big Data.
  • Involved in Test Plan, prepared traceability matrix for test coverage, high level business scenarios.
  • Participated in Agile Scrum meetings(Grooming, Planing, Daily Standup Meetings)
  • Worked in CICD process and involved in build deployments & validation for each build releaseusing Bamboo.
  • Preparation of Test Cases in JAMA based on buisiness requirements
  • Raised various defects/issues in JIRA application
  • Involved in various IVV process as per Chorus 2.0.
  • Deployed the various builds on HDP/HDF cluster using NodeJS, Ansible and prepared Integration test cases.
  • Validated Nifi Data flow and insertion of data into HIVE tables for XML nested data based onbusiness rules/ICDs.
  • Validation of Metrics/Reports calculation using Ground systems log files against business rules.
  • Created HIVE complex queries using LATERAL VIEW for XML nested data for validation againstbusiness rules.
  • Created SQL queries in Apache Phoenix for validation against business rules vs Metrics/Reports.
  • Validation of input files in Landing area.
  • Validation of Dataflow from landing area to HIVE/Phoenix tables using Flume Interceptor.
  • Validation of Incremental extracts and bulk load data using XCOPY/File Router.
  • Validated Yarn application jobs against failures and identifed root cause
  • Run various Oozie jobs for various applicaitons on daily/hourly basis.
  • Validated FllightAware/ASFD data, which is pulling by Flume Interceptor to Phoenix tables using Kafka queue.
  • Validation of batch functionality for any jobs failure or partial success by simulating the changes in input files.
  • Performed API testing using Postman Application(OAuth2) for JSON data.
  • Validation of email notification(EMM) to source system on batch/job failures.
  • Validation of folder structure of files in Hadoop clusters (HDFS).
  • Validated ETL jobs and troubleshooted SQL procedures based on errors displayed in error log.
  • Validation of Audit log, recount, and hash totals between source, Hadoop.
  • Simulation of scenarios to check the restartability of batch jobs, logs, error log table, rejectlogs.
  • Validation for Data completeness and accuracy
  • Validation of transformation logics as per mapping.
  • Validation that batch functionality for any jobs failure.
  • Involved in System Testing, System Integration Testing, Regression testing & Perfromance testing.

Confidential, Reston, VA

QA Lead - Hadoop/Automation

Environment: MapReduce, HDFS, Hive, HBase, Spark, Shell Script, SQL, OozieAutosys, Jenkins

Responsibilities:

  • Involved in Test Plan, prepared traceability matrix for test coverage, high level business scenarios.
  • Involved in Automation Testing(Cucumber) for REST API for existing regresstion test suite.
  • Involved in Performance Testing(JMeter) for REST API.
  • Worked in Agile Scrum(Kanban) environment and participated in Agile Scrum meetings(Grooming, Planing, Daily Standup Meetings)
  • Worked in CICD process and involved in build deployments & validation for each build releaseusing Jenkins.
  • Validation of input files in Landing area.
  • Validation of Dataflow from landing area to HIVE tables.
  • Validation of Incremental extracts.
  • Validation of batch functionality for any jobs failure or partial success by simulating the changes in input files.
  • Validation of Autosys Jobs
  • Validation of email notification(EMM) to source system on batch/job failures.
  • Validation of folder structure of files in Hadoop clusters (HDFS).
  • Validation of Audit log, recount, and hash totals between source, Hadoop.
  • Simulation of scenarios to check the restartability of batch jobs, logs, error log table, rejectlogs.
  • Validation for Data completeness and accuracy
  • Validation of transformation logics as per mapping.
  • Validation that batch functionality for any jobs failure.
  • Involved in System Testing, System Integration Testing.

Confidential, Atlanta, GA

QA Lead - Hadoop/ETL

Environment: MapReduce, HDFS, Hive, Java, Shell Script, SQL, Sqoop

Responsibilities:

  • Involved in Test Plan, prepared traceability matrix for test coverage, high level business scenarios.
  • Understanding of Big Data Hadoop architecture, EDW data model and scope of the system.
  • Validation of input files in Landing area.
  • Validation of Dataflow from landing area to HDFS.
  • Validation of Incremental extracts.
  • Validation of batch functionality for any jobs failure or partial success by simulating the changes in input files.
  • Validation of email notification to source system on batch/job failures.
  • Validation of folder structure of files in Hadoop clusters (HDFS).
  • Validation of Audit log, recount, and hash totals between source, Hadoop.
  • Simulation of scenarios to check the restartability of batch jobs, logs, error log table, rejectlogs.
  • Validation for Data completeness and accuracy
  • Validation of transformation logics as per mapping.
  • Validation that batch functionality for any jobs failure.
  • Validation for data loss, duplicate data.
  • Involved in System Testing, System Integration Testing.
  • Sending daily/weekly status reports to the client and also arrange defect tracking calls wif all the

Confidential, Eagan, MN

Test Lead - ETL/BI

Responsibilities:

  • Understanding of ETL architecture, EDW data model and scope of the system.
  • Understanding of existing/legacy claims data model.
  • Understanding scope and requirements and write appropriate test scenarios and test scripts.
  • Test plan preparation for Data migration.
  • Data Validation wifin Target tables to ensure data is present in required format and their is no data loss, Bad data from Source to Target tables.
  • Experience to End-to-End data validation for ETL & BI systems - Experience in SQL preparation in Oracle/ SQL Server/DB2.
  • Preparing Test cases, reviewing the Test cases and executing those Test cases.
  • Validated ETL jobs and troubleshooted SQL procedures based on errors displayed in error log.
  • Check source data (table (s), columns, data types and Constraints), about Target data (table (s), columns, data types and Constraints).
  • Writing Queries for Data Scrubbing, Data Aggregation, Data Merging and Data Cleansing Transformation logics.
  • Assign the tasks to team and track the task status accordingly Work wif Onsite and offshore team.
  • Interactions wif onsite team and Dev teams for query resolution.
  • Sending Daily and weekly status reports to client.

Confidential

Sr.Test Analyst - ETL/BI

Environment:HP Quality Center, AbInitio, QlikView

Responsibilities:

  • Running the ETL plans and Tests which is given by dev team in UNIX environment for loading thedata to data base.
  • Validate the data loaded in Target as per the business requirements by using SQL queries.
  • Preparing the Test data as per Business requirements and Business Rules given by client.
  • Manage and report on progress for own and team deliverables.
  • Report in weekly Test Team Meeting of supplier on progress, risks and issues.
  • Responsible for developing Test Scenarios, Test data and Test cases from Functional documents.
  • Executed test cases using UNIX environment and used Cognos for executing reporting test cases.
  • Attending & arranging Requirements meeting, Defect Call, Status calls.
  • Acting as coordinator to fulfill the functional gap in delivering things.
  • Effective co-ordinations and communications wif client on regular basis to provide them estimates, planning updates and reports.
  • Providing estimations for Functional requirement and handling individually for all STLC phases.
  • Prepared Traceability Matrix, Detailed Test Plan.
  • Analyzed the user/business requirement and functional specification documents and created test scripts in Quality Center.
  • Validated ETL jobs and troubleshooted SQL procedures based on errors displayed in error log.
  • Prepared DOU (Document of Understanding) after completing Test Execution.
  • Mentoring and knowledge transfer to new entrants.
  • Performed performance testing using JMeter.

Confidential, Plano, TX

Sr.Test Analyst

Environment:Oracle RMS, Toad, Teradata SQL Assistant, Informatica, SOAP UI

Responsibilities:

  • Carried out System, Interface and End to End testing.
  • Involved in comparison testing as part of the data migration from the legacy applications intoOracle RMS environment.
  • Involved in web service testing for the web services created as part of the integration.
  • Involved in Test Planning, Test Execution, Test Result Reporting, Status tracking and reporting tothe management.
  • Involved in Test Scenario and Test Case Review, Test Results Review and signoff process.
  • Extensively involved in test environment set up and test data identification.
  • Performed System Testing which includes validation of inbound data feed from EBS and othersystems.
  • Tested the 40 plus interfaces developed as part of these implementations.
  • Identified End to End test scenarios by identifying the impacted legacy applications and carried outthe testing by coordinating wif different IT/ Business teams in JCPenney.
  • Performed End to End Test Planning, Data Identification and End to End test execution.
  • Involved in Usability testing of the new screens developed for JCPenney.
  • Organized Daily Defect Triage wif the development teams and Business Analyst.
  • Weekly status meetings, to facilitate communication and maximize productivity.
  • Provided Demo to the Business Users after every Sprint.
  • Coordinated UAT testing.

Confidential

Mainframe/BPM/ETL Tester

Environment:BPM 8.0, DB2, Java, HP Quality Center, Informatica

Responsibilities:

  • Successfully delivered R1 release in Eriskay BPMS Workflow.
  • Attending & arranging Requirements meeting, Defect Call, Status calls.
  • Acting as coordinator to fulfill the functional gap in delivering things.
  • Successfully delivered various Work Packages in Pensions Reform programme during various releases.
  • Successfully delivered WP22.2/23.2 MI Payment(ETL Testing).
  • Leading the team of 3-5 members for Work Packages in Pensions Reform programme during various releases.
  • Effective co-ordinations and communications wif client on regular basis to provide them estimates, planning updates and reports.
  • Handled the role of deep scrum lead to discuss on progress, risks and issues to monitor the health of WP’s on regular basis.
  • Providing estimations for Functional requirement and handling individually for all STLC phases.
  • Prepared Traceability Matrix, Detailed Test Plan.
  • Analyzed the user/business requirement and functional specification documents and created test scripts in Quality Center.
  • Prepared presentations at client visit to demonstrate Pension Reforms Eligibility Calculator and also prepared presentation to demonstrate Pension Reforms Case Study for R1, R2 & R3 releases
  • Created Test conditional template which is useful in drafting Test Cases.
  • Create and execute test scripts, and scenarios that will determine optimal system performance according to specifications and prepared required test data.
  • Prepared Regression Test cases for Pension Reforms in various releases.
  • Involved in System/System Integration Testing for Eligibility Calculator & Waiting Period.
  • Created Conditional Parameterized template in QC which save time in Test Preparation Phase.
  • Validating the data as per the business rules from source to target systems.
  • Validating data transformed correctly from OLTP to OLAP like ensuring expected data is loaded, ensuring that all the data is transformed correctly according to design specifications.
  • Execution of plans/batches to load the data into various target sources.
  • Created test data setup for different Member Status Active, Inforce, Opt-Out & Terminated using Mainframe Online Transaction processing (CICS), DB2 and other GUI Applications.
  • Running JCL for Batch Testing & ACR process to update Deferral Period and for Eligibility Calculator.
  • Verifying system logs using JESMSGLOG, JESSYSLOG and SYSOUT under status of jobs and message queue using MQB.
  • Keep track of new requirements of the product and the same discussing wif the team on time.
  • Communicate test progress, test results, and other relevant information to UK counter parts.
  • Reporting and tracking defects using HP Quality Center.
  • Pro actively worked wif developers to ensure timely bug resolution.
  • Working closely wif team to perform extensively smoke, Functional & Regression testing.
  • Involved in peer reviews, weekly status meetings and weekly client status meetings.
  • Prepared DOU (Document of Understanding) after completing Test Execution.
  • Mentoring and knowledge transfer to new entrants.

Confidential, Minneapolis, MN

Environment: Mainframe, HP Quality Center, File Aid, QMF, SPUFI & JCL

Sr. Test Engineer - Lead

Responsibilities:

  • Providing estimations for change request and handling individually for all STLC phases.
  • Involved in preparing Traceability Matrix, Test Plan.
  • Analyzed the user/business requirement and functional specification documents and created test cases.
  • Create and execute test cases, and scenarios that will determine optimal system performance according to specifications and prepared required test data.
  • Create different type of pharmacy claims like Manual, Electronic, Paper and creating different Plan setups wif different members depending on the requirement using Mainframe Online Transaction processing (CICS).
  • Running JCL for Batch Testing & ACR process to verify claim adjustments.
  • Verifying system logs using JESMSGLOG, JESSYSLOG, and SYSOUT under status of jobs and message queue using MQB.
  • Keep track of new requirements of the product and the same discussing wif the team on time.
  • Communicate test progress, test results, and other relevant information to US counter parts.
  • Reporting and tracking defects using HP Quality Center and Rational Clear Quest.
  • Involved in peer reviews, weekly status meetings and weekly client status meetings.

Confidential, Richardson TX

Mainframe Testing

Environment:Mainframe, DB2 & HP Quality Center

Responsibilities:

  • Providing estimations for change request and handling individually for all STLC phases.
  • Tracking work allocation using Rational Team Concert tool.
  • Involved in preparing Traceability Matrix, Test Plan.
  • Analyzed the user/business requirement and functional specification documents and created test cases.
  • Create and execute test cases, and scenarios that will determine optimal system performance according to specifications and prepared required test data.
  • Create different type of pharmacy claims like Manual, Electronic, Paper and creating different Plan setups wif different members depending on the requirement in AS400 application.
  • Running PDE process, FIR transaction, R&R transaction to verify claim adjustments.
  • Running EZTEST for batch claims.
  • Keep track of new requirements of the product and the same discussing wif the team on time.
  • Communicate test progress, test results, and other relevant information to US counter parts.
  • Reporting and tracking defects using HP Quality Center and Rational Clear Quest.
  • Pro-actively worked wif developers to ensure timely bug resolution.
  • Working closely wif team to perform extensively smoke, Functional & Regression testing.
  • Involved in peer reviews, weekly status meetings and weekly client status meetings.
  • Mentoring and knowledge transfer to new entrants.

Confidential

Manual/Linux Tester

Environment:Java, C++, Oracle 11g, Linux

Responsibilities:

  • Analyzed the user/business requirement and functional specification documents.
  • Analyzed the Uses Cases/Change Request and prepared Unit Test cases, Integration test cases and System test cases.
  • Involved in various types of testing like GUI testing, Integration testing, System testing and Regression testing.
  • Involved in Test Plan preparation.
  • Involved in installing database (oracle), Linux (Red Hat), also creating oracle instances in Linux environment.
  • Creating the test data based on test cases also creating SQL scripts.
  • Involved in detailed Functional and End-to-End testing.
  • Performed Data base Testing using SQL Queries in Linux environment,
  • Performed system performance testing using UNIX commands like sar, iostat, netstat, top, vmstat.
  • Reporting and tracking of defects using ePMS.
  • Actively participated in Client, DPR, Peer Review meetings.
  • Maintaining Traceability matrix.
  • Preparing User Manual guide.
  • Preparing UTC, ITC Execution Status on weekly basis.
  • Integrated wif development team and discussed the technical problems, reported bugs and supported the team.

We'd love your feedback!