We provide IT Staff Augmentation Services!

Senior Production Validation Testing Team Engineer / Sr. Qa Analyst Resume

4.00/5 (Submit Your Rating)

Richardson, TX

PROFESSIONAL SUMMARY:

  • Total 15+ years of IT experience wif expertise in analysis, understanding business requirements, providing Quality Assurance by performing Manual & Automated Testing, System testing, White Box Testing including Functionality, Performance, Stress and Regression Testing of Client/Server and Web based applications.
  • Knowledge and understanding of full Project/QA Life cycle (SDLC) from Requirement analysis to User acceptance Testing.
  • Have developed excellent professional skills by working independently and as a team member for manual and automated testing of applications using Mercury Tools. Expertise in data driven testing & writing scripts. A committed team player having excellent communication skills. Involved in Project Management Sessions and assisted the QA Managers on Project Plan for Quality Assurance
  • Hands on experience in analyzing the business, technical, and functional requirements to develop the Test plans, Test cases, Test Scripts and Test strategies, Execution of tests and reporting defects.
  • Performed GUI Testing, Performance/Stress/Load Testing, Functionality Testing, Back - end Testing using SQL, Unit Testing/Integrated Testing, White Box Testing, Defect Tracking Using Automated Testing Tools like Mercury’s Quick Test Pro, Load Runner, Win Runner.
  • Extensive noledge of the relational database concepts and Proficient in the data manipulation using SQL for the retrieval of data using (inner joins, outer join, group by, order by, cursors etc.)
  • Ability to work in a team environment. Strong communication and interpersonal skills. Ability to interact wif customers wif ease and professionalism.
  • Attention to detail and ability to work in tighter schedules and on different applications concurrently

TECHNICAL SKILLS:

Testing Tools: Test Director 7.6, Win Runner 7.6, Load Runner 8.0, QTP 9.0/9.2, Quality Center 9.0/9.2, Test Track Pro, FogBugz and Change Synergy

Environment: Windows 98/XP/Vista, windows NT, windows 2000, UNIX

Language: Visual Basic 6, C,C++, C#, Java, SQL

Scripting: TSL, VBScript, JavaScript, Perl Script

RDBMS: SQL Server 7/2000,Oracle 8i, Informix, Sybase, MS Access 2000

Web: HTML, DHTML, ASP, IIS, PWS, JSP, Java, XML, J2EE, EJB, JSP, Dream weaver

Packages: MS Office 2003,Eudora, MS Outlook 2000, Photoshop, Flash, Perforce

Development tools: Photoshop, Flash, Visual Studio 6

PROFESSIONAL EXPERIENCE:

Confidential, Richardson, TX

Senior Production Validation Testing team engineer / Sr. QA Analyst

Responsibilities:

  • Production NDA lab test engineer focusing on production validation testing of major AT&T launch initiatives.
  • FirstNet network launch: Retail validation, subscriber provisioning validation and network usability validation, device and firmware validation.First responder migration flow validation for customers moving from the AT&T Commercial Network Core to the new Firstnet Network Core - including WPS subscriber provisioning and validation alongside of Firstnet. iOS device launch test engineer: iOS firmware device validation test engineer, IT project Production Validation Test engineer focused on device usability, IT retail and network provisioning, subscriber network usage consumption and billing validation for the following device launches: iPhone 7 and 7+, iPhone 8 and 8+, iPhone X, iPhone Xr, Xs and XS Max, Apple Watch Series 3 and 4 wif LTE, iPad and iPad Pro (2016 - latest generation). Retail system launch validation for AT&T retail POS, Direct Fulfillment, for all device launches.
  • IT project production validation Test Engineer in support of iPad, Apple Watch, and iPhone XS, XR, and XS Max eSIM projects. iPad, Apple Watch, and iPhone XS, XR, XS Max, requiring complete end to end validation in the production environment of new infrastructure, processes, and activation flows for each device launch as the environment evolved over the years to streamline activation flows and in corporate new eSIM types.
  • 5G service and device launch production test engineer. Our lab team performed validation of the new Nighthawk device in the PVT lab and at 5G field sites - new 5G ICCID type validation, IT provisioning flow validation, retail order activation, direct fulfillment validation, and 4G / 5G network attach, usage generation and network validation, device firmware validation via FOTA, and billing validation in the production environment.
  • Over all responsibilities included - Test case execution. Defect reporting, reviews, and retests. Status reporting to PVT test lead on issues and overall project status, production subscriber MISDN pool management and cleanup,

Confidential, Dallas, TX

Senior Functional Test Engineer

Responsibilities:

  • Supporting Production Issues by replicating and Testing in FQC and Certifying the fixes in FQC ENV’s
  • Involved in the design, analysis and testing of the applications. CSM, API, Titan, U-verse Issues in FQC Testing ENV’s
  • Performed Regression testing on CSM, API, iPhone, Billing, FBF and WNLP before the fixes deployed into Production.
  • Used SQL queries to perform backend testing on the database.
  • Performed functional, regression, white-box and black box testing on the payment and billing application.
  • Designing and executing the TCs for Sustainment release and ECR and Merges.
  • Used QC to execute the test cases, track execution against the plan during testing and manage defects from inception to resolution
  • Identified, analyze, and document defects utilizing Mingle as a defect tracking system like WebTacx and iTrack tools
  • Responsible for development of test strategy, test plans, creation and execution of test scripts, reporting test results and working closely wif project team members on issue resolution and process optimization.
  • Worked closely wif Developers, Business Analysts, and Business User Representatives and participated in the product design process including specification reviews and documentation inspections. Documented and communicated test results.
  • Participated in meetings on a day-to-day basis wif Implementation and Development team and testing team
  • Everyday conducting and coordinating Onshore and Offshore teams for Handoff meetings. Taking and providing instructions and suggestions in Handoff meetings.
  • Involve in defect reporting, defect tracking and defect reproducing.
  • Wrote SQL statements to extract Data from application backend and to verify it wif frontend results during End-to-End stage
  • Billing run and Validation of the Bills in FQC.
  • Performed backend testing in the database to validated correct tables and data is getting updated and inserted.
  • Validated the data in the database using database checkpoints
  • Interacted wif developers to report software bugs and re-tested the fixed issues.
  • Participated in QA weekly meetings and various other meetings and discussed Enhancement, and Modification Request issues and defects in the application.
  • Coordinated wif the BA and Development team in defect fixes
  • Coordinated wif the development team in test case execution and collecting test results

Environment: s: Manual Testing, Telegence CSM, Amdocs Enabler, API, CRM,OMS,CITRIX, Quality Center, WebTrax, iTrack, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, AMC tool.

Confidential, Dallas, TX

Senior Software Quality Assurance Analyst

Responsibilities:

  • Analyzed business requirements and functional specifications to create test plan and test scenarios
  • Writing Test Scenarios tan Test Cases from High Level documents
  • Actively participating in Test Case walkthroughs and client meetings
  • Worked extensively on Amdocs Ensemble and Enabler modules - Customer Service Management (CSM), MPS (Guiding & Rating), Billing Validation, AR and Collections, ASMM, ASAP, CRM, APRM, CM, OLC, and RPL.
  • Ran billing jobs to produce bills for new and converted bans in different billing cycles for multiple markets such as Boston, Philly, New York etc.
  • Verified and analyzed logs on enabler and ensemble side to resolve issues encountered during billing, AR batch jobs.
  • Actively participated in Control test to Run and validate EOD and EOM- AR/GL/CSM/Billing/Revenue Assurance/Tax jurisdiction/Prepaid OLC Balance reports.
  • Performed Build Acceptation, Functional, GUI, Integration, Database, User Acceptation and Regression testing of Identity Access Management application
  • Performed Integration, Regression and End to End testing for application version releases
  • Run billing for BANs (customers) on daily basis
  • Checking and analyzing Ensemble and Enabler logs and Data Bases (DB) to troubleshoot issues encountered during billing
  • Performing various CSM activities such as BAN creation /activation /suspension /cancel /restore /resume of subscribers.
  • Extensively testing various modules and applications viz., ASAP, POS, CRM, AMSS,CM,GL(General Ledger)
  • Testing Build sanity on regular basis
  • Deploying Hot fixes (HF) and retested the Defect
  • Testing and Validation of Messages and Images on the Invoices for every monthly statement
  • Attending the conference calls wif Developers, Infra/Operational team, DBA’s, Scoping, R.T. and UNIX team to resolve the issues

Environment: s: Amdocs Ensemble /Enabler, ASAP, AMSS, Clarify CRM, POS, INTERFACES, CITRIX, Quality Center, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, XML, SoapUI

Confidential, Dallas, TX

Senior Test Engineer / QA Analyst

Responsibilities:

  • Performed different tests to obtain performance of the Event Processes (EP) when A&F, rating and dispatcher run concurrently
  • Performed different tests to check performance of Usage query under varied load conditions. dis was done for Details as well as summary.
  • Ran Rating, Rerating, and EOC for checking performance of Rater processes.
  • Used load runner to create new subscriber, change Price plans, Other TRB activities on Telegence side and tan pass them on to Enabler to check performance of JMSAdapter, TRBManager & API Invoker.
  • Ran different Reports on huge production volumes to check performance of different Reports
  • Compiled and presented report to the Client at the end of the test drops.
  • Testing of Different Emergency Change Requests in the labs.
  • Supporting production night release for every version on Deployment night.
  • Went through HLD for new Kintanas and tan discussed and collected requirements from Dev & Biz Dev team to understand the goal of the tests.
  • Used Team Track, WebTrax for logging defects and opening Work Requests for support teams respectively.
  • For any issues found, we got the fixes in form of ECR which were tan tested for problem resolution. Also for fixes of any issues in production ECR were tested for performance if needed.
  • Used Amdocs Service Mediation Manager to process the rejected events, rectify them and tan reprocess them back again
  • On completion of tests, would log the details of the tests, processes, and timings, special comments in a performance website which was tan used by Performance DB team to analyze the system behavior in the course of the test.
  • API (TLG and API for Usage query only we used), UUP (unlimited usage goes to Abinitio platform which is flat file format. Limited Usage goes into Enabler). UQ, EOC would use the data both from Enabler and UUP for a cycle month and cycle code if UQ of EOC is run

Environment: s: Manual Testing, Amdocs Enabler, Telegence, API, CITRIX, Quality Center, WebTrax, Team Track, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, Load Runner

Confidential, Dallas, TX

Senior Software Quality Assurance Analyst

Responsibilities:

  • Analyzed business requirements and functional specifications to create test plan and test scenarios
  • Writing Test Scenarios tan Test Cases from High Level documents
  • Actively participating in Test Case walkthroughs and client meetings
  • Worked extensively on Amdocs Ensemble and Enabler modules - Customer Service Management (CSM), MPS (Guiding & Rating), Billing Validation, AR and Collections, ASMM, ASAP, CRM, APRM, CM, OLC, and RPL.
  • Ran billing jobs to produce bills for new and converted bans in different billing cycles for multiple markets such as Boston, Philly, New York etc.
  • Verified and analyzed logs on enabler and ensemble side to resolve issues encountered during billing, AR batch jobs.
  • Actively participated in Control test to Run and validate EOD and EOM- AR/GL/CSM/Billing/Revenue Assurance/Tax jurisdiction/Prepaid OLC Balance reports.
  • Performed Build Acceptation, Functional, GUI, Integration, Database, User Acceptation and Regression testing of Identity Access Management application
  • Performed Integration, Regression and End to End testing for application version releases
  • Run billing for BANs (customers) on daily basis
  • Checking and analyzing Ensemble and Enabler logs and Data Bases (DB) to troubleshoot issues encountered during billing
  • Performing various CSM activities such as BAN creation /activation /suspension /cancel /restore /resume of subscribers.
  • Extensively testing various modules and applications viz., ASAP, POS, CRM, AMSS,CM,GL(General Ledger)
  • Testing Build sanity on regular basis
  • Deploying Hot fixes (HF) and retested the Defect
  • Testing and Validation of Messages and Images on the Invoices for every monthly statement
  • Attending the conference calls wif Developers, Infra/Operational team, DBA’s, Scoping, R.T. and UNIX team to resolve the issues

Environment: s: Amdocs Ensemble /Enabler, ASAP, AMSS, Clarify CRM, POS, INTERFACES, CITRIX, Quality Center, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, XML, SoapUI

Confidential, Dallas, TX

Senior Performance Test Engineer

Responsibilities:

  • Performed different tests to obtain performance of the Event Processes (EP) when A&F, rating and dispatcher run concurrently
  • Performed different tests to check performance of Usage query under varied load conditions. dis was done for Details as well as summary.
  • Ran Rating, Rerating, and EOC for checking performance of Rater processes.
  • Used load runner to create new subscriber, change Price plans, Other TRB activities on Telegence side and tan pass them on to Enabler to check performance of JMSAdapter, TRBManager & API Invoker.
  • Ran different Reports on huge production volumes to check performance of different Reports
  • Compiled and presented report to the Client at the end of the test drops.
  • Testing of Different Emergency Change Requests in the labs.
  • Supporting production night release for every version on Deployment night.
  • Went through HLD for new Kintanas and tan discussed and collected requirements from Dev & Biz Dev team to understand the goal of the tests.
  • Used Team Track, WebTrax for logging defects and opening Work Requests for support teams respectively.
  • For any issues found, we got the fixes in form of ECR which were tan tested for problem resolution. Also for fixes of any issues in production ECR were tested for performance if needed.
  • Used Amdocs Service Mediation Manager to process the rejected events, rectify them and tan reprocess them back again
  • On completion of tests, would log the details of the tests, processes, and timings, special comments in a performance website which was tan used by Performance DB team to analyze the system behavior in the course of the test.
  • API (TLG and API for Usage query only we used), UUP (unlimited usage goes to Abinitio platform which is flat file format. Limited Usage goes into Enabler). UQ, EOC would use the data both from Enabler and UUP for a cycle month and cycle code if UQ of EOC is run

Environment: s: Manual Testing, Amdocs Enabler, Telegence, API, CITRIX, Quality Center, WebTrax, Team Track, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, Load Runner

Confidential, Dallas, TX

Senior Software Quality Assurance Analyst

Responsibilities:

  • Analyzed business requirements and functional specifications to create test plan and test scenarios
  • Writing Test Scenarios tan Test Cases from High Level documents
  • Actively participating in Test Case walkthroughs and client meetings
  • Executing Test Cases in Quality Center
  • Performed Build Acceptation, Functional, GUI, Integration, Database, User Acceptation and Regression testing of Identity Access Management application
  • Involved in conducting walkthroughs on Test Plan, Requirement Traceability Matrixes RTM
  • Manual Execution of the Test Cases from the given Calendar on each Build.
  • Performed Integration, Regression and End to End testing for application version releases
  • Run billing for BANs (customers) on daily basis
  • Checking and analyzing Ensemble and Enabler logs and Data Bases (DB) to troubleshoot issues encountered during billing
  • Running AR (Accounts receivable) Jobs on daily basis
  • Coordinating wif Developers for defect analysis and their fixes.
  • Performing various CSM activities such as BAN creation /activation /suspension /cancel /restore /resume of subscribers.
  • Extensively testing various modules and applications viz., ASAP, POS, CRM, AMSS,CM,GL(General Ledger)
  • Testing Build sanity on regular basis
  • Provisioning a Handset/device through Amdocs Activation Manager (AAM)
  • Responsible for functionality testing, Integration testing, Data Validation testing, Control testing and Regression testing.
  • Performing Interface testing using XML
  • Deploying Hot fixes (HF)
  • Testing and Validation of Messages and Images on the Invoices for every monthly statement
  • Attending the conference calls wif Developers, Infra/Operational team, DBA’s, Scoping, R.T. and UNIX team to resolve the issues

Environment: s: Amdocs Ensemble /Enabler, ASAP, AMSS, Clarify CRM, POS, INTERFACES, CITRIX, Quality Center, Java, Oracle, Toad, SQL plus, UNIX Batch Operational, Windows NT, Tuxedo, XML, SoapUI

Confidential, Charlottesville, VA

Sr. Software Test Engineer

Responsibilities:

  • Reviewed and updated the existing AMC Test Plan
  • Developed Test Plans, which include testing overview approach, strategy and complete scope of testing
  • Performed Black Box and White Box testing to ensure dat the functionality of the application matched wif the Business requirements.
  • Performed Manual Testing and created test cases including Functional Test Cases, Acceptance Test Cases & Regression Test Cases
  • Created test scripts using QTP and enhanced the scripts by adding functions and inserted Synchronization points and check points in scripts whenever necessary.
  • Developed test scripts for performance and data driven tests using QTP.
  • Extensively used Ethereal to capture the communication between the Clients and Server in the form of packets
  • Responsible for testing of S2S Message Protocol and developed a Test Plan for it.
  • Executed the Test Cases to verify actual results to expected results
  • Verified the Pay tables and Validation Testing on Data Tables.
  • Wrote simple to complex SQL queries to verify the database tables for the data inserted from the GUI
  • Developed, debugged, executed, and maintained software test plans and test cases using Quality Center
  • Involved in tracking, reviewing, analyzing and comparing bugs using Fogbugz, Test Track Pro and Quality Center
  • Responsible for weekly status updates showing the progress of the testing effort and open issues to be resolved.
  • Attended daily bug meetings to discuss defects along wif testers, business analysts, and developers.
  • Submitted defect reports & running Regression tests
  • As a Senior Test Engineer Led the Four members team at onsite and Co-ordinated wif other Location Team members for the testing Activities and Status updates

Environment: Manual Testing, QTP 8.6/9.0, Quality Center9.0, Fog Bugz, Test Track Pro, C, C++, C#, ASP.NET, SQL Server, XML, IIS, Windows XP/NT, Crystal Reports, Python, Ethereal, Ether term.

Confidential, VA

QA Analyst

Responsibilities:

  • Involved in developing test case inventory, Test Plan and detailed test cases for Manual Testing for CSM and Price Plan Modules.
  • Responsible for Testing the Web CSM module as a first phase.
  • Analyzed business requirements, system requirement specifications and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Ran Smoke test every morning after each build.
  • Performed Functionality, Regression and User Acceptance Testing.
  • Performed Black box testing for the application and reported the defects using Quality Center
  • Written positive, negative test case and conducted tests as needed.
  • Performed unit testing using JUnit test cases to test the behavior of the coded classes.
  • Developed automated test scripts using QTP to perform functional and regression testing.
  • Used Quick test pro data table to parameterize the tests.
  • Created Test input requirements and prepared the test data for data driven testing.
  • Carried out extensive testing wif the QTP testing tool wif different test cases, which reflects the various real time business situations.
  • Performed actual load and stress testing using LoadRunner by creating virtual users and multiple Scenarios.
  • Performed load and stress testing using rendezvous points and recording transactions in LoadRunner, to pinpoint potential problem areas and bottlenecks.
  • Created Check points and Synchronization points for functional testing using QTP.
  • Data validation and Database integrity testing done by executing SQL, PL/SQL statements.
  • Performed cross browser functionality testing on multiple versions of IE/NN/AOL.
  • Used Quality Center to track and report system defects and bug fixes. Written modification requests for the bugs in the application testing on.
  • Participated in defect review meetings wif the team members. Used MS- Word and Microsoft Excel for documentation.
  • Involved in other test planning meetings and submitted test matrices daily to the management.

Environment: LoadRunner8.0, Quality Center 8.6, QTP 8.6, HTML, DHTML, JAVA APPLETS, JSP, Servlets wif JDBC, EJB 2.0, JUnit, APPLICATIONS, Oracle, PL/SQL, MS Access, Windows NT/UNIX.

Confidential, NY

QA Analyst

Responsibilities:

  • Helped QA Team to implement testing objectives, studied and understood the functionality of the application under test.
  • Formulated Test plans from Business Requirements/Function Specifications documents and created several Test Cases.
  • Formulated Performance Test Strategy, Test Scenarios and Test types based on the transaction details, user profile details and transaction distribution using LoadRunner.
  • Administered and participated in Load and Performance testing done using LoadRunner by creating virtual users.
  • Wrote and executed test scripts to verify most of the Test cases using QTP.
  • Performed White Box and manual testing of the application using QTP to test the system for both the functional and user requirements for positive and negative scenarios.
  • Performed Web Testing.
  • Created and enhanced numerous test scripts to handle changes in the objects, in the Tested application’s GUI and in the testing environment using QTP.
  • Manually performed Functional testing dat includes Security, Unit, System testing, Mutation Testing, Smoke testing And Back-end Testing.
  • Database table manipulations of relational database systems by writing complex SQL queries manually.
  • Conducted Regression Testing using QTP.
  • Inserted different checkpoints in the QTP scripts in order for the GUI and functional testing of the application to be more detailed and exhaustive.
  • Managed these tests in Quality Center.
  • Reported defects about any failures in the script wif Quality Center.
  • Analyzed results of the tests by creating reports in Quality Center.
  • Documented and reported the progress to the management on an ongoing basis and updated the requirement and defect status as per the current status of the testing project in the Quality Center.
  • Actively participated in Status Reporting meetings.

Environment: Quality Center 8.6, LoadRunner 7.6, QTP 8.6, SQL, PL /SQL, Oracle, Java, Windows NT/2000, TSL.

Confidential, NEWARK, NJ

QA Tester

Responsibilities:

  • Analyzed business and system requirements for the Prudential Financial Website
  • Performed Manual testing of the application before performing Automated testing
  • Managed the test cases using Quality Center
  • Developed automated scripts using QTP
  • Developed the Test Scripts for stress testing
  • Run the scenarios and analyzed the results through monitors and graphs to identify the bottlenecks
  • Implemented User Acceptance Testing (UAT)

Environment: Oracle, PL/SQL, Quality Center 8.0, QTP 8.0, LoadRunner 7.0, Java, JavaScript, web Logic 5.1, HP-UNIX, FTP, SQL, TCP/IP, UML, Windows 2000

We'd love your feedback!