We provide IT Staff Augmentation Services!

Uat Lead Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Software Professional with over 15+ years of experience in software industry as a Software Development and Test Engineer with the latest role being Software Test Lead/Quality Assurance lead, with hands on experience with Design, development & Testing in application development using C#, VB 6.0, VB 6.0, VB.NET, Oracle 6.x, HTML, Quality Center, Rally, Team foundation Server and SQL and having excellent management skills on leading both onshore and offshore teams.
  • Has good Technical and Communication Skills. Versatile Team Player with the ability to communicate effectively Confidential all Levels of the development/testing cycles.
  • Have played the role of Scrum Master for agile projects, having the team create the tasks (Test Cases), execute those and mark the test cases and the respective User stories to complete .

TECHNICAL SKILLS:

Languages: C#, SQL, VB6.0, VB.NET, HTML

Platforms: Win 2008 Server, Win 2003 Server, Win XP, Vista, Win 98, Win 95.

Tools: Team Foundation server(TFS), Product Studio(For Bugs), WTT(For test cases), Octopus(for deployment/installation), Megallan(For Code Coverage), NetMon - ACT - PerfMon (for monitoring performance counters), Source Depot(for code checkins), Quality Center, Remedy Ticketing System.

Working Knowledge: In White box & Black Box Testing, writing Test Plans and Test Cases, Automating test cases, debugging, crash dumps, Code Coverage, full Software Development Life Cycle (SDLC), involved in Agile and Scrum process, extensive knowledge in Manual testing and Unit testing, team player with good interpersonal and communication skills, QA/UAT Lead.

WORK EXPERIENCE:

UAT Lead

Confidential

Responsibilities:

  • I was responsible right from Assessing to sign-off to go-live readiness of all the projects which included the following responsibilities:
  • Reviewing and assessing the requirements
  • Includes Extracting (by analyzing the current requirements with the stake holders ) and anticipating (coming up with the requirement gaps) the requirements.
  • Ensuring that the definition of tests (Test Cases) provides effective coverage of all the In-Scope functionality - Detailed and comprehensive Test Plan
  • Getting the priorities of the respective features set.
  • Gathering the key acceptance criteria.
  • Setting up efforts pertaining to the data request needed for the smooth execution of the UAT.
  • Defining the right exit criteria.
  • Making sure that the proper buy-off is gathered from the business owners in case of any discrepancies in the expected and the actual results.
  • Recognize any change necessary to existing processes and take a lead role locally in ensuring that the changes are made and adequately communicated to the concerned parties.
  • Evaluating the process and facilitate the issue resolution.
  • Automation using C# (with internal Confidential Automation tools) and Selenium (beginner level).
  • Sign off on the UAT with all the caveats listed accordingly and communicated to the whole project team as needed.
  • Assist on the Night of Deployment with the production testing.
  • Mentoring the team on the projects based on the priorities
  • Aligning the projects based on priority and complexity .
  • Assigning the resources based on skill set and availabity
  • Provide detailed insight to the assigned team on the project, logistics, data required
  • Simplifying the requirements
  • Evaluate the testcases written, test Data creation, help the team execute the cases
  • Push back on any last minute requirement changes or get the buy off from the management in accepting the last minute requirements based on the value to the customer
  • Back up the team in case of ambiguous situation like, defect resolution
  • Was completely focused on customers and business. Was constantly in touch with the customers to see what they exactly are4 seeking for from the product, versus testing on just the what the expected behavior should be . Gave some valuable missing requirements to the team to fulfill in, so the customers are happy.
  • I was involved in the cross team collaboration to understand the process they follow so we could bridge the gap in our team. For example, the gap in gathering the test data requirements, accepting/rejecting the requirement changes Confidential the last minute, etc.
  • Driving and contributing with the team to meet business goals and objectives
  • Evaluating and segregating the information gathered from high-level / low-level sources to a general understanding.
  • Link business goals to everyday actions and priorities
  • Reviewing deliverables and providing input for other analysts. Strong understanding of all business channels.
  • Understanding how policies, systems and processes impacts requests/projects and fostering business knowledge and understanding among business systems analysts and project stakeholders
  • Facilitating requirement sessions with stakeholders and creating comprehensive work products, documenting requirements gap.

Test Lead

Confidential

Responsibilities:

  • I was responsible for leading the projects on the CSSNG (Compass site System Next Generation) team which included the following responsibilities:
  • Was involved in the SR and the HLD reviews
  • Draft the Test Deliverable scehdules.
  • Was involved in the Test Plans and Test Cases. Made sure all the test cases were covered, entered in the tool (QC, TFS, etc) and prioritized.
  • Made sure all the test cases were correctly mapped to the requirements.
  • Made sure the team met the test schedules with no hitches.
  • Involved the team in daily stand ups to understand the progress of testing cycle and if any issues that needs to be addressed.
  • Made sure all the defects entered had all the details needed like the Title, Description, additional info and images. And also made sure they were real bugs and not a design change.
  • Was responsible for every day triages on bugs.
  • Was involved with the Integrated Pair Wise Testing (IPWT) and UAT . Made sure the environment and End-Point details was communicated to the team in time.
  • Was involved in getting the Test package submitted to the team which consisted of:
  • Test Plan
  • Test Cases
  • Test Results
  • Defect Disposition Report
  • Traceability Tree (Which contains the mapping between the Requirements and the Test Cases).
  • Bug Status
  • Was solely responsible for the Test Sign OFF from the team.
  • Was involved during the Deployments and Smoke testing after the deployment.

Sr. QA Analyst / QA Lead

Confidential

Responsibilities:

  • I worked on Remedy Ticketing System.I was responsible for designing test strategies including Test Plans, Test Cases, coming up with test schedules, coming up with a process for Test Sign off, Test Plan /Test Case review, executing the tests, reporting bugs, debugging the issues, assisting in deployment and smoke testing in production, on the Remedy Platform.
  • Change Management
  • Incident Management
  • Work Order Management
  • Service Request Management
  • People Feed Web Service(Through Soap UI)
  • Took up QA Lead role and worked on ramping up the new QA folks, assisting them with the QA tasks, assigning them modules to work on, reviewing the work, guiding them on the daily tasks, educating them on the projects in case of any questions, Bug reporting, striving towards Confidential (Zero Bug Bash - bugs are either closed or are postponed to a future release) before the sign off.

SDET

Confidential

Responsibilities:

  • Was involved right from the design phase of the project and gave constructive feedback to the team about the implementation of the webservice
  • Was involved in the review of the design document
  • Creating and enumerating the testplan and testcases.
  • Automation of the ScomConnector Service and Auto Pilot Connector WebService. Tests involved like:
  • Processing Events to the Output Adaptor.
  • Calling ScomConnector, when the other dependent services(Pipeline service, are) are down.
  • Verifying the custom fields in SCOM for the proper updates of the events.
  • Few Manual Tests.
  • Full TestPasses, Regressiong Testing and Smoke Testing.
  • Creating test MSI for the test automation
  • Setting up the SCOM server
  • Creating/Importing Management packs (OLEDB, NTService, TCP, URL etc).
  • Setting up monitored objects(The computers that needs to be monitored by SCOM to gather the events).
  • Creating rules in Monitored Objects(like the event ID number, event source, etc).
  • Generating test Alerts(Event Alerts, URL alerts, etc) from different Scom Agents in SCOM.
  • Smoke tested in Production(Looking Confidential Healthbeat, Events Processed counter, etc.) with limited access.
  • Wrote the following helper functions to help enhance the Automation
  • Method that determines if an xml file is empty
  • Method that gets and modifies the element value from an xml file.
  • Method that stops the service, deletes some files and then starts the service
  • Method to see if a particular service is installed.
  • Method that returns all the files from a directory
  • Method that deletes records from the database

Confidential

SDET

Responsibilities:

  • Automating the Contacts area in Outlook using the DASATK test framework which uses the Fake server. The DASATK connects to the Active Sync Engine(ASE) through HTTP Port. And it communicates to ASE through AirSync protocol and send XML’s to ASE to manipulate date. DASATK basically uses Outlook Object Module to verify the data. And I was responsible to automate the protocol. DASATK used the fake server.
  • A function that changes the System’s DateFormat. Researched a lot on how to do this. And then found that I have to deal with some Windows API’s to do that. Basically the pInvoke thing.
  • A function takes an XML file and TagName as inputs and returns the Tagvalue as output.
  • A function that would compare the tagvalue (from an XML file) of the server from that of the Response body(Which again is an XML file).
  • Function that would compare 2 pictures.
  • Function that would create Random Numbers and Strings that could be used as the Input data for automating test cases. We never used hardcoded inputs in our automation.
  • Function which basically writes the information about the cases failed, and why and where they failed.
  • Function that would convert an xml into a string
  • Automated the Tasks test cases using the EWS test framework . This used the real exchange webserver.
  • Worked on setting up the Virtual Confidential (This emulates on X86)
  • Fixed cases written by vendors which failed in our daily runs.
  • Coordinated with the vendor team (China) about the runs and test cases fixing.
  • Narrowed down all the cases to Emulator, Simulator, NoEnroll Emulator, Cmdlets and NodeMonCSP cases.
  • Ran all the TEE automation and fix the automation bugs if any.
  • Did a code coverage on the TEE
  • Communicated the TEE TestPass report to the team with the exact number of tests being run, Tests passed, Tests Failed and with the code coverage numbers.
  • Uploaded the TEE results into WTT with batch Files.
  • Ran all the ADGP Driver cases ( RSOP Functional, WMI Negative, WMI Positive, Update CmdLets)
  • Did some automation fixes
  • Ran all the GPMC automated testcases and fixed some failures in the automation.
  • Wrote batch files to run the Automation
  • Getting the Traces for the bugs with the help of the WPPTracer.exe(This is the inbuilt tracing code written inside the product). And reporting the traces to the dev’s or the concerned people.
  • Tested lot of DCR’s and did lot of regression testing.
  • Tested Multiple Instance. Had SQL in 2 instances, Device Management Server 1 and Enrollmet Server 1 pointing instance 1 of SQL and Device Management Server 2 and Enrollmet Server 2 pointing instance 2 of SQL. And the client box and the Domain Controller on their machines/instances.
  • Setting up the Test environments with Single Box and Multiple Boxes Topologies.
  • Wrote lot of Test docs, on setting up environment, Running the test cases, Setting up Single-box and Multi-Box topology, etc.
  • Ramped up a new contractor on setting up the environment and running the GPMC cases and getting him upto speed on the project with automation.

SDET

Confidential

Responsibilities:

  • I worked as an SDET in the Image group of Confidential . My features included Footage Details page, Price Update tool and Pricing Provider(Web Service). Basically tested the updates on adding new prices for each image based on the territories(Like, US, UK, Japan, Spain, etc.) and the currency. Main work items included the following:
  • Involved in writing Test Cases and Test Plans
  • As part of the sprint process, involved in daily triages and schedule tracking.
  • Execution of the tests
  • Automation (using C# and SQL) of the Pricing Provider(Web Service), to test 4 broad categories namely, basic Tests, boundary Tests, multiple and invalid tests.
  • Run SQL jobs to update the database with the latest prices for the Price Update Tool.
  • Use of basic SQL queries inside the C# code for automation to verify for the return data.

SDET

Confidential

Responsibilities:

  • I worked in the Windows Desktop Search Team. The project I was involved in was the Casino Project which is the new Search Feature that is going with Vista.
  • My Primary responsibilities included:
  • Run the tests and debug the solutions using the debugger tools like Win Dbg, Console Dbg, etc.
  • Use of Watson for crash dumps and debugging.
  • Do the Code Coverage stuff for the features I’m involved in. I used the Megallan Tool for the CodeCoverage Stuff.
  • Coordinating with Vendors Offshore (India) to keep them up to speed on project progress, requirement and status to drive the projects successful. I was involved with the following:
  • Assigning work items to them
  • Giving them guidelines and status on the new features added and how to test features
  • Keeping track of their daily tasks
  • Verifying the bugs entered by them
  • Resolving any issues/questions they have on the project
  • Guiding them on test runs
  • Automation for BVT’s.

Software Test Engineer

Confidential

Responsibilities:

  • I worked in MSCOM team for 2.5 years as a full time employee. I worked on several projects during that period including Media center Online Project, Customer Support Tool, Regsys Builder, Anonymous Provider, Blogs, and WebBoards (Forum Service Webservice, ReachOut Webservice, Admin Webservice, Background Search, Search Indexer, Word Breaker)
  • On all these projects, my primary responsibilities are…
  • Involved in writing Automation for most of the projects
  • Reviewing and giving feedback on the functional specs.
  • Writing Test Plans and Test Cases and getting them reviewed from the team.
  • Installing the bits of the latest drop on all the test servers, Backingup and Restoring of the database
  • Performance Testing, Localization Testing, Browser Testing, Backward Compatibility Test, Cross Site Scripting Testing
  • Performed Code Coverage testing for all the projects (Used Sleuth to analyze the results).
  • Coordinated with Vendors Offshore (India) to keep them up to on project progress, requirement and status to drive the projects successful.
  • Conducted Triage meetings for all the projects that I’m responsible to make sure we fix the right bugs in the product.
  • I created Test Environment Diagrams in Visio before I actually set them up in the lab.
  • I have done extensive performance testing as part of my job. I have used tools like Anvil Tool, MsHammer Tool, Act Tool, Perfmon, NetMon, RT Analyzer, BandWidth Analyzer and Webrunner while testing performance of the websites.
  • I have created the following tools as part of my job….
  • Wrote a Tool to Stress test a WebService.
  • Wrote a Tool to Multithread Testing of the webservice.
  • Wrote a tool to retrieve data from the database in a given format.
  • Wrote Test Pages for the projects to test the webservice.
  • Wrote a Tool to test the Word Breaker.

We'd love your feedback!