We provide IT Staff Augmentation Services!

Senior Lead Consultant Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Seasoned techno - functional professional with an overall experience amounting 13+ years in Product Delivery as Quality Assurance Specialist, including 6 years of international experience (USA), currently working in a Senior QA Lead role.
  • Applicant is a result-focused professional of increasing responsibility in software quality assurance and in Software Testing in both Manual testing on (Functional, Security, SOA, SAP) and Automation Performance testing with a proven track record of delivering quality, strong strategic vision with an awarded pathway and demonstrated ability to inspire/mentor/manage QA teams and its activities in the geographically dispersed companies
  • Extensive experience in building/setup QA COE and implementing Best Practices (Coding Standards, Code Reviews, Unit Testing, Continuous Testing, Version Control, Merging QA/Dev efforts, Ensure Management Transparency, share knowledge extensively, Offer Feedback etc.,), Strategies and Automation Frameworks
  • Experience working with the business teams, development teams to define QA organization roadmaps and drive the deliverables of technology projects with high quality.
  • Strong experience with Leading Agile and Modern Agile implementation.
  • Proficient in Manual Testing, Automation Testing, Performance Testing, Security Testing, Mobile Testing and Accessibility Testing.
  • Delivered large-scale real-time distributed systems and built highly scalable systems in stressful and challenging environments by mentoring /motivating teams to achieve the business goals of the projects.
  • Drive process improvements in test design, test execution, test reporting and deliver high quality applications.
  • Demonstrated success in driving end-to-end development lifecycles, right from requirement collation, planning/scheduling, resource-task alignment, actual implementation and post-implementation support.
  • Skilled in the application of various development / testing methodologies, including Agile, Scrum, Waterfall, Iterative, aimed at executing mission-critical programs / projects in a seamless manner
  • Proven expertise in defining QA functions, setting-up all operations from scratch and laying down robust systems to support the same; driving operations with focus on qualitative project delivery.
  • Recognized for defining effective policies / procedures, critical process methodologies/improvements and systems, in line with organizational as well as client requirements.
  • Deft at collaborating with the senior management and functional managers to plan, execute and deliver projects in a timely manner, through a committed resource matrix.
  • Successfully executed several projects by utilizing onsite / offshore service delivery models which focus on risk management, enterprise release management, relocations, problem / incident / change management and conflict resolution.
  • Proficient and strong knowledge in functional, negative, regression, system integration (SIT), User acceptance testing (UAT), browser compatibility testing, and performance testing
  • Creative in writing comprehensive test plans, test suites, test scripts, test scenarios, test cases, Wide knowledge of various phases of end to end quality assurance, test strategies, and other SQA documentation.
  • Hands on experience in complete testing cycle including preparation of Estimations, test artifacts like Test Plan, RTM (Req. Traceability Matrix), Bug reporting and Test metrics document.
  • Worked with QA tools and technologies like Rational Functional Tester, SQL Query, Excel, Visio, PowerPoint, Test Director, Rational performance tester, Rational Test manager & Rational Clear Case and well versed with the test management tool HP Quality Center.
  • Used SOA Test Framework for Integrating it with Confidential Web Sphere Business Services
  • Experience in understanding application performance requirements, developing performance engineering strategies, wide-ranging exposure to complete performance testing using the protocols and usage of performance monitoring tools.
  • Well experienced in performance testing tools- HP Performance center, Load Runner, Iliad Stress and Rational Performance Tester.
  • Worked on various Load Runner Protocols like Web (HTTP/HTML), Web Services, FLEX, Winsock, Citrix, SAP-GUI/Web, NET, JAVA- RMI, Java over HTTP.
  • Experience in creating Test approach, performance test scripts, work plan and designing the test Scenarios, test cases for Performance test requirements.
  • Worked on various monitoring tools like Perfmon, BMC patrol, Dynatrace, Wily Introscope, and Team Quest.
  • Remarkable experience in Performance Testing Methodologies and Best Practices.
  • Knowledge on performance tuning activities and Knowledge in analyzing results.
  • Ability to provide guidance in building Performance Test Environments.
  • Expertise in SAP Automation and Manual Testing with different modules like PS, FICO, PSRM, MM, HR, CRM, SD, Portal. SAP testing including best practices for test documentation, test tools, and test approach.
  • Performed SOA / Web Service testing using SoapUI and have experience in XML Testing for Non-GUI applications.
  • Experience in various internet protocols like TCP/IP, SNMP, and DHCP etc.
  • Expertise on Business Process Testing (BPT). Domain experiences in Banking, Insurance, Accounting, Financial management, HR Health Care, e-Learning and HCM, Retail have ability to learn Domain Knowledge related to the application in a short period of time.
  • Good experience in test methodologies including agile scrum methodology and Test-Driven Development.
  • Good knowledge of the Backend Testing using SQL Queries and Participated in Installation Testing Activities.
  • Wide experience in Quality Assurance and Exposure to CMM and ISO standards.
  • Extensive knowledge on Mobile testing - iOS, Android
  • Track record of developing a good rapport and healthy relationships with all stakeholders, while leading geographically distributed Manual, Automation and Performance test teams.
  • Sound communicator and good team player, with high work ethics, time management and task prioritization skills, coupled with an aptitude to focus on the minute-most details

TECHNICAL SKILLS:

Languages: C, C++, VB, Java

Internet Skills: JSP, ASP, HTML, DHTML, XML, VBScript, JavaScript.

Application Servers: WebSphere, Weblogic

Databases: MS Access, DB2 7.0, Oracle 8. I/7. x, IMS, Developer 2000

RDBMS: SQL Server, DB2

OS: Windows NT/95/98/2000, UNIX (Solaris, Linux, AIX), MS DOS.

Servers: Confidential Web sphere 3.5.2 /2.0, Web logic 5.1/4.5,Java WebServer

Office Suites: MS-Word, MS-Excel & MS-Access.

Software testing tools: Bug Tracker, Jira Bug Tracker, CDMP and Track Plus

Testing Techniques: Verification and Validation, Black Box Testing, Functional testing, Standardization Testing

Application Testing: GUI Testing, Unit Functional Testing, User Interface Testing, Integration Testing and Regression Testing, Internationalization Testing, Interoperability Testing.

Automated Tools: Load Runner, Performance Center, QA Run, QA Load, QA Director QTP.

Version Control Tool: VSS, Rational Clear Case, CVS

XML Technologies: XML, XSD, DTD, WSDL

Monitoring Tools: App Dynamics, HP SiteScope, HP Diagnostics, Wily Intro scope

Other Tools: JMeter, Clear case, Clear Quest, HP ALM, Rational Rose, UML, Visio, Coradiant, True SightSiteScope, Shunra

ERP: SAP R/3 Release 4.6B, 4.6C, 4.7, Version, ECC 6.0

Other SAP Tools: SAP Portal 7.0, SAP TAO 2.0, SAP Solution Manager

PROFESSIONAL EXPERIENCE:

Senior Lead Consultant

Confidential

Environment: C#, SQL Server, Jira, Alum, Selenium SoapUI

Testing Type: Release management, Automation Testing, Functional & Regression Testing

Responsibilities:

  • Manage Software Product Delivery in terms of QA.
  • Design Quality Assurance Strategy, Automation Framework and Automation Strategy.
  • Manage Risks, Estimating the work, Resource allocation between projects and track the project progress.
  • Drive triage, status meetings with team. Meet with Senior Leadership, provide Project Health Reports and ensuring quality deliverables on time.
  • Represent Delivery Team (offshore) as a SPOC (Single Point of Contact) for Onsite Delivery teams.
  • Collaborate with Senior Leadership, Scrum Masters, BA's, all Leads and ensure that project (PCA, CCV and BMPS) features delivered on-time with utmost quality.
  • Working with Engineering Leader, Agile Managers and made sure that effective resource utilization in place in all Agile teams. Was creative in engaging resources by sharing their abilities across teams during resource crunch situations
  • Working with QA Leadership team in the initiation of Release/Sprint Metrics to track the progress of deliverables in each sprint of the release and to minimize the risk of spill-over of stories and/or last-minute builds.
  • Streamlined the Delivery metrics and presented to business team to show the quality effectiveness of all products,
  • Closely working with support teams (PCA-SMT, CCV teams) and made sure the issues are addressed/closed as per defined SLAs and ensured Zero critical issues open at the end of every release.
  • Participated in Modern Agile, Managing in Modern Agile Environment trainings
  • Initiated frequent feedback program, provided constructive feedback to all individuals every quarter and motivated the team to improve their performance.
  • Initiated “Consolidation of builds in UAT and Prod environments” to help Tech Ops team for deployment accuracy.
  • Ensured cross collaboration between Agile teams and enabled better hand-shake/coordination with Product teams in terms of sharing the feedback on the risks/needs for effective delivery.
  • Worked with every leader, enabled collaboration across teams, implemented knowledge sharing between teams, improved process for effective/high quality delivery.
  • Provided/taken feedback for continuous development/improvement, facilitated high performance pre-requisites to build HIGH performance team and ensured utmost CLIENT satisfaction.
  • Defined the roles and responsibilities to every individual and provided guidance to play their role effectively.
  • Hired experienced individuals based on the skill/capability requirement and provided adequate training, mentorship to become high performer
  • Collating the information collected through the session and creating a high-level test strategy document which cover the Scope, Functions, Risks and Mitigations
  • Assist with design of system integration, regression, and UAT testing solutions utilizing business/functional requirements
  • Evaluate test specifications, requirements, strategies and methodologies against the solution
  • Develop and establish quality assurance measures and testing standards for new applications, products, and/or enhancements to existing applications throughout the lifecycles
  • Assist with design of system integration, regression, and UAT testing solutions utilizing business/functional requirements
  • Creating and maintaining test plans for all the projects covering the test approach, test coverage, methods, roles and responsibilities, suspension criteria and resumption requirements, test deliverables, test schedules, test environments, risk and contingencies, and approvals.
  • Managing and implementing Confidential ’s test data management strategy to address the test data requirements for all programs and projects;
  • Customizing data management tools like data creation utility, data conditioning utility, file creation tool, and file validation tool for the data requirements; and
  • Overseeing the data requirements to ensure that they are in line with the test scenarios and the online test data repository is accurately built.
  • Responsible for creating, maintaining and executing manual tests, as well as providing constructive technical feedback throughout the development lifecycle
  • Perform System testing to ensure that the performance Parameters and other features of the System were not impacted by Code Changes included as a part of the enhancement.
  • Execute test plans and procedures using manual, semi-automated, and automated techniques, and system test harnesses
  • Execution of test cases and results gathering to a highest possible degree, and work closely with the engineering team to identify, characterize and resolve software issues
  • Test system reliability, scalability, load and throughput
  • Reviewing test execution progress daily for all testing projects using the application lifecycle management (ALM) and Jira tool;
  • Managing the defects in the ALM tool;
  • Conducting defect meetings; and Track quality assurance metrics, like defect densities and open defect counts
  • Publishing defect metrics to key stakeholders

Quality Assurance

Confidential

Environment: C#, SQL Server, AppScan, RTC, MTM

Testing Type: Release management, Security Testing, Functional & Regression Testing

Responsibilities:

  • As QA for Application Testing will be the liaison between Project Managers, Production Support and the Application Testing Team. My role will include accountability for project team activities including the analysis of functional and technical requirements, determining resource needs, developing timelines for test plans, test cases and test scripts, and directing assigned team members to meet or exceed testing expectations.
  • As a QA will assist in the evaluation and publication of test results and metrics to determine compliance with regard to test plans and established business processes. Process optimization or lessons learnt are duly shared with top management to show ownership in such cases.
  • Key responsibilities are Defining Testing Strategies, Leadership, Reporting, Managing Risks and Process improvements.
  • My QA team "support" production support teams in ensuring that a production defects are duly fixed in time, testing in System Test environment, tested by business teams in UAT & moved to PROD and a hot-fix is applied through a staging environment.
  • The process of managing, planning, scheduling and controlling a software build through different stages and environments; including testing and deploying software releases
  • The ownership is in ensuring that the regression and impact analysis methodology should be reviewed again - to ensure that PROD failures do not happen in future.
  • Manage all aspects of the end to end release process
  • Update service knowledge management system - KEDB, Knowledge Management
  • Ensures coordination of build and test environments teams and release teams
  • Ensures teams follow the organization’s established policies and procedures
  • Provides management reports on release progress
  • Service release and deployment policy and planning
  • Deals with release package design, build and configuration
  • Deals with release package acceptance including business sign off
  • Deals with service roll out planning including method of deployment
  • Deals with release package testing to predefined acceptance criteria
  • Signs off the release package for implementation
  • Deals with communication, preparation and training
  • Audits hardware and software before and after the implementation of release package changes
  • Installs new or upgraded hardware
  • Deals with storage and traceability/auditability of controlled software in both centralized and distributed systems
  • Deals with release, distribution and the installation of packaged software
  • Establishes and reports outstanding known errors and workarounds
  • Coordinate release documentation and communications, including training and customer, service management and technical release notes
  • Plan the deployment in conjunction with change and knowledge management and service asset and configuration management
  • Provide IT service and business functional support from prior to final acceptance by service operations
  • Ensure that the service operations have an appropriate handover and training
  • Ensure delivery of appropriate support documentation
  • Deal with formal transition of the service to service operations and CSI
  • Co-ordination the Security AppScan runs for all the builds and leading the efforts. Owning a Module and initiate the Security Scan using App Scan tool. Followup, Analyze, Test, Re-Test, Report issues and Re-verify them post fixes done by SE. Maintain the proper depth and breadth of the scan in regular intervals for the quality scan.
  • Define security testing framework and best practices for PD testing group.
  • Exploit security flaws and vulnerabilities with attack simulations on multiple systems and projects working against specific focused scopes of work.
  • Integrate the security testing activity into the development/testing lifecycle.
  • Perform web application Penetration testing (manual and Automated) and pinpoint the security issues and suggest countermeasures for security improvements.
  • Research and develop security testing tools, techniques, and process improvements.
  • Assists test management to ensure the delivery of security testing activities throughout the duration of a project until release.
  • Mentor junior engineers where necessary to build their skills and contribution levels in security testing.
  • Working knowledge of Security principles, techniques and technologies
  • Good understanding of network protocols, design and operations
  • Perform application and infrastructure penetration tests, as well as physical security review and social engineering tests for our global clients in an IOT environment
  • Well versed with Security Testing from Application to Server (E2E) in the IOT space
  • Review and define requirements for information security solutions
  • Perform security reviews of application designs, source code and deployments as required, covering all types of applications (web application, web services, thick client applications, SaaS)
  • Participate in Security Assessments of networks, systems and applications
  • Work on improvements for provided security services, including the continuous enhancement of existing methodology material and supporting assets

Offshore Test Coordinator

Confidential

Responsibilities:

  • Demonstrated the ability to work in complex environments including matrix organizations. Worked with external vendors, offshore vendors, outsourcing contracts, CoSourcing/InSourcing contracts and managed the performance of the contracts.
  • Solid analytical skills, exceptionally strong problem solving, decision making, and people management skills. Determined, self-starter, quick learner with strong interpersonal skills with proven ability to manage the positive and productive client relationship.
  • Co-ordinates the testing efforts of multiple projects executed from near shore and offshore.
  • Responsible for coordination of both offshore and onsite team’s daily tasks planning, assignment and tracking.
  • Establishes standards and processes that Test Leads should follow to manage their areas of responsibilities.
  • Preparation of System Test Strategy, Test Plan and effort estimation.
  • Formation of Test Team, distribute the workloads, track the progress.
  • Schedule Preparation, Module Allocation, Reviews on Test Process, Client Interaction, Verify Status Reports
  • Write job descriptions, Review resumes, Interview candidates
  • Track project Development and team costs
  • Determine milestone deliverables, communicate changes to schedule, Determines impact of schedule changes on quality objectives, Coordinates schedule with dependent groups.
  • Map quality measurements to Customer satisfaction goal. Determine effective quality metrics, Establish quality criteria / release criteria.
  • Synchronize schedules, set expectations and establish service level agreements (SLA), Escalate issues with internal tools.
  • Responsible for Resource planning, project forecasts, project estimations etc.
  • Successfully built an Automation Performance team at offshore
  • Successfully built a near shore team at Confidential USA for the project DC7 and DC5.
  • Successfully built an offshore functional team at Hyderabad for the project CDE projects
  • Accountable and responsible for the Performance testing for Client Plan, coordinate, and execute the DC level Performance testing Perform strategic and operational planning, budgeting, and staffing forecasts for Performance testing, as well as monitor these activities through the lifecycle of the project Staff and onboard the DC level Performance test team.
  • Coordinate and plan the DC level Performance test environment requirements
  • Provide hands-on coaching and leadership to the Performance test team and team lead
  • Ensure DC level Performance test Entrance and Exit Criteria are met
  • Review and provide input to Performance test plan
  • Serve as an escalation point for Performance test teams and team leads
  • Act as a liaison Performance testing and established management communities to drive consistency, share best practices, and identify Performance test lessons learned
  • Create and manage Performance test objectives, testing approach, and overall test schedule
  • Identify Performance testing improvements or areas of focus for current and future Phases
  • Communicate DC Performance test status and enable the sharing of key metrics
  • Proactively gather required information from Product Management, Business Analysts, Chief architects and Clients on requirements, functional specification clarification, and support team to prepare relevant documents.
  • Check/review the Test Case Document, System, Integration and User Acceptance test cases prepared by Test engineers
  • Conducting peer test reviews and Collection of Test and Defect Metrics
  • Assist in design and development of Test Automation Frame work
  • Designs, develops and executes reusable and maintainable automated scripts
  • Send status reports (Daily/weekly/Monthly) to Project manager/Client
  • Assist team members on clarifications and technical issues and mentor them.
  • Independently advise stakeholders of best security test strategy to use for each project.

Onsite Offshore Coordinator/Performance Tester

Confidential

Environment: Web services, Web Application and Store procedures

Responsibilities:

  • When Onsite Bloomington, IL, USA, Have close coordination with the client Confidential Insurance in USA and update project teams on the new requirements in India.
  • Managed Onshore/Offshore and vendor relations
  • Working with team members to capture lessons learned from performance test environment setup and shakedown and develop action items to help prevent these problems from happening in the future
  • Prepared Test Plan, High level test scenarios, Test data and Traceability matrix and Single Point of Contact for Performance Questions.
  • Responsible for Monitoring Early Performance Testing for efforts and for Performance Test Requirements, Performance Test Plan, Test Schedule, and Test Summary Report
  • Maintaining the Performance Test Team Responsibilities Document
  • Coordinates performance testing activity for client and outside efforts. This includes handling environment sharing and test scheduling to prevent component overlays on the systems that can result in significant downtime during performance testing
  • Support the production of the UAT Test Scripts by the functional consultants and/or other designated project personnel, ensuring that the definition of the tests provide comprehensive and effective coverage of all reasonable aspects of functionality.
  • Execute the test cases using sample source documents as inputs and ensure that the outcomes of the tests are recorded.
  • Validate that all test case input sources and test case output results are documented and can be audited.
  • Document any problems, and work with the project team to resolve problems identified during the tests.
  • Sign off on all test cases in accordance with the stated acceptance criteria by signing the completed test worksheets
  • Accept the results on behalf of the relevant user population.
  • Recognize any changes necessary to existing processes and take a lead role locally in ensuring that the changes are made and adequately communicated to other users.
  • Record timings of transactions for purposes of comparison with previous test phases and expected timings relating to legacy systems.
  • Conduct Volume/Performance/Capacity Testing as requested.
  • Attend de-briefing sessions with functional leads and/or other designated project personnel following UAT.
  • Managed Performance testing CoE using HP Performance Center/ LoadRunner including the following activities: directing/identifying volume/stress/performance testing solutions, scheduling, tracking, measuring of Performance testing and test results
  • Executed stress/load/rendezvous scenarios and regression testing for various operations and performed detailed test analysis reports and perform Disaster Recovery.
  • Created performance scenarios and scripts for doing multiple iterations.
  • Executed various load tests such as stress test, endurance test, throughput test, capacity test.
  • Collected information from Business analysts and software developers.
  • Developed and managed test data and the test environment; as well as document and track problem reports.
  • Review of deliverables like Test Report and Test Analysis (Weekly Status Report, Work Breakdown structure, Defect Trend etc.
  • Assisting analysts with estimation/scheduling, and risk prioritization for both Early and Formal Performance Testing
  • Assisting analysts with Performance Test results analysis and the determination of next steps (application tuning, additional testing, etc)
  • Responsible for performance results communications and Providing analyst specific mentoring to aid in the development of performance focused skill development
  • Setting strategic direction for performance testing and automation
  • Participate or send representation to Client Project PPP sessions to ensure early performance testing is included in project planning and Responsible for Managing dependencies and risks.
  • Coordinate, review and route for authorization all recommended performance changes that will be implemented to production. This will improve communication, ensure validation, and minimize errors and/or last minute changes going to production.
  • Regularly meet with Early Engagement Analyst and platform performance analysts to get statuses on projects and to stay engaged on upcoming projects/efforts
  • Meet with project and SRs early in project life cycle to proactively estimate Capacity Concerns, Identify Application Tuning Opportunities, request Application Distribution Diagrams, Request BVM’s (Business Volume Metrics) from Business analyst, request SVM’s (System Volume Metrics) from Systems Analyst, and all necessary information to develop complete understanding of application and environmental changes.
  • Educate business areas on the importance of identifying Application Performance Requirements and SLAs.
  • Responsible for Follow up meeting with Early Engagement Analyst, Platform representatives, PPEM, TAE’s, Scripters, Test Lead, and additional resources required to properly size Performance environment and identify effective performance tests and load for performance Testing.

Onsite Coordinator

Confidential

Environment: Java, AS400, Mainframe, DB2, RAD, XML, QTP.

Testing Type: Automation (QTP) and SOA (SOAPUI)

Responsibilities:

  • When Onsite visit to Dublin, OH, USA, Have close coordination with the client Nationwide Insurance in USA and update project teams on the new requirements in India.
  • Prepared Test Plan, High level test scenarios, Test data and Traceability matrix.
  • Created scripts for the critical production defects in the less span of time. This would achieve to meet the customer satisfaction.
  • Executed the predefined QTP automation regression scripts for each release.
  • Providing ongoing support and working on Major small projects like eCCAp and Horizon.
  • Work closely with international clients, product management, project managers and engineering technical leads to assure that user interface achieves desired user experience and performance goals for the application.
  • Conduct peer test reviews. And Conduct test reviews.
  • Provide regular resource performance feedback to Project Leads/ Project Managers.
  • Assist team members on clarifications and technical issues and mentor them.
  • Interact with Product Management, Business Analysts, and Clients on requirements and functional specification clarification.
  • Work with Project Managers and IT on Hardware and software requirements for test setup
  • Communicate and share with Clients QA standards and processes.
  • Get familiar with technology used in the project
  • Supports software releases and deployments, configurations, upgrades and migrations through problem isolation, verification, resolution and documentation.
  • Assist Project Managers and HR in recruitment process of technical resources.
  • Participates regularly in the recruitment of new hires through our interviewing process and involvement in recruiting events.
  • Performed retesting, Regression, Functionality and System testing.
  • Batch Execution and Product Verification is one of them. I am performing the role of a team lead that includes executing Batch Process in Natural, Adabas & COBOL.
  • Executing Oracle Stored Procedure in TOAD, executing Credit Card processing in Java and XML, synchronizing Claims on EPIC SYNC, setting up new environment to execute all processes, developing applications and fixing application problem.
  • Testing & executing batches as part of GMACI Convergence project.
  • Execute Batch Process in COBOL, Natural and Adabas. Write and Execute Oracle Stored Procedure in TOAD.
  • Execute Credit Card Payments in Java and XML. - Synchronize claims on EPIC SYNC. - Set up new environment to execute all processes - Develop application program per requirement.
  • Good knowledge on the different systems like policy admin, cash, print and claims. Resolved issues quickly based on the knowledge on the migration and book roll systems and balancing policies that are found to be out of balance in the policy admin simulation cycle.
  • Proficient in setting up the new environment to execute the batch and other processes mentioned above specific to each environment.
  • Prepared Test Plans and Test Cases based on Requirements and General Design Documents. Planning the testing timelines of a release for the various SDLC testing phases including Integration testing coordination and execution.
  • Preparing Release level test plan and conducting Release level testing status meetings to track the testing progress and presented the Release level metrics on a regular basis.
  • Reduced the cost by 40% of System Analysis/Testing by proactively using the resources to do both Analysis and Testing of the modules at the release level.
  • Mentored and coordinated the teams and aid in effective team building and take responsibilities to complete the task in the speculated time for the various releases.
  • Analyzed Test Plans and Test Cases based on Requirements and General Design Documents
  • Proficient in using Quality Center director and Prolite tools to track the issues and problems in the project.
  • Planning the analysis and testing timelines of a release for the various SDLC testing phases Integration testing coordination and execution.
  • Prepared/executed the Release level test plan for successful implementations in the scheduled releases.

Onsite Offshore Coordinator

Confidential

Environment: Java, J2EE, WAS, SQLServer, RAD, XML, Filenet, Quality Center, and Zoom.

Testing Type: SOA (SOAPUI)

Responsibilities:

  • When Onsite visit to Dublin, OH, USA, Have close coordination with the client Nationwide Insurance in USA and update project teams on the new requirements in India. Played a primary role in defect governance and reviewed defects to make sure the necessary information is being provided or updated.
  • Performed SOA / Web Service testing using soapUI.
  • Performed and validated business process, service level testing.
  • Created CTM test automation scripts with custom fixtures.
  • Responsible for test planning for Agile / Scrum and TDD.
  • Prepared Test Plan, High level test scenarios, Test scripts and Traceability matrix.
  • Mentoring the team in authoring, executing test cases and execution phase.
  • Tracking action items / review comments /defects /clarifications to closure.
  • Identification of various scenarios for Regression Testing.
  • Review the test cases, conducting review meetings, Kickoff meetings.
  • Planning and execution of retesting, Regression, Functionality and System testing.
  • Responsible for test planning for Agile / Scrum and TDD.
  • Prepared Test Plan, High level test scenarios and Traceability matrix.
  • Mentoring the team in authoring, executing.
  • Tracking action items / review comments /defects /clarifications to closure.
  • Identification of various scenarios for Regression Testing.
  • Review the test cases, conducting review meetings, Kickoff meetings.
  • Lead analysis sessions, gather requirements and write specification and functional design documents for enhancements and customization; Analyze product impact
  • Present and defend product designs and architecture to clients
  • Performed retesting, Regression, Functionality and System testing.
  • Conducted Test Sign-Off Meeting.
  • Communicate activities/progress to project managers, business development, business analysts and clients
  • Develop implementation and test plans, build software acceptance criteria, coordinate and work with clients to oversee the acceptance and dissemination process

Confidential

Environment: Java, AS400, Mainframe, DB2, RAD, XML, QTP.

Testing Type: Manual Testing and SOA (SOAPUI)

Team Coordinator

Responsibilities:

  • Work with Business Analyst in translating business requirements into Functional Requirements Document and to Detailed Design Documents
  • When Onsite visit to Dublin, OH, USA, Have close coordination with the client Nationwide Insurance in USA and update project teams on the new requirements in India.
  • I worked hard in "Class Application" which is core functionality in Claims solution application to meet the project deadlines and completed the work with in the span of time.
  • Prepared of Application Information Documents (AID) from the knowledge gathered from the Customer.
  • Prepared Test Plan, High level test scenarios, Test data and Traceability matrix.
  • Created scripts for the critical production defects in the less span of time. This would achieve to meet the customer satisfaction.
  • Providing ongoing support and working on Major small projects like Satellite, FRC, CUP.
  • Work closely with international clients, product management, project managers and engineering technical leads to assure that user interface achieves desired user experience and performance goals for the application.
  • Conduct peer test reviews. And Conduct test reviews.
  • Provide regular resource performance feedback to Project Leads/ Project Managers.
  • Assist team members on clarifications and technical issues and mentor them.
  • Interact with Product Management, Business Analysts, and Clients on requirements and functional specification clarification.
  • Work with Project Managers and IT on Hardware and software requirements for test setup
  • Communicate and share with Clients QA standards and processes.
  • Get familiar with technology used in the project
  • Supports software releases and deployments, configurations, upgrades and migrations through problem isolation, verification, resolution and documentation.
  • Assist Project Managers and HR in recruitment process of technical resources.
  • Participates regularly in the recruitment of new hires through our interviewing process and involvement in recruiting events.
  • Performed retesting, Regression, Functionality and System testing.

SAP Consultant/SAP Tester

Confidential

Environment: SAP 4.6B, CATT, QTP, Load runner, Test Director, SD, MM and FI/CO.

Testing Type: Manual Testing and Automation Testing

Responsibilities:

  • Review business requirement documents and derive test scenarios.
  • Creating Business Components, Test Sets in Test Plan and executing in Test Lab in QC.
  • Report defects in QC and assign them directly to the developer.
  • Tracking the defects in QC and maintaining those defects in Catapult Knowledge Management (KM) tool for future reference.
  • Involved in development activities for HR module in Shift Planning and Roosters work to maintain the daily work schedules for employees and IDOCs monitoring.
  • Communicate with Team Leader, Business Analyst on regular basis.
  • Actively involved in Team meetings for Review of the project. Post Go Live support by addressing project specific calls.
  • Perform Functional Testing, Integration Testing and System Testing.
  • Extensive knowledge to SAP System Acceptance Testing.
  • Creating automated components using SAP TAO Inspection.
  • Creating manual business components using HP Quality Center.
  • Involved in automating test cases for the main business applications using HP Quality Center, SAP TAO and QTP.
  • Writing the Test Strategy Test Plan documents for SAP CRM, WM, SCM, APO, FI using the FDDS and BRND's for client requirement.
  • Testing of Client/Server environment in SAP FI, MM, SD modules using manual as well as automated tools.
  • Involved in documenting Test Plan, Test Cases and Test Procedure using Business requirements document and Functional requirements document of the system.
  • Conducted entry validation tests on all the online Applications - both positive and negative Tests.
  • Identification of Test data and validation of Test environment.
  • Involved in Integration testing (end-to-end Cross-Functional process chain).
  • Recording of various tests and stored in the database for reusability.
  • Reporting of bugs using Test Director. Create test summary report.
  • Monitored the status of bugs until they were sorted out. Made summarized and detailed test reports.
  • Involved in creation of Virtual Users using Load Runner.
  • Involved in creation of low, medium, and Peak load scenarios.
  • Involved in generating scripts for the load tests as per the requirements.
  • Involved in creating the Scenarios for single and Multi-Vuser test.
  • Involved in Data Migration testing i.e. used old data and checked the validity of the data.
  • Prepared training documents for end users of application.
  • Developed various Reports, Interfaces, Forms/Layouts using ABAP/4 in SD and MM modules.
  • Performed data migration from legacy systems to SAP R/3 using session and call Transaction techniques.
  • Test planning for the requirements given & the required functionality.
  • Created Test plans & Test Cases in Test Director.
  • Executed Test cases using Test Director.
  • Tested and transferred legacy data into SAP system.
  • Involved in performing different types of Testing Unit, Integration, Regression, Data Driven and Validation Testing.
  • Involved in Modification of the existed test scripts.
  • Tracking, analyzing, review and comparison of defects using Test Director.
  • Testing & verifying the custom development & system changes which the ABAP programmers developed.
  • The automated components created are stored in ‘Business Component’ module of Quality Center (QC). Then those components are arranged in order in the ‘Test Plan’ module of QC.
  • Automated testing using QTP with the help of manual test cases. HP QTP is used to run those automated test cases.
  • Apart from automating the BPT’s, transaction level components of different SAP modules like - FICO, HR, MM & PS are being automated using SAP TAO. The list of t-codes to be automated is provided by functional consultant belonging to those SAP modules.
  • Over the period Confidential GDC Testing Team has automated around 700+ transaction codes. Automating these t-codes has made a very big robust regression testing library covering varied scenarios being used in the project.
  • Having such a huge library of automated test cases makes the regression testing activity fast and smooth. Around 100 automated test cases run can be finished overnight.
  • Data driven testing methodology is being followed in automated testing. For individual automated test case, corresponding data sheet is maintained. This makes sure a single automatic test case can be run for different scenario or for a whole new set of data.
  • The main advantage of this data driven methodology is that the task of automating the test case is not required. Change of the data in the data sheet can make the test case to run for whole new scenario.
  • Scenario creation is another automation activity. In this set of similar kind of test cases are combined to create a whole scenario. TAO is used to consolidate this scenario.
  • Also, individual test cases after confirmation are consolidated using TAO.
  • Periodic maintenance of the automated test cases is a major activity undertaken by the testing team. Streamlining of the automated test library is utmost importance. Also identifying test cases which are not so relevant for the current release and updating them.
  • Analyzing the failed test case after automated run is over. The failures are analyzed to determine whether the failure is due to problem in the script itself or in SAP itself.

Senior Test Engineer

Confidential, Norwood, MA

Environment: Java, J2EE, Websphere portal, EJB, Bus Tester, Oracle

Testing Type: Functional testing, compatibility testing, Interoperability testing

Responsibilities:

  • Verified business requirements and functional specifications and analyzed the business requirements and the site specifications.
  • Performed functional decomposition and designed test scenarios for different financial user groups based on system requirements, solution diagrams, help files, and screen mock ups
  • Conducted system functionality testing and developed produced problem reports, suggestions, system test logs, and test incidents reports
  • Identified Software bugs and interacted with developers to resolve technical issues.
  • Recording defects encountered while Test Execution test cases.
  • Involved in preparing the checklist for Browser Compatibility Testing (Internet Explorer 6.0 & Netscape Navigator 8.0)
  • Ensuring the integral aspect of QA environment, Regression testing is under-taken after every bug fix.
  • Creating the request files by using Ultra Edit 3.2 and executing the request file in Soap UI collecting the response files and checking the number status which matches to the expected result or not.
  • Updating the test cases by cross checking with the response file.
  • Created critical business process scripts for the huge Web application and load tested using V-users for Performance evaluation
  • Compared the results of previous version with the latest execution to investigate the performance of the application
  • Analyzing the test results and creating the bug report and document test result-using JIRA
  • Capturing screen shots using Snag-it / Quick Screen Capture tool, provided screen shots to identify and Re-produce the bug.
  • Trained and educated the newly joined team members on functional and technical aspects.
  • Reached the project milestones within the scheduled time.
  • Preparation and execution of test scripts using JMeter and SOAP UI tool to perform Web Services testing.
  • Preparation and execution of test scripts using JMeter and SOAP UI tool to perform Web Services testing.
  • Provide support to the development team in identifying real world use cases and appropriate workflows
  • Performs in-depth analysis to isolate points of failure in the application
  • Developed detailed Testing Strategy for the entire application and developed various test cases.
  • Initiated, coordinated and implemented the QA Process and Methodologies
  • Tested application in Citrix server using Citrix protocol and monitored Citrix server through PerfMon.
  • Executed SQL queries to perform database testing using SQL Plus
  • Used SQL Queries were to retrieve data from tables and to perform Back-End testing
  • Write SQL Query to extract data from various database tables for testing purpose
  • Inserted various check points like bitmap check points, text checkpoints and database check Points to check the functionality of the application
  • Test the forms in Documaker manually trigger document components and set Breakpoints at multiple points and execution levels
  • Find the Rating factor and table’s details and configuration files in Documaker.
  • Used Clear quest for the repository, reporting bugs, tracking bugs and updates on resolving bugs.
  • Used Clear case for Version Control. Each version of the application is stored in Clear case and necessary modifications, updates and analysis had been done.
  • Regression Testing is performed, and the additional scripts are generated for each version.
  • Extensively used QC, for test planning, bug tracking and reporting.
  • Regular meetings and updates were made to the Management team of the ongoing QA process.

Performance Analyst

Confidential, Norwood, MA

Environment: Java, J2EE, Websphere portal, EJB, Bus Tester, Oracle

Testing Type: Load testing

Responsibilities:

  • Developed Load Runner test scripts according to test specifications/ requirements.
  • Identify and eliminate performance bottlenecks during the development Lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Verified that new or upgraded applications meet specified performance requirements.
  • Used to identify the queries which taking too long and optimize those queries to improve performance
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
  • Independently develop LoadRunner test scripts according to test specifications /requirements.
  • Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages.
  • Develop and implement load and stress tests with LoadRunner, and present performance statistics to application teams, and provide recommendations on how and where performance can be improved
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Assist in production testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Interface with developers, project managers, and management in the development, execution and reporting of test automation results.
  • Used Scheduler to schedule scripts run at a time
  • Build Script with Data Driven Methodology which applies the Business rules to validate the components displayed on the website.
  • Customized scripts for error detection and recovery
  • Responsible for writing Startup scripts and Compiled Module Functions for front and backend validation.
  • Writes and executes SQL queries in validating test results.
  • Compare and analyze actual to expected results and report all deviations
  • Used Virtual User Generator to generate VuGen Scripts for web (J2EE) and Citrix.
  • Developed and deployed test automation scripts to do end to end performance testing using Load Runner.
  • Parameterize Company Name, Contact Name, Category Name, Category Description, Contact Title, Address City, Region, Postal Code Country, Phone and Fax to simulate concurrent virtual users.
  • Checking server replay and redirect the flow to pass or fail criteria.
  • Collected SQL Server memory and memory utilization of web server and database server
  • Used SQL Profiler to trace the Queries which are consuming time more than 500ms (Run the SQL Server Trace file during the scenario Run)
  • Participated in project review meetings & discussions
  • Involved in test cases Authoring and Execution.
  • Involved in reviewing the test cases and review meetings.
  • Involved in Retesting, Functionality, System testing.
  • To perform Regression testing after each modification of the application.
  • Executing Test cases and Gathering the Test execution results.
  • Involved in Troubleshoot performance issues in packaged application.
  • Worked in shared environment tested different application
  • Tracking Bugs using Rational Clear Quest.
  • Prepared Defect Summary Report, Test Summary Report.
  • Working closely with Programmers, Business Analyst's to resolve technical and functional issues
  • Implemented and maintained an effective automated test environment and the QA Lab.

We'd love your feedback!