We provide IT Staff Augmentation Services!

Software Engineer Resume Profile

3.00/5 (Submit Your Rating)

SUMMARY

  • A Professionally qualified QA Test Manager with strong analytical skills with Over 15 Years of Experience in design, development and implementation of test plans, test strategy, test procedures in compliance with Capability Maturity Model CMM level Key Practice Areas KPA . Strong hands-on testing experience on the applications developed in JAVA, J2EE, .Net, C/C , VC , Oracle, Data Warehousing, Oracle Apps, SAP, Cognos, ERP and Siebel CRM, ERM PRM for Govt., Federal, Transportation, Life Sciences Healthcare, Insurance Pharma, Public Sectors, Web Based eCommerce Portals and Client/Server multi tier apps, Hospital management, Finance/Banking, Mobile and Telecom on both Unix and Windows environment. Strong experience in All types of Manual Testing, Automation and Performance testing using with HP Test Suites like ALM Quality Center 10/11.5 UFT 11.52, Quick Test Pro 11.0, Load Runner 11.5, Performance Center 9.52/11.5, Site Scope 11.5, Diagnostics 9.0, Rational Testing tools.
  • Over 15 Years of Experience in all phases of Software Dev/Test Life Cycles SDLC/STLC requirement analysis, plan, design, implement and execute
  • Over 13 Years of experience as a QA Manager/Lead, Test Coordinator and Managed various testing Projects developed in Agile, Waterfall, V-model and managed the team of 25-30 Testers in Onsite-Offshore Model.
  • Over 5 Years of experience in Delivery Management working with client technology and business teams, directing the technical manager, project manager and the GSP teams.
  • 3 Year of experience in PMO Quality Gates Process, ORR and Designed the PMO testing process templates like Requirements, Test Plan, Test Summary reports, and submitted the compliance reports.
  • 10 Years of Strong project management responsibilities like Test Strategy, planning, scheduling, work allocation, Budgeting, Estimations and Track the QA Tasks progress integration tasks, and generate functional / performance test matrices with geographically distributed Onsite-Offshore teams.
  • 8 Years of strong experience in HP ALM Quality Center Admin and 7 years in HP Performance Center Admin and maintained the testing activities of Large Enterprise, real time, high integrity software in a complex corporate testing environments.
  • Over 9 Years of strong experience in Performance Testing with Load Runner Performance Center.
  • 7 Year of Strong Experience in Performance Test Architect, Performance Engineering in Designed, Developed and executed the Performance Test Plans. Analyzed the raw results and prepared the reports, compared results with Baseline SLAs and suggested the performance improvements.
  • 8 year of experience in Business Analysis of the Requirements Management and interact with Business Stakeholders and gathered details requirements Act as a liaison between Business and development teams.
  • 5 Years of experience in tested the mobile applications in iOS, Android, Symbian and Windows Mobile Smart phones and Tablets.
  • 5 Years of Experience in Web Services testing with SOAPUI Pro SOATest and Selenium Automation.
  • Expertise in SOA Architecture and work with Web Services WSDL/XML/SOAP .
  • 6 Years of strong experience in Backend ETL Testing with SQL and tested the application involved heavily data oriented Projects such as ETL, Data Warehouse Business Intelligence.
  • Strong hands on experience in design, development and execution of Test Strategy, Test Plan, Testing Methodologies, Test Management and Defect management Process.
  • Strong experience in PMO Process and involved in all phases of the Project life cycle, Initiation, Planning, Requirements, Design, Development, Testing, Deployment Closeout.
  • Expertise in meet with client groups to determine performance requirements and SLA goals, determine test strategies based on requirements and architecture, create and implement performance tests using Load Runner.
  • Strong Experience in Delivered the Testing activities at Program level and project level to the clients.
  • Created the Program level Master Test Strategy Test Plan, Test Status Reports, Defect Status Reports.
  • Manage the relationship with multi million dollar contracted vendors and successfully tested and delivered the Projects.
  • Governance of Global Service Provider's testing deliverables and Provide required leadership, oversight and coordination to the 3rd party Vendors and geographically distributed Teams.
  • Used Monitoring tools like HP SiteScope, HP Diagnostics, profiling tools like Precise I3 JProbe to analyze different profiles in web server, application and database server levels and find out the bottlenecks, debug the root causes and suggested the Solutions improvements to the Management.
  • Good Working Knowledge in AIX, UNIX Linux, Solaris, RedHat Linux, AWS, and Cloud Computing.

TECHNICAL SKILLS

O/S

Windows Windows Servers, iOS, Android, Windows Mobile, Symbian, z/OS, Solaris, UNIX, Linux, IBM AIX Power 7 machines and HP-UX

Testing Tools

HP Testing tools ALM/QC 11.5, Quick Test Pro 11.5, Unified Functional Test 11.5, LoadRunner 11.5/12.0, Performance Center 11.52, SiteScope 9.52, JIRA 5.2, Silk Performer 2010, HP Diagnostics, Bugzilla, MS TFS, Selenium WebDriver 2.0, JMeter, SOAPUI Pro 5, BluePrint, SAP Solution Manager, Rational Performance Tester, Rational Test Manger, Rational DOORs, Rational Clear Quest Requiste Pro.

Testing Techniques Methodologies

White Box, Black Box, Smoke Testing, System, Functional, Integration, Regression Testing, End-2-End Testing, Performance Testing, Scalability Testing, Load Stress Testing, SIT, UAT, Installation Testing, Product testing, Mobile Testing with DeviceAnywhare, Web Services Testing, HA DR Testing, 508 Compliance Testing, Systems Mgmt Security, Agile Scrum, Waterfall, V-model.

Technologies

Java, J2EE, UML, SQL, PL/SQL, HTML, SAP, Siebel CRM, ERM, Oracle Business Intelligence OBIEE, Product and Catalog Management, DHTML, XML, MS Visual Studio and HTML Kit.

Web Servers

Oracle AS, WebLogic, JBOSS, Java Web Server, Tomcat.

RDBMS

Oracle 11g, DB2, Big Data, Hadoop, Oracle Database Lite 10g, MySQL, SQL Server MS-Access.

Tools

ADE, Rational Rose EE 2001, jDeveloper10g, MS VSS 6.0, Windows Mobile, Symbian, and Smartphone, Eclipse, LDAP, Precise i3, JProbe, MS Office Suite, MS Project, Dashboard Reporting Tools, TOAD, Oracle SQL Developer.

Network Protocols

HTTP, SSL, FTP, SMTP, POP3

Scripting

Test Script Language, Perl, VB Scripting, Shell, Java Script, JSP.

Software Configuration

Build Configuration Tools and PVCS, VSS 5.0/4.0 VSS

Business Knowledge

Transportation, Life Science Health Care, HIPPA, Ecommerce Portals, State Govt Capital Projects, Public Sector, Telecom Mobile Technology, Bio-technology, Genetics, Pharmaceutical, Educational, Finance/Banking, Hospital Mgmt, Brokerage Securities, Client/Server, N-tier Web based technology.

EXPERIENCE

Confidential

QA Test Manager, Lead Automation/Performance Test Engineer Technical

Responsibilities:

  • Perform Audit Event testing using SOAPUI Pro 5.0.
  • Validate the wsdl XML files in SOAPUI and verified SOAP request and response for various services through Proxy and Directly from Curam to EXACT Integration layer to third party Legacy Systems like FDHS, MMIS, MABS and CIS.
  • Lead the Web Services testing activities using SOAPUI 4.6.4 for 270/271 testing and see the response is getting for the EVS Employee Verification System and CMS Medicaid CHIP data from CMS Systems.
  • Lead the testing releases activities of Medicaid, IA, UA, CHIP Children Health Insurance Program , 8001, 834, 270/271 and QHP.
  • Lead the Section 508 Compliance testing on MHBE, Used IE Web Accessibility tool, Firefox WAVE Plugin tool and tested the Production site and presented the daily report to State.
  • Lead the testing releases activities of Medicaid, IA, UIA, CHIP Children Health Insurance Program , 8001, 834, 270/271 and QHP.
  • Lead and Managed various releases validation of the Testing Scenarios like, Eligibility Verification, Income Verification, Coverage group, Notices Generation and Open Enrollment extended Testing in both External and Internal System for Non-Native Americans Native Americans.
  • Identified 65 critical Test Scenarios for the UAT and executed these scenarios whenever we get the new builds.
  • Total 400 Test Scenarios were Prepared and Executed based on the possible combinations of the applicant's information, family income levels for Native and Non-native Americans.
  • Managed the System Integration testing and UAT activities and lead the team of 22 Testers.
  • Reviewed and analyzed the User Stories FRs and Created the project scope Test Strategy and Test plan that are aligned with project releases and scope.
  • MHBE Portal is developed in Agile Scrum, having 2-3 weeks sprint releases.
  • Conducted requirements sessions with business stakeholders to gather detailed requirements for new and enhanced system functionality and process automation.
  • Working with multiple vendor teams involved in the project and negotiations on the test deliverables and testing tasking, build deployments in various environments like System, UAT, Stage, Production.
  • Created and Maintained the Daily/Weekly Testing Action Items log, Project report, Decision log, Risks Issues log,
  • Created the Testing Standards document, Test Strategy, Test plans, Test Management Reports and Test Summary Reports and Provide trainings to the Projects teams.
  • Created the Standard Test Requirements Management Processes and Procedures and train the Project teams across the enterprise, Provide Consulting to the Project Teams and maintain the training materials in MS SharePoint and follow the Process.
  • Oniste QA Delivery Management for various Sprint releases, QA Resource Management, Budget forecast, Resources Recruiting Allocation to the tasks, QMO, Release Management, Risk Issue resolutions, Change Management Control, Requirements Management Process, Vendor Management, Test Data Management, Test Environment Management, Performance Test Management, Performance Engineering, Testing COE.
  • Tested the prequests and responses using the SOAP UI
  • Involved in creation of mock responses using SOAP UI
  • XML, WSDL, Web Services and Client/Server messaging with SOAPUI Pro
  • Validated XML responses according to the XSD format and make sure that there are no SOAP faults.
  • Executed the 834 QHP Test Scenarios and seeded the data through Manual and also executed through SOAPUI.
  • Lead the Automation team of 4 and designed the Selenium Automation framework with Selenium WebDriver and execute the scripts in Regression testing and modified the scripts when the functionality is changed or not working.
  • Selenium WebDriver 2.0 Eclipse used to run the automation scripts.
  • Did the Backend Testing by running the SQL, PL/SQL queries in Curam T2, T3, and Pre-Prod DB2 Database.
  • Did the Extensive Backend testing for IRS Reporting release and write the SQL, PL/SQL queries, send the data in batch files from Marketplace to CMS and then CMS to IRS and check the proper response is getting from IRS to Marketplace.
  • Involved in 834 8001 EDI transaction testing and seed the large data into the portal and Verified various QHP and Medicaid Scenarios.
  • Ran the Load tests with 5000 Users for various hours' intervals and collected the transactions response times and compared with Baseline SLA results.
  • Did the Performance Engineering at the time of the Performance Test Runs and find the bottlenecks at Web Server, Application Server and Database Servers and informed to the Developers.
  • Tune the Application Server and Database Servers and improve the Performance of the Runs.
  • Did the Performance Engineering, Performance Monitoring on the applications while running the performance testing in all aspects and find the bottlenecks and tune the applications.
  • Analyzed the Response times of Various Transactions results and Prepared final Performance Report with various graph compared with baseline results and groups.
  • Prepared Analyzed the Oracle DB AWR Reports and check the load distributing to the all the Oracle Nodes, OS, Network, Monitoring and suggested the recommendations to Database teams Release management teams.
  • Captured the Application Servers, Web Servers, Oracle DB Servers Performance metrics like Throughput, CPU utilization, Disk I/O, Memory utilization and Processor time.
  • Analyzed compared the results with the baseline results and prepared compared different types of graphs and results and make the GO / NO GO decision of the release.
  • Lead the Performance engineering activities like Analyzed the completed test results, Log files, diagnosed the performance bottlenecks, conducted the performance tuning, and leading the testing of performance benchmarks.
  • Identified Listed the Key Transactions from the entire load tests and set the SLAs for the key Transactions.
  • Prepared the Performance test Summary report and presented the performance test summary report to the PMO and DHMH managers.
  • Lead the Performance testing activities of MHBE and did the performance testing with Rational Performance Tester and submit the performance test summary report to the PMO and DHMH managers.
  • Prepared the test summary report by collecting the other testing team member's data and consolidated the report and send it Client managers.
  • Present the daily Test status Defect review meetings with development team, PMO manager and DHMH Managers.
  • Daily Interaction with multiple teams and client personal and conduct the meetings.
  • Developed and Presented the Daily Test Execution Summary report, Defect Summary report and progress of the testing execution in the scrum meetings.
  • Trained newly joined testing by providing the KT and progress the track of the resources.
  • Manage the development of Test Strategies, Methods, Test Plans and Test Data that may combine methods and phases.
  • Program level Test Deliverables, Maintain relationships with the PMO to leverage in delivery.
  • Build strategic and tactical client relationships through delivery excellence and leverage with the Sales representatives.

Environment: Java, J2EE, HP ALM Quality Center 11.52, LoadRunner 11.50, QTP 11.5, SiteScope 9.52, JIRA, Selenium WebDriver, Eclipse, SOAPUI Pro, IBM Curam, Connecture, IBM WebSphere, DB2, Informatica, ETL Autosys, Datamart, Firefox, Firebug, ANT, FirePath, JMeter, 508 Compliance Testing with Web Accessibility Tool, WAVE, Oracle 11, XML, HTML, DB2, Hadoop, Oracle SQL Developer, Solaris, RedHat Linux, Unix , Windows 7.

Confidential

PMO QA Test Delivery Manager / Lead Tools Specialist / Performance Test Lead

Responsibilities:

  • Budget forecast, Resource Allocation to the tasks, Risk Issue resolutions, Change Management Control, Quality Management Operations, Release Engineering, Test Data Management, Test Environment Management and Testing Center of Excellence.
  • Define and Design the Testing Strategy, Requirements Management Process, Test Procedures and Process across the Enterprise Program Level and Project level.
  • Handled multiple roles like PMO QA Test Delivery MANAGER and Worked as an ALM / Quality Center QC Lead and responsible for maintained and Administration of all Amtrak Projects in 60 domains and 300 Projects in HPQC.
  • Managed and delivered the various Projects Program Level by Created, Reviewed Executed the Project Plan, test plans, test deliverables, Test timelines and produced the test summary reports for SAM, HCM-eForms, EIMP4, eWMS, Ariba CoSource, MM, Amtrak.com RESNG, eTicketing, Sharepoint 2010 upgrade, EPM 2010 Upgrade, Conductor Mobile Devices, Windows Mobile 8 for ecommerce mobile apps project, Transportation Foundation Projects and various small project developed in Agile, Agile Waterfall Hybrid, V-Model and Waterfall model releases in Amtrak.
  • Managed a team of 8-10 QA Analysts and Testers, Test Leads using automated and manual testing methods to ensure high quality software products were delivered.
  • Led the QA Agile Process by Generating the Sprint Back logs to track the resource utilization and estimated hours and sending the reports to upper management to buy hours for resources.
  • Daily tracking of Sprint back logs and leading the standup meetings to track the progress.
  • Lead a team of 5 QA engineers and manage end-to-end QA tasks in a SCRUM agile environment.
  • Represent the QA team in Sprint Planning, Grooming, Retrospective, and Demo meetings.
  • Attend Agile Ceremonies Sprint Planning, the Daily Stand-up , Sprint Retrospective, Sprint Review.
  • Work closely with the Business stakeholders and technical team.
  • Attend Scrum meetings with internal team and 3rd party teams.
  • Review the test artifacts publish the test reports.
  • Lead the Defect Triage Process within the Scrum Team
  • Lead the work from sprint till the product is installed.
  • Initiated, participated and facilitated the project work groups internal and external peer reviews, status meetings, and Process Improvement. .
  • Experience in building, managing transitions, organization change management and running of a Testing Center of Excellence.
  • Manage the development of Test Strategies, Methods, Test Plans and Test Data that may combine methods and phases.
  • Program level Test Deliverables, Maintain relationships with the PMO to leverage in delivery.
  • Build strategic and tactical client relationships through delivery excellence and leverage with the Sales representatives.
  • Work with delivery team personnel at all levels to drive improved client service. Assist with sending, processing, and responding to client satisfaction surveys.
  • Provide feedback on personnel performance and recommend assignments based upon capabilities.
  • Maintain oversight of the Gross Profit Project Report GPPR and project financial performance for all work performed for the client. Verify reported project information to determine revenue recognition.
  • Understand the Employee Advantage Program and prepare performance review information and assist with performance appraisal process as needed.
  • Provide delivery oversight and support to Project Launch Reviews PLRs and ensure a Project Progress Review PPRs is occurring
  • Tested the ETL Informatica mappings and other ETL Processes Data Warehouse Testing Involved in extensive DATA validation using SQL queries and back-end testing.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
  • Assist with tracking of contracts, service agreements, and service schedules.
  • Monitor delivery performance at the client make recommendations for improvement.
  • Review all SOWs, proposals, and Change Orders for the client.
  • Handled System, Functional, Regression, End to End, UAT, Automation and Performance Testing.
  • Prepared the Performance Validation Plan for various projects in Amtrak and Designed the Performance test scripts in VuGen and executed the scenarios in controller and Performance center.
  • Design the Load Runner Scenarios for 1 hrs, 8 hrs and 12 hrs and analysed the results.
  • Reserved the timeslots for the Projected and kick of the performance testing scenarios of various projects in HP Performance Center and LoadRunner and generated the reports and analyzed the reports and documented the results.
  • Developed the Performance testing scripts with Load Runner VuGen and designed the scenarios in HP Performance Center and ran the tests and report the Performance of the application along with bottlenecks.
  • Did the Performance Engineering, Performance Monitoring on the applications while running the performance testing in all aspects and find the bottlenecks and tune the applications.
  • Developed the functional Automation Test Scripts with QTP, Unified Functional Testing.
  • Developed the functional Automation testing framework with QTP, Unified Functional Testing.
  • Maintained and Designed the Testing Process templates for Requirements, Test Plan and Test Summary Report documents and make the use of these templates across corporate level in all the Amtrak Projects.
  • Participated in Functional Requirements, Design specification and technical documentation reviews.
  • Created the policies and procedures relating to software Testing and releases and Improved the Process.
  • Created the HPQC Project template, Created New Custom process docs as per the Test Lead/Test manager's requirements
  • Provide the hands on trainings to the individuals and groups to get familiarize with HPQC, ALMQC, HP Performance Center, QTP SiteScope and Unified Functional Testing tools.
  • Provides administrative, consulting, and technical support to IT and business staff in the configuration and use of QC for requirements, tests, and defect management.
  • Network Management of the HP Tools and Protocol Systems Testing and Maintenance Operations.
  • Maintained Licenses servers of Automation Tools like QTP 11.0 and Unified Functional Testing 11.5.
  • Prepared the Requirements Test cases upload templates and shared to the testing team across the enterprise level.
  • Consult the IT and Business users to troubleshoot and resolved all the Project Testing related issues.
  • Prepared the tool maintenance reports, License usage reports, users usage reports, dashboard reports, workflow reports, and training materials for ALM QC, HPPC, QTP and UFT.
  • Reviewed the PMO Quality Gates documents, like requirements, test plan and test summary reports and produced the feedback to the PMO Officers and Project managers.
  • Prepared the compliance checklist for all the projects coming to the quality gates and escalated issues to the management and make sure the right quality application is delivered.
  • Interface with HP Support and logged the issues and follow up the issues until they resolved for HPQC and HPPC.
  • Managed, reviewed and Executed the Test Plans, Test scenarios for Mobile Apps developed for Amtrk.com, eTicketing, Conductor Mobile Devices and Window Mobile 8 for Amtrk.com
  • Prepared Reviewed the UAT Test Plan and development of Scenarios and Scripts to ensure that sufficient testing is planned for UAT.
  • Attend daily defect status meetings and present progress updates on the status of various phases of testing.
  • Escalate and Analyze the defects with the project team Test leads, SME and Release Management to resolved issues.
  • Contribute to the ongoing development of UAT process and measures of success
  • Co-ordinate with Business Stakeholders and report the testing activities and get signed off at project level and program level and escalated the issues.
  • Post install activities include monitoring of Project and identification of defects.
  • Defined test execution processes, Created test strategies, defect filing and escalation procedures.
  • Helped define and implement QA best practices, including risk based testing
  • Scheduled, directed and monitored work effort of testing resources. Helped identify and mitigate risks
  • Trained and mentored resources on projects.
  • Facilitated review meetings
  • Frequently communicated to project team on progress and impediments
  • Developed good working relationship with team members and stakeholders across the enterprise to coordinate efforts
  • Closely worked with the Application data centers teams Unix SA Oracle DBA and make sure the testing tools are fully functional and available to the enterprise IT staff for 24x7.

Environment: : HP Quality Center 10, ALM QC 11.00, Performance Center 9.52 11.5, QTP 11.5, Unified Functional Testing UFT 11.52, Site Scope 9.52, LoadRunner 12.00, Diagnostics, SOAPUI Pro, Parasoft SOATest, JMeter, ANT, J2EE, SaaS, ASP, .Net, DB2, z/TPF, z/OS, Arrow, WebSphere, Foglight, BAC, Siebel CRM, Oracle 11g, Oracle Business Intelligence OBIEE , ETL, Informatica, Data Warehouse, JBOSS, Blueprint Requirement Center 2010, Network Management, HTTP, SSL, FTP, SMTP, POP3, MS Office 2010, MS Project 2010, MS SharePoint 2010, EPM 2010, MS TFS, SQL Developer, Solaris, Linux, Unix, Windows XP 7, Windows 2003 2008 Servers.

Confidential

QA Manager Performance Test Lead

Responsibilities:

  • Handled multiple roles like QA MANAGER Individual Lead for 14.x and 15.x release and reviewed with the stakeholders and updated the PVP with stakeholders comments and get signed off and baseline the PVP.
  • Responsible for overall delivery management working with client technology and business teams, directing the technical manager, project manager and the entire team.
  • Responsible for identifying and mitigating delivery risks, delivering projects within budget and timelines ensuring individual, project and organization goals are met.
  • Responsible for associate growth and concerns, mentoring and providing guidance to the team in terms of both technical and domain related.
  • Planning and Managing the technical execution of the project and day-to-day delivery activities.
  • Managing the implementation of L1 production support processes for mainframe applications and monitoring of SQL jobs for distributed applications.
  • Project dashboard review, SLA reporting and review, status reporting, risk mitigation and issue tracking resolution with the client
  • Project scheduling, monitoring and tracking and progress reporting.
  • Project status tracking every week and identifying corrective preventive actions, managing dependencies, part of change control board, facilitate sign-offs
  • Track post-delivery defects and communicate to the project team
  • Tested and Delivered EMS implemented OSS services like Realtime Charging solutions for TTSL and Aircel.
  • Responsible for the execution delivery of test plans for End to End testing in the QA Test Lab PVP.
  • Worked with peer teams and stakeholders to write test cases and review test plans for various release and get signeoff the testcase and test plans. Prepared the test beds in test lab and maintained for releases.
  • Design, Developed and executed the Performance tests of Real Time Charging System RCS , Prepaid, Postpaid, SMS, MMS, WAP, Data on AIN, WIN and CIN services solutions using CvAS, SPACE, SIMDB, SIP, SS7 HTTP. SSL, FTP, SMTP and POP3 protocols
  • Executed the testcase of call flows like SMS, Charge, Diamter, DATA, GPRS, GSM, CDMA, and VOIP calls on AIN, CIN and WIN platform of IBM JS21 JS22 and Power 7 Clusters.
  • Mentor and Lead the small team of QA testers 3-5 and assigned the work and track the daily progress.
  • Ran the different Voice Calls like SMS, MMS, Charge, WAP, DIAMETER, DATA through NGS Simulator via SS7 protocol Frond End linked board of M3UA ATM and collect the Calltrack data, RRLT and Debug the call flows and fix the flow.
  • Tested and Executed the MMS calls like VoIP, Multimedia Confercing, and Video streaming and used DOCSIS Data Over Cable Service Interface Specifications
  • Created 100K subscribers and Setup the Performance load on the application and collect the Performance counters like svmon, iostat, vmstat, netstat, mpstat and NGB.
  • Developed and executed the performance load scripts for Custom Presentation Layer CPL , Setup the Load Scenarios on different time intervals and Capture the Response time and Middle tier Server weblogic and DB statistics and tune the parameters to best values and report the bottlenecks.
  • Tested the CPL application with different Subscribers IDs and pulled Calls History of the Subscribers and ran the performance load and captured the server response time to get the data.
  • Executed the testcase of telephone calls like SMS, Charge, Diamter, DATA, GPRS, GSM, CDMA, and VOIP calls on AIN, CIN and WIN platform of IBM JS22 and Power 7 Clusters.
  • Tested the synchronization capacity of the SPACE and performance capacity of SLEE, SLDB, SIMDB and IDB find out the bottle necks and engineer the route cause of the problem and report to dev team.
  • Executed the test cases in Converged and non-Converged platform and document the results.
  • Designed and Developed the Performance Automation framework using with NGS SIM for the RCS and CvAS applications.
  • Generated the AWR reports from Oralce DB after completed the performance traffic and analyzed the report data and escalated the performance bottlenecks to database team.
  • Developing automation tools diagnostic tools using Shell and Perl scripting for different components of the ISCP applications.
  • Tested the Web Services Access, Inbound and Outbound calls like CORBA, SOAP and SimpleXML flows and run the performance traffic.
  • Coordinating, Maintaining installation of right versions of the Third Party software with fresh installs and patches of different releases of OSP applications Java, WebLogic, VisiBroker etc. on AIX, UNIX and LINUX.
  • Designed, build and executed performance tests for both browser and non-browser applications.
  • Managed daily testing activities across client testing resources and 3rd parties and remote teams.
  • Coordinating with team members for the support for technical issues, debugging and resolving issues.
  • Documented Test results, CRs, MRs for each release and prepared lessons learnt documentation.

Environment: Java, J2EE, C, C , Oracle 10g, Web services, Weblogic 11g, SecureCRT, Silk Performer 2010, HP Quality Center 10, Load Runner 10 , HP Performance Center 10, IMS, EMS,VoIP, SIP, IPV4, IPV6, SS7, M3UA, ATM, NGS SIM, DOCSIS, Ethernet, Saas, HTTP, FTP, SSL, SMTP, POP3, CORBA, XML Inbound, Outbound, JProbe, Precise i3, Windows mobile, Perl Tcl Scripting, SQL, IBM AIX JS21 JS22 Machines, IBM AIX Power 7 machines, Network Management, Routers, Firewalls, GNU AWK , Solaris, Linux, Unix Windows.

Confidential

QA Manager Performance Test Lead

Responsibilities:

  • Managed a team of 8-10 QA Analysts and Testers, Test Leads using automated and manual testing methods to ensure high quality software products were delivered.
  • Worked with different departments including Cash Management, General Accounting, and IT ESB, PeopleSoft, Kofax and Onbase developers to understand their business process flows and create test scenarios.
  • Created test plan and test scenarios based on the process flows, requirements document and transformation logic in data mapping document.
  • Performed test execution, logged defects and verified the resolved defects.
  • Helped business users with User Acceptance Testing UAT .
  • Ensured that the application would function according to the defined business requirements.
  • Assisted project manager with project estimates. Participated in project status meetings to keep track of issues and the project schedule.
  • Functional PeopleSoft Financials and Data Conversion Testing:
  • Performed functional testing for various PeopleSoft Accounts Payable AP features including payment processing which enables user to create payments for the vouchers that have been entered, approved, and scheduled for payment. This includes pay cycle manager pay cycle job, pospay and reconciliation functionalities.
  • Performed payment processing for different payment methods such as Check and ACH.
  • Processed payments for vendors on Levy and Backup Withholding.
  • Processed financial transactions such as Payment Request Transactions, Cancel Transactions Stop/Void , and Recovery Transactions.
  • Performed Data Conversion testing for payments by executing SQL queries and matching the fields in source ETL staging tables and target PeopleSoft tables . Validated record count, data and transformations.
  • Performed functional testing for various PeopleSoft Accounts Receivable AR features including processing of recovery receipts, miscellaneous receipts and returned checks.

Middle-tier ESB testing Check Processing Unit :

  • Performed functional and integration testing for a middle-tier layer ESB that serves as a data assembly layer a.k.a CPU between source systems and the check production/storage utilities.
  • The XML produced by CPU was captured and validated manually for different expected tags and data. The XML was also mocked up for certain scenarios.
  • The XML was then fed to xPression and the printout was tested as per the requirements.
  • Retrieved and validated the EOB/Check images stored in OnBase that was sent from xPression.
  • Responsible for getting the checks scanned through Gunther machine and validating whether each envelope had documents with correct sequence numbers.

Kofax Onbase testing:

  • Used Kofax to perform functional testing for the processes necessary to scan and image documents generated from PeopleSoft AR and Mainframe.
  • Validated incoming mail and outgoing mail barcodes using hand scanner.
  • Performed Kofax validation to test batch class recognition server document separator , validation keywords and export connector to OnBase.
  • Performed OnBase validation to test OnBase keywords, Auto Name String, Workflow and verify the stored images.

Integration testing:

  • Performed integration testing between Guidewire products ClaimCenter, BillingCenter, Underwriting to PeopleSoft AP and PeopleSoft AP to CPU.
  • Processed different types of payments through the different pay cycles and validate the output XML from PeopleSoft AP based on the mapping document.
  • The XML produced by PeopleSoft AP was captured and validated manually for different expected tags and data. The XML was also mocked up for certain scenarios.

Environment/Tools: Java, .NET, WAS/WPS, xPression, Solimar/Rubika, HP/Mercury Quality Center, Kofax Capture 9.0, SBL language, Kofax Capture 9.0, OnBase, Dbvisualizer, SQL, Oracle, DB2, LoadRunner, People code, SQL server 2008, XML Notepad, MS Office, MS Project.

Confidential

Performance Test QA Lead

Responsibilities:

Lead the Testing activities of Reserve Mortgage and Portfolio Trading Testing team of 10-12. Reviewed the Application Architecture and Performance requirements and prepared the Test Strategy and Test Plan.

Developed Reviewed the LoadRunner Scripts for the Conduit, Confine, WLTS and Deal management modules.

  • Designed various types of Load Tests Scenarios like 1 hrs, 8 hrs Stress test, Endurance test and End-2-End Focus tests.
  • Designed and Developed Load Test Scripts with VuGen, executed Test Scenarios in Load Runner controller and Analyzed the results with LoadRunner Analysis tool.
  • Executed different Load Test Scenarios like 1 hr, 1 hr End-2-End, 8hrs with Imports, 8 hr with no Imports.
  • Run the SQL PL/SQL queries in TOAD and check the imports and messages for processing.
  • Designed Executed the Endurance Test in staging environment and tested the Application server memory leaks and other problems for prolonged period of time..
  • Analyzed the Response times of Various Transactions results and Prepared final Performance Report with various graph compared with baseline results and groups.
  • Prepared Analyzed the Oracle DB AWR Reports and check the load distributing to the all the Oracle Nodes, OS, Network, Monitoring and suggested the recommendations to Database teams Release management teams.
  • Captured the Application Servers, Web Servers, Oracle DB Servers Performance metrics like Throughput, CPU utilization, Disk I/O, Memory utilization and Processor time.
  • Analyzed compared the results with the baseline results and prepared compared different types of graphs and results and make the GO / NO GO decision of the release.
  • Lead the Performance engineering activities like Analyzed the completed test results, Log files, diagnosed the performance bottlenecks, conducted the performance tuning, and leading the testing of performance benchmarks.
  • Identified Listed the Key Transactions from the entire load tests and set the SLAs for the key Transactions.
  • Worked closely with Portal SMEs, Dev team, Release Mgmt, DB teams and infrastructure experts on ensuring the performance of multiple internal and customer facing systems.
  • Conducted the Performance tuning at Network level, System level and Hardware level.
  • Experienced with measuring and analysis of key performance indicators like CPU, network throughput, memory footprint etc.
  • Distribute the workload and debug issues of failing transactions to the team and communicated the route cause problem to the development team.
  • Designed the Performance test Scenarios in HP Performance Center 9.0 and distributed the load and set the SLAs and run the daily 1 hr test in staging environment Production environment to check the application spikes for find Network Problems.
  • Extensively used GUI, Bitmap, Text checkpoints for checking dollar and Euro icons and Inserted Synchronization points wherever it was required using QTP.
  • Conducted Data Driven Testing using QTP Data Driven Wizard for re-testing with multiple data through Excel sheets.
  • Designed and Developed Automation Framework to set the Constraints, Standards and Procedures to be followed to provide support for automated testing.
  • Back-End Database verification manually and using Quick Test Professional to automatically verify Database with the values entered during automated testing by inserting Database Checkpoints
  • Checking the database for database validations using SQL statements using Database Check Point
  • Perform web test using Quick Test professional for checking page contents, web objects, frames and tables.
  • Conducted the multi-browser Testing on Internet Explorer IE , Firefox and Netscape Navigator NN
  • Coordinated testing activities by client and 3rd party resources and Remote testing teams.
  • Worked with various checks points Standard checkpoints, Database Checkpoints, Text checkpoints and Bitmap checkpoints to perform regression testing
  • Interacted with developers to resolve defects, performed Regression testing by using QTP to verify that bug fixes did not break some other parts of the system
  • Application of descriptive programming using QTP in CRM module. Generate the reports in Cognos.

Environment: J2EE,Ajax, HTML, Weblogic, Oracle 10g, TOAD 8.0, HTTP, FTP, TCP/IP, Cognos, Siebel, IBM MQ, Quick Test Pro 9.0, Load Runner 9.0, HP Quality Centre, HP Performance Center 9.0, Diagnostics 9.0, SiteScope 9.0, Informatica, SQL Server, Linux, HP-UX, Windows 2003 Server Windows XP.

Confidential

QA Manager

Responsibilities:

  • Managed and Lead the development of Performance Test scripts for Release 10 11 of the PA-NEDSS application.
  • Identified the scope of Performance testing, prepared the Performance Test Strategies, Test Plan and Setup the project timelines.
  • Developed the Load Test Scripts with Load Runner and executed different Load Testing Scenarios in Load Runner controller.
  • Executed the Load testing end-2-end scenarios in test build and prepared the Load test scripts. Prepared the performance test data for load tests and lead the performance testing team of 3.
  • Prepared and executed the performance test scenarios like Blended Scenarios and Scalability Scenarios with different User Load.
  • Conducted the Scalability testing and Hardware Scalability Testing and captured the performance metrics like response time, throughput, Processor Time, Disk I/O, CPU Utilization, Memory Utilization, Network Problems,
  • Captured the Performance Counters of Web Server, Application Server, and Database Server and captured the statistics like bytes sent and received from server and check the load on the Servers.
  • Prepared the Performance reports and compare the results with previous version of the NEDSS application and suggested the performance improvements like modifying the Complex stored procedures and SQL Queries.
  • Design and Developed the testcases for the Integration of the Siebel Product and Catalog Management application to develop, manage, and deliver dynamic product catalogs across all channels.
  • Upgrade of Load Runner 7.8 to 8.1 and Setup of remote agents on different locations and ensure that load if generating on the machines at the time of Load testing.
  • Developed the project using Agile Methodology tested the project in Agile incremental model.
  • Deployed the test builds into Testing Environments by pulling the code from VSS and configures the builds. Identified the early potential performance problems in the build by conducted the smoke test in the NEDSS testing environment.
  • Tested the PA-NEDSS application is integrated with other ETL Legacy Applications for proper report generation.
  • Identified the Test scenarios for automation, estimated the time lines for automation and automated the application with Quick Test Pro and reviewed automation scripts developed by the team members.
  • Worked as Administrator for Test Director Maintenance, Coordinated testing related activities with other team members.
  • Lead the system testing activities and lead the team of 8-10 members and assign the task and collect the metrics on daily basis and prepared the system testing report from Test Director and presented to the client.
  • Executed the Performance test scenarios manually and prepared the daily test report and escalated the defect issues in daily defect review meeting to development leads.
  • As a QC admin, Customized where I can add, delete a new field in any of the module. Add, delete the Users in QC and assign the permissions in the project.
  • Added new Projects for different releases and maintained the project and corresponding users in remote location.
  • Assigned a user to User Groups like QC Admin, Project Manager, QA Tester, Viewer, Developer and managing the permission at all levels.
  • Installation/test/deployment of patches and Add-ins in server machines and maintains the licenses.
  • Provide the support to QC Users and developed the advanced QC reports.
  • Project/Workflow Customization and User Administration in QC and Managed 3rd party integration of QC.

Environment: Microsoft .Net frame work 2.0, BizTalk, Oracle 9i, PL/SQL, TOAD 8.0, AppWorx, Quick Test Pro, Load Runner 7.8, Quality Center Admin, SiteAdmin, Siebel Product and Catalog Management, SAP, Cognos, AGILE Methodology and Windows 2003 server Windows XP.

Confidential

QA Lead

Responsibilities:

  • Presented the daily test reports to stakeholders and the client and escalated the defect issues in daily defect review meeting.
  • Conducted the quick Smoke Test after the builds deployed into testing environments and proceed for execution of test cases.
  • Reviewed the Test Scenarios, Test cases and defect in Quality Center logged by testers.
  • Closely worked with the Solution Architect teams to create or update the plans and documents to effectively articulate multiple views of the project changes, and maintained the Testing traceability matrices.
  • Closely worked with Business, Project Managers and Development teams to understand the requirements and changes in the application and create a maintainable and extendible test plan and testing timelines.
  • Responsible for design and execute the Functional Testing, Web Application Testing, Load Testing, User Acceptance Testing, Integration Testing, Black Box, White Box, Error Processing, Navigation Testing.
  • Lead the Testing activities of Claim Processing System team of 4 and Developed and Maintained the System Integration Test Strategy, Test plan and Test delivery schedules.
  • Worked closely with the Product Planning development team and prepared the testing estimations for New Change Requests.
  • Worked with internal and external development teams to support the testing activities and measure project-level integration solution development, performing quality testing reviews and measuring adherence to integration guidelines and procedures implementations
  • Managed daily testing activities across client testing resources and 3rd parties and remote teams.
  • Worked with DB2 and VSAM files to pull the test data and setting up data driven tests using QTP.
  • Facets is used support data and update of enrollment information, stored all the enrollment data and Providers data.
  • Participated extensively in the Claims Testing Phases like Online Screens, Resolution Screens, Adjustment/void functionality / Mass Adjustment functionality and Adjudication Process.
  • Tested the entire Claim processing life cycle: Claim Submission, Claim Processing Claim Payment though Facets Claims Administration, using with SOAPUI.
  • Tested the report generation information from third party ETL Legacy Systems.
  • Used Claim System to test and track the Insurance status of the patients and automated with Auto Adjudication and integrated with other payer systems with trading partner systems to enhance the processes.
  • Written SQL queries for backend testing.
  • Performed Database integrity testing by executing SQL statements for the Oracle database.
  • Served as an Administrator for Quality Center 9.0 and Help the client personnel to resolve the HPQC Admin related issue like repair the corrupted projects in HPQC.
  • Reviewed and assigned the defect in Quality Center logged by testers.
  • Prepared the daily Test Reports from Quality Center and presented to Project Mgmt and Stakeholder.
  • Lead and presented the daily test reports to stakeholders and the client and escalated the defect issues in daily defect review meeting and follow up the defect status until the defect is closed.
  • Identified the risks and resource time estimations for the project and reported to PM.
  • Developed automated test scripts, defined scenarios and procedures to test the performance and functionality of application.
  • Conducted testing of the system end-end manually as well as automated the functionality using QTP.
  • Used Load Runner to analyze the response times of the business transactions under different user load.
  • Developed reports and graphs to present the stress test result to management.
  • Participated in the status meetings to report issues. Communicated with developers through all phases of testing to prioritize bug resolution.
  • Developed the Load Testing Scripts in VuGen and wrote the custom functions, and executed different load test scenarios in controller.
  • Created the of performance test data and Analyzed the response time and graphs and prepared the performance reports
  • Capture the Performance metrics like Response time, CPU utilization, Disk I/O, Memory utilization and Processor time.
  • Used Precise i3 and analyzed different profiles at Application, Database server Storage sever levels and find out the bottlenecks, root causes and suggested the improvements.
  • Participated in User Acceptance Testing UAT , coordinated and dealt with clients for maintaining better quality standards

Environment: VSAM, Informatica, Trizetto's Facets, Claim Processing, MVS, JCL, DB2, CICS, OS/390, Java, JSP, Java scripts, HIPPA, Informatica, Abnatio, ETL, SOAPUI, XML, MQ, TOAD 8.0, Oracle, Quick Test Pro 9.0, Load Runner 9.0, HP Quality Centre, Precise i3, SQL Server, TFS, Windows 2003 Server Windows XP.

Confidential

QA Manager Onsite - Offshore model

Responsibilities:

  • Worked as a Test Lead and Test Manager for a team of size 22 and managed the allocation of modules to team members, time estimation in the preparation Execution of test cases.
  • Prepared the Test strategies, Test plan, and set the time lines for Test execution.
  • Developed the project using Agile Scrum tested the project in Agile incremental model.
  • Prepared the Test Scenarios and Test cases for Punch-Out Module and executed the same in Test Director.
  • Worked closely with the client development team at onsite to coverage new requirements.
  • Coordinated with other test team members to help them in understand the SRSs, FRs, test cases writing and escalated the issues to product planning functional leads.
  • Tested the Order management and Product Catalog management module Integrated with Siebel and enables to develop, manage, and deliver dynamic product catalogs across B2B channels.
  • Identified the Test scenarios for automation, estimate the time for automation and automated the portal application with Quick Test Pro and reviewed automation scripts.
  • Coordinated all the testing related activities for B2B with geographical distributed teams and escalated the issues to the client.
  • Prepared the Performance Test strategies, Performance Test plan, and developed the Load Test Scripts with Load Runner and executed different Load Testing Scenarios in Load Runner controller.
  • Captured the performance metrics like Response time, CPU utilization, Disk I/O, Memory utilization and Processor time.
  • Wrote the Excel Macros using with SQL Queries and pulled the test reports from Test Director and prepared the daily test reports.
  • Maintained and Administration of Test Director and prepared the Test sets in test lab and Pulled the Test Reports from TD.
  • Migrated the TestDirector to Quality Center 9.0 and Maintained all the testcases and test plans in Quality Center.
  • Monitored the work, tracked the testing status on daily basis and escalated the issues to the client directly.
  • Conducted Daily Defect review meeting and test team status meetings with managers and client.
  • Provided Hands-on support for Pre-UAT and UAT Testing.

Environment: J2EE, JSP, JDK1.5, Comergent, Ariba, SciQuest, Quick Test Pro8.2, Load Runner8.1, Test Director for Quality Center, Verity, WebMethods, BEA WebLogic, Cyber source, SAP, Siebel CRM, cXML, EDI, iDoc, File Builder, EBS, Business Objects, AGILE Methodology, Oracle 9i, Linux, HP-UX and Windows XP.

Confidential

Product Test Lead

Responsibilities:

  • Lead the team of size 6 and responsible for work allocation to the team, daily basis work progress tracking, monitoring and managing the team.
  • Prepared the Product Test strategies, Product Test plan for Telecom Mobile Oracle Lite 10 g product
  • Prepared Test case documents for the Mobile server, Mobile Devices Windows Mobile, Symbian, and Smartphone, Web-to-go, Win32, Branch Office and Migration of Oracle Lite 10g.
  • Installed the Oracle 9i/10g database, Oracle 9i Application Server and on top of that Installed Telecom Mobile Oracle Lite 10g in different environments like windows, Solaris Unix, Linux and executed the Mobile Server and Mobile clients Test cases and reported the defects.
  • Installed the various Telecom Mobile clients like Web-to-go, Win32, Branch Office, in Windows Mobile, Symbian, WinCE and Palm and checked the installation problems and data Synchronization issues.
  • Tested mobile applications such a music, ring tones, email, browser, games and video streaming and upgraded mobile devices with new firmware along with reviewing device logs from a trouble-shooting perspective.
  • Identified the possible test cases for automation and automated the Mobile Server, Mobile clients Web-to-go, Win32 and Brach Office using WinRunner.
  • Prepared WinRunner Scripts for Mobile Devices, Web-to-Client, Win32 client and Branch Office.
  • Conducted the Sanity checks and whenever the build comes, installed the builds and run WinRunner regression Scripts and reported the defects.
  • Critically think about how consumers will interact with the handset being tested and turn those thoughts into test cases, bugs, action items, and avenues of discussion.
  • Review test cases and problem reports to find areas where test cases are missing. Work with automation of test cases by using scripts and other tools
  • Installed the older version of Oracle Lite in different OS like Windows 97, 98, 2000, Solaris UNIX, and Linux and migrated to latest version of Oracle 10g and identified the issues and reported the same.
  • Conducted frequent status check meetings with testers and prepared the daily test status report and weekly status reports and presented to the product director.

Environment: J2EE, JSP, JDBC 2.0, JDK1.4, UML, XML, UIX Framework, ADE for code management, jDeveloper10, Oracle 10g 9i, Oracle 9iAS, WinRunner 7.0, BUG, Windows Mobile, Symbian, and Smartphone, Perl, Sun Solaris and Windows XP/2000.

Confidential

Sr. Software Engineer /Sr QA Testing Consultant

Job Duties Technical Skills: Test Lead / Sr. QA Lead Consultant

  • Prepared the test plans and Test scenarios for System Testing, Functional Testing, Regression Testing, End-to-End Scenario User Acceptance Testing
  • Prepared the Load testing Scripts with LoadRunner, Automation Test Scripts with WinRunner.
  • Wrote the Automation scripts with Rational Functional Tester, and maintained them in Rational Test Manager.
  • Worked as a Senior QA Engineer at Onsite in a process driven global organization NOKIA, Helsinki, Finland for one year.
  • Primarily involved in the writing Test cases and execute the same in Test Director.
  • Involved in the writing and review of the test plan and test estimation.
  • Maintained all the test cases and automation scripts in Test Director and Run the Tests from TD and reported the defects.
  • Conducted baseline testing for single user and presented the report to mgmt and proceed for the performance testing with different user load scenarios, and prepared the performance report and presented to the client and suggest the improvements.
  • Reported the performance degradation b/n releases by running baseline tests and comparing with existing release.
  • Could able to locate Disk I/O bottlenecks, Network limited problems, CPU Utilization and Database bottlenecks

Confidential

Software QA Lead

Job Duties:

  • Write the code in Java and developed the E-Gurkha core server components
  • Prepare and Review the Test Plan and System Testing, Functional Testing, Regression Testing, End-to-End Scenario User Acceptance Testing.
  • Prepared the Test Scenarios and Testcases and executed the same.
  • Involved in the identification of the scenarios for Automation testing and Automated the product using WinRunner

Confidential

Software Engineer QA Lead

Job Duties:

  • Test Plan Preparation, System Testing, Functional Testing, Regression Testing, End-to-End Scenario User Acceptance Testing.
  • Prepared the Test Scenarios and Testcases and executed the same.
  • Involved in the review of the test plan and test estimation.
  • Maintained all the test scripts in Excel Sheets executed the same and report the defects in Bugzilla.

We'd love your feedback!