Qa Manager Resume
NC
SUMMARY:
- 15 plus years of experience as a Software QA Manager/Lead/Test Engineer/. Strong experience in managing QA/Test team, Identifying and implementing test policy and process and executing test plan and scripts. Strong Domain knowledge in Health Care, Pharmaceuticals, Retail, Financial, Higher Education, Insurance, Banking, Media/Publishing Applications on Client Server, Distributed, ERP, BPM, E - Commerce and Unix environment.
- Strong background with forming test team, resource planning, identifying test policy, test processes, coach testing, advocate quality and implement the processes.
- Strong background with ‘V’ model, Agile and Waterfall, RUP process/methodology.
- Strong background with QA release management and maintenance.
- Strong background with Sharepoint implementation projects
- Security Testing experience within various organizations
- Extensive experience with PEGA testing, PEGA test framework. PEGA Project management framework
- Strong knowledge in Capability Maturity Model CMMI, Testing Maturity Model, Automated Software Testing Maturity Model, and Automated Testing Life-Cycle Methodology.
- Experience in analyzing Business Requirements, Functional and Technical Specifications.
- Hands-on experience using Mercury Interactive QuickTestPro, Load Runner. Win Runner, Test Director, and Quality Center
- Solid experience in Developing, Reviewing, Executing and Maintaining Test Plans, Test Cases and Test Procedures.
- Expertise in Quality Assurance, Manual and Automation testing of Client/Server, Window based Applications & Web applications.
- Strong understanding of QA Principles, QA Process, Use Cases and Software Development Life Cycle.
- Extensively involved Black Box and Full Life Cycle Testing.
- Experience in tracking test coverage and providing Metrics (RVM) to management.
- Performed Smoke, Integration, Functionality, Performance, Load, Regression, and System Tests.
- Experience in documenting Defects with high level of detail, accuracy, and informative recreation steps.
- Experience in Software Verification, Validation and Testing Methodology.
- Experience in Oracle, SQL Server, Teradata, MS-Access databases and UNIX, Windows operating systems.
- Strong knowledge in Security/Penetration Testing
- Sound knowledge in Mobile Apps Testing
- Sound knowledge in Salesforce (SFDC) Testing
- Extensively involved in PEGA (BPM) Testing
- Experience in Mainframe Testing
- Exceptionally well organized, possess strong communication skills and a willingness to work hard to achieve employer’s confidence and objective.
TECHNICAL SKILLS:
TESTING TOOLS: Mercury Interactive QuickTestPro 6.5/8.0.1, 11, Page Detailer, WAT,PHAT, Broadband, CKAT, Charting, Quote, StockTicker, Symbol, BrowserCam. Microsoft Web Expression ScannerTotal Jam, WireShark,Sniffers, Snort, Netcat etc. Win Runner 4.0/ 5.0/6.0/7.0/7. 5, Test Director 5.0/6.0/7.0/7. 6/8.0, Load Runner 6.5/7.5, 11, Quality Center/ALM, Jira and PEGA 4/5/6, PEGA Test Framework, PEGA PM Framework
Operating Systems: MS-DOS, UNIX, Sun Solaris, Windows 9xWindows NT/2000/XP
Languages: C, C++, PERL, JAVA, SQL, UNIX, TSL, VBScript, JavaScriptInformatica, ETL
Mainframe Technologies: COBOL II, OVS, SPUFI, QMF, IMS DB/DC, CICS, JCLHiperstation
ERP: Oracle 11i Applications, GL, AR, PO, Order Management
ETL Tools: Informatica, ETL
RDBMS: Oracle 9I, MS SQL Server, Informix, Teradata and DB2
Web Technologies: HTML, IIS, ASP, Java Script, Java Servlets, JSP, FrontPage, weblogic, AJAX, JSON, and SharePoint
Packages: MS-Office, MS-Access, and MS-Excel
GUI Tools: Oracle Developer 2000 and Visual Basic 4.0/5.0
Test Strategies: Integration, System, Regression, User-Acceptanceand Smoke, Black Box, UAT etc.
Defect Tracking tools: Remedy, Bugzilla, Test director, Quality Center, Test Track Pro Clear Quest, Test Manager
PROFESSIONAL EXPERIENCE:
Confidential, NC
QA Manager
Responsibilities:
- Identified QA strategy for testing GBE Pega project
- Hired, coached and trained conventional testers into mobile and Pega testing area
- Placed a mobile test team from ground up
- Identified Atomic test cases, risk based testing approach, appropriate methodologies using Direct capture of objectives (DCO)
- Executed Pega test cases and gathered results
- Worked with SWIFT messaging services
- Tested Pega work items, cases, workbaskets, flows, clipboards etc.
- Created Test schedule and estimation
- Evaluate Automation test tool for mobile testing
- Implement Perfecto Mobile framework for Test automation
- Create, execute and manage Mobile automation using Perfecto Mobile
- Captured Test metrics, defect trends and communicated those to the stake holders
- Design, Architect and build the entire mobile testing framework from the ground up
- Lead the mobile testing on IPhone, IPAD, Android and Windows Mobile applications
- Build automated test infrastructure and testing lab for mobile application testing
- Coordinate mobile testing work with application architects, developers, designers and project/product managers
- Establish overall mobile application testing policy, process, standards, procedures, tools and metrics
- Lead and Mentor use cases, test cases creation on mobile application testing
- Lead mobile application security testing effort
- Manage Mobile application testing on 2G, 3G, 3GS, 4G/LTE technologies
- Perform Mobile test task estimation, test execution, Regression testing, Functional System testing
- Testing mobile handsets of: Iphone, Nokia, Samsung, Galaxy, Motorola, Sony Ericson, G1, Windows mobile, H2C, LG etc.
- Create, Execute and Manage automated testing using Appium, Device Anaywhere, Robotium
- Define and communicate the accurate Mobile test strategy and provide qualitative and quantitative status/defect reports in a timely fashion
- Web service testing to test the middleware using SOAP UI
- Created, managed and executed web service testing using xml editor, wsdl file and automation scripts
- Managed mobile applications security testing using various pen testing standards, practices and tools
- Managed QA dashboard
- Gathered and communicated QA metrics and measurements
- Pen Testing on Web application and Mobile security
- Vulnerability assessment and dynamic web scanning
Environment: PEGA PRPC (6.1), DCO, PAL, Preflight, Java, JSP. Quality Center (QC), SWIFT, GSMOS, IOS, Android, Java, XML, MQ, SOAP, Unix, Oracle, SharePoint,Jira, Perfecto, Device anywhere, Appium, Metasploit, Nmap, Appscan, Pega, SFDC
Confidential, Phoenix, AZ
QA Manager, Lead
Responsibilities:
- Identified QA test process for BPM/Pega testing
- Synchorniged BPM Pega test strategy with existing testing practice of the organization
- Managed 15 testers both onshore and offshore
- Coached, Trained testers as necessary
- Managed Pega Test Frame work and Pega Test Suite
- Hired/coached and mentored testing resources for BPM Pega testing
- Organized daily Test status meetings as well as other test/project related meetings
- Identified Atomic test cases, risk based testing approach, appropriate methodologies using Direct capture of objectives (DCO)
- Created BPM test methodology for Confidential
- Created Test Plans and Test Strategies
- Created Test Cases and Scripts
- Executed Functional, Non-Functional, System, Smoke, UAT Testing
- Executed Web Services Testing
- Executed Mobile Device and Applications on disparate systems
- Executed BPM test cases and gathered results
- Implemented, documented and managed system testing
- Provided testing best practice to ensure implementation success
- Tested Pega XML, SOAP, MQ, Clipboard, Harness, Preflight, DBTrace, DCO, PAL etc.
- Done Test Database configuration, tuning and troubleshooting
- Ensured the Pegasystems’ design standards are adhered to.
- Kept update on relevant technologies and PRPC functionality
- Implemented Pega Test Management Framework
Environment: PEGA PRPC (5.4/6.1), DCO, PAL, Preflight, Java, JSP. Quality Center (QC), XML, MQ, SOAP, UNIX
Confidential, North Dakota
QA Manager/Information Technology Department
Responsibilities:
- Managed QA team members and seasoned (25) testers of various Gov. agencies
- Established QA goals, Policy and Process for the Organization
- Managed/Coordinated testing effort with other agencies/vendors
- Trained, Coached testers as necessary
- Managed about 3.5 mil. Budget for the year
- Run & participated daily/weekly project planning and feature demo meetings
- Planned, scheduled and control large-scale programs and individual projects
- Offered the right strategic mix of projects - created scope/traceability documents
- Established QA goal/policy/processes for State Gov. of North Dakota.
- Created a seamless defect management. Build management process within Gov.
- Managed/Participated Security testing and vulnerability assessment QA checkpoints
- Allocated best resources on various projects nd tracked/monitored progress
- Monitored & visualized project performance vs. plan
- Estimated and finalized budget for QA team and QA involvement in various Gov. projects
- Calculated project costs using the direct and indirect costs per resource
- Determined if a project is ahead of schedule, behind schedule, over budget or under budget by analyzing earned value KPI’s
- Tracked the variances between what was budgeted and how the project is being executed
- Created WBS and developed schedules
- Captured metrics, Identified correct Process and Provided schedule/budget information into project repository
- Met, Communicated and collaborated with various project resources and update project schedules, plans
- Created status reports for stake holders, identify risks and create mitigation plan
- Created Performance Confidential and feedback for fulltime/contract QA resources
- Planned and Managed Joint/Partner Testing effort between different agencies and DOT
- Organized daily Test status Meetings for various projects
- Created and Managed QA metrics documentation and reporting
- Created and Prepared various test scenario, traceability and test cases
- Executed Performe, Integration, System, Regression and smoke testing to test the application functionality.
- Involved in Performance testing effort for client side and server side load testing
- Done Pen/security Testing using various network security tools to scan, hack and identify bottlenecks
- Involved in creating security policy and processes
- Created/managed Digital Forensic management
- Managed BPM/Pega Test team
- Identified/Created BPM/Pega testing policy, process and goals
- Created Pega test framework
- Created/Tested Pega automation rulebase testing
Environment: Mercury IBM Suite (Req Pro, Clear Case, Clear Quest, Test Manager), Seguie, IBM Page Detailer, Loadrunner, Appscan, Nesus, WireShark, Charles Proxy, Visual Studio Team Foundation.
Confidential, NY
QA Manager
Responsibilities:
- Developed Test Strategy, Test Plans and Test Cases.
- Planned and Managed Joint/Partner Testing effort within different organizations under WSJ
- Manager a onshore and offshore QA team consists 6 onshore and 10 offshore
- Organized daily Test Scrum Meeting within different sprints
- Created and Managed Project backlog documentation and reporting
- Created and Prepared various test scenario on Rich front end application
- Created Wat, Phat, Conga and other editorial, media publishing test scenarios
- Participated on Sprint Demo release meetings
- Performed Integration, System, Regression and smoke testing to test the application functionality.
- Validated Quote, Symbols, Chart, Ticker, Stocks through various Tools
- Tested web services using SOAP testing tool
- Validated Web Service for Quote, Symbol Data
- Validated the XML Schema and WSDL files using XML Spy
- Generated and validated various automation scripts to test load and stress of the application using Jiffy, Page Detailer, Loadrunner
- Tracked Project spring progress by PM tool Trac
- Pen Testing using various network security tools to scan, hack and identify bottlenecks
- Involved in creating security policy and processes
- MQ messaging testing by Hermes
- Web service and SOAP testing
- Business process management testing with PEGA BPM SUITE
- Pega Unit Testing, Clipboard testing
- Pega DB Tracer Testing
- Pega PAL (performance Analyzer) testing
- Managed Pega Test management framework testing effort
- Mentored, Coached, Led Pega testing team and testing effort
Environment: Mercury Suite (QTP, QC, Load Runner), Jiffy, IBM Page Detailer, Broadband, Wat, PHAT, Conga, Browsercam, Trac, AJAX, JSON, SOAP, XML, HTTP, Charting, Quoting, Ticker, Snort, Sniffer, WireShark, Charles, Microsoft Web Expression, PEGA 4.0, 5.0 PEGA Test Framework
Confidential, MN
Test Manager
Responsibilities:
- Developed Test Strategy, Test Plans and Test Cases.
- Planned and Managed Joint/Partner Testing effort with different organizations.
- Involved in Offshore model. Led 10 team members includes 3 onshore and 7 offshore
- Extensive experience with Pega Test Frame work and Pega Test Suite
- Involved in Pega Performance, Work Object and associated testing
- Organized daily Test status meetings as well as other test/project related meetings
- Managed requirements and their attributes using Rational RequisitePro.
- Tested claim information that comes from the provider and loads into RxTPM system.
- Involved in preparing Test plans and Test cases based on user requirement documents in Req Pro.
- Tested essential health plan functions, including enrollment, member eligibility, benefit tracking, billing and claims.
- Performed Integration, System, Regression and smoke testing to test the application functionality.
- Validated the EDI/HIPAA transactions.
- Generated and validated various automation scripts to test load and stress of the application using Loadrunner.
- Managed End to End testing, and automated tests using Robot/Functional tester
- Created Test Scripts and assisted in User Acceptance Testing.
- Interacted with data from SQL Server by writing SQL Queries in Enterprise Manager and Query Analyzer.
- Worked with SQL and Unix Shell Scripts to test database integrity.
- Performed Web service testing related to claim processing
- Tested SOAP messaging between applications and validated XML attributes
- Used MQ Series messaging and tools, Hermes XMLSpy to validate messaging
- Used Quality Center (QC) for planning, executing and managing test plan and test cases
- Used Quality center for reporting test progress and administering QC activities
- Maintain defects in Clearquest and participated in defect review meetings and GO/NO-GO meetings.
- Maintained test logs, test suites and reports in Test Manager.
- Involved in security testing including SQL injection and Vulnerability
- Provided support for Change management Tickets by using internal tool- IT2B and Tivoli
- Executed mainframe jobs and scripts to exercise internal and joint testing.
- Worked on best practices in developing traceability between Use-Cases in Rose to Requirements in RequisitePro and between Requirements in RequisitePro and Test Cases in Test Manager.
- Trained personnel in usage of Rational Tools and RUP.
- Maintained configuration management and all documents and scripts using ClearCase.
Environment: Rational Suite (Test Manager, Robot, Requisitepro, Clearquest, Clearcase), QA Hiperstation, Java, J2EE, JDBC, Struts, EJB, Pega 4.2, Pega 5.2/5.4, HTML, DHTML, XML Weblogic, JDK, Oracle, Unix, Data Stage, DB2, PVCS, SQL, CICS, JCL, MS Excel, and Windows NT 4.0. Quality center, Load Runner, PEGA (4.0 & 5.2), Unix ( AIX), SOAP, HTTP, XML Spy, Ultra Edit, Hermes, Sniffer, Wireshark, Snort
Confidential, Minneapolis, MN
Sr. QA Analyst
Responsibilities:
- Involved in manual and automated testing.
- Conducted walkthroughs and reviews with designers and developers to establish quality policy conformance
- Reviewed the requirements to write Test Cases/ Test plan that would test various scenarios. Test cases were written in Test Director.
- Involved in the development of system test plan and test scripts using business and system requirement documents.
- Analyzed system requirements, developed & executed detailed Test plan, Test cases
- Prepared test data for data driven test cases.
- Developed, Enhanced, and Maintained automated testing scripts with QTP for Regression, Integration, and system Testing
- Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL queries on the Oracle database
- Validated HTTP protocol and SOAP messaging
- Validated XML nodes and attributes
- Used Mercury Test Director, Bugzilla, Quality Center as a defect tracking tool to report application defects and enhancements request and discussed with developers to resolve technical issues.
- Created and executed security testing scenarios and involved in security policy establishment
- Documented, tracked and communicated test plans, test results, analysis, and unresolved problems.
- Compared and analyzed actual with expected results and reported all deviations to the appropriate individual(s) for resolution.
- Reviewed application server logs and error reports to identify application processing errors and possible improvements.
- Expertise in Mercury Quality Center 8.0 (Test Director 8.0) for Scheduling the Test Scripts
Environment: Oracle, SQL, QTP, Load Runner, Test Director, Clear Quest, UNIX, Java, JSP, Bugzilla, Servlets, EJB, Web Applications, MS Server, PL/SQL, HTTP, XML Editor, SOAP, Windows 2000/NT, Wireshark, Scanner, NetFishing
Confidential, MN
QA Lead
Responsibilities:
- Created Test Strategy according to the business functions, priorities & schedule.
- Led both onshore and offshore teams sized 12 members.
- Created Test Plan that identified the items to be tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning
- Prepared test cases and test data for all tests defined in the test plan
- Executed back-end data-driven test efforts with a focus on data transformations between various systems and a data warehouse
- Performed testing of ETL and flat file data transfers without relying on a GUI layer.
- Extracted data from Oracle database 10g and flat file, perform mappings and load it into Confidential Oracle tables by SQL* Loader
- Tracked defects during various activities and stages within the development lifecycles, as well as post-implementation (in production)
- Performed Performance testing using LoadRunner
- Used TestDirector as a central repository for all testing material. All test cases were stored, and ran from TestDirector. It was also served as a defect-tracking database.
- Wrote SQL*PLUS queries against views and tables from Oracle of 9i and validated data with that of other Oracle/MS SQL server tables as well as the production and test environment
- Validated web interface data against that of the Oracle Database
- Verified that all business/technical requirements were met in User Interface
- Organized daily Test status meetings as well as other test/project related meetings
- Reported test report to the project team in timely fashion
- Reported weekly testing status to the project team
- Reported daily production Cap by validating daily metric’s condition
- Created use cases to write effective test cases
Environment: QTP, LoadRunner, Windows XP, Oracle 9i, Oracle forms, Java, JBuilder, Test Director, Clear Quest, PL/SQL, Informatica 8.0, Unix (solaris), ASP, JSP.
Confidential
Sr. Qa Analyst
Responsibilities:
- Developed & involved in QC strategies, Test plans, Flow graph, Test cases, Test data, and Test Status report.
- Involved in CMM-iv level project & exercised detail QA & PM processes in testing lab environment.
- Involved in design and development of the system.
- Ghosting & Testing different OS systems with localization including English, German, French, Japanese, and Chinese & Arabic
- Internalized Test centers as well as created Test data for different OS & versions.
- Involved in Localization Testing.
- Reviewed design documents, Requirements notes, Use cases & participated in Requirements inspection meetings.
- Created QTI XML exams for Testing.
- Load Runner Vuser Scripts to Test load & stress of the application
- Performed Manual Testing life cycle.
- Performed functional, regression, Integration, Black box, Ad-hoc, Sanity & web testing.
- Used SQL queries to access data from different database tables.
- Reported defects using Test Director & Test Track Pro and interfaced with developers to resolve technical issues.
- Performed negative testing to find how the functions perform when it encounters invalid or unexpected values.
- Verified, Reported & Assigned & Closed bugs.
- Performed Compilation for stand-alone testing.
- Verified Testing Log files & Resource container to verify events & report error.
Environment: Java, XML, Java Script, SQL Server, Windows NT/2000/98, UNIX (Solaris), Test Director, Load Runner.
Confidential, MD
Sr. QA Analyst
Responsibilities:
- Involved in System Testing, creating and running scripts
- Manually implemented several test scripts and documented the results
- Performed Functionality Testing and GUI testing.
- Extensively worked with Win runner functions in automation of test scripts.
- Wrote and enhanced Win runner test cases and test scripts by adding the required functionality as per the business requirements.
- Used Win runner to automate the regression testing process
- Analyzed system requirements, developed & executed detailed Test plan, Test cases
- Prepared test data for data driven test cases
- Used Test Director as the bug reporting tool
Environment: ORACLE, SQL, VB, .NET, Win Runner, Load Runner, Test Director and WINDOWS NT/2000, Java, ASP