Qa/program Manager Resume
Nyc, NY
SUMMARY
- Over 10 Years of experience in the information technology industry.
- Extensive knowledge of all stages of the Software Development Life Cycle (SDLC).
- 6 Years of program management experience that consisted of managing several large scale, multi - million dollar, business critical projects.
- Managed several projects through Inception, Elaboration, Construction and Transition phases.
- Driven large software Quality Assurance teams, local and remote working on Client/Server, Multi-tier, Web based applications.
- Setup testing infrastructure to support multiple simultaneous application deployments, spanning across several LPAR’s (Logical PArtitionS on a single server)
- Worked with business and technology test experts to develop comprehensive test strategies.
- Influenced and guided system test plan development for complex interfacing applications.
- Provided consensus to multiple and diverse stakeholders at a program level.
- Demonstrated the ability to make complex decisions impacting project budgets, durations and business continuity.
- Successfully balanced project budgets and resource allocations across projects spanning over various business lines.
- Built software testing teams and implemented testing processes customized to suit the size of the project at hand.
- Developed SRPs (standard repeatable practices) to introduce efficiencies and standardization across projects.
- Drafted policies, procedures and associated software testing artifacts using ITIL (Information Technology Infrastructure Library).
- Introduced and implemented AGILE methodologies to new and existing projects. Certified SCRUM Master.
- Extensive experience in Functional, Integration, Regression, Positive, Negative testing, Back End and System Level Testing.
- Continually worked on increasing automation using in-house developed testing frameworks and external vendor products.
- Regularly challenged and re-evaluated testing efficiencies to reduce Regression costs by TEMPeffectively using automation.
- Demonstrated the ability to influence activities of Test Leads directly reporting to me and Test Managers assigned to other organizational units.
- Directly managed Functional Testing, Performance Testing, and ASAR (application security assurance review) teams.
- Successfully worked with business analysts, clients and developments, management staff.
- Independently and consistently managed multiple complex assignments across business lines.
- Demonstrated managerial and motivational skills to lead and direct testing activities.
- During my junior years in dis field, me has proven my skills as a Tester and hands on Test Lead.
- Excellent communication, interpersonal, analytical, organizational and management skills.
TECHNICAL SKILLS
Technologies: J2EE, IBM s Web sphere, BEA s Weblogic, XML1.0/1.1, EJB1.1/1.2, JSP1.0/1.1, Servlets 2.0, Java (JDK 1.1/1.2), AWT, Java Swing, JavaBeans, JMS, JNDI, JavaScript, HTML, COM, JDBC 2.0, CVS, SVN.
Languages: J2EE (Java, Servlets, JSP), C, C++, XML, Perl, TSL, SQL, UML, Shell Scripting.
Testing Tools: QTP, Test Partner, WinRunner, Trac, ALM.
Platforms: Unix/Linux, Win 2003 Server, Windows 2000/NT/ME/XP/VISTA/7EE/8
RDBMS: DB2, Oracle, MS Access, MySQL, SQL Server, SQL Developer
Tools: andTechnologies Visual Basic (VB), J2EE, JavaScript
Version Control: PVCS, Visual Source Safe, CVS, Tortoise SVN, Eclipse 3.7
PROFESSIONAL EXPERIENCE
Confidential, NYC NY
QA/Program Manager
Responsibilities:
- Introduced and halped architect multiple test environments to support simultaneous code deployments in the testing environments to allow multiple testing groups to work in parallel.
- Controlled resource allocations across the team while actively assessing demand and capacity on a monthly basis.
- Implemented and followed SRPs (Standard Repeatable Processes) for each project that me’ve managed.
- Provided Risk assessment based on the scope and budget allocation for each of the projects.
- Provided functional, performance, security testing estimates (HLEs - High Level Estimates) for all new testing initiatives being introduced to the Technology Services Group.
- Participated in several discussions with the Business/Development/Program Management teams to highlight possible testing approaches while understanding the business requirements and technical designs.
- Produced and Managed Project Schedules using MS Project and determined task allocation for each tester, % allocation between projects and estimated % completion on weekly basis.
- Assisted test leads in developing test plans and provided guidance in designing test scenarios based on requirements, enhancements and/or defect fixes.
- Reviewed Used Cases and halped develop the traceability matrix to build test plans. All documents and templates created followed ITIL (Information Technology Infrastructure Library) concepts and processes.
- Initiated a conversion effort that consisted of migrating existing old manual test cases to automation scripts, allowing a significant reduction of 75% in Regression testing.
- Automation involved developing scripts for the Front End GUI, DB mapping validation and MQ functionality.
- UTP’s Keyword Driven framework was used to test the front end, while an in-house developed Back End Automation tool performed database validations. (me personally provided input for several Enhancements to our automation tool that halped improve its overall functionality and automation capability)
- Produced status reports for senior management by means of an Excel Macro based - Dashboard that consisted of Time/Cost/Quality/Defect Counts/Variance tracking for the application in test.
- Defect Reports were produced by ALM (Application Lifecycle Management - HP tool)
- While Transitioning to the UAT phase, me’ve provided an overall quality assessment to the Business area and discussed all existing open defects/issues.
- Attended and provided input to UAT defect review meetings.
- Provided UAT user support, addressing application functionality questions/concerns from the business users.
- Completed a Project Closure checklist that consisted of a Budget summary, defect analysis, and overall quality assessment of the application tested.
- Tools:
- Newly adapted applications were automated using UTP directly. NO manual test case development was required.
- Eclipse 3.7 was the choice software for Version Control.
- Introduced new techniques, to allow parallel testing on the same system with different network connections via a Virtual Machine. dis VM would run on a different VPN connection, allowing testers to seamlessly switch between different networks without losing connectivity to either one.
- Cross browser testing was TEMPeffectively managed for IE6,7,8,9 and Mozilla Firefox.
- Bug reporting and tracking was done using an open source application called Bug@theBank (Trac). Some applications required defect tracking using ClearQuest. Currently, the team has implemented ALM (HP Tool) for defect and requirements tracking.
- DB testing was carried out using SQL Developer and SQL Server Management Studio, varied based on the application being tested.
Environment: Java, J2EE, SQL, Oracle, UNIX, XML/HTML,Windows XP, and Win7.
Confidential, Jersey City, NY
Lead QA Engineer
Responsibilities:
- Created a basic test plan to create a logical flow for my team to begin testing. dis consisted of objectives, functions to test, approach, responsibilities, error management and exit criteria.
- Assigned tasks to individual testers and provided them with requirements pertaining to their task at hand.
- Assisted in the re-designing of the entire application.
- From time to time suggested several enhancements to improve the applications architecture.
- Created several documents and templates to streamline testing. These also proved to be extremely halpful in reducing the learning curve for new hires.
- Initial testing efforts began as Manual Unit Testing.
- Used Tortoise SVN to checkout most recent revisions updated by the development team and monitored each defect and/or enhancement as they were made ready for the QA Team.
- Configured my desktop as a server to run the web application.
- Compiled the checked out code using Apache Ant 1.7.
- Deployed the compiled WAR file via Tomcat 5.5.
- Managed cross browser functionality testing using IE 5.5, 6.0, 7.0 and FireFox 2.0.
- Bug reporting and tracking was done using Trac.
- Introduced better reporting methods and visually appealing graphs to provide at a glance summary of reports.
- Monitored and Performed regression testing once a RC (Release Candidate revision) was ready for deployment in the QA Lab.
- Created several scripts using QTP to perform regression testing.
- Manually Tested reports that were generated by Crystal Reports. These were internal media player usage reports that were generated as a browser embedded PDF file.
- Back-end testing was carried out on an Oracle DB, by performing data validation.
- Database testing was done via SQL Developer. (Manually configured and setup connections to the DB Server)
- Second method of Database testing involved using Putty to login to the remote DB server, staring SqlPlus and using command line queries to pull tables, rows, etc.
- Continued testing efforts on my previous project “Media Anywhere” based in Minneapolis, MN.
- Along with above responsibilities, me also coordinated testing efforts with a QA Team of 6 testers based in Bangalore, India.
Environment: Java, J2EE, Crystal Reports, SQL, Oracle, UNIX, XML/HTML, and Windows XP/VISTA.
Confidential, Minneapolis, MN
Senior QA Analyst
Responsibilities:
- Analyzed and modified existing test plans and test cases.
- Solely responsible for the entire testing effort which involved all stages of testing.
- Coordinated and lead testing efforts with a tester based in Moscow, RUSSIA.
- Planned and implemented a test strategy for the project which involved unit, functional and regression testing.
- Created several progress reports to provide real time, application development and testing status.
- Provided recommendations and cost analysis on possible automation software tools that could be used to streamline future testing efforts.
- Performed cross browser functionality testing using Internet Explorer 7, Firefox and Safari (MAC).
- Backend testing was performed using IBM CM8 - eClient for Windows. Search based on SQL queries were executed to perform data validation.
- Suggested specification and requirement documentation updates based on testing and development progress.
- Detected and reported bugs using Trac (Integrated SCM and Project Management).
- Individually identified and reported over 200 bugs within the first month of testing.
- Developed a Google spreadsheet to track existing bug status and testing results, that could be viewed by several global authorities concerned with the testing efforts.
- Installed and configured a new server to be used for future migration of the application.
- Configured and installed Windows Server 2003 Standard Edition for the new server.
- Sever installation included installation and configuration for the following software packages: IBM DB2, IBM Content Manager 8.3, Net Search Extender, WebSphere Application Server 5 and all relevant fix packs required.
- Performed regular maintenance of the newly installed server which included updating Enterprise Archive (.ear) files using WebSphere’s Administrative Console.
Environment: Java, J2EE, WebSphere, Adobe Flex 2, SQL, DB2, UNIX, XML/HTML, and Windows 2003 Server.
Confidential
QA Analyst
Responsibilities:
- Developed test plans, test cases and test strategy for the project.
- Analyzed and documented the Business requirements for WDW ILOL (Mortgage Apps).
- Responsible for writing the Test Plan and developing a Test strategy.
- Involved in development of test cases from functional requirements, technical specification and user interface documents (UI Docs).
- Developed and enhanced test scripts by using feedback from Business Consultants and Project Leads.
- Creating and execution of manual test scripts using Quality Center and Clear Quest.
- Checked the existing functionality of the application manually.
- Clear Quest was used for defect tracking, Incident Reporting and monitoring Change Requests.
- Verified all the test scripts from the team and developed a Test Suite for the production release, optimizing it for the time of run.
- Developed Test Matrices for consolidating the data required for Positive and Negative testing.
- Tested the application manually during the development phase.
- Created automation scripts using QTP for Loan Registration
- Performed Integration and Regression testing to check compatibility of new functionality with the existing functionality.
- Assisted in Production Validation towards the end of release.
- Worked with the QA / Development team to further improve testing methodology.
- Used Loan Origination Software’s such as Byte, Encompass, Genesis and Calyx to create loans and test direct upload functionality to ILOL.
- Extensively used Mainframe applications such as LPS (Loan Processing System) and ACAPS (Automated Credit Application Processing System) for back end testing, data validation and loan decision.
- Validated Pricing calls made in ILOL on newly Locked Loans using Financial Markets (MEX Price).
Environment: Java, WSAD, SQL, Oracle, Mainframe, Clear Quest, Quality Center, QTP, and XML/HTML
