Software Quality Assurance Resume
Fort Collins, CO
SUMMARY
- For the past 20 plus years the Software Testing and Quality Assurance field has been my area of interest and expertise.
- I have stayed abreast of the latest forms of technology within this field, to include the use of Automated Testing Applications
- Use Case Application, Agile Technology, RUP Application, and other software lifecycle development applications, establishing Test Plans and Test Case outlines
- I have been instrumental in the establishment of processes and quality assurance, metrics, and SDLC.
- My leadership abilities have been displayed through the appointment of Assistant Project Manager, Quality Assurance Manager, Team Lead, and Test Lead.
- This coupled with the variety of platforms I have worked on offer a wide range and diverse background for an employer to draw upon.
TECHNICAL SKILLS
- QTP
- Loadrunner
- NeoLoad
- Test Case and Test Suite Peoplesoft
- SalesForce
- Toad
- various security apps
- Java
- Visual basic
- Visual Studio
- Quality Center Enterprise
- Rational Test Workbench
- Rational Performance Tester
- Rational Quality Management various defect tracking tools
PROFESSIONAL EXPERIENCE:
Confidential, Fort Collins, Co
Team Lead
Responsibilities:
- As the Team Lead of 14 other automated test personnel, I insure what projects each member works on and oversee the creation of functional scripts, and the conversion of manual functional scripts to automated scripts.
- I also oversee any and all Performance and Load testing to include the creation of a formal test plan and the creation of a formal test report which is provided to the project manager and upper level management.
- I review these scripts to insure they meet the functional steps outlined in the manual test scripts.
- I hold weekly team meetings reviewing with each automated test person how many manual scripts have been converted, status of running the automated scripts from Rational Quality Manager.
- I then generate a report for upper management outlining the progress from week to week. I have also set up a weekly Tips and Tricks meeting to help each member learn new short cuts or methods of creating new scripts, and how to reuse existing scripts.
- I create additional reports from Rational for management to review to insure we are on expected schedules.
- Train new automation testers by giving them an overview of the functional and operational aspects of the tool, and provide follow up training as needed.
- I create and execute P&L scripts for various projects.
Confidential, Denver, Co.
Team LeadResponsibilities:
- Conducted Data testing and performed data analysis on Global Data Warehouse operations. Worked with DBA’s to decide approach to confirm validation and outlined where anomalies occurred. Recorded defects in Rally, and retested Rally defects created prior to my arrival. Created and ran Macro’s to populate data worksheets for the data comparison.
- Performed functional testing on SalesForce, Yardi, Global Data Warehouse and small amounts of Peoplesoft interactions
- Performed security testing for users permission
- Worked with two others to determine if any automation tools would allow better regression testing of the GDW application.
Confidential, West Bend, Wi
Team LeadResponsibilities:
- West Bend Mutual had over 2000 old and current defects they decided needed to be addressed.
- Categorized the defects with the Project Manager, outlined priority of defects, established the resolve on how they would be redesigned for best business practice.
- Worked with developers on establishing a proper test environment. Attended daily standups to review progress. Wrote and created test cases for testers to execute.
- Oversaw 12 test leads and 40 test personnel. Documented all results for Project Managers reports.
Confidential, Denver, Co.
Team LeadResponsibilities:
- Conducted all performance and load testing using NeoLoad.
- Outlined and submitted documents on what was needed for a formal performance environment.
- Set up load generators for use with neoLoad. Created monitoring links to testing and sub - production environments.
- Created performance scripts to establish a baseline for functionality. Redirected and ran scripts, then conducted analysis on results.
- Documented and charted results for distribution to appropriate staff. Reviewed and outlined results to staff for any change consideration.
- Did script editing for testing in multiple environments for performance comparisons.
- Work with developers in Colorado, and Argentina to coordinate any upgrade or improvements based on performance and load results.
- Internal applications tested, Store Search, Online Order, Safety Security Risk, Online video training, Account Creation, Password creation and update
Confidential, Colorado Springs, Co
Software Quality Assurance
Responsibilities:
- Established the complete system of software Quality Assurance for Metso.
- Metso defect tracking consisted of sending an email to someone and telling them the product would not work.
- Upon arriving I established a defect tracking process using ExtraView Defect tracking system to record all defects and logged them into the tracking system and the problems were detailed and outlined.
- I then established defect review meetings twice a week to review with the developers the correction progress and outline any new defect problems. Because Metso had no documentation, I then established a Software Design Document process.
- When the customer requested any change, upgrade or repair I required the developer to document what was being changed, the parts of the program it interfaced with, the new functionality and its importance and outline any calculations that the new process may use.
- I was also asked by my supervisor to review customer redesign work and calculate costs to be presented to the customer.
- Outline and documented requirement for product, and created test cases, test scenarios and scripts for regression, functional, scalability, and performance testing.
- Performed Performance test operations to determine of application works within performance guidelines with Selenium IDE, Selenium Remote, Test Case and Test Suite
Confidential, Colorado Springs, Co
Requirements Analyst
Responsibilities:
- Worked extensively with Quality Center in an agile environment to create High Level Story Cards, based on project outline.
- Assigned points to each card and detailed sub card projects while calculating additional risk factors.
- After this I would create the Detailed Story Cards and assign them to the appropriate developer.
- I would then create the Fit Criteria of expectations, and attach any screen shots or images for reference.
- At this time I would schedule a Gateway Review meeting to go over each High Level Story Card for the iteration.
- I would them set a Pre-IPM Meeting for Iteration review and work load determination.
- Also scheduled and ran the Iteration and Requirements Sync meetings.
- Assisted in the creation of QTP scripts for the automated testing of application.
- Would query data to establish test models for other testers and story cards.
Confidential, Englewood Co
Software Quality Engineer IV
Responsibilities:
- Electronic Flight Bag is an application used by commercial and private pilots to set departures and destination on portable unites (tablets), and have flight information available while in route.
- I reviewed requirements and created test cases, performed test case reviews, ran test case updates, reviewed other people's test case as a review process, executed test cases, wrote CR's on software defects, and executed regression tests on new software builds.
- I participated in the review and updating of some 900 legacy test cases to bring them up to current use.
- I outlined and documented all additional requirements for product, and created test cases, test scenarios and scripts for regression, functional, scalability, and performance testing.
- All defects were recorded in Performance center for revision back to developers.
- Extensive SQL work on oracle database.
Confidential, Colorado Springs, Co
Software Quality Engineer IV
Responsibilities:
- Barclay card partners with various companies, ( LL Bean, Travelocity, Choice Hotels) to provide credit cards with their logo and name on them. Performed Quality Assurance functions to include review of requirements to facilitate the creation of test cases. Test case creation and development in Microsoft Quality Center. Execution of test cases' in Quality Center Test Lab. Execution of process in Control-M as well as creation of test accounts and validation in TSYS to include simulated payment process. I also tested and monitored the test accounts in CCD, (Customer Service web site).
- I outlined and documented all additional requirements for product, and created test cases, test scenarios and scripts for regression, functional, scalability, and performance testing. All defects were recorded in Quality Center for revision back to developers.
- Extensive SQL work on oracle database. Verified the API applications would work and display properly.
Confidential, Englewood, Co
Software Quality Engineer IV
Responsibilities:
- Interfacing with an Oracle data warehouse. I created test cases and test scripts to validate airport, runway, waypoints, refueling, and navigational aids. Validation consisted of the ETL, (extraction, transfer, and load) from an Oracle database to an Access flat file, and then redeposited into an Oracle database.
- It was then reconfirmed on a flight simulator, and through the extensive use of SQL queries.
- In addition, I performed validation of specific data through the use of aeronautical charts world wide
- Because the C-17 has specific air and landing requirements, i.e. runway length, width, and surface material, validation was completed using both the simulator and through data query. Test cases and test scripts were created using Microsoft Excel and imported, stored, and run from HP Quality Center.
- In conjunction with data validation I attended requirements meeting, use case review meetings, high level design meeting, and RIO meetings to review and validate compliance.
- Platform and tools consisted of: Oracle, SQL, MSQuery, Unix, HP Quality Center, ETL process.
Confidential, Colorado Springs, Co
Software Quality Engineer/Automated Testing
Responsibilities:
- Testing consisted of unit, functional, and integration to include either manual or automated product version upgrade.
- Some testing was done on remote Wi-Fi locations to verify connectivity and basic product functionality.
- Test features of the product consisted of call forwarding, call blocking, caller identification, contact list configuration, as well as a number of other features.
- Testing was performed to confirm functionality on Windows XP, Windows 7, and MAC platforms.
- Some automated scripts were created with the use of QTP for validation and use with Mexico and Argentina test locations.
- Test cases and test scripts were created using Microsoft Excel and imported, stored, and run from HP Quality Center.
- This application serves as a SAAS and is also an EDI application, and interfaced with VOIP applications.
- Created and ran Unix scripts for basic research and operational use.
- Verified the API application would work and display properly.
Platforms and tools consisted of: Windows XP, Windows 7, QTP, HP Quality Center, J2EE,
Confidential, Denver, Co
Software Quality Engineer III
Responsibilities:
- Included in this are a number of software applications that are intricate to the operation and accuracy of the medical equipment Baxa manufactures. The primary application, (EM2400) is used for the infusion of intravenous, (IV) bags by a pharmacist. Accuracy is needed to be within one-one thousand as patient safety is dependent on accuracy. Software is embedded onto the hardware and testing consisted of uploading a current or new software version, integrating patient information, and performing manual processes to validate requirements.
- A secondary application (Abacus) is a network application used to configure pharmacy medications and then sent to the EM2400 equipment at a central location. Abacus issues warning and displays graphs to show proper configuration of medication, and help the pharmacist determine any carbohydrate, protein or lipid needs in the ingredient configuration. Software testing consisted of requirement review, test case configuration and execution, and generation of a traceability matrix for FDA, (Food and Drug Administration) approval, to show proper safety testing.
Confidential, Nashville, Tn
Quality Assurance Team Lead
Responsibilities:
- I performed extensive data comparisons for music that was sent to Landmark for insertion into a specific database. This ETL process consisted of transferring data from a supplied portable hard drive, (sometimes in a Mac format) into a specific database where it was then parsed out and inserted. This could consist of and many as 50,000 records. Validation was performed to insure artists and writers would receive credit for appropriate royalties
- I also evaluated automated tools for consideration and compatibility with programs being developed, and made recommendations to the Senior VP of Development.
Platforms and tools consisted of: Windows XP, SQL, Oracle, Mac, J2EE, Web based, ETL
Confidential, Cedarburg, Tn
Automated/Performance/Load Test Engineer
Responsibilities:
- AEDC provides PeopleSoft, Windows and security application to Arnold Air Force Base and the DOD.
- Testing consisted of creation of LoadRunner scripts to validate performance of PeopleSoft upgrade and Oracle upgrade and configuration.
- I also performed software security upgrades and manual testing to confirm security settings.
- Additional LoadRunner testing consisted of the configuration of LoadRunner to by pass MS LoadBalancer to distribute across multiple servers through IP spooling.
- Monitoring servers for CPU, Memory and additional performance requirements. Monitor Weblogic for performance. Create and submit detailed reports of test results to include historical reports for comparison.
Platforms and tools consisted of: Windows 2000/2003, QTP, LoadRunner, PeopleSoft, Synergen, Weblogic, MS LoadBalancer, and Oracle
Confidential, Brentwood, Tn
Software Quality Assurance, Manual Testing
Responsibilities:
- Perform Quality Assurance manual testing on medical application for the use by hospitals and inpatient facilities.
- Manual testing of this application included but not limited to formal test case review, manual testing of test cases, structuring and configuration of Virtual Server systems, and reporting of anomalies within the application.
- Reported anomalies to VP of development for review and dispersing of repair.
Confidential, Brentwood, Tn
Software Quality Assurance
Responsibilities:
- Perform Quality Assurance testing on medical application for the use of Renal Care. Work with Project Manager to establish requirements. Outline, document, and create Test Cases. Perform manual testing of application. Record defects and retest. Document reports to IT manager on project status. Product is a Weblogic server software application, interfacing with J2EE and an Oracle data application for distribution to customer thin clients from corporate data applications. Also created and ran Unix scripts to insure data transfer.
- Development and testing was within the EDI and HIPPA structure to insure data transfer and security restrictions.
Platforms and tools consisted of: J2EE, Oracle, Unix, Web based
Confidential, Nashville, Tn
Performance Test Engineer
Responsibilities:
- Establish Performance test operations for various groups within the Confidential operation. Worked with groups to establish Performance criteria, outline and document requirements, create test scripts, run test scripts, and create result matrix. Created LoadRunner scripts and executed. Establish results and reported results to appropriate groups. Basic scripts would be created and run through QTP and Loadrunner for regression testing upon upgrades.
- Work with HP (Mercury) Corporation to upgrade product version and license status from assigned to concurrent, as well as product upgrade.
Platforms and tools consisted of: Weblogic, J2EE, Oracle, Unix, Loadrunner
Confidential, Brentwood, Tn
Quality Assurance Manager
Responsibilities:
- Established full quality assurance process and practice for software and hardware provided to customers. Outlined and defined requirements for the establishment of test documentation to include Formal Test Plans, Test Case Outlines, Testing Metrics, and preliminary deployment of Automated Test Scripts. Implemented the correct practices and procedures for the use of Mercury Test Director. Outlined the advantage of automated test tools. Established Quality Assurance guidelines for hardware systems. Established and documented baseline performance for hardware and applications.
- Worked with over 14 people to coordinate test functions for the integration of multi component application, and multi component of hardware systems to quality assurance of hardware, to meet customer expectation and delivery date.
- Outlined and established value added aspects of Automated Testing.
- Performed testing functions myself to include Unit Testing, Integration testing, System Testing, Performance Testing, and User Acceptance Testing. Outlined staffing requirements, outline detailed mission critical testing to project manager, and oversaw all testing functions.
- Implemented customer support operations.
- Created, established, and implemented processes for User Documentation and Customer Training.
Confidential, Cincinnati, Oh
Quality Assurance Manager
Responsibilities:
- Responsibilities: In charge of complete testing operations for Manager Data Integration Program, as well as Bond Builder 3.0. Oversee 12 people and distribute testing functions as well as perform testing functions myself to include Unit Testing, Integration testing, System Testing, Performance Testing, and User Acceptance Testing. Establish Test Plans, staffing requirements, outline detailed mission critical testing to project manager, and oversee with all testing functions.
- Help create Automated Load Test Scripts using QuickTest, WinRunner and LoadRunner Defects were recorded in Test Director for revision back to developers.
Confidential, Houston, Tx
Team Lead Test Engineer
Responsibilities:
- Peoplesoft HR application to include all modules: new hire, benefits, tax calculation, reports, and payroll.
- Construct Automated-testing applications for Peoplesoft install and upgrade. Integration testing and performance testing.
- Integrate and create Automated Scripts for Performance Testing. (Quick Test, Mercury WinRunner)
- Create Automated Load Test Scripts. (Mercury LoadRunner)
- Defects were recorded in Test Director for revision back to developers.
Confidential, Jupitor, Fl
Performance Test Analyst
Responsibilities:
- HR staffing application for management to oversee work crews throughout the State of Florida. Establish criteria for Performance Testing. Document Performance Testing Hierarchy. Write Performance Test Plan.
- Create Automated Scripts for Performance Testing. (Mercury WinRunner)
- Create Automated Load Test Scripts. (Mercury LoadRunner)
- Create Performance Testing time line. Review Test environment for suitability to Performance Testing. Documented Requirements as outlined by users.
- Created Test cases, Created Test Scripts, Performed complete Performance Testing operations.
- Provided FPL and Worksuite with Performance Analysis Document for review.
Confidential, New Orleans, La
Lead Test Engineer
Responsibilities:
- Development of a HR system for the U.S. Navy using Maturity Modeling Level III and CMM III.
- Test web bases application integrated with data warehousing, and Comms, and PeopleSoft integration.
- Tested Peoplesoft Migration from Ver7.0 to Ver 8.8
- Payroll, HR, and modification to configure within NSIPS application.
- Work with automated test scripts, (Mercury WinRunner), to determine produce functionality as specified in Requirements.
- Manual Testing Responsibilities include execution Test Plans, Test Cases, Test Scenarios for operational verification, Requirements Matrix and manual testing of Web Enabled Storage application Software.
- System Testing separate from Development.
- Data query to include manipulating of data for testing purposes.
Confidential, Colorado Springs, co
Lead Test Engineer
Responsibilities:
- Mass storage application
- Test Reporting Report Test status to QA Manager, PM and Development Manager.
- QA Process and Procedures Serve on a QA team to create and implement best practices for QA Policies and Procedures.
- Manual Testing Responsibilities include execution Test Plans, Test Cases, Test Scenarios for operational verification, Requirements Matrix and manual testing of Web Enabled Storage application Software.
- Tools Evaluation- Serve on a QA team to evaluate Test Tools, Defect Tracking Tools and Requirements Tools.
- Test host dependent and host independent applications for storage controllers.
