We provide IT Staff Augmentation Services!

Lead Secure Code Review Team Resume


  • Over twenty years of experience in the field of Information Technology (IT) combined with service in the Confidential .
  • Certified Senior Software Test Engineer and certified Project Management Professional (PMP).
  • Experience in project management, quality assurance, quality control, black and white box software testing, section 508 compliance testing, of large, multi - tier systems.
  • Experience in client server, web, .NET, UNIX, cloud, mobile and mainframe environments.
  • Worked in ad-hoc, and disciplined ISO 9000 and CMMI Certified environments.
  • Test Automation Tools: Experienced in performing DARs (decision analysis resolution) to determine which test automation tool is most compatible with the application under test and supporting infrastructure. Experienced with working with development team to install and test the tool to ensure it works properly. Worked with test automation tools to develop and run regression tests, retests and tests that were prone to human error (math calculations) most notably Rational Function Tester, Quick Test Pro, Selenium
  • Confidential Citizen with Public Trust and SSBI Background Investigation.


Operating Systems: Windows, Unix, Linux, Solaris, MS-Dos, Android

Languages: Java, JavaScript, ASP, Blaise, C, C#, Oracle, WebPortal, IE, Mozilla Firefox, Netscape,Linux, ASCII, UNIX, Python, SQL, VB, Perl, XML

Databases: SQL Server, DB2, DB. Linux, Oracle, MySQL, Access

Software: Microsoft Office, Visio, LYNC, SharePoint, InfoPath; IBM Policy Tester, Rational Suite (Rational Functional Tester (RFT), Rational Performance Tester (RPT), Policy Checker, Requisite Pro (ReqPro), Rational Quality Manager, Rational Team Concert, Lotus Notes; BluePrint; Mercury Interactive; WinRunner; LoadRunner, Test Director, Quality Center; Oracle; JIRA; ; Quick Test Pro (QTP); Synergex; PVCS Tracker; CollabNet; TeamForge; Freedom Scientific; JAWS Assistive Technology; UML Modeling; Object Oriented Design (OOD); Object Oriented Analysis and Design (OOAD); Web Server Performance Monitors; Database Server Resource MonitorsWeb, .Net, UNIX & Mainframe Environments, Windows 2000/NT/XP/7, LDAP, Weblogic, Big IP Router, Oracle 9i, Siebel, Java, JavaScript, Shell Scripts, C, C#, C++, XML,NET, JSP, Active X, TOAD, Jbuilder, Verisign, Internet Explorer 6.0, 7.0, Google Chrome, Netscape, Firefox, PEGA



Lead Secure Code Review Team


  • Manage the Secure Code Review Team (CRT) in performing static code analysis of DHS ICE web and client server applications
  • Schedule and facilitate secure code audit kickoffs and meetings with the development teams to discuss the audit findings for DHS ICE applications
  • Scan, analyze, generate reports and determine the security posture of static software code using SonarQube and HP Fortify static code analyzers
  • Generate and compose report of secure code audit findings to include violations mapped to OWASP Top Ten or MITRE’s Common Weakness Enumeration and recommend mitigation solutions
  • Assist ISSOs (Information System’s Security Officers) in tracking the status of ATOs (Authority to Operate) for various DHS ICE development and production systems
  • Facilitate weekly Code Review Team meetings with DHS ICE client to assess work performance, report measured progress and to receive client directions
  • Schedule CRT workload, conduct daily scrums, resolve issues and document team activities and progress
  • Compare secure code scan audit results against industry standard of errors per total lines-of-code (TLOC) and work with development teams to resolve
  • Perform peer-review walkthrough and inspections of CRT documentation

Sr. Software Systems Engineer


  • Managed program level system engineering phases for CEDCaP (Census Enterprise Data Collection and Processing) projects to include the following systems: 2016 ECON Refile, 2016 ECON COS/ASM, 2017 ECON, 2016 Decennial Test, 2018 Decennial Test, and the 2020 Decennial Census
  • Created program-level test plan to provide guidance for CEDCaP project teams to maintain a cohesive standard
  • Scheduled program milestones such as P-CDR (Program Level Critical Design Review), P-TRR (Program Level Test Readiness Review), and P-ORR (Program Level Operational Readiness Review) for CEDCaP projects
  • Facilitated weekly CEDCaP Integrated Project Team (IPT) meetings to convey, brief and discuss project status, milestones, roadblocks and all other pertinent information with project and program stakeholders
  • Facilitated weekly CEDCaP, EDITE (Enterprise Development Integration and Test Environment) and ETSB (Enterprise Testing Services Branch), Mind Meld meetings to discuss and resolve project roadblocks with program and project leadership
  • Tracked CEDCaP project security impact analysis and ATOs (Authority To Operate) for CEDCaP cloud and non-cloud development and production environments
  • Produced program level guidance documents for CEDCaP project teams such as: Configuration Management Plan, Performance Management Plan and Test Implementation Plan
  • Proactively worked with CEDCaP project teams to ensure projects meets exit criteria in preparation for enterprise test readiness review (TRR)

Software Testing and Process Improvement Trainer


  • Performed hands-on manual and automated Selenium testing for the Affordable Care Act (ACA) application
  • Trained software testers on test methodologies and test management techniques using standards and guidance of the ISTQB (International Software Testing Qualifications Board)
  • Performed static testing of development artifacts such as requirements specifications and architectural design before the execution of these artifacts
  • Used black-box, experienced-based, and white-box techniques to compose comprehensive test design to ensure full, in-depth test coverage of requirements
  • Wrote master test plan to address multiple levels of testing (unit testing, functional testing, system testing, User Acceptance Testing (UAT), performance testing)
  • Reviewed system requirements, architectural designs, and risk reports to identify and document test conditions to be verified by one or more test cases
  • Performed test case prioritization reviews to determine the order in which the test cases should be run given the time allotted for the testing phase
  • Wrote and executed manual test cases and identified and scripted test cases that were candidates for automation using Selenium test automation software
  • Created and loaded and verified test data using XML template
  • Documented defects using JIRA and communicated with developers and stakeholders to determine the priority of defect fixes
  • Ensured full traceability of system requirements, test design, test cases and defects to ensure full accountability of all testable requirements

Software Test Engineer


  • performed hands-on testing of new and legacy applications developed in various languages including Java, .NET, C, C#, Python, Unix, Linux assessed projects and developed testing approach based on availability of test basis; approaches included requirements based test approach, risk based test approach and reactive based test approach wrote SQL queries to verify test data, create test data and troubleshoot issues used Rational Suite to manage tests, provide traceability, capture metrics and develop metric reports conducted hands-on functional testing (black box), backend (white box), performance (non-functional) and Section 508 testing of websites and web applications performed test analysis of projects to determine and document the test conditions and reviewed the same with stakeholders performed test designs to determine how to test each condition wrote, modified, and tweaked test automation scripts for regression testing and re-testing oftentimes using elementary Java code to optimize test script created regression test suite specific to each project using Rational Function Tester facilitated defect review meetings with developers to discuss defect validity, fixes and planned release dates
  • Created new process improvement workflow to ensure all Forest Service applications in corporate Section 508 standards into the Requirements and Development phases of the Software Development Life Cycle (SDLC) and trained FS staff on that new process created and ran automation scripts for Section 508 testing using Rational Policy Tester tested websites and web applications against the Section 508 Communication and IT standards (1194.21, 1194.22, 1194.23, 1194.24, 1194.25 and 1194.26) worked with IBM to install software updates, troubleshoot problems and resolve inconsistencies reported Section 508 violations to stakeholders and worked with development team to fix non-compliances set up test environment, establish parameters for system load and stress tests using LoadRunner to create V-Users and to assess system performance under expected user load
  • Rational Requisite Pro
  • Rational Functional Tester
  • Rational Quality Manager
  • Rational Team Concert
  • Rational Policy Tester
  • SQL Developer
  • LoadRunner
  • Performance Tester

Technical Environment: Windows 2000, Windows 7, Unix, Linux, Android, Apple, Java, JavaScript, .Net, Oracle 9i, Oracle 10g, Oracle Forms, IBM Rationale Suite, HP LoadRunner, Visio, LYNC, SharePoint, InfoPath; IBM: Policy Tester, Rational Suite (Rational Functional Tester (RFT), Rational Performance Tester (RPT), Policy Checker, Requisite Pro (ReqPro), Rational Quality Manager, Rational Team Concert, Lotus Notes; BluePrint; CollabNet; TeamForge; Freedom Scientific; JAWS Assistive Technology, VeriSign


CMMI Manager


  • Streamlined a process that normally takes 2 - 2.5 years to 1.5 years
  • Used MS Project to identify and track project tasks and deadlines
  • Played key role in CMMI SCAMPI A inspection by SEI Carnegie Mellon
  • Applied CMMI Specific Goals and Specific Practices to assess and improve the process for secure products
  • Conducted hands-on testing of software applications for AT&L website redesign wrote Software System Test Plan to establish scope, resources, test types, duration, inputs, outputs and test expectations analyzed requirements and design to identify what to test and assess testability of the requirements designed tests to include entry and exit criteria, pre-conditions and expected results identified, documented and tracked defects until closure using PVCS Tracker conducted functional, systems, and systems integration testing of application releases created and ran manual tests and assessed which tests were candidates for automation created and ran automated retests and regression tests using HP Quick Test Pro (QTP) wrote test scripts and performed debugging of the Java code to ensure script automation would run correctly created data driven tests which allowed scripts to link to a spreadsheet, extract the data, plug in the that data, and execute test to reach the desired expected results

Technical Environment: Windows, Java, .Net, ASP, Oracle DB, Linux, UNIX, Solaris, TOAD


Software Test Engineer Contractor


  • Managed team of 5 manual and automated software test engineers
  • Conducted test planning to assess maturity of XM’s test practices and to determine the monitoring and control techniques to use (i.e., how metrics would be captured, what metrics would be captures and how they will be reported)
  • Performed test analysis of test basis materials inclusive of loose requirements, use cases, and SME input to determine test conditions (what to test) and conducted test analysis review with stakeholders
  • Assessed test techniques (white box or black box) to use and assigned workload based upon test teams capability matrix
  • Performed test design to ensure requirements and risks coverage and to detail entry and exit criteria
  • Prioritized test execution according to priority level of requirements and risks
  • Assigned workload to test staff for manual and automated testing
  • Performed automated and manual testing of a primarily Java written application that also included JavaScript, C# and Python
  • Used QTP and Selenium to automate regression tests to be tested against Internet Explorer, Google and FireFox browsers
  • Documented and managed defects in PVCS Tracker
  • Facilitated defect status meeting with stakeholders to assess and prioritize defect fixes
  • Facilitated User Acceptance Test, captured and prioritized user found defects and comments and presented to stakeholders for resolution
  • Conducted performance test using LoadRunner to assess system performance against the expected load
  • Conducted stress test using LoadRunner to assess which component(s) of the system was/were most likely to under perform
  • Performed test closure activities (test cases concluded, source documents in repository, data forwarded to maintenance team, lessons learned documented)

Technical Environment: Windows 2000, Unix, Linux, Java, JavaScript, C#, .Net, J Builder, Oracle DB, PVCS Tracker, Selenium, HP QTP, HP LoadRunner


Software Test Engineer - Contractor


  • Participate in the project life cycle of multiple census projects: DADs (Dangerous Address Database), CenCATI (Census Computer Automated Telephone Interviewing), WebCATI (Web Computer Automated Telephone Interviewing), ICATI, PFIRS (Paperless Fax Image Reporting System), CAPI (Computer Assisted Personal Interviewing), ACS (American Community Survey).
  • Participated in UML models and use case development approaches to conduct testing on projects using an agile, iterative SDLC process
  • Performed hands-on functional testing of applications’ features that were planned, designed and built using agile process
  • Used WinRunner (now Quick Test Pro QTP) to automate regression tests
  • Wrote code to modify and tailor WinRunner test scripts to enhance efficiency of automated tests against Java developed applications and .NET developed applications
  • Participated in requirements reviews to establish test planning and test control approaches and test techniques for projects that used a sequential SDLC process and for projects that used an agile SDLC process
  • Conducted test analysis to determine test conditions (what to test) by reviewing test basis sources such as requirements, high level designs, low level designs and documented risks for projects following a sequential SDLC process (e.g., V-Model)
  • Conducted test design to ensure requirements coverage, identify gaps (sequential processes) and to ensure functional coverage of features (agile processes)
  • Created project master test plans to document the test strategy, test approach, test levels, test techniques, test schedule, test entry/exit criteria, test metrics plan, etc.
  • Effectively used Test Director (now called Application Lifecycle Management) to manage test cases, defect status and maintain traceability
  • Efficiently produced metric reports using Test Director to control the test process, identify problem areas and collaborated with project team to resolve discrepancies
  • Wrote SQL queries to verify, validate, create or modify test data and to troubleshoot defects
  • Successfully used LoadRunner to use create V-Users (virtual users) to simulate anticipated user volume on the system to assess system performance
  • Participated in performance test reviews, performance test tweaks and performance re-tests to meet performance test objectives
  • Managed User Acceptance Test, documented, prioritized user found defects and comments
  • Participated in the Change Control Board (CCB) do discuss, assess, approve, disapprove proposed changes to the system.

Technical Environment: Mercury Interactive’s WinRunner, Selenium, LoadRunner, Test Director, Quality Center, Java, JavaScript, ASP, Blaise, C, C#, Oracle, WebPortal, IE, Mozilla Firefox, Google, Netscape, Linux, ASCII, UNIX, Python, SQL, VB, Perl, XML, UML Modeling; Object Oriented Design (OOD); Object Oriented Analysis and Design (OOAD); Web Server Performance Monitors; Database Server Resource Monitors, LDAP, Weblogic, Big IP Router,, Siebel, C, C#, XML,NET, JSP, Active X, TOAD

Hire Now