OpTech’s Quality Control Plan OpTech’s quality goals are to achieve the Performance Standards set by our client by providing and implementing the procedures necessary to maintain control and ensure continuous quality improvements in each functional area for each task. The OpTech approach prevents deficiencies through a systematic approach that continuously analyzes designs, develops, implements, and evaluates performance. OpTech recognizes that quality is achieved by maintaining repeatable, scheduled, documented, and managed processes and by properly motivating and assigning responsibility and oversight through a Quality Management System based on ISO and SEI/CMMI principles. The first step in delivering quality is an accurate understanding the formal deliverables and work products. OpTech will work with the Navy to gain a complete understanding of each Task Order (Call) and the associated deliverables. The second step is identifying and integrating the Navy’s policies, templates and guidelines into OpTech’s Quality Management Plan so we are compliant with all the Navy’s standards and policies. Two of our Contract Teaming Partners (Sayres & Associates and Phoenix Group) are already highly successful vendors to the Navy and their experience and understanding of the Navy’s policies and standards will be very beneficial in adapting our Project and Quality Management Plans to the Navy’s environment even prior to the first Task Order Call. The third step in establishing our Quality Control Plan is to monitor and maximize quality through a standard, repeatable quality-control program. This program has the following key elements: Basing its quality program on ISO, CMMI and other proven processes. A SharePoint Site providing Quality Reporting to the Navy Customers and OpTech Team Members. Performance metrics quantifying the meeting of customer expectations and SLA’s. The quality specialist reporting results directly to the OpTech Program Manager on all Task Order Calls being managed by the OpTech Team. Workers having primary responsibility for quality. Management performing random quality inspections. Clear documented processes and plans covering all work. Actively soliciting feedback from customers. Incorporating lessons learned into new procedures. Tracking problems until issues are corrected. Conducting performance evaluation meetings with customers. Implementing performance-based management for today’s solutions. OpTech will institute a specific Quality Control Plan (QCP) for each Task Order Call awarded under SeaPort-e. The QCP defines standards, techniques for measuring compliance, an approach for eliminating causes of unsatisfactory performance, and the roles, responsibilities, and resources. OpTech’s quality control system is a key contributor to its ability to deliver highquality products and outcomes. This system is documented and is used for all contract work. 1 OpTech has a defined approach, method of surveillance, and incentives as part of our QCP that ensures meeting quality standards. The following list identifies the focus of OpTech’s quality standards: Schedule – deliverables and reports submitted on time Budget – meet cost and budget constraints Quality – ensure accuracy, completeness, and minimal errors and rework Security and information assurance – meet the Navy’s requirements The OpTech team has extensive experience successfully managing and executing performancebased contracts. OpTech’s approach facilitates a partnering approach with customers and ensures meeting all performance standards. OpTech engages the customer during the entire project life cycle which results in input on the deliverable, minimized problems, and eliminates last minute surprises. OpTech’s Quality Control Plan assures a continuous cycle of excellence. OpTech’s Quality Management System is based on the ISO 9001:2000 standard. Our quality management process addresses the Navy’s concerns in the areas of product and services delivery, customer satisfaction, cost, future maintainability and future consolidation. OpTech’s QA/QC processes incorporate the key practices of the SEI CMMI into our Business Practice System Requirements and Business Practice Standards (BPS) documentation – the guidebooks of our technical and management processes and procedures. The crosswalk between the SEI CMMI and some sample BPS’s are shown in Exhibit 6.2.3-1. We can utilize these CMMI-derived practices daily to determine our ability to fulfill the goals set forth by our clients. Implementation of and adherence to these practices enables OpTech and all our employees to meet the industry and governmental standards necessary for Continuous Process Improvement (CPI), growth and performance in all contractual matters. SEI CMMI Commitments and Standard Practice Reference Table SEI CMMI Process Business Process Manual Program Control Review Schedule Management Business Practices and Contracts Integrated Engineering Engineering Plan & Estimation Integrated Engineering Tracking Inspections System Requirements Definition System Design Product Design OpTech Reference SEI CMMI Process O-BPS-A10 O-BPS-A20 O-BPS-D20 O-BPS-D30 O-BPS-E05 O-BPS-E10 O-BPS-E15 O-BPS-E20 O-BPS-E25 O-BPS-E30 O-BPS-E35 Product Development System Integration and Test Engineering Changes Technical Manuals Maintenance and Support Configuration Management Problem Management Deployment System Ops & Maintenance Subcontractor Management Quality Assurance Exhibit 001 - Standard Practice Reference Table 2 OpTech Reference O-BPS-E40 O-BPS-E45 O-BPS-E50 O-BPS-E55 O-BPS-E70 O-BPS-F10 O-BPS-F20 O-BPS-F40 O-BPS-F50 O-BPS-F55 O-BPS-G10 OpTech’s management follows the appropriate Business Practice System (BPS) in the performance of our tasks and flow down these processes and procedures to all projects. We also offer to our clients the opportunity to modify these procedures to their in-house standards or to use ours and move to a higher SEI CMMI level. OpTech’s CMMI Policies, Plans and Practices 1 2 3 4 5 6 7 8 9 Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Requirements Process and Product Quality Assurance Configuration Management Measurement and Analysis Organizational Process Focus Organizational Process Definition X X X X X X X X X 10 11 12 13 14 15 16 17 18 Organizational Process Training Integrated Process Management Requirements Definition Technical Solutions Product Integration Verification Validation Risk Management Decision Analysis and Resolution X X X X X X X X X Exhibit 002 - CMMI Policies, Plans and Practices OpTech’s Quality Control Plan includes Quality Assurance (QA), Configuration Management (CM), a Metrics Program and a Continuous Process Improvement Program (CPIP) for all project tasks. OpTech offers all of the elements or selected ones of the Navy ’s choosing to be applied on the tasked projects. Each element of the Quality Control Plan is shown below and provides detailed discussions on identifying deficiencies, citing staffing responsibilities and how our QC Plan supports the the Navy requirements. OpTech has a Draft Quality Control Plan and (if tasked) will develop and submit a specific Quality Control Plan to the Navy for each task as our first quality work product. The plan will specify roles and responsibilities, scheduled audits with checklists and details on problem reporting. Quality Assurance Approach Our approach to providing QA will be to tailor our standard QA practices to support our the Navy PgM and all TOs. Our PMP will also document our QA/QC program for the Navy TOs. Our QA Plan (if tasked) will be tailored to the needs of any the Navy task/project and be included in the the Navy Project Plan. It will cover services, products and processes as requested by the the Navy tasks. That is, both deliverables and the processes used to develop the deliverables will be the subject of our rigorous QA approach. OpTech has formed an internal Software Engineering Process Group (SEPG) to continuously improve our engineering processes. The SEPG consists of representatives from various OpTech functional groups, including systems engineering, software development, QA, CM and technical writing, as well as the program managers or their representatives. The the Navy can take advantage and be the beneficiary of this group, such as lessons learned from other Programs in the commercial sector. The SEPG's goal is to improve the software engineering process by streamlining the sequence of steps through which a project or an activity must progress. The SEPG meets bi-weekly to review all the company's system and software development processes and to discuss all suggested or recommended improvements. It reviews and approves all amendments to current processes or any introduction of a new process. The SEPG also monitors the company's success in complying with the SEI CMMI processes. Data from trouble reports, 3 peer reviews, customer feedback and QA reviews are used to improve processes. OpTech’s continuous improvement program is iterative and uses metrics extensively. OpTech’s QA places emphasis on identifying and recording design and programming discrepancies (and errors) as early as possible, in the development process, recognizing that the later in the process an error is detected, the more costly the error is to correct. QA embodies the major elements of testing, system evaluation, and configuration management, in addition to specific mechanisms for QA monitoring throughout the system development project. Quality Surveillance Plan • Introduction • Engineering Reviews • Scope • Audits / • Document Checklists Review • QA Activity • Updates Schedule • Reference • Problem Reporting Documents • QA Activity • Roles and Reporting Responsibilities • Quality Records • Program Manager • Monthly QA Reports • Task Manager • Performance • Corporate QA & Metrics CM Collection, • QA Engineer Analysis, and • Engineering reporting • Planning Quality • Engineering • Quality • NonCompliance & Non Surveillance Escalation Planning • Plan Implementation PerformancePerformanceBased Solutions • Performance metrics assigned at the WBS Task Level • Earned value, # of defects, timeliness, # of change, activities, etc. – best metric for each WBS element • Checklist / audit form for each metric • Independent data collection, reporting QA Activities Matrix Exhibit 003 – Quality Surveillance Plan The specific QA mechanisms used are technical reviews or walk-throughs and formal QA reviews. In a walk-through, a team of functional and technical experts (identified through OpTech’s SEPG) is assembled to examine products of the life cycle for correctness and technical quality. The QA review is a management level review conducted to ensure that methods and standards have been observed which comply with the Navy’s QA goals. Formal inspections throughout the life cycle have proven to be the largest return on investment of any QA technique available with the software industry. It is a proactive approach to error detection and resolution and it has the added benefit of increasing communication and learning. OpTech (if tasked) will assign a QA Engineer to each task order. The QA Engineer may be responsible for more than one task order based on the level of complexity. The QA Engineer will be a part of the team overseeing OpTech’s operations and maintenance of the Navy systems, who 4 will report to the Quality Control Manager. This QA engineer will ensure that all aspects of the software development lifecycle have quality standards and metrics that are implemented in accordance with proven business practices and the Navy requirements. Every project will have a Quality Surveillance Plan. Exhibit 003 provides a “quick glance” Quality Surveillance Plan. This QASP will provide evidence that the the Navy approved Quality Control Plan is continuously and consistently applied on all Task Order Calls and embedded in the management processes for all work that OpTech performs across this BPA. The underlying concept to OpTech’s Quality Assurance Surveillance Plan (QASP) approach is the proactive establishment of standards and expectations that provide consistency throughout the project and enable quality to become a component of the delivered service or product, rather than a post-development activity. The key objectives of our QASP approach are: To provide assistance to manage, monitor and direct the projects. To establish a quality-focused culture whereby projects are defined, planned, executed and controlled in a manner so that they meet their business objectives and are completed on schedule and within budget. To review and assess third party vendors, contract teaming partners and subcontractors in terms of their performance of project tasks and development of work products that meet pre-defined quality standards or service level agreements. To provide continuous and ongoing counsel and advice to client project management, including performing activities to proactively identify issues and problems and to provide recommendations to address real and potential issues. To evaluate development and implementation project methodologies, approaches and work plans, assist in problem solving and develop/evaluate alternative approaches for conducting tasks and meeting expectations. To provide independent and objective analysis and assessments of processes and products and assist in resolving project-related disputes. The QA Engineer will perform the QA activities as an independent agent using information collected by our management tools and techniques. We will post QA checklists and evaluations in our Project Activity Collaboration Folders and it will be used to report the status of QA activities. QA activities will include participating and recording minutes from various meetings. The QA function will be separated organizationally from the project development staff to avoid the potential for and appearance of a conflict of interest. The QA Engineer is responsible for: Conducting reviews of life cycle products and providing feedback to the project team. Reviewing system for conformance to requirements and design. Assuring the quality of the data through the entire life cycle of the project. Participating in the testing of new or modified software. Participating in developing the disposition plan for the system. The QA process will dictate the means by which the QA requirements established for each system development effort will be met. The degree of QA applied and the level of detail contained in the process will be appropriately tailored to and consistent with the complexity, size, intended use, mission-criticality and cost of failure of each system development effort. To 5 ensure will: that the appropriate levels of QA activities are defined for each task and project OpTech Identify the specific purpose and scope of the QCP for each task and project Identify any deliverables needed for the QA review Identify any documentation Establish traceability of requirements through the execution of reviews and audits Identify the level of testing and evaluating results Resolving problems that result from reviews, audits and tests The PgM and a Team QA resource will utilize a well-defined and proven Quality Control Plan (QCP) which will provide proven plans and techniques for verification and validation of the applications and data that we develop and support and will ensure that a quality product is delivered to the the Navy. Our QCPs are derived from our codified and structured Quality Management System (QMS) to ensure that projects receive quality services and are entered into our Continuous Process Improvement Cycle (CPIC) of support. The QA process (described in Exhibit 004 below) helps to monitor and control project services by identifying and correcting deficiencies in performance before they become unacceptable and assists in continuous improvement of service. Our approach is to baseline existing QA processes and to validate current metrics and objectives, or to mutually determine them with the Navy or the stakeholders/users. We then work with the project personnel and stakeholders to determine the sufficiency and effectiveness of documented processes and measures. OpTech provides weekly, monthly and quarterly reports to our customer on quality activities and the effectiveness of our Quality Control Plan. Develop Develop // Update Update Plan Plan and and Metrics Metrics Use Use Results Results to to Inform Inform & & Improve Improve Performance Performance Plans Plans •• •• •• •• •• •• Quality Quality Control Control Plan Plan Metrics Metrics and and Measurements Measurements Performance Performance Standards Standards Communication Communication Process Process Corrective Corrective Action Action Procedures Procedures Continuous Continuous Process Process Improvement Improvement Plan Plan (Implementation (Implementation Plan) Plan) Metrics Metrics •• Define Define Metrics Metrics and and Performance Performance Standards Standards •• Define Define Acceptable Acceptable Quality Quality Levels Levels Assess Assess Program Program Performance Performance Performance Performance Measurements Measurements •• Monthly Monthly Compliance Compliance Audits Audits •• Ongoing Ongoing Monitoring Monitoring of of Metrics Metrics •• Internal Internal Assessment Assessment Process Process Measure, Collect, Validate and Analyze Improvement Improvement Efforts Efforts •• Proactive Proactive Corrective Corrective Action Action •• Identify Identify and and Implement Implement Program Program Improvements Improvements •• Revise Revise Quality Quality Control Control Plan Plan (as (as required) required) Reporting Reporting •• Daily, Daily, Weekly, Weekly, and and Monthly Monthly Reports Reports •• Quarterly Quarterly Reviews Reviews •• Metrics Metrics Review Review (as (as needed) needed) Update Plans and Metrics Exhibit 004 - Team OpTech QA Process. Our QA process ensures customer satisfaction 6 Operations and Maintenance (O&M): The O&M task represents one of the most challenging aspects of operating an IT portfolio of applications and supporting infrastructure. Our approach encompasses producing a current state assessment of the operating environment. This assessment includes: o Policies and Procedures o Business Partnership Management o Development Methodologies o Technology Platform o Infrastructure o Quality of Service Upon assessment completion we will develop a set of recommendations with the Navy’s guidance. The assessment will serve as the foundational O&M roadmap for transitioning the the Navy organization to a high-performing shared services model. This roadmap may include, but is not limited to application, platform and service integration and consolidation to achieve reduced costs and improved service responsiveness. Our systematic approach is based on leveraging our many years of experience implementing successful IT O&M projects for Federal customers that include FTA, FAA, Department of Transportation (DOT), U.S. Coast Guard (USCG), Department of Labor (the Navy), and Health and Human Services (HHS). Our standard approach, employs our CMMI ML 2/3 and ITIL best practices to derive and deliver efficiencies and assure successful outcomes. We consistently use the right resources (people, processes, tools and technologies) to achieve optimal outcomes and reduce risks for our customers. We will collaborate with the Navy’s infrastructure stakeholders to clarify, document and confirm their requirements and performance criteria, and we implement IT strategies and plans that satisfy client needs. We apply current Federal and industry standards and best practices to all tasks related to operating and maintaining our customers’ IT infrastructure. We employ directed performance and service level measurement and improvement techniques applicable to the infrastructure to continuously improve mission outcomes and enhance productivity. We ensure optimum infrastructure use, support emerging technology evolution and innovation within the existing infrastructure, maintain skill currency, and ensure consistent IT support for the business processes. Our PMs and Task Leads define and implement formal management processes and plans for change and configuration, performance, service assets, security, and incidents and problems. We initiate an evaluation of the existing IT environment; assess emerging technologies; plan enhancements, reengineering, and consolidations; and conduct feasibility studies and prototypes. 7 Our PgM will produce and deliver a Risk Management Plan (RMP) to the CO and COR that contains processes based upon best practices (including the SEI CMMi Risk Management Guide) and includes risk identification, management and mitigation. We have gained experience and expertise in RM processes through our stewardship and performance on our many government projects. The RMP is the responsibility of the PM for planning and monitoring, and to all technical and project personnel for identifying, reporting and mitigating risks in their areas of work and responsibility. Risk management commences immediately and is consistent throughout the Period of Performance. All management personnel will be responsible for reporting risk, mitigation and the results of these activities at weekly and monthly program meetings. Exhibit 005 - Risk Management Process. Our risk management process is institutionalized in a Risk Management Plan that allows the PM to monitor, investigate, and mitigate risks throughout the project(s) so that problems do not arise. Our RMP approach incorporates a continuing, closed-loop review and analysis of technical, programmatic, cost, and schedule risks. We have mitigated and eliminated risk throughout our performance on projects through selectively hiring the best qualified technical personnel, documenting processes and procedures and standardizing them through SOPs, instituting standardized and relevant training and technical upgrading, and forming close relationships with the client personnel for the coordination of information and processes. We will proactively detect, address, escalate, and resolve problems using a five-step risk management process, which includes documenting and tracking all identified risks/problems from point of discovery through resolution (Exhibit 005). 8 The five steps are: Identifying Risks; Assessing and Analyzing Risks; Planning Risk Response; Executing Risk Response; and Tracking, Monitoring, and Controlling Risk. These steps combined provide assurance that we will quickly identify, assess, and identify a resolution for each risk before it can have an impact on the solution. Program/task risk is a potential future event, which, if it occurs, will result in an undesirable outcome. Risks are characterized by the following three attributes: A definable event Probability of occurrence—there must be meaningful possibility that this future event may actually occur Consequence of occurrence—there must be some negative consequence associated with its occurrence. Project risks will be further identified and documented (Project Risk Identification and Mitigation) in the PMP, which will be created and finalized within 30 days of contract startup. The project risk table will be monitored and updated continuously throughout the project. Risk will be a topic for discussion at project kickoff and at each status meeting. Risk assessments will be updated with customer, stakeholder, and user input. Each group will also be consulted for updated risk assessments as the project develops. Once risk items are identified: They are captured in a Risk Identification and Mitigation Table assigned to a specific manager or technical person for monitoring and status updating The level of risk impact is assessed and a mitigation decision is made at the appropriate level (timely decisions prevent reactive and/or negative outcomes) Rigorous risk identification and continuing reassessment of exposure throughout program activities minimizes impact Implemented mitigation plans are cost effective Contingency plans are developed early to avoid costly ad hoc or reactive actions A mitigation strategy is decided upon at the appropriate level. Risk is updated at each status meeting. Risk management is the overall responsibility of the PM. The PM will assign project personnel to implement risk mitigation efforts and monitor identified risk items as needed. Initial risk assessment is in the areas of project risk, process risk, customer risk, and technical risk. Table T.1 identifies activities associated with risks that Team OpTech uses in its projects. Risk Transition Risk Business Impact Process Definition Customer Characteristics Operations Environment Future Technology Staff Size Experience Description Risk associated with transitioning from one environment and contractor to another Risks associated with the constraints imposed by management or the customer. Risks associated with the degree to which the process is defined, and is followed by the organization. Risks associated with the sophistication of the customer and the ability to communicate in a timely manner with the customer. Risks associated with the availability and quality of the tools to be used to build the product. Risks associated with the complexity of the system to be built and the “newness” of the technology that is packaged by the system. Risks associated with the overall technical and program experience of the technical resources who do the work and staff size. PMP Table T.1 - Risk Identification Categories. Predefined risks are placed in different categories by the PM. 9 Risk Assessment Method The PM will use the qualitative risk assessment method. This methodology uses an ordinal rating system. Impact may be qualified as high, moderate, or low as shown in PMP Table T.2. Rating Probability Range Impact High Likely to occur, even with special emphasis and close monitoring (60-100% probability). Serious schedule delays, cost overruns, performance failures, or loss of customer satisfaction. Some disruption of schedule including deliverables, some cost increases, performance problems, or quality problems. Small schedule delays that do not affect deliverables, small cost increases performance problems, or quality problems. Has the potential to occur, but with special Moderate emphasis and close monitoring should reduce the likelihood (30-60% probability). Little potential to occur; normal effort and monitoring could prevent it from happening Low (0-30% probability). PMP Table T.2 - Qualitative Risk Analysis Guidelines. The PM will assess and assign a Risk Probability to each identified risk. Risk Exposure. Risk exposure is determined by the risk impact and probability. Risks with a high probability of occurrence and a significant impact represent a serious exposure to the PM. Risks of low probability and low impact represent far less serious exposure. Risk Prioritization. Risk prioritization is the process of identifying and ranking those risks that will have the greatest impact on the program if they occur. Prioritization provides the team direction in performing risk management to optimize mitigation resources for the most serious threats to the project. This program uses the exposure level as the basis for prioritizing risks. Risks are then rank ordered based on the impact each could have on the program, should it materialize. Risk Response Strategies. Planning risk response strategies includes identifying various response strategies for each risk, evaluating the effectiveness of each option, and selection of the best approach. There are generally four possible response strategies that may be selected as defined in PMP Table T.3. Response Strategy Avoidance Mitigation Transfer Contingency Definition Choose a different approach that eliminates the risk but also eliminates the potential benefits of the original approach. Plan specific actions that reduce the probability or potential impact of the risk. Transfer all or a portion of the risk to another party that may be better equipped to deal with it. Prepare a contingency plan to be implemented should the risk occur. Set aside reserves to cover exposure as appropriate. PMP Table T.3 - Possible Risk Response Strategies. The Risk Response is predicated by the type and severity of the risk – typically the PgM will choose the most effective and least invasive strategy. 10 Development and Submission of Risks. Initial program/task risks are identified during program/task planning/re-planning and new ones are added during the program. Risk identification is an ongoing process that occurs throughout the program or task lifecycle. Potential risks may be submitted to the PM or Task Leader by any member of the program team. Potential risks will be analyzed and added to the program risks as appropriate. Risk Management Database. We create and continually populate a Risk Management Database to identify potential risks with mitigation and contingency strategies for each risk. The PM will use this database to maintain program and task risk information. This database, which contains information pertaining to all program risks, is reviewed at least monthly. Review of Program Risks. The PM reviews program and task risks as part of program execution. A review of all risks is performed as part of preparing the monthly Program Review. Major program and task risks are reviewed during the conduct of the Program Review. Other risks may be reviewed during the Program Review as deemed appropriate. Any action items stemming from the review of program and task risks will be captured in the program action item database to control action and ensure resolution. 11
© Copyright 2026 Paperzz