Notice

The Intelligent Ship: Competition Document

Updated 17 May 2023

Typographical errors were found in the original Terms attached to this page. Please ensure you refer to the V4 terms attached now.

1. Introduction:

Defence wants to understand the design implications, challenges and opportunities of future Command and Control (C2) systems that incorporate multiple Artificial Intelligence (AI)-based agents. These agents are expected to interact and work together as a Collaborative AI system. Defence also need to understand how to successfully integrate these collaborative AI systems with (potentially smaller teams of) Human Operators and decision makers, forming a complex Human-Autonomy Team (HAT).

Developing our understanding of these future systems is expected to lead to more timely and informed decision-making that can be trusted in a complex and contested data environment.

This DASA competition seeks proposals that will design, optimise and demonstrate an effective HAT based upon collaborative Intelligent Agents and Humans. The primary aim will be to investigate and develop understanding through the design and evaluation of a collaborative AI based HAT system design for naval Command and Control.

The capabilities and understanding developed in the project are expected to be exploited in the design and specification of future C2 systems (such as the Future Air Dominance System (FADS)) and in support of connected HAT, autonomy and naval C2 focused research activities.

This phase shall focus in particular on investigating and developing concepts that address the human aspects of a system - i.e. the required roles, interactions, and information and interface needs of both human operators and the matching AI based agents. This phase shall also investigate and further development arbitration approaches to managing potentially conflicting outputs and proposed Course of Action (CoA) generated by the AI agents.

2. Competition key information

Submission deadline

6 June 2023 at 12 midday (BST).

Where do I submit my proposal?

Via the DASA Online Submission Service for which you will require an account. Only proposals submitted through the DASA Online Submission Service will be accepted.

Total funding available

The total possible funding available for phase three of this competition £1.7 Million (ex VAT).
As this is a phase three competition we aim to fund a single proposal for up to the full funding.

Supporting materials

Dial-in 1-2-1s Q&A sessions

18 April 2023 – A series of 15 minute one-to-one teleconference sessions, giving you the opportunity to ask specific questions. If you would like to participate, please register on the Eventbrite page. Booking is on a first come first served basis.

25 April 2023 – A series of 15 minute one-to-one teleconference sessions, giving you the opportunity to ask specific questions. If you would like to participate, please register on the Eventbrite page. Booking is on a first come first served basis.

3. Competition Scope

3.1 Background:

This phase three competition builds on the delivered capabilities from the first two phases of the Intelligent Ship competition.

Phases one and two developed both the integration and evaluation ‘sandpit’ (the Intelligent Ship AI Network – ISAIN) and a range of novel and innovative AI based machine agents to integrate within it. The final system’s performance and operator usability was then evaluated against a naval scenario within a simulation environment at Dstl’s Command Lab simulation Facility using Dstl’s Military Advisors as operators.

These first two phases primarily focused on the development of, and management and performance of, collaborative AI systems, with the human operators in the evaluations effectively overlaid onto the system via a range of independent UIs, rather than fundamentally designed into the system from the start of design. This phase 3 competition therefore attempts to better integrate the ‘Human’ into ‘Human-Autonomy Teaming’.

A summary of Phase 2, ISAIN, the component intelligent agents and key conclusions are provided as part of this competition pack to provide additional background to the projects.

Phase 3 shall focus on investigating and developing concepts that address the human aspects of a system - i.e. the required roles, interactions, and information and interfaces needs of both human operators and the matching AI based agents. This phase shall also investigate and further development arbitration approaches to managing potentially conflicting outputs and proposed Course of Action (CoA) generated by the AI agents.

Where possible it is expected that the supplier shall make best use of the existing phase 2 systems and component agents by reconfiguring, augmenting and adapting them to demonstrate the impact of a more human-focused HAT design approach and of a more detailed consideration of agent arbitration. The emphasis on AI agent development within phase 3 shall, however, be limited to that required to support these aspects and phase aims, rather than to develop wider or new capabilities.

The AI agents and human operators within the developed HAT system(s) will be integrated within the existing or a modified (if required) version of the ISAIN. Outline information on ISAIN, its key functions and architecture is supplied as part of this competition pack; the full software pack and supporting documentation will be provided to the successful supplier team upon contract.

To allow a greater focus on the human elements of the design, and the challenges of arbitration of collaborative AI, it is expected that phase 3 will focus on a sub-set of the naval capabilities evaluated in phase 2 – i.e. less broad, but deeper consideration of the of the Naval Above-Water Warfare decision making space. It is suggested that focusing on the capabilities to support future anti-air defence to reflect the expected challenges of the RNs FADS should be at the heart of any proposed capability, but Dstl is open to alternative approaches and focus that could be shown to better support the analysis of Collaborative AI based HATs.

The matching evaluation vignettes/ scenario can also be more focused, but should work towards pushing the complexity and demands the system and its operators to induce greater levels of ‘conflict’ and hence the need for arbitration, and to test the proposed interactions and interfaces.

The focus of the phase will continue to be the naval domain, but there is an expectation that the principals and approaches investigated will be applicable across multiple Defence domains and applications.

All developments conducted under this phase should be designed to be open, flexible and utilise industry standards to support future exploitation beyond this competition-phase. This includes avoiding the use of proprietary standards where possible.

As part of the delivery of this phase the successful team will be expected to support the development on ongoing exploitation plans in conjunction with Dstl. The aim will be to understand opportunities of both direct and in-direct exploitation of the outputs and learning from this phase, with an aim to support the use of collaborative AI based systems on future naval platforms, across distributed maritime forces, or in wider defence systems.

Please note this is the third phase of funding for a multi-phase competition. It is not compulsory to have been involved in previous phases to apply. See the previous competition and bids we funded.

4. Competition Challenge

The project is seeking a supplier team or consortium that includes a mix of organisations able to provide the diverse mix of skills and experience needed to support the delivery of the project scope. The supplier shall be responsible for the:

  • Management of the wider supplier team or consortium
  • Development and assessment of approaches to human-focused HAT system design - leading to an agreed HAT system design process and design(s) which includes both the AI agent and human elements, including definitions of operator numbers, roles, and their information and interface needs;
  • Development and then the implementation of integration, test (hardware, software and system requirements) and evaluation plans (i.e. evaluation design, metrics to be used and human aspects); this should include the assessment and use of appropriate measures of effectiveness, and approaches to assessing the system’s ability to build trust and explainability to human operators;
  • Development of suitable evaluation scenarios in collaboration with the Dstl team– it is expected that proposals shall outline a broad outline of a scenario and that this will be developed, refined and detailed in conjunction with Dstl once the phase is underway;
  • Any ethical approval required through the Ministry of Defence Research Ethics Committee (MODREC) process, including the development of the supporting documentation required (based on the plans listed above). This will require initial submission of evaluation plans into the process to secure non-binding assessment of the need for MODREC;
  • Establishment and the management of relationships with any supporting AI agent suppliers
    • to enable their adaption, integration and use within the system as required and;
  • Management, modification or adaption (as necessary) of ISAIN, its interfaces (both Human & AI) and the integration of required AI agents, UIs etc. within it to support the agreed evaluation aims and plan; this should include updates to matching documentation and delivery and installation of updated code on Dstl’s command lab at the end of the phase (if evaluated on another capability);
  • Development, integration and assessment of suitable arbitration approaches for the Collaborative AI-based HAT;
  • Managing the system’s Human-Machine Interfaces (HMIs) to ensure their design, commonality, maturity and complexity support the developed system design intent and the evaluation aims, while minimising any impacts on systems level performance evaluation;
  • Deliver and manage a series of evaluation sprints at a suitable simulation capability (see discussion below) to deliver agreed evaluation aims and plan(s);
  • Delivery of matching reports, presentations and stakeholder demonstrations including:
    • Standard programme management supporting materials (kick-off events, monthly reporting etc.);
    • An evaluation design & delivery plan and a HAT design approach summary which will support a system & evaluation design review prior to the build an devolution phases of the activity;
    • Physical demonstration of final system to Stakeholder (at proposed simulation location);
    • Final customer reports & presentational material;

Dstl is open to alternative approaches to effectively communicate the outputs of the phase and the project aims as a whole, e.g. a video. In the development of the design of the HAT, the supplier shall consider alternative decision making options and roles for the human operators, i.e. the supplier is not restricted to duplicating or automating current Naval operations room processes or to emulating current operator roles, ranks or skills.

Where suppliers intend to use open source software in their deliverables, they must make this clear in their proposal, and only use open source software available under ‘permissive licences’.

The intent is that any software development of ISAIN and supporting Intelligent Agents shall be limited to what is required to support the agreed HAT design and the evaluation plan. i.e. the aim is to focus on HAT system design and arbitration needs, not the further development of current or new naval AI capabilities or agents.

The supplier may propose the use of any existing agents, HMIs or enabling capabilities developed for the Intelligent Ship project, or other Dstl projects. All component intelligent agents from phase 2 were delivered under DEFCON 705 – the supplier will need to work with (within their team or as sub-contractors) these sub-suppliers if any modification or in-phase support is required to specific agents or tools. A list of agents used by the projects, their suppliers and an outline of their capabilities are provided as Government Furnished Information (GFI) within the phase 2 summary socument supplied as part of this Competition Document.

The supplier may also specify agents or capabilities developed elsewhere to a TRL of 4 or above, i.e. they must be of sufficient maturity and robustness that they will not adversely impact the functionality or evaluation of the HAT system as a whole.

All agents and tools proposed and used in the project must be compatible with the government owned interface definitions used by ISAIN and supported within the ISAIN SDK (Software Design Kit – as described in GFI provided with this document and delivered at start on contract).

The supplier shall work with Dstl to define a suitable evaluation (test) scenario. This can be developed or modified from existing scenarios or the previous Intelligent Ship scenario (phase 2 scenario outlined in the phase 2 summary provided as GFI). It shall support functional testing of the key elements of the HAT design (e.g. human interfaces) and shall enable the assessment of usability and the system’s performance, especially during complex transitions between, for example, different levels of human interactions, tempo, and threat. It should also address issues of trust and explainability.

If MODREC is required to support the agreed evaluation aims and the required human focused interactions and experimentation, approach and plan, then the supplier shall assume in their plans that the process may take up to 6 months to complete.

For Phase 3, Dstl is open to the proposed use of alternative simulation environments, the continued use of Dstl’s Command Lab at Dstl Portsdown West site (as mandated in Phase 2) or a combination of the two based on the most efficient and effective use of resources and the best access for stakeholders. The final system, however, shall be compatible with, and installed and functionally tested on Command Lab’s systems to ensure access and usability beyond this phase.

Details of the current capabilities of Dstl’s Command lab and matching VBS simulation gateway are provided as GFI within the phase 2 summary pack provided with this competition document. It should be noted that this provides a current snapshot of the capability, which may change or be augmented during the life of the project as the lab supports a range of Dstl and MOD testing activities.

Dstl will also support the availability of Military Advisors (MAs) to support evaluations if required or considered the most appropriate operators for the developed system. Selection of potential operators for the evaluations should be based on the agreed evaluation plan, with consideration of the potential relative impacts of experience and previous training on the selected evaluation metrics and approaches.

It is expected to be challenging to identify real supporting data, hence suppliers shall expect to use simulated data as a default; any requests for data-sets will be investigated but cannot be guaranteed to be available.

The Intelligent Ship AI network (ISAIN) is a government owned asset, and therefore it is a requirement that any new Intellectual Property, generated as a result of funding under this competition, on the infrastructure, information architecture and interface specifications for ISAIN and its interfaces is vested in the Authority. Any further development of ISAIN and associated interfaces must and shall thus be subject to DEFCON 703.

5. We are interested in…

Strong proposals will address all of the following:

Explore approaches to, and the benefits and challenges of, a more human centred approach to the design of the collaborative AI based HAT

  • develop approaches to designing and then implementing and testing a HAT with human operators and their roles, tasking and needs intrinsic to the design.

Explore approaches to, and the benefits and challenges of different approaches to system level arbitration in a Collaborative AI based HAT

  • develop, implement and evaluate approaches (including Phase 2 approaches) to implementing effective and robust arbitration of proposed Course(s) of Action (CoA) from different agents within a collaborative AI based HAT.

The effective use of evaluation approaches to understand the impacts of more human centred approach on the design of the collaborative AI based HAT

  • using simulation approaches to undertake a series of evaluation sprints with human operators to assess the relative performance of the system(s). This should include the assessment and use of appropriate measures of effectiveness, and the system’s ability to build trust and explainability to a human operator.

These shall be delivered through a phased approach with the expectation of an initial design period, followed by build and evaluation periods in which the designed more human-focused HAT system(s) are developed and integrated within ISAIN at Dstl’s Command Lab facility at Dstl Portsdown West or other suitable facility. Finally the developed HAT system shall be evaluated in areas such as performance, usability, trust and explainability, through a series of incremental evaluation events or sprints.

Exceptional proposals will seek to include the following as secondary objectives:

  • Dstl and DASA expect to work with the successful supplier to further develop exploitation plans beyond phase 3 of this project. The aim will be both inform future requirements, but also to identify opportunities for earlier exploitation, re-use or adaption of the developed ideas, approaches, intelligent agents or the system as a whole. This will include the consideration of options to exploit within other Dstl projects, in naval platforms, in other simulation environments or in wider naval experimentation. There will also be an aim of identify opportunities across a range of time-frames, both near and further term.
  • Non-core funded additional options that could support or accelerate exploitation beyond phase 3 – these would be considered during Phase 3 and would be subject to securing additional funding.

6. We are not interested in…

The project is not seeking proposals that:

  • Are proposed by a single supplier with no collaborators/supporting organisations;
  • Have a primary focus on the development of individual component AI agents other than where required to support the broader aims of the competition;
  • Are focused wholly on capability development or the development of nearer terms tools and approaches – aim is to explore solutions for next generation, and generation after next, platforms and capabilities to inform their requirements and capability targets;
  • Are focused purely on maturing an aspect of AW warfare C2, rather than using selected capability as an aide to understanding and evaluation collaborative AI based HAT;
  • provide a HAT system design that covers a wide and unfocused range of naval capabilities
  • does not include sufficient intelligent agents to be considered a collaborative AI system, or that avoids the need for arbitration;
  • constitute consultancy, paper-based studies or literature reviews;
  • offer no real long-term prospect of integration into defence and security capabilities.

7. Accelerating and commercially exploiting your innovation

It is important that over the lifetime of DASA competitions, ideas are matured and accelerated towards appropriate end-users to enhance capability. How long this takes will depend on the nature and starting point of the innovation.
A clear route for exploitation

For DASA to consider routes for exploitation, ensure your deliverables are designed with the aim of making it as easy as possible for collaborators/stakeholders to identify the innovative elements of your proposal.

Whilst DASA recognises that early identification and engagement with potential end users during the competition and subsequent phases are essential to implementing an exploitation plan, during the competition phase there should be no correspondence between suppliers and DASA other than via the DASA helpdesk email at accelerator@dstl.gov.uk, or their local Innovation Partner.

All proposals to DASA should articulate the expected development in technology maturity of the potential solution over the lifetime of the contract and how this relates to improved capability against the current known (or presumed) baseline.
How to outline your exploitation plan
A higher technology maturity is expected in subsequent phases. Include the following information to help the assessors understand your exploitation plans to date:

  • the intended defence or security users of your final product and whether you have previously engaged with them, their procurement arm or their research and development arm
  • awareness of, and alignment to, any existing end user procurement programmes
  • the anticipated benefits (for example, in cost, time, improved capability) that your solution will provide to the user
  • whether it is likely to be a standalone product or integrated with other technologies or platforms
  • expected additional work required beyond the end of the contract to develop an operationally deployable commercial product (for example, “scaling up” for manufacture, cyber security, integration with existing technologies, environmental operating conditions)
  • additional future applications and wider markets for exploitation
  • wider collaborations and networks you have already developed or any additional relationships you see as a requirement to support exploitation
  • how your product could be tested in a representative environment in later phases
  • any specific legal, ethical, commercial or regulatory considerations for exploitation

7.1 Is your exploitation plan long term?

Long term studies may not be able to articulate exploitation in great detail, but it should be clear that there is credible advantage to be gained from the technology development.

Include project specific information which will help exploitation. This competition is being carried out as part of a wider MOD programme and with cognisance of cross-Government initiatives. We may collaborate with organisations outside of the UK Government and this may provide the opportunity to carry out international trials and demonstrations in the future.

The primary aim of the project is to explore solutions for next generation, and generation after next, platforms and capabilities to inform their requirements and capability targets. It is expected that this phase of work will inform future requirements both with respect to the design and implementation of future naval C2 systems, and the corresponding impacts on platform and force level systems design

The development of overarching principals in collaborative AI based HAT systems are likely to have wide utility across defence and security domains, with likely spin-offs into certain autonomy focused commercial sectors. Some of the project output is expected to inform the Dstl’s Human Autonomy Teaming project (HAT) design guides & biscuit books.

Dstl and DASA expect to work with the successful supplier to further develop exploitation plans beyond phase 3 of this project. The aim will be both to inform future requirements, but also to identify opportunities for earlier exploitation, re-use or adaption of the developed ideas, approaches, intelligent agents or the system as a whole. This will include the consideration of options to exploit within other Dstl projects, in naval platforms, in other simulation environments or in wider naval experimentation. There will also be an aim of identify opportunities across a range of time-frames, both near and further term.

8. How to apply

8.1 Submission deadline

6 June 2023 at 12 midday (BST).

8.2 Where do I submit my proposal?

Via the DASA Online Submission Service for which you will be required to register.
Only proposals submitted through the DASA Online Submission Service will be accepted.

8.3 Total funding available

The total funding available for Phase 3 of this competition £1.7 Million (ex VAT).

8.4 How many proposals will DASA fund

As this is a phase three competition we aim to fund a single proposal for up to the full funding.

8.5 For further guidance

Click here for more information on our competition process and how your proposal is assessed.
Queries should be sent to the DASA Help Centre –accelerator@dstl.gov.uk.

9. What your proposal must include

The following information shall be provided within proposals:

  • Evidence to show understanding of the project aims, in particular the Human-Autonomy Teaming design aspects and the those pertaining to collaborative AI arbitration;
  • The proposed team structure (identifying all industrial/university partners or sub-contractors), and evidence of commitment from each team member to support the task, highlighting the lead or leads for key areas of responsibility;
  • An initial definition of naval capability area(s) that are expected to be addressed[footnote 1];
  • Explanation of the approaches used to explore and develop the HAT design for this phase, including approaches and tools to be assessed and/or used;
  • Explanation of the approaches used to explore and develop the arbitration approaches used in this phase;
  • An initial outline of the tools, agents and enablers expected to be included within the developed system[footnote 1];
  • Explanation of expected approach to evaluation planning and execution, including a descriptions of the expected evaluation capability if not the Command Lab and a broad outline of scenarios expected to be used in the evaluations[footnote 1];
  • Identification of any additional GFX and data requirements, and any specific hardware, software or interface needs;
  • Initial identification of project risks and expected mitigations[footnote 1];
  • Proposed approach to developing exploitation plans in conjunction with Dstl
  • An initial project plan[footnote 1];
  • Any non-core options identified by the team, that address transition from phase 3 to follow-on work, aide future exploitation, or that would augment the planned work if additional funding is identified during the phase.

It is the intent to develop the detail of evaluation design and supporting system during the initial part of this task, and hence it is desirable for the project to have a degree in flexibility in scope, and in the prioritisation of tasks as concepts and plans develop throughout the project. Therefore, proposals shall/should highlight how project and team flexibility will be managed within the proposed budget and time, e.g. through the use of MoSCoW prioritisation .

The proposal should focus on the Phase 3 requirements but can include any outline options for non-core (unfunded) options identified by the team, that address transition from phase 3 to follow-on work, aide future exploitation, or that would augment the planned work if additional funding is identified during the phase

When submitting a proposal, you must complete all sections of the online form, including an appropriate level of technical information to allow assessment of the bid and a completed finances section

Completed proposals must comply with the financial rules set for this competition. The upper-limit for this competition is £1.7 Million (ex VAT). Proposals will be rejected if the financial cost exceeds this capped level

You must include a list of other current or recent government funding you may have received in this area if appropriate, making it clear how this proposal differs from this work

A project plan with clear milestones and deliverables must be provided. Deliverables must be well defined and designed to provide evidence of progress against the project plan and the end-point for this phase; they must include a final report

You should also plan for attendance at a kick-off meeting at the start of Phase 3, a mid-project event and an end of project event at the end of Phase 3, as well as regular monthly reviews with the appointed Technical Partner and Project Manager; all meetings will be in the UK. Meetings may also take place virtually.

Your proposal must demonstrate how you will complete all activities/services and provide all deliverables within the competition timescales (16 months). Proposals with any deliverables (including final report) outside the competition timeline will be rejected as non-compliant

10. What your resourcing plan should include

Your resourcing plan must identify all participating organisations within the team and, where possible, the nationalities of proposed employees that you intend to work on this phase.

You must also provide evidence that your team contains relevant skillsets and capabilities in the following areas:

  • The development, integration and management of the network of connected agents and/or apps – i.e. the skills to manage, modify and configure ISAIN and to manage the integration of 3rd party intelligent agents into it;
  • Experience & management of Simulation Environments;
  • Development and understanding of military scenarios to exercise a HAT;
  • Human Factors – skills in the design and optimisation of a HAT including the design and optimisation of user interfaces; familiarity of the MODREC process (if required);
  • Experience in designing and undertaking of experiments and evaluations using AIs and humans, including understanding of the challenges of developing, capturing and assessing suitable metrics.

In the event of a proposal being recommended for funding, the DASA reserves the right to undertake due diligence checks including the clearance of proposed employees. Please note that this process will take as long as necessary and could take up to 6 weeks in some cases for non-UK nationals.

You must identify any ethical / legal / regulatory factors within your proposal and how the associated risks will be managed, including break points in the project if approvals are not received.
MODREC approvals can take up to 6 months therefore you should plan your work programme and risks management accordingly. It is expected that the development of evaluation plans early in the contract will determine if ethical approval will be required. For general guidance on the implications and requirements of the MODEREC process, please refer to the MODREC Guidance for Suppliers or contact your Innovation Partner for further guidance.

Requirements for access to Government Furnished Assets (GFA), beyond those stated to be provided in this document, may be included in your proposal (for example: information, software, equipment). However, DASA cannot guarantee that GFA will be available. If you apply for GFA, you should include an alternative plan in case it is not available.

Failure to provide any of the above listed will automatically render your proposal non-compliant.

11. Cyber risk assessment

11.1 Supplier Assurance Questionnaire (SAQ)

On receipt of a ‘Fund’ decision, successful suppliers must prove cyber resilience data before the contract is awarded. The start of this process is the submission of a Supplier Assurance Questionnaire (SAQ). The SAQ allows suppliers to demonstrate compliance with the specified risk level and the corresponding profile in Def Stan 05-138, and the level of control required will depend on this risk level.

To expedite the contracting time of successful suppliers we ask all suppliers to complete the SAQ before they submit their proposal. The SAQ can be completed here using the DASA Risk Assessment RAR-800736853 and answer questions for risk level “Very Low”.

Defence Cyber Protection Partnership

The Defence Cyber Protection Partnership (DCPP) will review your SAQ submission and respond with a reference number within 2 working days. The completed SAQ form and resulting email response from DCPP must be downloaded and included within the DASA submission service portal when the proposal is submitted. Please allow enough time to receive the SAQ reference number prior to competition close on 6 June 2023 at 12 midday (BST).

If the proposal is being funded, the SAQ will be evaluated against the CRA for the competition, and it will be put it into one of the following categories:

  1. compliant – no further action
  2. not compliant – if successful in competition and being funded, the innovator will be required to complete a Cyber Implementation Plan (CIP) before the contract is placed, which will need to be reviewed and agreed with the relevant project manager

Innovators can enter a proposal without all controls in place, but are expected to have all the cyber protection measures necessary to fulfil the requirements of the contract in place at the time of contract award, or have an agreed Cyber Implementation Plan (CIP).

The CIP provides evidence as to how and when potential innovators will achieve compliance. Provided the measures proposed in the Cyber Implementation Plan do not pose an unacceptable risk to the MOD, a submission with a Cyber Implementation Plan will be considered alongside those who can achieve the controls.A final check will be made to ensure cyber resilience before the contract is placed. Commercial staff cannot progress without it. This process does not replace any contract specific security requirements.

Further guidance for completing this process can be requested by emailing the DASA Help Centre: accelerator@dstl.gov.uk.
Additional information about cyber security can be found at: DCPP: Cyber Security Model industry buyer and supplier guide.

12. Public facing information

When submitting your proposal, you will be required to include a title and a short abstract. The title and abstract you provide will be used by DASA, and other government departments, to describe your project and its intended outcomes and benefits. They may be included at DASA events in relation to this competition and in documentation such as brochures. The proposal title will be published in the DASA transparency data on GOV.UK, along with your company name, the amount of funding, and the start and end dates of your contract. As this information can be shared, it should not contain information that may compromise Intellectual property.

13. How your proposal will be assessed

At Stage 1, all proposals will be checked for compliance with the competition document and may be rejected before full assessment if they do not comply. Only those proposals that demonstrate compliance against the competition scope and DASA mandatory criteria will be taken forward to full assessment.

14. Mandatory Criteria

The proposal outlines how it meets the scope of the competition Within scope (Pass) / Out of scope (Fail)
The proposal fully explains in all three sections of the DASA submission service how it meets the DASA criteria Pass / Fail
The supplier team is formed of more than one supplier or entity Pass / Fail
The proposal clearly outlines each of the proposed supplier team members, and their contribution to the project in terms of both skills and outputs to be delivered Pass / Fail
The proposal clearly details a financial plan, a project plan and a resourcing plan to complete the work proposed in Phase 3 Pass / Fail
The proposal clearly identifies an approach to assessing the need for MODREC approval, and has allocating time and risk to obtaining MODREC approvals if required Pass / Fail
The proposal clearly identifies what simulation locations and environments are to be used Pass / Fail
The proposal identifies any GFX required for Phase 3 Pass / Fail
Maximum value of proposal is £1.7 Million Pass / Fail
The proposal demonstrates how all research and development activities / services (including delivery of the final report) will be completed within 16 months from award of contract (or less) Pass / Fail
The bidder has obtained the authority to provide unqualified acceptance of the terms and conditions of the Contract. Pass / Fail

Proposals that pass Stage 1 will then be assessed against the standard DASA assessment criteria (Desirability, Feasibility and Viability) by subject matter experts from the MOD (including Dstl), other government departments and the front-line military commands. You will not have the opportunity to view or comment on assessors’ recommendations.

DASA reserves the right to disclose on a confidential basis any information it receives from innovators during the procurement process (including information identified by the innovator as Commercially Sensitive Information in accordance with the provisions of this competition) to any third party engaged by DASA for the specific purpose of evaluating or assisting DASA in the evaluation of the innovator’s proposal. In providing such information the innovator consents to such disclosure. Appropriate confidentiality agreements will be put in place.

Further guidance on how your proposal is assessed is available on the DASA website.

After assessment, proposals will be discussed internally at a Decision Conference where, based on the assessments, budget and wider strategic considerations, a decision will be made on the proposals that are recommended for funding.

Innovators are not permitted to attend the Decision Conference.

Proposals that are unsuccessful will receive brief feedback after the Decision Conference.

15. Things you should know about DASA contracts: DASA terms and conditions

Please read the DASA terms and conditions which contain important information for innovators. For this competition we will be using the attached Terms and Schedules.

We will require unqualified acceptance of the terms and conditions; if applicable, please ensure your commercial department has provided their acceptance.

The Intelligent Ship AI Network (ISAIN) is a government owned asset, and therefore it is a requirement that any new Intellectual Property, generated as a result of funding under this competition, on the infrastructure, information architecture and interface specifications for ISAIN and its interfaces is vested in the Authority. Any further development of ISAIN and associated interfaces must and shall thus be subject to DEFCON 703.

Intellectual property for all other deliverables shall be managed and vested in accordance with the standard MOD intellectual property contract condition for fully funded research contracts – DEFCON 705. Under the terms of DEFCON 705 any intellectual property generated under the contract belongs to the contractor. In return, the funding Authority obtains a set of rights to use the delivered technical information and associated intellectual property for specified purposes.

More information on DEFCON 703 and DEFCON 705 can be found by registering on the Knowledge in Defence site .

Funded projects will be allocated a Project Manager (to run the project) and a Technical Partner (as a technical point of contact). In addition, the DASA and Dstl teams will work with you to support delivery and exploitation including, when appropriate, introductions to end-users and business support to help develop their business.

We will use deliverables from DASA contracts in accordance with our rights detailed in the contract terms and conditions.

For this competition, £1.7 Million is currently available to fund proposals. There may be occasions when additional funding may become available to allow us to revisit proposals deemed suitable for funding. Therefore, DASA reserves the right to keep such proposals in reserve. In the event that additional funding becomes available, DASA may ask whether you would still be prepared to undertake the work outlined in your proposal under the same terms.

16. Phase 3 key dates

Pre bookable 1 to 1 telecom Session 1 18 April 2023
Pre bookable 1 to 1 telecom Session 2 25 April 2023
Competition closes 6 June 2023 at 12 midday (BST)
Feedback release 4 August 2023
Contracting Aim to start at the beginning of September 2023 and to end no later than end of December 2024 (i.e. a maximum of 16 months later)

17. Help: Contact the DASA Help Centre

Competition queries including on process, application, commercial, technical and intellectual property aspects should be sent to the DASA Help Centre at accelerator@dstl.gov.uk, quoting the competition title. If you wish receive future updates on this competition, please email the DASA Help Centre.

While all reasonable efforts will be made to answer queries, DASA reserves the right to impose management controls if volumes of queries restrict fair access of information to all potential innovators.

18. Appendix: Clarifications arising from Collaboration Event and 1 to 1 sessions

Organisations

Q. Do organisations need to have taken part in either Phase 1 or Phase 2 to be part of a bid for Phase 3?

A. No. Organisations can take part in Phase 3 without having being involved in previous Phases.

Q. Is a bidder allowed to both prime an offer whilst also supporting another prime?

A Yes.  There are no issues with suppliers submitting or contributing to multiple bids. I would normally recommend you are open about this in the respective proposals, in particular if there is significant overlap between what you are aiming to do in each of the proposals.

Q. Will the outputs of IS Phase 2 be made available to all the bidders?

A. The only outputs that will be made available are those explicitly stated in the Competition Document, and specifically the references to key information in Section 2. The Phase 2 outputs will be shared with the winning bid as Government Furnished Information.

Q. How can I join a consortium?

A. You should reach out to successful applicants from Phase 1 and 2 and sign up to the Collaboration survey to make yourself and your areas of specialism visible to other potential suppliers and leads . DASA/Dstl are not able to act as a brokering service.

Q. Do you foresee contributors changing during the project?

A. This is an S & T project so is exploring a technical space rather than specifying a detailed solution; this inevitably leads to subtle changes in scope based on learning and evaluations throughout the activity, potentially impacting individual team member’s scope – see question on flexibility below. We don’t expect wholescale change to the team make-up during the project however.

Q. If we have existing projects in other areas can we bring these in?

A. Yes, providing there development or adaption isn’t a significant element of the overall proposed scope – i.e. they bridge a potential capability gap in a proposed offering, but their development is not a ‘distraction’ from the specific aims of this call .

Q. Is Dstl going to coordinate the collaborative team required for the single successful proposal? Is there previous experience of similar approaches?

A. No. There have been previous examples of multiple organisations working together in DASA competitions (swarming drones), but for this call we expect organisations to form their own teams and choose a lead to make the submission. We need visibility of the team and its structure within proposal responses. There is a collaboration survey which can support organisations in identify links to other organisations.

Background Information

Q. Is there anything more about what you want to get out of Phase 3 than what is said in the documentation?

A. In short, No. In this phase we want to give teams greater flexibility to explore the design space, e.g. in terms of designing the human in from the start; where should the humans sit in the system?; what skills do they need? . I.e. Really explore the human aspects. As stated there is aim to have significantly less focus on individual agent development other than to support that aim.

Q. Are there any standards that apply to the proposed system that we would need to adhere to?

A. No; this approach will undoubtedly challenge existing standards and generate demand for new ones too, however as this is exploratory research we do not want to be overly constrained by standards. So for this competition we would be working to DEFSTANS where relevant and useful, but we would expect outputs, in part, to help to inform future standards and their development needs

Q. What is the desired outcome regarding arbitration?

A. In this phase we would ideally see proposals create more conflict at a more appropriate level of the system to show the need and power for having arbitration within the system. Hence we would expect various arbitration approaches to be explored, covering both how arbitration is done, but also considering where (could be at multiple locations) in the decision making hierarchy it should live and how it is linked, or informs, human operators.

Earlier Phases

Q. In the review of Phases 1 and 2 there was no strong conclusion. Is this because it ‘was a mess’?

A. We set ourselves a challenging target and much was ‘designed through competition and opportunity’ – e.g. we designed the system around the successfully funded agents rather than against a specific capability requirement. There were clear conclusions around the potential of collaborative AI, its risks and challenges, as well as less definitive conclusions on the impacts of greater automation on crew workload, trust and situational awareness. We also learned that we need to limit the number of dimensions and that learning will be brought into Phase 3.

Q. In Phases 1 and 2 the use of VBS brought opportunities and constraints. What is the intended situation for Phase 3?

A. In the next phase we will allow more scope for innovators to test in different environments and with different simulation options. We will still be aiming to integrate the solution into Dstl’s Command Lab at the end of the phase to allow future exploitation and exploration.

Operating Environment

Q. Are we moving towards a ‘Standard architecture’ and if so, when?

A. This is a challenge which is being addresses elsewhere in the Autonomy programme and across MOD, but thinking and approach are yet to mature – again this project informs those activities through the use of ISAIN.

Q. Does the agent need to run on Command Lab?

A. We are open to proposals that are evaluated on alternative environments and with alternative simulation engines. We would require a risk based assessment of the benefits and costs of using an alternative vs. Command lab, both from a technical perspective and from a practical location impact on movement and access for both contractors, Dstl staff, and wider MOD stakeholders. As stated above, we require the final system to be installed on Command Lab even if testing and evaluation are undertaken elsewhere.

General

Q. How do we understand if our solutions work from a human perspective? Is the target to get more/different output from people or reduce people?

A. Phase 2 focused on improving capability and this will still be the overall aim of Phase 3. This time we want to identify how the human and machine elements can be best integrated into the team as a whole to achieve the best combined capability result. We are open to alternative approaches that reduce the burden on human operators, but support a net improvement in capability, and solutions that keep operators engaged and situationally aware. I.e. manpower reduction is not a primary aims, capability enhancement is.

Q. Is there any requirement for data collection/management technologies e.g. sorting the wheat from the chaff?

A. This is being addressed in other areas of Dstl and hence not directly addressed by this competition – a clear enabler to the capability being investigated and developed however.

Q. Can you comment on feedback loops?

A. We realised in Phase 2 that it was necessary to keep operators more situationally aware on what the autonomy was doing and why. We did design some solutions to improving this issue, but this aspect will also need to be part of the system developed in Phase 3.

Q. Were usability assessments of agents done before testing in Command lab.? What testing was done to assess work load?

A. Some agents did as part of their individual development, however others were adapted from agents designed to support full autonomy – i.e. there was a mix of maturities with respect to how they interacted with humans and in, for example, their developed UIs. Usability was assessed at a whole system level as part of the evaluations, which also provide insights into individual agent’s usability.

Q. Will the MODREC committee be notified in advance that we are expecting submissions to speed up approval?

A. Yes, the committee will be informed about our competition – however the committee take no action until formal applications/ submission are made – i.e. early dialogue is unlikely to reduce timescales.

Q. What classification will apply and if it is at Official will that compromise output?

A. The project aims to remain at Official-Sensitive or lower to maximise the ability to share learning and outputs widely. Command lab would give an options to manage data at higher levels if required or desirable to improve outputs, but we (Dstl & supplier team) would need to balance the risks and benefits of doing so.

Q. You mentioned that development will need to be flexible. Are there any commercial issues with this?

A. In terms of flexibility – we expect to practically need a degree of flexibility within the proposed (and hence fixed) budget and timescale – i.e. the ability to re-prioritise activities or sub-tasks depending on what is discovered during the development of and evaluation of the system. This was achieved via routinely updating a MOSCOW type analysis in phase 2 for example.

Q. It is mentioned in the reports that software was produced to query data from ISAIN? Is that software itself subject to DEFCON 703?

A. Yes. All software produced in terms of generating the ISAIN dashboards, specific tables showing the messaging traffic flow and the information contained therein are subject to DEFCON 703 and any further developments will also be subject to DEFCON 703.

Q. Human factors can be broad - are there any particular areas in focus or are the team wanting everything?

A. The team has an open view. Primarily we are interested in on the effects on the design of the system – i.e. the general principals at a systems level of how the human operators and AIs should work together. Key question is how do we get the design approach right at the beginning - accepting that we won’t be able to fix all of the potential User Interface issues, or fully represent future potential alternative UI that may help the overall challenge One aim, for example, will be to inform the Human-Autonomy Teaming project’s design guides.

  1. these are expected to be further developed & maintained throughout the project in conjunction with Dstl, so are expected to potentially change and mature during delivery.  2 3 4 5