Advice Letter: Michelle Donelan, Advisor, Fullbrook Strategies
Published 30 September 2025
1. BUSINESS APPOINTMENT APPLICATION: The Rt Hon Michelle Donelan, former Secretary of State for Science, Innovation and Technology. Paid appointment with Fullbrook Strategies Limited.
You sought advice from the Advisory Committee on Business Appointments (the Committee) under the government’s Business Appointment Rules for Former Ministers (the Rules) on your role as a Business Unit Manager and Advisor on International Online Safety and Free Speech (Advisor) with Fullbrook Strategies Limited (Fullbrook Strategies).
The purpose of the Rules is to protect the integrity of the government. The Committee[footnote 1] considered whether this appointment was unsuitable. There is clear overlap with your responsibilities as the Secretary of State of the department responsible for AI and digital and the focus of this work – advising on online safety and free speech online.
The Committee has also considered the information provided by you and your former department – the Department for Science, Innovation and Technology (DSIT) – about your access to information and decisions taken in office. The material information taken into consideration by the Committee is set out in the annex.
The Committee has advised that a number of conditions be imposed to mitigate the potential risks to the government associated with this appointment under the Rules – including a limitation to the role and a waiting period before taking up the post. The Committee’s advice is not an endorsement of this appointment in any other respect.
The Ministerial Code sets out that ministers must abide by the Committee’s advice. It is an applicant’s personal responsibility to manage the propriety of any appointment. Former ministers of the Crown, and Members of Parliament, are expected to uphold the highest standards of propriety and act in accordance with the 7 Principles of Public Life.
2. The Committee’s consideration of the risks presented
Fullbrook Strategies is an international PR, crisis management, and political campaigns company. It has been registered as a consultant lobbyist with the Office for the Registrar of Consultant Lobbyists since June 2022. [footnote 2] It advises clients both in the UK and internationally across multiple sectors, including election campaigns, counter fraud, AI and cyber, defence, energy, and reputation management.
You described your role as advising Fullbrook Strategies and its clients on online safety and free speech online. In particular, you will be advising on the international approach, what is and is not deliverable and supporting clients’ awareness/campaigning work in these areas.
You were the Secretary of State responsible for the UK’s policies on AI and online safety – setting up the UK’s AI Safety Institute[footnote 3] and delivering the Online Safety Act 2023. You met with companies with an interest in online safety and interest in the approach to managing mis/disinformation online and free speech online. The majority of these companies operate internationally, including in the UK.
As Secretary of State you made policy, regulatory and operational decisions in the online safety sphere and in relation to frontier AI models – setting the tone for the government’s approach in this area at the time. None of these decisions were specific to Fullbrook Strategies – which is not a tech company but may represent companies operating in this space. There is no relationship between DSIT and Fullbrook Strategies, nor any direct overlap between the company and your responsibilities in office. Whilst there is a reasonable concern that your work with the company and the clients you may advise could directly overlap with your role in office, there is no suggestion you made decisions as a minister in expectation of work with Fullbrook Strategies. This significantly limits the risk that this role could reasonably be perceived as a reward for decisions made or actions taken in office.
As above, while Fullbrook Strategies is not primarily a tech or AI company, it does operate in this sector, where you plan to focus your work. As the former Secretary of State at DSIT you had access to sensitive information which would likely be seen to benefit any company with an interest in relevant policy areas. Overall, DSIT’s view was that there are some mitigating factors that reduce the risks associated with your general access to information, particularly if your work is restricted to that outside of the UK. In addition, you have been out of office for over nine months, meaning the majority of information to which you had access has moved on, including the technology associated with AI frontier models (more information below) and policy around the Online Safety Act 2023.
There remain risks associated with your access to information. There are two areas DSIT highlighted as potentially high risk of offering an unfair advantage, whether real or perceived:
- The UK’s approach to online misinformation[footnote 4] and disinformation[footnote 5]. This included information about the National Security Online Information Team (NSOIT) – the team that leads the UK’s operational response to online information threats.[footnote 6] You indicated this is not an area you would advise on in this role. The department said you will still be in possession of some privileged information regarding online safety policy, particularly around mis/disinformation and operations policy which is likely to have some ongoing relevance and may overlap with the Free Speech part of this role. Overall, DSIT said that if the proposed role does not include mis/disinformation or matters related to NSOIT, the risk of conflict is reduced.
- Your privileged access to DSIT’s stakeholders. You met with several important stakeholders of the government’s policy in this area whilst in office, undertaking, for example, discussions on digital infrastructure and online protections. The department said your access to commercially sensitive information is limited – it is unaware of anything specific and noted the capability of AI has moved on significantly even in the nine months since you were in office. However, it said there is a reasonable concern that you could have, or be seen to have, some insight from those meetings about companies’ approaches/views on online safety and handling of mis/disinformation. It recommended you be prevented from working on UK policy matters that could potentially overlap with this information.
The Committee considered there are risks associated with your access to information that needed to be mitigated. This risk is compounded given the clients you will be working with are unknown and the risk is most likely to occur if you work on matters, or with clients, that are directly related to your ministerial responsibilities.
There are also significant risks associated with your contacts and influence within government, particularly as Fullbrook Strategies is a firm that lobbies the UK government on behalf of its clients. All former ministers are prevented from lobbying the UK government for two years on leaving office. You said that you will not advise on UK matters, and your work will be separated from Fullbrook Strategies’ lobbying work. It is significant that Fullbrook Strategies has confirmed this in writing to the Committee.
3. The Committee’s advice
There are risks under the Rules associated with the potential for your responsibilities in office to conflict with your proposed work/clients of Fullbrook Strategies. Your closeness to the sector, as the previous Secretary of State for Science, Innovation and Technology, risks being seen to offer unfair access to information on the UK’s approach to online safety and free speech and related matters. The Committee is also mindful of the new government and that your access to sufficiently up to date information is limited.
The Committee advises a gap between your access to information, involvement with stakeholders and decisions in office before taking up this work is necessary. The Committee determined that a 12-month gap from your last day in ministerial office would be appropriate.
Further, the Committee advises that your role must be limited to that as you describe in this application – focussing on international online safety and free speech outside of the UK, avoiding any work on UK policy.
It is likely that some of Fullbrook Strategies’ clients will have a significant interest in the UK government and its approach to AI, digital and technology. There is a need to demonstrably separate your work from UK government policy and the lobbying work carried out within the company. Therefore, the Committee’s advice is that you should have no direct engagement with the UK government while you are subject to the Rules. This helps to mitigate the risk you may be seen to be making improper use of your contacts and influence. Fullbrook Strategies has confirmed your role can and will be separated from its lobbying work, and that adherence to the conditions in this letter will take precedence over any other contractual obligations.
The Committee advises, under the government’s Business Appointment Rules, that this appointment as Strategic Advisor with Fullbrook Strategies Limited should be subject to the following conditions:
- a waiting period of 12 months from your last day ministerial office;
- you should not draw on (disclose or use for the benefit of yourself or the persons or organisations to which this advice refers) any privileged information available to you from your time in ministerial office;
- for two years from your last day in ministerial office, you should not become personally involved in lobbying the government or its arm’s length bodies on behalf of Fullbrook Strategies Limited (including parent companies, subsidiaries, partners and clients); nor should you make use, directly or indirectly, of your contacts in the government and/or Crown service to influence policy, secure business/funding or otherwise unfairly advantage Fullbrook Strategies Limited (including parent companies, subsidiaries, partners and clients);
- for two years from your last day in ministerial office, you should not undertake any work with Fullbrook Strategies Limited (including parent companies, subsidiaries, partners and clients) that involves providing advice on the terms of, or with regard to the subject matter of, a bid with, or contract relating directly to the work of government, or its arm’s length bodies; and
- for two years from your last day in ministerial office your role must be limited to advising Fullbrook Strategies Limited and its clients on online safety and free speech outside of the UK. In doing so, you must not:
- advise on any UK policy, regulations or operations including but not limited to matters related to misinformation, disinformation and/or frontier AI models; or
- have any contact with the UK government on behalf of Fullbrook Strategies Limited (including parent companies, subsidiaries, partners and clients).
The advice and the conditions under the government’s Business Appointment Rules relate to your previous role in government only; they are separate from rules administered by other bodies such as the Office of the Registrar of Consultant Lobbyists, the Parliamentary Commissioner for Standards and the Registrar of Lords’ Interests.[footnote 7] It is an applicant’s personal responsibility to understand any other rules and regulations they may be subject to in parallel with this Committee’s advice.
By ‘privileged information’ we mean official information to which a minister or Crown servant has had access as a consequence of his or her office or employment and which has not been made publicly available. Applicants are also reminded that they may be subject to other duties of confidentiality, whether under the Official Secrets Act, the Civil Service Code or otherwise.
The Business Appointment Rules explain that the restriction on lobbying means that the former Crown servant/minister ‘should not engage in communication with government (ministers, civil servants, including special advisers, and other relevant officials/public office holders) – wherever it takes place – with a view to influencing a government decision, policy or contract award/grant in relation to their own interests or the interests of the organisation by which they are employed, or to whom they are contracted or with which they hold office.’
You must inform us as soon as you take up this role, or if it is announced that you will do so. You must also inform us if you propose to extend or otherwise change the nature of your role as, depending on the circumstances, as it may be necessary for you to make a fresh application.
Once the appointment has been publicly announced or taken up, we will publish this letter on the Committee’s website, and where appropriate, refer to it in the relevant annual report.
4. Annex – Material Information
4.1 The role
Fullbrook Strategies Limited is an international PR, crisis management, and political campaigns company. It advises clients on political and non-political issues in the UK and internationally. It states it has expertise in strategy, research, strategic communications, campaigning and media management. Fullbrook Strategies has been registered as a consultant lobbyist since June 2022.[footnote 8] It provides services in the following sectors:
- government relations;
- election campaigns;
- counter fraud and financial crime;
- AI and Cyber;
- defence;
- energy;
- Infrastructure; and
- reputation management.
You told the Committee this will be a paid, part-time role as a Business Unit Manager and Advisor in International Online Safety and Free Speech in which you will:
- consult and advise clients on the topics of online safety and online free speech;
- advise on the feasibility of support and delivery of campaigns and/or awareness in the clients’ home countries –these clients will vary, for example charities, companies, and individuals;
- be advising outside of the UK – none of your work will have any overlap with UK policy on these topics;
- should you advise a company that operates in both the UK and abroad, only advise on non-UK matters, including speaking only with territory-specific staff outside of the UK;
- not advise on frontier AI models;
- not advise on mis/disinformation – including matters related to NSOIT (the National Security Online Information Team); and
- not lobby or have contact with the UK government.
You said that you have known the founder of Fullbrook Strategies, Mark Fullbrook, for many years.
4.2 Dealings in office
You confirmed that as Secretary of State you were responsible for the UK’s online safety policy, for DCMS and then DSIT between Autumn 2022 and July 2024. You said that whilst you were responsible for the UK’s policy, you were not involved in starting or shaping debates in other nations or helping with campaigns outside of the UK.
Online safety
The Online Safety Act 2023[footnote 9] (the Act) was delivered when you were in office; you said that any information you held will now be out of date given that the Act is in force.
The Act is aimed at protecting children and adults online. Ofcom is the independent regulator – setting out steps providers can take to fulfil their safety duties in codes of practice. It has a broad range of powers to assess and enforce providers’ compliance with the framework. The Act:
- puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms;
- gives providers new duties to implement systems and processes to reduce risks that their services are used for illegal activity, and to take down illegal content when it does appear;
- requires platforms to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.
- protects adult users, ensuring that major platforms are more transparent about which kinds of potentially harmful content they allow, and gives people more control over the types of content they want to see; and
- categorises online services based on their size and the level of risk they pose to users. Larger platforms with a higher risk profile face stricter regulations.
Mis/disinformation refers to the spread of false or inaccurate information. The difference is in the intent, as are the methods to counter them. Countering misinformation is likely to require education and fact-checking, whilst identifying the and exposing the source of the malicious intent is needed to help counter disinformation. You said you don’t consider your access to information in this space to be a particular risk. You described your main role in this area as rebranding the team that worked on it – the National Security Online Information Team (NSOIT – which was previously the Counter Disinformation Unit (CDU)).
NSOIT leads the UK’s operational response to online information threats.[footnote 10] It is a government unit within DSIT responsible for identifying and responding to online misinformation and disinformation threats, particularly those posed by foreign states, with a focus on risks to elections and use of AI and deepfakes.
You said that you do not have any knowledge on this topic beyond what is in the public domain. You also noted that you received access to some of the weekly reports the team produced (which contained no methodological information), some of which have now been released into the public domain in freedom of information request responses[footnote 11].
Frontier AI models
‘Frontier models’ describe the most advanced, cutting-edge models at the forefront of what AI can achieve. Examples include: Chat GPT (OpenAI), MetaAI, Gemini (Google), and other providers geared towards more specialist services. Frontier models support input from text, images, and audio – these advanced capabilities and contextual reasoning make them efficient on complex decision-making tasks. AI cybersecurity frontier models are being developed for use in cybersecurity – advancing and improving what is possible in threat detection, response, and analysis. These often have capabilities like natural language understanding, complex pattern recognition, and rapid adaptation to emerging threats, but also present unique security challenges due to their powerful capabilities and potential for misuse.
You said that your main role in developing AI policy was setting up the UK’s AI Safety Institute[footnote 12] (now the AI Security Institute[footnote 13]), including setting up the task force responsible for delivering it.[footnote 14] This was set up in 2023 as a result of the risks to national security arising from frontier systems. You said you have no access to relevant and up-to-date policy or privileged information in this area.
Information available publicly: the task force established an expert advisory board of the world’s leading experts in AI research/safety and key figures from the UK’s national security community. The AI Safety Summit held in November 2023 brought together key countries, as well as technology organisations, academics and civil society to inform national and international frameworks around the development of AI. This resulted in the Bletchley Declaration, signed by 28 countries, which was the start of global cooperation on AI safety. Since then, there have been two further AI summits – in Seoul in May 2024 and France in 2025. Including delivery of the first International AI Safety Report – which provides scientific information, backed by tens of countries and their experts, to support informed policymaking.
Stakeholder meetings
You said that you met with several organisations with an interest in online harms and mis/disinformation, in discharge of your ministerial responsibilities. These included Apple, Google, Meta and Amazon. You said that all of these meetings were declared and minuted, and would show no conflict. You said that you did not meet with, or have any dealings with Fullbrook Strategies in office.
Correspondence with Fullbrook Strategies
Fullbrook Strategies provided the Committee with the following information about how any conflicts in your role with this advice will be managed. It said:
- ‘Where any of the staff provided by FSL to the Client are subject to advice and authorisation provided by the government Advisory Committee on Business Appointments (ACOBA), the Services will be provided in compliance with such advice’;
- ‘For the avoidance of doubt where the terms of this Agreement and the advice and authorisation provided by ACOBA come into conflict the advice and authorisation provided by ACOBA shall be paramount.’;
- ‘Ms Donellan will not be engaged in any form of lobbying, will fully comply with ACOBA’s advice, and will not have access to any information relating to other clients or their lobbying activities.’;
- your employment contract will both preclude you from lobbying on behalf of FSL or any of its clients; and
- ‘At FSL, we adhere strictly to the guidance issued by ACOBA. Our standard engagement contract clearly states that ACOBA’s advice takes precedence over any other contractual obligations. In addition, we operate with clearly defined internal protocols: all teams work within robust Chinese wall policies to maintain confidentiality between client matters.’
4.3 Departmental assessment
DSIT confirmed there is no contractual relationship between DSIT and Fullbrook Strategies, and you made no decisions specific to the company in office.
DSIT said that as the Secretary of State, you took policy and regulatory decisions in the online safety/protection/harms sphere, where you will focus this work with Fullbrook Strategies. This included mis/disinformation and on developing frontier AI models:
- In relation to frontier models – you had no statutory powers to make decisions directly to the release of frontier models. However, you did make decisions related to the government’s position and policy approach on this. DSIT noted that decisions taken on the Online Safety Act which impacted the sector have moved on now the Act is live and in the public domain.
- Regarding mis/disinformation, DSIT considered you had access to information on NSOIT’s activities and methods, given previous proposals to transfer the team to the Home Office, and your role in refining the team’s monitoring categories in advance of the general election in 2024.
In relation to access to information, DSIT said:
- any access to privileged information relating specifically to online safety or freedom of speech will now likely be in the public domain (the online safety Act and the work on bias presented at the Bletchley AI Summit in 2023[footnote 15]) or be out of date;
- you will still be in possession of some privileged information regarding online safety policy, particularly around mis/disinformation and operations policy. It said that you have access to information relating to NSOIT’s activities and methods, given your role in refining the team’s monitoring categories in advance of the general election in 2024. This is likely to have some ongoing relevance and may overlap with the Free Speech aspects of this role. Overall, DSIT said that if the proposed role does not include mis/disinformation or matters related to NSOIT, the risk of conflict is reduced; and
- there is a risk, real or perceived, that working in this area could overlap with operational models and your meetings with DSIT’s stakeholders operating in this sphere. You had privileged information related to the potential national security risks from AI, though this is now largely out of date as model capabilities have continued to change over the last year. You also met with many stakeholders in office,[footnote 16] some of which were large, international companies – including Meta, Apple, Amazon, Google and WhatsApp. This offers you potential insight into these companies’ mis/disinformation and operations policy, and their views of online safety and how they are managing these changes.
4.4 Departmental recommendation
DSIT said there is a reasonable risk that your ministerial responsibilities will cross over with this proposed role. In particular, it noted that:
- Online disinformation policy is one of the department’s more sensitive areas internationally with many key stakeholders being US-based companies that also operate in the UK.
- There is overlap with operational models and your meetings with DSIT’s stakeholders operating in this sphere.
- Some of your access to privileged information regarding online safety policy, particularly around mis/disinformation and operations policy is likely to have some ongoing relevance and may overlap with the Free Speech part of this role.
Whilst DSIT said you were privy to operational details that will become less relevant over time, to reduce the risk of conflict it recommended:
- restrictions on the role to prevent you from working on relevant content and policy areas, specifically the UK’s policy; and
- operation and regulation and preventing any work on mis/disinformation or matters related to NSOIT.
-
This application for advice was considered by Isabel Doverty; Hedley Finn OBE; Sarah de Gay; Dawid Konotey-Ahulu CBE DL; The Baroness Thornton; and Mike Weir. Andrew Cumpsty was unavailable. Michael Prescott was recused. ↩
-
https://orcl.my.site.com/CLR_Public_Profile?id=0014J00000k01NMQAY ↩
-
https://www.gov.uk/government/publications/ai-safety-institute-overview#:~:text=It%20will%20work%20towards%20this,AI%20and%20enable%20its%20governance – now the AI Security Institute https://www.aisi.gov.uk/ ↩
-
Misinformation is likely to be about education and checking facts that need correcting. ↩
-
Disinformation is more likely to be about identifying and exposing the course of the malicious intent. ↩
-
NSOIT leads the UK’s operational response to online information threats. It is a government unit within DSIT responsible for identifying and responding to online misinformation and disinformation threats, particularly those posed by foreign states, with a focus on risks to elections and the use of AI and deepfakes: https://www.gov.uk/government/publications/national-security-online-information-team-privacy-notice/national-security-online-information-team-privacy-notice#:~:text=The%20NSOIT%20leads%20the%20UK,to%20cause%20harm ↩
-
All Peers and Members of Parliament are prevented from paid lobbying under the House of Commons Code of Conduct and the Code of Conduct for Members of the House of Lords. Advice on obligations under the Code can be sought from the Parliamentary Commissioners for Standards, in the case of MPs, or the Registrar of Lords’ Interests, in the case of peers. ↩
-
https://orcl.my.site.com/CLR_Public_Profile?id=0014J00000k01NMQAY ↩
-
https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer ↩
-
https://www.gov.uk/government/publications/national-security-online-information-team-privacy-notice/national-security-online-information-team-privacy-notice#:~:text=The%20NSOIT%20leads%20the%20UK,to%20cause%20harm ↩
-
https://bigbrotherwatch.org.uk/wp-content/uploads/2024/05/FOI2024-00576-NSOIT-Copy-compliance-policy_Redacted.pd ↩
-
https://www.gov.uk/government/publications/ai-safety-institute-overview#:~:text=It%20will%20work%20towards%20this,AI%20and%20enable%20its%20governance. ↩
-
https://www.gov.uk/government/publications/frontier-ai-taskforce-first-progress-report/frontier-ai-taskforce-first-progress-report ↩
-
https://www.gov.uk/government/topical-events/ai-safety-summit-2023 ↩
-
https://www.gov.uk/government/collections/dsit-ministerial-gifts-hospitality-travel-and-meetings ↩