Research and analysis

Intimidation in Public Life: progress report on recommendations

Published 17 December 2020

Intimidation in Public Life: progress report on recommendations

1. Summary of progress made against the report’s recommendations

1.1 Government

The government has made progress in a number of areas. In 2019, they published their Online Harms White Paper, which established a new regulatory framework for online safety, including a statutory duty of care to make companies take more responsibility for the safety of their users. This will be backed up by an independent online harms regulator. The government has not committed to bringing forward legislation to shift the liability of illegal content online towards social media companies.

As per our recommendation, the government consulted on the introduction of a new electoral offence of intimidation of candidates and campaigners during elections. They have committed to legislating for this offence when parliamentary time allows. Similarly, the government published legislation in 2018 to remove the requirement for candidates standing as local councillors to have their home addresses published on the ballot paper. These provisions came into force for the polls on 2 May 2019.

1.2 Political parties

In 2017, we found that political parties needed to do more to protect their candidates from intimidation – to show leadership in setting an appropriate tone for candidates and supporters; to tackle intimidatory behaviour undertaken by their members; and to provide support to their candidates who face intimidation during elections.

Political parties have made progress in a number of key areas, but there is still work to be done in others.

All of the political parties represented in Westminster now have in place their own Code of Conduct, which sets out the minimum standards of behaviour expected of their members. The party codes all prohibit bullying, harassment and unlawful discrimination – conduct that clearly falls within the scope of intimidation. Some of the codes list further categories of behaviour that will not be tolerated by parties, including victimisation, abuse and hateful language. Many of the codes explicitly refer to the positive behaviours expected by party members, including fairness, respect, tolerance and dignity, as well the expectation that members will challenge unacceptable behaviour where it occurs. This is a significant step forward.

Similarly, each party has in place its own internal disciplinary process for dealing with alleged breaches of the party’s code. A range of sanctions are included in those frameworks, including formal warning, suspension from party membership, prohibition from holding office or standing for election, and revocation of party membership. It is not clear to what extent parties enforce the full range of sanctions available to them to discipline intimidatory behaviour by their members. We would like to see all parties collecting data on the number of complaints against members for engaging in intimidation and the outcome of any disciplinary process resulting from these complaints.

We have been working with the Jo Cox Foundation since 2019 on the recommendation that political parties work together to develop a joint code of conduct on intimidatory behaviour. That work has resulted in a high-level statement of principle outlining the minimum standards of behaviour that all party members should aspire to. We welcome support for the statement from the Labour Party, the Scottish National Party, the Liberal Democrats, Plaid Cymru, and the Green Party.

1.3 Policing

In 2017, we found that the approach taken on intimidation offences by local police forces was inconsistent. To that end, we recommended better training and guidance.

In line with our recommendation, the National Police Chief’s Council published joint guidance with the Crown Prosecution Service, the College of Policing, and the Electoral Commission in 2019, about behaviour which candidates in elections may experience during a campaign which is likely to constitute a criminal offence. We were pleased to see that the guidance includes practical advice on how to protect yourself, as well as legal definitions and what might constitute a breach of criminal law.

We were also pleased to see that the College of Policing has updated their Authorised Professional Practice for elections to include information on the Committee’s report, intimidation and the police’s responsibility to mitigate and investigate allegations related to intimidation.

1.4 Social media

In 2017, we found that social media had been the most significant factor enabling intimidation in recent years. We were concerned that not enough was being done by social media companies to proactively address intimidation online.

All three social media companies now have measures in place to protect their users from intimidation and harassment. These include policies and guidelines that are regularly reviewed and updated, mechanisms to identify and remove abusive content, and reporting channels for users to report content that violates their policies. They also all give users options to control the content they see and who they can interact with online. These include block, mute and safe search functions.

In line with our recommendation, all three companies now publish transparency data on reported content and takedowns. This is a significant step forward. Neither Twitter, Facebook or Google appear to publish data on the time it takes to remove reported content, however. This would help satisfy the Committee that social media companies are able to make decisions quickly and consistently on the takedown of intimidatory content.

All three companies established temporary election teams during the 2019 General Election to protect the integrity of election-related content and identify and respond more quickly to potential threats and challenges, including removing intimidatory content. We were pleased to see that Facebook has since established a permanent reporting channel for MPs to flag abusive or threatening content, which runs year round for sitting MPs and is extended for Parliamentary candidates during elections.

We were also pleased to see that all three companies shared bespoke election and safety resources with MPs, political parties and the government, ahead of the General Election.

We were disappointed to see that social media companies have not adequately revised their tools for users to escalate potential illegal online activity to the police. We said in 2017 that general statements, such as “remember that you should contact local law enforcement if you ever feel threatened by something you see on Facebook”, do not help users to constructively engage with the police. It remains our view that social media companies have a responsibility to advise their users about how to escalate any credible threats they receive.

1.5 Press regulators

Press regulators IPSO and Impress both wrote this year to update the Committee.

It is clear that the Editors’ Code of Practice Committee, who oversee IPSO’s Code of Practice, acknowledge that intimidation is a problem for all those in public life, and that their Code is robust and protects individuals in a range of circumstances, including discrimination and harassment. They have satisfied the Committee that editors exercise discretion for their own editorial content and language and that they are open to criticism and called to account by the public and those in public life. Editors must comply with the Code and the law. We were glad to hear that publishers are responsible for their freelancers’ work, which must also comply with the Code.

We were pleased to see that Impress is currently undertaking a comprehensive review of their Standards Code, considering issues around discrimination, harassment, online threats and intimidation. They intend to publish a new version of the Code in July 2022.

2. Recommendation tracker

Recommendation Responsibility Progress Progress rating
Government should bring forward legislation to shift the liability of illegal content online towards social media companies. Government The government published the joint DCMS-Home Office Online Harms White Paper in 2019. In that White Paper, the government said that shifting liability for illegal content is not the most effective mechanism for driving behavioural change by companies. Instead, they have increased the responsibility on companies in a way that is compatible with existing law. Amber
The government should consult on the introduction of a new offence in electoral law of intimidating Parliamentary candidates and party campaigners. Government In 2018, the government consulted on the introduction of a new electoral offence of intimidation of all candidates and campaigners during elections. The government has committed to legislating for this offence when parliamentary time allows. Green
The government should bring forward legislation to remove the requirement for candidates standing as local councillors to have their home addresses published on the ballot paper. Returning Officers should not disclose the home addresses of those attending an election court. Government The government published legislation in 2018 to remove the requirement for candidates standing as local councillors to have their home addresses published on the ballot paper. These provisions came into force for the polls on 2 May 2019. Green
Political parties should set clear expectations about the behaviour expected of their members, both offline and online through a code of conduct for members which specifically prohibits any intimidatory behaviour. Parties should ensure that members are familiar with the code. The consequences of any breach of the code should be clear and unambiguous. Political parties All of the political parties represented in Westminster have in place their own Code of Conduct, which sets out the minimum standards of behaviour expected of all party members. Green
Political parties must ensure that party members who breach the party’s code of conduct by engaging intimidation are consistently and appropriately disciplined in a timely manner. Political parties All of the political parties represented in Westminster have in place their own internal disciplinary process for dealing with alleged breaches of the party’s code to ensure appropriate and timely discipline. It is not clear to what extent parties use the full range of sanctions available to them. Amber
Political parties must collect data on the number of complaints against members for engaging in intimidatory behaviour, and the outcome of any disciplinary processes which result from these complaints. Political parties Some of the political parties represented in Westminster have written to confirm they collect data on the number of complaints against members for engaging in intimidatory behaviour, and the outcome of any disciplinary processes which result from these complaints. Amber
The political parties must work together to develop a joint code of conduct on intimidatory behaviour during election campaigns by December 2018. The code should be jointly enforced by the political parties. Political parties The Committee has been working with The Jo Cox Foundation to take forward this recommendation. Our work has resulted in a high-level statement of principle outlining the minimum standards of behaviour that all party members should aspire to. We have written to political parties represented in Westminster asking them to confirm their support for the statement in time for the 2021 local elections. We welcome support for the statement from the Labour Party, the SNP, the Liberal Democrats, Plaid Cymru and the Green Party. We will be publishing formal replies from other political parties as they come in. Amber
Political parties must take steps to provide support for all candidates, including through networks, training, and support and resources. In particular, the parties should develop these support mechanisms for female, BAME, and LGBT candidates who are more likely to be targeted as subjects of intimation. Political parties Some of the political parties represented in Westminster have written to confirm they provide training and support for candidates on intimidation, including social media training. Amber
Political parties must offer more support and training to candidates on their use of social media. This training should include: managing social media profiles, block and mute features, reporting content, and recognising when behaviour should be reported directly to the police. Political parties Some of the political parties represented in Westminster have written to confirm they provide training and support for candidates on intimidation, including social media training. Amber
The Home Office and the Department for Digital, Culture, Media and Sport should develop a strategy for engaging with international partners to promote international consensus on what constitutes hate crime and intimidation online. Home Office and the Department for Digital, Culture, Media and Sport In 2018, the government committed to developing a strategy for engaging with international partners to promote international consensus on what constitutes hate crime and intimidation online. The government is currently working with international partners on this issue. Amber
The National Police Chiefs Council should ensure that local police forces have sufficient training to enable them to effectively investigate offences committed through social media. Local police forces should be able to access advice and guidance on the context in which MPs and Parliamentary candidates work. National Police Chiefs Council The NPCC wrote to confirm that all police forces have access to training provided by the College of Policing and that police forces continue to receive advice and guidance through designated single points of contact in each force area. Green
The College of Policing Authorised Professional Practice for elections should be updated to include offences relating to intimidation, including offences committed through social media. College of Policing The College of Policing’s Authorised Professional Practice for elections has been updated to include information on the Committee’s report, intimidation and the police’s responsibility to mitigate it and investigate allegations and offences related to intimidation. Green
The National Police Chiefs Council, working with the Crown Prosecution Service and the College of Policing, should produce accessible guidance for Parliamentary candidates giving clear advice on behaviour they may experience during a campaign which is likely to constitute a criminal offence. National Police Chiefs Council, working with the Crown Prosecution Service and the College of Policing In 2019, the NPCC published joint guidance with the Crown Prosecution Service, the College of Policing and the Electoral Commission, about behaviour which candidates in elections may experience during a campaign which is likely to constitute a criminal offence. Green
Social media companies must develop and implement automated techniques to identify intimidatory content posted on their platforms. They should use this technology to ensure intimidatory content is taken down as soon as possible. Social media companies Twitter, Facebook and Google have written to confirm they use automated techniques to help them identify and take down intimidatory content faster. Green
Social media companies must do more to prevent users being inundated with hostile messages on their platforms, and to support users who become victims of this behaviour. Social media companies Twitter, Facebook and Google have written to confirm the steps they are taking to prevent users being victims of intimidation and harassment on their platforms. Green
Social media companies must implement tools to enhance the ability of users to tackle online intimidation through user options. Social media companies Twitter, Facebook and Google have written to confirm they offer a range of tools and user options to enhance the ability of users to tackle online intimidation. Green
All social media companies must ensure they are able to make decisions quickly and consistently on the takedown of intimidatory content online. Social media companies Twitter, Facebook and Google have written to confirm the steps they are taking to ensure they’re able to make decisions quickly and consistently on the takedown of intimidatory content. Amber
Twitter, Facebook and Google must publish UK-level performance data on the number of reports they receive, the percentage of reported content that is taken down, and the time it takes to take down that content, on at least a quarterly basis. Social media companies Twitter, Facebook and Google have written to confirm they now publish transparency data on reported content and takedowns. Neither Twitter, Facebook or Google publish data on the time it takes to remove reported content. Amber
Social media companies must urgently revise their tools for users to escalate any reports of potential illegal online activity to the police. Social media companies Neither Twitter or Facebook have revised their tools for users to escalate reports of potential illegal online activity to the police. Google has written to confirm they have systems in place to escalate content that may be unlawful. Red
The social media companies should work with the government to establish a ‘pop up’ social media reporting team for election campaigns. Social media companies Twitter, Facebook and Google have written to confirm they established cross-functional election teams to monitor and respond to challenges, including quickly taking down intimidatory content, during the 2019 General Election. Green
Social media companies should actively provide advice, guidance and support to Parliamentary candidates on steps they can take to remain safe and secure while using their sites. Social media companies Twitter, Facebook and Google have written to confirm they actively provide advice, guidance and support to Parliamentary candidates on steps they can take to remain safe and secure while using their sites. Green
Press regulation bodies should extend their codes of conduct to prohibit unacceptable language that incites intimidation. Press regulation bodies (IPSO and Impress) In 2018, the Editors’ Code of Practice Committee, who oversee IPSO’s Code of Practice, wrote to say that they would not be amending the code to prohibit unacceptable language that incites intimidation. They acknowledged that online intimidation is a disturbing aspect of public life but said that the Code already protects individuals in a range of circumstances, including discrimination and harassment. IMPRESS has written to confirm they are undertaking a comprehensive review of their Standards Code over the next 18 months with the intention of publishing a new version of the Code in July 2022. It will consider issues around discrimination, harassment, online threats and intimidation. Amber
News organisations should only consider stories from freelance journalists that meet the standards of IPSO’s Editors Code, or the Editorial Guidelines of Impress, as appropriate, and ensure that freelance journalists are aware of this policy. News organisations In 2020, the Committee followed up with the Editors’ Code of Practice Committee, who oversee IPSO’s Code of Practice. They said that publishers are responsible for the content they publish, including that from freelancers. Editors who publish freelancers’ work must ensure that it complies with the Code of Practice. They also said that freelancers would have signed a contract saying they would abide by the Code. Green

3. Recommendation tracker: further information

Recommendation Responsibility More Information
Government should bring forward legislation to shift the liability of illegal content online towards social media companies. Government The Online Harms White Paper establishes a new regulatory framework for online safety and sets out a range of legislative and non-legislative measures for tackling online harms, including a statutory duty of care to make companies take more responsibility for the safety of their users and to tackle harm caused by illegal and harmful content on their platforms. This will be overseen by an independent online harms regulator, which will set clear safety standards, backed up by reporting requirements and enforcement powers.
The government should consult on the introduction of a new offence in electoral law of intimidating Parliamentary candidates and party campaigners. Government In May 2019, the government announced a range of new measures intended to safeguard UK elections from intimidation, influence and disinformation. As well as committing to legislate to introduce a new electoral offence of intimidating candidates or campaigners, the government has committed to: legislate to clarify the electoral offence of undue influence of a voter; implement a digital imprints regime for online campaigning material; and launch a consultation on electoral integrity.
The government should bring forward legislation to remove the requirement for candidates standing as local councillors to have their home addresses published on the ballot paper. Returning Officers should not disclose the home addresses of those attending an election court. Government The government made four statutory instruments that removed the preexisting requirement for candidates at local, parish and mayoral elections to publish their home addresses during the election process and be included on the ballot paper at these polls. A candidate may now choose that their home address is not made public and that the local authority area in which they live appears on the ballot paper instead.
Political parties should set clear expectations about the behaviour expected of their members, both offline and online through a code of conduct for members which specifically prohibits any intimidatory behaviour. Parties should ensure that members are familiar with the code. The consequences of any breach of the code should be clear and unambiguous. Political parties In 2019, the Committee reviewed the codes of conduct of political parties holding seats in Westminster, considering scope, themes and language. The party codes all prohibit bullying, harassment and unlawful discrimination – conduct that clearly falls within the scope of intimidation. Some of the codes list further categories of behaviour that will not be tolerated by parties, including victimisation, abuse and hateful language. As well as precluding poor behaviour, many of the codes refer explicitly to the positive behaviours expected by party members – fairness, respect, tolerance and dignity are common themes. Several of the codes also include the expectation that members will challenge unacceptable behaviour where it occurs. The codes vary in their approach to referring to procedures for alleged breaches of the code.
Political parties must ensure that party members who breach the party’s code of conduct by engaging intimidation are consistently and appropriately disciplined in a timely manner. Political parties A range of sanctions are included in parties’ disciplinary frameworks, should any member breach their party’s code of conduct. These include: formal warning; reprimand; suspension from party membership; prohibition from holding office or standing for election to any specified party role, permanently or for a specified period; and revocation of party membership. It is not clear to what extent parties enforce the full range of sanctions available to them to discipline intimidatory behaviour by their members.
Political parties must collect data on the number of complaints against members for engaging in intimidatory behaviour, and the outcome of any disciplinary processes which result from these complaints. Political parties See correspondence from the Liberal Democrats; the Labour Party; and the Green Party. The Committee has not heard from other parties on this issue.
The political parties must work together to develop a joint code of conduct on intimidatory behaviour during election campaigns by December 2018. The code should be jointly enforced by the political parties. Political parties The Committee has been working with The Jo Cox Foundation since 2019. We have been working with political parties to draw up a common statement of principle on intimidatory behaviour to encourage cross-party consensus on the issue. Our work to date has evolved through close collaboration with the political parties and draws on the language of existing party codes. The resulting statement is not intended to supersede or replace party codes of conduct or disciplinary processes, but to complement them by acting as a high level statement of principle outlining the minimum standards of behaviour that all party members should aspire to. See the joint statement on conduct of political party members here.
Political parties must take steps to provide support for all candidates, including through networks, training, and support and resources. In particular, the parties should develop these support mechanisms for female, BAME, and LGBT candidates who are more likely to be targeted as subjects of intimation. Political parties See correspondence from the Liberal Democrats; the Labour Party; the Conservative Party; the Green Party; and Plaid Cymru. The Committee has not heard from other parties on this issue.
Political parties must offer more support and training to candidates on their use of social media. This training should include: managing social media profiles, block and mute features, reporting content, and recognising when behaviour should be reported directly to the police. Political parties See correspondence from the Liberal Democrats; the Labour Party; the Conservative Party; the Green Party; and Plaid Cymru. The Committee has not heard from other parties on this issue.
The National Police Chiefs Council should ensure that local police forces have sufficient training to enable them to effectively investigate offences committed through social media. Local police forces should be able to access advice and guidance on the context in which MPs and Parliamentary candidates work. National Police Chiefs Council The NPCC has established a network under Operation Bridger, the mechanism by which the Parliamentary Liaison and Investigation Team supports single points of contact within each police force area. The Parliamentary Liaison and Investigation Team use this network to disseminate knowledge and learning to single points of contact across the country. This includes advice and guidance on the current threat picture for MPs, the context of the political landscape and other emerging issues, including online activity against MPs.
The College of Policing Authorised Professional Practice for elections should be updated to include offences relating to intimidation, including offences committed through social media. College of Policing The Authorised Professional Practice for elections sets out the key roles for the police service and other agencies who need to work together to ensure an effective police response to elections, including their responsibilities for recognising and responding to harassment, intimidation or threatening behaviour.
The National Police Chiefs Council, working with the Crown Prosecution Service and the College of Policing, should produce accessible guidance for Parliamentary candidates giving clear advice on behaviour they may experience during a campaign which is likely to constitute a criminal offence. National Police Chiefs Council, working with the Crown Prosecution Service and the College of Policing The guidance was updated ahead of the 2019 General Election. It includes key indicators, offences and ways to protect yourself, legal definitions and what might constitute a breach of criminal law. A copy of the guidance was disseminated to political parties and returning officers during the General Election and briefings were held in each police force area for candidates.
Social media companies must develop and implement automated techniques to identify intimidatory content posted on their platforms. They should use this technology to ensure intimidatory content is taken down as soon as possible. Social media companies Twitter, Facebook and Google are increasingly using technology like machine learning to scale the efforts of human moderators and proactively identify, review and remove content that violates their policies: During the reporting period January to June 2019, more than 50% of Tweets that Twitter took action on for abuse were proactively identified using technology; The proportion of hate speech removed by Facebook before it’s been reported has tripled since 2017; and Between April and June 2020, Youtube removed 11.4 million videos for violating their guidelines. 10.8 million of those were flagged by machines, with 42% not receiving a single view.
Social media companies must do more to prevent users being inundated with hostile messages on their platforms, and to support users who become victims of this behaviour. Social media companies All three social media companies have measures in place to protect their users from intimidation and harassment. These include policies and guidelines that outline what is and is not allowed on their platforms; mechanisms to detect and remove abusive content; and reporting channels for users to report content and behaviour that violates their policies. See correspondence from Twitter, Facebook and Google.
Social media companies must implement tools to enhance the ability of users to tackle online intimidation through user options. Social media companies Twitter, Facebook and Google give users a range of options to control the content they see, what others see and who they can interact with online. These include advanced filter settings that allow users to disable notifications from certain types of accounts, and block, mute and safe search functions. For example: In August 2020, Twitter made conversation controls available to all users. This means that before you Tweet, you can choose who can reply (everyone; only people you follow; or only people you mention); People who manage Facebook pages, like those for candidates, can hide or delete individual comments. They can also moderate posts by turning on the profanity filter or by blocking specific words that they do not want to appear on their page; and Account holders on Youtube can delete, turn off or manage comments on their videos by requiring pre-approval before they are posted publicly.
All social media companies must ensure they are able to make decisions quickly and consistently on the takedown of intimidatory content online. Social media companies All three social media companies have measures in place to ensure they’re able to make decisions quickly and consistently on the takedown of intimidatory content. This includes using both human moderators and machine learning systems to identify and remove harmful content at speed, often before users have seen it; increasing the number of people working to address content that might violate their policies; and providing tools for users and organisations to report intimidatory content. See correspondence from Twitter, Facebook and Google.
Twitter, Facebook and Google must publish UK-level performance data on the number of reports they receive, the percentage of reported content that is taken down, and the time it takes to take down that content, on at least a quarterly basis. Social media companies Facebook and Google both publish quarterly transparency reports on the number of reports they receive and the amount of content they take down. Twitter publishes transparency data biannually, as below: Facebook’s transparency reports include a Community Standards Enforcement report that publishes data on the amount of content taken down and a proactive removal score that shows how much content was found before it was reported. It now includes a report on the amount of content actioned for violating their bullying and harassment policy; Youtube’s transparency report provides aggregate data about the flags they receive and the actions they take to remove videos and comments that violate their policies. It also breaks the data down by country or region. Google also publishes a broader transparency report for their services, detailing the number of requests they have received from government or law enforcement to remove or delist content; and Twitter has recently launched a transparency centre, which covers an array of transparency efforts, including information requests, removal requests, copyright and trademark notices, email security and platform manipulation. It also includes a rules enforcement report, which publishes data about the number of accounts actioned and suspended and the amount of content removed for breaching their rules.
Social media companies must urgently revise their tools for users to escalate any reports of potential illegal online activity to the police. Social media companies Twitter and Facebook both have guidelines for law enforcement to support their work but do not provide adequate advice to users on how to escalate illegal online activity to the police. Their guidance on reporting abusive behaviour and content tends to advise users to “contact [their] local law enforcement agency”, which does not help them to constructively engage with the police when they’re facing illegal abuse online. Google said that Youtube operates a “notice-and-action system” for users and government authorities to report content which may be unlawful in local jurisdictions. They provide a tool to help users report content they believe should be removed from Youtube based on applicable laws.
The social media companies should work with the government to establish a ‘pop up’ social media reporting team for election campaigns. Social media companies Twitter, Facebook and Google all established temporary election teams during the 2019 General Election to protect the integrity of election-related content and identify and respond more quickly to potential threats and challenges, including removing intimidatory content. Facebook has since established a permanent reporting channel for MPs to flag any abusive or threatening content, which runs year round for sitting MPs and is extended for Parliamentary candidates during elections. All three companies also have “trusted flagger programmes”, which allow organisations like the UK Parliament’s Members’ Security Support Service to expedite reports directly to them.
Social media companies should actively provide advice, guidance and support to Parliamentary candidates on steps they can take to remain safe and secure while using their sites. Social media companies Twitter, Facebook and Google all work closely with the Parliamentary authorities, political parties and the government to ensure that MPs and candidates are properly supported when it comes to staying safe online. Ahead of the UK General Election in 2019, they published joint candidate safety guidance on the types of content and activities that breach services’ terms and conditions and how candidates can report content that violates their policies. All three companies shared bespoke election and safety resources with sitting MPs, political parties and the government. Twitter also said they offer safety and security training for parties and candidates and drop-in sessions for MPs and their staff.
Press regulation bodies should extend their codes of conduct to prohibit unacceptable language that incites intimidation. Press regulation bodies (IPSO and Impress) In 2020, the Committee followed up with the Editors’ Code of Practice Committee on why they would not be amending the Code. They said that: The press regulated by IPSO should not be held responsible for the activities of unregulated individuals acting as online trolls and a change to the Code would unduly restrict freedom of expression and the right of the press to report and comment on public affairs; Editors exercise discretion for their own editorial content and use of language and they are open to criticism and are called to account by their readers and those in public life. The Code does not and should not seek to dictate the language of journalism, which is entirely a matter for editors, who must comply with the Code and the law; The Editors’ Code seeks to balance both the rights of the individual and the public’s right to know. It states that it should be interpreted “neither so narrowly as to compromise its commitment to respect the rights of the individual, nor so broadly that it infringes the fundamental right to freedom of expression – such as to inform, to be partisan, to challenge, shock, be satirical and to entertain – or prevents publication in the public interest.” It is in the interests of everyone that the press is free to campaign, scrutinise and criticise those exercising the power of the state. Those rights are balanced by responsibilities and the Code offers protection to individuals, including those in public life.
News organisations should only consider stories from freelance journalists that meet the standards of IPSO’s Editors Code, or the Editorial Guidelines of Impress, as appropriate, and ensure that freelance journalists are aware of this policy. News organisations The preamble to IPSO’s Code of Practice says that “It is the responsibility of editors and publishers to apply the Code to editorial material in both printed and online versions of their publications. They should take care to ensure it is observed rigorously by all editorial staff and external contributors, including non-journalists.” The Editors’ Codebook, which explains how the Code is interpreted and enforced by IPSO, has examples of how this is effective in the section devoted to the preamble.