Automated vehicles: protecting marketing terms
Published 10 June 2025
Applies to England, Scotland and Wales
Introduction
The Automated Vehicles Act 2024 (the 2024 act) received royal assent in May 2024.[footnote 1] The act gives the Secretary of State for Transport the power to protect certain terms, so that they can only be used to market vehicles authorised as being automated (self-driving). Protected terms[footnote 2] must not be used to market driver assistance systems.
In this consultation, we seek views on which terms should be protected.
Background
The law sets out an authorisation process to determine whether a vehicle can safely drive itself without being controlled or monitored by a human. This process would be undermined if businesses are able to claim that their vehicles are self-driving without getting the vehicles authorised.
Such marketing is also dangerous: it can mislead drivers into thinking that they do not need to pay attention to the road. As more manufacturers offer high-end driver assistance systems, misleading marketing risks worsening road safety. It could also undermine trust in the self-driving sector.
The 2024 act therefore includes 2 new marketing offences.
- The first offence – the ‘protected terms’ offence – restricts the use of listed terms. The list must be set out in secondary legislation, following consultation. We are therefore issuing this consultation.
- The second offence – the ‘confusion’ offence – is broader: it applies to any commercial communication likely to confuse end-users that an unauthorised vehicle can travel autonomously. The offence does not need secondary legislation, so is not the subject of this consultation. However, it complements the first offence: the marketing may constitute an offence if it is likely to confuse, even if it does not use terms on the protected list.
The 2024 act sets out a procedure to authorise self-driving vehicles; this procedure is not yet in force. Under the current law, vehicles that are capable of safely driving themselves may be listed under section 1 of the Automated and Electric Vehicles Act 2018 (the 2018 act). At present, listed vehicles will be treated as authorised vehicles for the purposes of misleading marketing (2024 act, s 81(5)). However, once the authorisation procedure comes into force, it will supersede the listing procedure under the 2018 act.
Therefore, under the current law, only listed vehicles will be allowed to be marketed using the protected terms. In future the intention is that only authorised vehicles will be allowed to be marketed using protected terms (for example ‘self-driving’, if this term is specified in the regulations). So, the protected terms offence will only apply where the vehicle is neither listed nor authorised.
Geographic scope
The relevant sections of the 2024 act apply to England, Wales and Scotland. This consultation also applies to England, Wales and Scotland.
Automated Vehicles Act implementation programme
This consultation is being published alongside a call for evidence on the statement of safety principles. Together, this represents the first part of a wider programme of secondary legislation to implement the 2024 act. We are targeting full implementation of the regulatory framework in the second half of 2027.
The implementation programme is purposefully designed to prioritise the development of a regulatory framework that maximises the potential for innovation and growth, strengthens public confidence and ensures public safety. The timeline for implementation of the 2024 act incorporates extensive public consultation and we welcome your response as part of this process.
Outline
This document has 4 chapters.
In chapter 1, we summarise the offences in the 2024 act.
In chapter 2, we consider the need to protect certain terms, summarising evidence that the public are confused about what is and what is not self-driving.
In chapter 3, we consider which terms should be protected.
In chapter 4, we ask for views.
How would the protections work in practice? provides practical examples of how the protections will work in practice.
1. The new marketing offences: a summary
The Automated Vehicles Act 2024 includes 2 new marketing offences, based on the Law Commissions’ recommendations.[footnote 3]
- Section 78 protects the use of certain terms to authorised automated vehicles (AVs). The exact list of terms may be set out in secondary legislation, following consultation. It may include words, expressions, symbols or marks.
- Section 79 is broader. It applies to commercial communications likely to confuse end-users into thinking that unauthorised vehicles can travel autonomously.
The purpose of the offences is straightforward: they are designed to prevent businesses from misleading drivers into thinking that unauthorised vehicles can drive themselves. However, some of the detail of the offences is complex.
Here we provide a brief overview of how the offences work. Those who wish to understand these offences fully should refer to the act itself.
Limited scope
The offences are not intended to limit academic discussion. Nor do they apply to communications between developers and investors. Essentially, the act limits the offences to promotional communications directed at end-users. We consider each requirement in turn.
Promotional communications
The offences only apply to those acting in the course of business. Furthermore, the communication must be ‘in connection with the promotion or supply’ of products or services. The product in question depends on the offence.
- Section 78(2) applies to terms used ‘in connection with the promotion or supply of a road vehicle’.
- Section 78(3) applies to terms used ‘in connection with the promotion or supply of a product intended for use as equipment of a road vehicle’. This includes software. An example could be an aftermarket kit, sold ‘to make your car self-driving’.
- Section 79(1) applies to the promotion or supply of ‘any product or service’, if the communication is likely to confuse end-users of road vehicles. The section would apply, for example, to a car manufacturer’s advert showing an unauthorised model of car operating without a driver. However, the confusion offence goes slightly further than this. It would also cover a similar advert that was ostensibly for insurance rather than the car itself.
Directed at end-users
The offences only apply if the communication is ‘directed at an end-user or potential end-user of the vehicle’. The term ‘end-user’ is defined in section 81(1) as ‘a person who uses the vehicle on a road or other public place’ for purposes which are not to do with ‘the development, manufacture or supply of the vehicle’.
The concept of ‘using a motor vehicle’ appears many times in road traffic legislation, most notably in the offence of failing to insure under section 143 of the Road Traffic Act 1988. Although the meaning has yet to be determined in the context of the 2024 act, previous case law[footnote 4] says there must be some element ‘of controlling, managing or operating the vehicle’.[footnote 5]
This means that the offence only applies if there is potential for the vehicle to be owned or driven by people who are not connected to the development or supply of the vehicle.
Examples
A developer sends out a press release that it is conducting self-driving trials with paying customers. The trial vehicles are not yet authorised and use a safety driver. This is not an offence: neither the passengers nor the safety drivers are end-users.
A manufacturer has sold its vehicles to consumers. It then offers vehicle owners a trial of ‘self-driving software’, which is unauthorised. If ‘self-driving’ is a protected term, this will be an offence. The vehicle owners are end-users.
Authorised vehicles
The offences do not apply where vehicles are authorised as automated vehicles under Part 1 of the 2024 act. If the marketing offences come into effect before the authorisation scheme, vehicles listed under the Automated and Electric Vehicles Act 2018 are similarly exempt.[footnote 6]
Defences
The section 78 offence is subject to 3 possible defences.
Terms used in a way not connected with automation
Under section 78(4), a protected term can be used in connection with a road vehicle provided that it ‘was used in a way that was not intended to convey, and could not reasonably have been understood as conveying, any meaning to do with automation’.
Example
A firm advertises ‘self-drive van for hire’. It might be clear that, as the van does not come with a driver, the person hiring the van must drive it themselves. The use of the term ‘self-drive’ in this context may not be an offence if it cannot reasonably be understood as referring to automation.
Terms directed at end-users from outside Great Britain
Under section 78, it may be an offence to direct protected terms at end-users from another country, if it is ‘reasonable to anticipate’ that the term will come to the attention of end-users in Great Britain. However, the business has a defence if it can prove that it has taken ‘all reasonable precautions and exercised all due diligence’ to ensure that end-users here understand that the term was not directed at them.[footnote 7]
For the confusion offence, a similar defence applies under section 79(3).
Example
A manufacturer sets up a website which may be accessed in Great Britain, claiming that its vehicle has been authorised to drive itself in California. This is not an offence provided the manufacturer can show that it had taken all reasonable precautions and exercised all due diligence to ensure that end-users would understand that the vehicle cannot drive itself on roads in Great Britain.
Secondary involvement
Special rules apply to secondary businesses, which are not involved in the manufacture or supply of the vehicle or equipment in question. Under section 78(6), such businesses have a defence if they can show that they did not know, and had no reason to suspect, that the communication would constitute an offence.
For the confusion offence, a similar defence applies under section 79(4).
Example
A newspaper carries an advert wrongly describing a new car model as ‘self- driving’. At first sight, the newspaper has ‘caused or permitted’ an illegal communication. However, it has a defence if it has no reason to suspect that the vehicle was unauthorised.
Once the regulator has notified the newspaper, the newspaper can no longer use this defence if it continues to carry the advert.
Directors’ liability
Under section 80, company directors, secretaries and managers may also be held personally liable for these offences. This occurs where the offence took place with their consent or connivance or was attributable to their neglect.
Penalties
A conviction on indictment carries a maximum penalty of an unlimited fine or 2 years’ imprisonment (or both).
Civil enforcement
Civil enforcement powers are set out in schedule 5. Under paragraph 1, the Secretary of State has a duty to enforce the provisions. This may be done through accepting out-of-court undertakings (paragraph 3), or by bringing proceedings for an injunction before the civil courts (paragraph 4).
2. Why protect terms which could mislead?
The distinction between driver assistance and self-driving is crucial. Yet many drivers are currently confused about where the boundary lies.
The problem is aggravated if the way that driver assistance systems are marketed gives drivers the misleading impression that they do not need to pay attention to the road. Serious consequences could arise if businesses describe driver assistance technology as ‘self-driving’, even though it is neither authorised nor listed. It could undermine the authorisation process, present a risk to public safety and reduce trust in self-driving vehicles.
The Law Commissions’ consultations
As part of their 4-year review, the Law Commissions considered the existing law in this area and concluded that it did not provide adequate protection.
In their first consultation paper, the Law Commissions proposed that an agency should be responsible for the safety of automated vehicles following deployment. The Law Commissions then asked whether the agency should have specific responsibility for, among other things, market surveillance and regulating consumer and marketing materials. Most respondents (60%) agreed.[footnote 8] As KPMG LLP noted:
While these points are often viewed as second or third-order questions, they are critical to investment and the safe deployment of AVs[footnote 9].
The Association of British Insurance and Thatcham Research felt that there was a ‘real need for more coherent messaging from manufacturers’, citing the 2018 #TestingAutomation study:
7 in 10 (71%) drivers globally and 53% in the UK believe that they can purchase a car that can drive itself today; and 1 in 5 (18%) British motorists think that a car marketed as being capable of automatic steering, braking and acceleration (i.e. an assisted driving system) allows them to ‘sit back and relax and let the car do the driving[footnote 10].
In the light of these concerns, the Law Commissions recommended the new offences.
Evidence from the 2018 #TestingAutomation study
The #TestingAutomation consumer survey was commissioned by Thatcham Research, Euro NCAP and Global NCAP. It surveyed 1,567 car owners from China, France, Germany, Italy, Spain, the UK and US. The research showed potentially dangerous false impressions around new cars which combine adaptive cruise control, lane centring and speed assist systems to support the driver.[footnote 11] These were often confused with self-driving.
Key findings from the study include:
- 7 in 10 (71%) drivers globally and 53% in the UK believed that they could already purchase a car that could drive itself
- 1 in 5 (18%) British motorists thought that a car marketed as being capable of automatic steering, braking and acceleration allowed them to ‘sit back and relax and let the car do the driving’
- many respondents said that they would be tempted to break the law while using an assisted driving system by texting on a mobile phone (34%), making a hand-held call (33%) or having a brief nap (11%)
- only half (51%) of drivers believed they would be liable in the event of a crash when using assisted driving systems
Evidence from Thatcham’s 2022 survey
When Thatcham repeated the survey in 2022, they found that little had changed. In a survey of 2,000 UK car owners, 52% mistakenly believed that they could ‘purchase a car today that can drive itself’.[footnote 12] This misperception was even greater among the young, with 77% of 17 to 25-year-olds believing that self-driving cars were already on the market.
Evidence from the 2023 great self-driving exploration study
In 2023 the Department for Transport (DfT) conducted further research to explore people’s understanding of self-driving technology. It also revealed widespread confusion.[footnote 13]
The research included a survey of 4,027 people across the UK. This showed that half of respondents (50%) were unsure whether a vehicle can currently legally drive itself. The remaining half were almost evenly split between thinking this is currently legal (26%) and illegal (24%).
High levels of positivity towards technology did not necessarily lead to higher levels of understanding. Early adopters were more likely than others to think that vehicles can currently drive themselves. For example, 36% of early adopters said (incorrectly) that vehicles can legally drive themselves within a pre-determined area without human oversight (compared to 26% of the general population).
The research also conducted large scale public engagement events, in which members of the public learnt more about self-driving vehicles and took part in trials. After this, participants felt they had a better understanding of the ‘rules’ for using self-driving vehicles and were more likely to accurately play back this understanding. However, there were still areas of potential confusion such as what level of autonomy is currently legal on UK roads.[footnote 14]
There was concern about the ‘minefield’ of new legal terminology associated with self-driving. This made participants unsure of how easily they could navigate new systems around self-driving vehicles, as drivers or users. They also thought that it could provide opportunities for manufacturers or individuals to take advantage of the system.[footnote 15]
An annex to the great self-driving exploration looked specifically at updates to The Highway Code which explain the difference between automated vehicles and advanced driver assistance systems.[footnote 16] The research found that drivers generally felt that they had a clear understanding of the relevant paragraphs when they read them in detail. However, participants pointed out that they ‘don’t tend to refer to The Highway Code in their day to day lives’. And, as one advanced driving instructor put it, ‘drivers rarely engage fully with the manufacturer’s instructions’.[footnote 17]
The research indicates that drivers’ understanding of the distinction between automated vehicles and advanced driver assistance systems tends to be shaped by wider communications. In particular, it is shaped through the media, advertising or branding. The conclusion is that marketing matters: misleading marketing can undermine understanding.
Safety concerns in practice
Driver confusion can have real-life consequences. A recent example of this occurred on 29 March 2025 in China, when a Xiaomi SU7 operating in ‘Navigation on Autopilot’ intelligent-assisted driving mode was involved in a fatal collision that claimed the lives of 3 students.[footnote 18] Earlier, in February, the Ministry of Industry and Information Technology issued a new regulation that prohibited driver assistance systems from being named and marketed in a way that indicated that they could be used as an autonomous driving system.[footnote 19] On 16 April, reflecting public concern over this incident, the Ministry issued updated guidance to prevent driver assistance features from being advertised using terms such as ‘autonomous driving’, ‘smart driving’ and ‘intelligent driving’.[footnote 20]
This has led to a stronger emphasis on safety, with manufacturers increasingly using more precise terminology such as ‘assisted driving’. Xiaomi, for example, has renamed its ‘Pilot Pro’ system to ‘Assisted Driving Pro’, in addition to replacing references to ‘autonomous driving’ with ‘assisted driving’ on its website. Other manufacturers have adopted similar approaches, with sales teams emphasising to customers that driver assistance systems require full driver supervision. At the 2025 Shanghai Auto Show, executives from several companies acknowledged the need for more accurate public information.[footnote 21]
The importance of trust
Public trust and acceptance of any new technology is key to its success. As a recent literature review by Partners for Automated Vehicle Education (PAVE) UK puts it, even if the need for trust slows down the process of research and innovation in the short term, ‘it is critical to long-term success’.[footnote 22] The paper shows that trust requires understanding, information and engagement: unjustified claims have been shown to lead to mistrust.[footnote 23]
Research by the Safe Autonomy Research Group at the University of Warwick demonstrates that trust in the system increases with the introduction of knowledge about its capabilities and limitations. This enables ‘informed safety’ which helps calibrate drivers’ trust to an appropriate level, subsequently ensuring safe use.[footnote 24]
In 2022, a report from the Centre for Data Ethics and Innovation reached a similar conclusion.[footnote 25] It identified the need to build public trust and confidence in self-driving vehicles to drive adoption and innovation. The report noted ‘there is currently public confusion, exacerbated by some claims from industry, about the capabilities and limits of AV systems’.
In 2021, Britainthinks conducted a research programme on the future of transport on behalf of DfT,[footnote 26] surveying 2,842 UK adults and holding focus groups across the UK. The research found that regulatory oversight and safety assurance were the core factors influencing public perceptions of the safety of connected and automated vehicles. If the marketing of vehicles not authorised as self-driving is left unregulated, there is a danger that this would lead to undermining public trust in the safety of the technology.
There are therefore 3 reasons to regulate misleading marketing: to protect the integrity of the authorisation/listing process; to ensure public safety; and to engender trust.
3. Which terms should be protected?
The language used in connection with driving technologies continues to develop and evolve. Like with every new endeavour, driving automation has acquired its own specialist language.
The Law Commissions noted that one difficulty in understanding driving automation is the complexity of the language associated with it. For example, the important distinction between ‘driving automation’ and ‘automated driving’ may not be immediately apparent. While ‘driving automation’ is considered a broad term which includes driver support features, ‘automated driving’ is used to refer to where the vehicle is performing the driving task. Furthermore, driver assistance technology is developing rapidly, with concepts such as ‘autonomous emergency braking’ entering the language. Differences in usage and competing taxonomies have the potential to add further complexity.
In the absence of regulation, it is difficult to convey to the public a clear and succinct message about how to interact with driving automation. It is important to bring clarity to public understanding by protecting certain marketing terms, so that they can only be used in connection with authorised or listed vehicles.
There is established precedent for specific terms to receive legal protection. For example, it is an offence in some circumstances to use the word ‘taxi’ to advertise the hire of a vehicle that does not have a taxi licence.[footnote 27]
On the other hand, we do not wish to be overly prescriptive. Not all terms can be covered: some confusing terms will need to be regulated under the confusion offence, rather than be listed as specific protected terms. In particular, we only intend to protect terms in the English language, where consumers’ understanding of terms has been researched. The Welsh version of this paper therefore retains the protected terms in their English versions, rather than attempting to find a Welsh equivalent. Misleading marketing in Welsh and other languages can be prosecuted under the confusion offence.
Which vehicles are considered self-driving?
The 2024 act introduces a ‘monitoring and control’ test to determine whether a vehicle can be considered self-driving. A vehicle is considered to ‘travel autonomously’ if it is being controlled by the vehicle’s equipment – and an individual is not monitoring the vehicle or its surroundings with a view to immediate intervention in the driving task.
This creates a clear dividing line between self-driving technology, which requires no monitoring or control by an individual, and driver assist technology, which requires human supervision.
We start by looking at the terms used in legislation to refer to self-driving technology and the Law Commissions’ recommendations before discussing individual terms.
What words are used in statutes?
The legislation uses a variety of words to describe vehicles that are listed or authorised as able to drive themselves:
- Section 1 of the Automated and Electric Vehicles Act 2018 puts an obligation on the Secretary of State to list vehicles capable of ‘safely driving themselves’. Under section 1(4), a vehicle listed under the section is referred to as an ‘automated vehicle’.
- Section 1 of the Automated Vehicles Act 2024 defines basic concepts. They key issue is whether a vehicle ‘satisfies the self-driving test’, which, in turn, depends on whether a vehicle feature will allow it ‘to travel autonomously’.[footnote 28]
- Under section 3, the Secretary of State may authorise a vehicle that satisfies the self-driving test as an ‘automated vehicle’. And, under section 6, each authorised automated vehicle requires an ‘authorised self-driving entity’ to be responsible for it.
In other words, the legislation uses 4 terms to refer to vehicles where human monitoring and control is not required, at least in some circumstances. These are: ‘driving themselves’, ‘self-driving’, ‘automated vehicle’ and ‘travel autonomously’.
The Law Commissions’ recommendations
In 2022, the Law Commissions noted the debates over terminology, commenting that ‘our aim is not to prescribe what language is best’. Instead, they confined their recommendations to the terms which they thought were most likely to mislead the public into thinking that they did not need to pay attention to the road. The terms identified were: ‘self-drive’, ‘self-driving’, ‘drive itself’, ‘driverless’ and ‘automated vehicle’.
The Law Commissions did not wish to be overly prescriptive. They therefore recommended a fairly limited list, noting that other confusing terms would be caught by the confusion offence (now in section 79 of the 2024 act). However, they also recommended a power to add other protected terms in future, especially if consumer research showed them to be misleading. They commented that this might include, for example, ‘automated driving’ or ‘autonomous’.
Here we consider the terms recommended by the Law Commissions, those that are used in legislation and related terms and ask whether they should be protected. We then ask about symbols and marks. Finally, we ask whether other terms or brand names might be misleading.
‘Self-driving’/’driving itself’
In our view, there is a strong case to protect the use of ‘self-driving’ and ‘driving itself’ to vehicles which are authorised or listed as able to drive themselves safely. These terms are used in legislation to refer to the test for listed and authorised vehicles. To allow an unsafe or unlisted/unauthorised vehicle to be described as ‘self-driving’ or capable of ‘driving itself’ would undermine the safety assurance scheme and lead to unacceptable public confusion.
We think that the protection should apply to these phrases if they use any form of the verb ‘to drive’. It would therefore include ‘self-drive’ and ‘self-driven’, and to a vehicle which ‘drives’, or ‘drove’ itself. Furthermore, ‘itself’ would include ‘themselves’ – as in the phrase ‘these vehicles drive themselves’. It would also include where the term ‘self-drive’ was used as an adjective (as in ‘this is a self-drive vehicle’). As regulation 2(3) of the draft SI puts it, the protection applies to:
- Variants of the term resulting from its assignment to a different part of speech, and
- Other grammatical forms of the term.
However, it is important not to be too broad. The offence may not apply to marketing material where it is reasonably clear from the context that the driving is done not by automation, but by a human. An example could be ‘self-drive van hire’, where the human hirer is expected to drive the vehicle themselves.
The 2024 act already provides a defence in these circumstances. Section 78(4) applies where the accused proves that the protected term was ‘used in way which was not intended to convey and could not reasonably be understood as conveying, any meaning to do with automation’.
However, it might be helpful to provide further clarification in the regulation that the protection does not apply to terms which describe human driving. We ask for views.
‘Driverless’
The Law Commissions recommended protecting the use of ‘driverless’. We agree.
Vehicles which are not authorised or listed will require a driver (that is, ‘an individual who is exercising, or in a position to exercise, control of the vehicle’[footnote 29]). We do not see any justification for marketing such vehicles as ‘driverless’.
‘Automated vehicle’
The Law Commissions recommended protecting the use of the specific phrase, ‘automated vehicle’. This has now become a term of art. Both the 2018 and 2024 acts use the term specifically to refer to vehicles which are authorised or listed as self-driving.
It would be highly misleading to describe vehicles which are not authorised or listed in this way. We therefore agree that the use of these 2 words together – ‘automated vehicle’ – should be protected under section 78.
‘Automated’, when applied to the whole vehicle
The Law Commissions considered whether the term ‘automated’ should be protected more widely. They noted that variations of the words ‘automated’, ‘automation’ and ‘automatic’ are applied widely to automobiles and the automotive industry. They are also applied to a variety of parts, from gearboxes and windscreen wipers to brakes. The Law Commissions thought that protecting the single word, ‘automated’, would be too wide.
The government agrees. We do not wish to protect the word automated, when applied to specific features. However, there is a strong case to protect its use when it is applied to the whole vehicle.
Although the term ‘automated’ is included in the draft SI under the list of protected terms, regulation 2(2) clarifies that it will be a protected term ‘only when used to describe a vehicle as a whole’. On this basis, the protection would apply to phrases such as ‘automated car’, ‘automated van’, ‘automated hatchback’, or ‘our automated [name of model]’. However, it would not apply to ‘automated braking’.
‘Automated driving’
The Law Commissions left open the possibility of protecting the specific phrase ‘automated driving’. This term is now seen as synonymous with self-driving, as in the phrase ‘automated driving system’. We therefore think there is a case for protecting the phrase ‘automated driving’, when the 2 words are used together.
The protection would also apply where ‘automated driving’ was used as an adjective (as in ‘this vehicle is equipped with an automated driving system’, or ‘the car has automated driving capability’).
However, we would wish to ensure that the protection does not affect components, such as drivetrains. It would still be acceptable to refer to an ‘automatic drive’, or ‘automated transmission’, so long as this did not lead to confusion.
‘Autonomous’, when applied to the whole vehicle
As with ‘automated’, the word ‘autonomous’ is applied to a variety of vehicle features. The phrase ‘autonomous emergency braking’, for example, is now widely used to refer to a driver assistance feature, without causing confusion. As vehicle parts/features become automated, it would be unduly restrictive to prevent them from being described in this way.
However, the phrase ‘autonomous vehicle’ has gained a distinct meaning. It is now commonly used to refer to a vehicle that is self-driving. It would be highly misleading to describe vehicles which are not authorised or listed as autonomous.
Therefore, the term ‘autonomous’ is included in the draft SI under the list of protected terms, but regulation 2(2) clarifies that it will be a protected term ‘only when used to describe a vehicle as a whole’. Examples where the use of autonomous might be protected are ‘autonomous car’, ‘autonomous van’, ‘autonomous hatchback’, or ‘our autonomous [name of model]’.
‘Autonomous driving’/’driving autonomously’
The greatest risk of confusion is where the word ‘autonomous’ is applied not to the vehicle or its equipment, but to the driving process. End-users are particularly likely to think that when a vehicle ‘drives autonomously’, the human is not responsible for the driving task. Although applying the phrase ‘autonomous driving’ to vehicles that are not authorised or listed would almost certainly breach the confusion offence, there is a case for making this point particularly clear. We welcome views.
Again, the word ‘driving’ would include any form of the verb ‘to drive’, including ‘autonomous drive’, ‘drives autonomously’, ‘drove autonomously’ or ‘driven autonomously’. Under draft regulation 2(3), these phrases would be protected, whichever word appeared first, so as to include ‘autonomously driven’ or ‘driving: autonomous’.
‘Travel autonomously’
The 2024 act uses the phrase ‘travelling autonomously’ to decide whether a vehicle satisfies the ‘self-driving test’.
At this stage, we do not intend to protect the phrase ‘travel autonomously’, even though it is used in the legislation. It is a somewhat legalistic phrase, which is unlikely to appeal to end-users. However, if it is used in a misleading way it may be captured under the confusion offence in section 79.
Symbols or marks
During the Law Commissions’ consultation, several consultees drew attention to the possibility of a symbol or kitemark to indicate that a vehicle is self-driving. The Law Commissions commented that ‘as the industry develops there may be merit in developing’ such a mark, ‘particularly if standardised internationally’.[footnote 30] This possibility is provided for in section 78(1), which not only covers words and expressions but also ‘symbols or marks’.
At present, we are not aware of any such symbols. We welcome views on whether there are any specific symbols or marks that deserve special protection and whether there is merit in developing a standardised symbol or kitemark in the future.
Other possible terms
We are interested in receiving views about whether any other terms should be protected.
One particular area of concern is the use of brand names for advanced driver assistance systems. Manufacturers currently use a variety of names for their driver assistance systems. Examples include Tesla’s ‘Autopilot’, Ford’s ‘BlueCruise’ and ‘Co-Pilot 360’, Nissan’s ‘ProPILOT’ and Volvo’s ‘Pilot Assist’.
Specific concerns have been raised over the term ‘Autopilot’.[footnote 31] In the US, the National Highway Traffic Safety Administration (NHTSA) has said that ‘this terminology may lead drivers to believe that the automation has greater capabilities than it does’.[footnote 32]
However, there may be similar concerns with other brands, which have yet to be investigated.
Furthermore, any concerns may be due to the overall effect of using a range of marketing and design features that could encourage over-reliance, of which the term ‘Autopilot’, for example, is just one part. It might be possible to use ‘Autopilot’ (and other similar brand names) without being misleading. In this case, the scope of the section 79 offence would allow government to take action if the overall effect of the marketing is likely to confuse end-users into thinking that the unauthorised vehicles can safely drive themselves. We welcome views.
4. Consultation questions
These questions are included here so you can read them in the context of this document. The response form may include more questions, for example, questions about who you are.
See ways to respond for a full list of questions and how you can respond to them.
Conceptually, do you agree or disagree that certain terms should be protected for these vehicles?
Do you agree or disagree that the following terms should be protected?
- self-driving
- drive itself
- driverless
- automated driving
- autonomous driving
- drive autonomously
Do you agree or disagree that different parts of speech and other grammatical forms of protected terms should also be protected?
Do you agree or disagree that the terms ‘automated’ and ‘autonomous’ should be protected only when they are used to describe the whole vehicle?
In your view, are there any other terms that should be protected under the Automated Vehicles Act 2024?
Do you agree or disagree with our approach of only protecting English terms?
Do you agree or disagree that there will be sufficient legal safeguards to prevent the protection from being applied to marketing which is unconnected with driving automation?
In your view, are there any specific symbols and marks that indicate a vehicle is self-driving that deserve special protection?
If the terms proposed in this consultation were protected, would your business incur any costs as a result?
What costs would your business incur and to what scale?
How would the protections work in practice?
As discussed in chapter 1, the protected terms offence is limited. It only applies when the term is used by a business in connection with the promotion or supply of a road vehicle or vehicle equipment – and the vehicle is not listed or authorised. Furthermore, it must be directed at end-users or potential end-users.
Here we consider some possible scenarios and discuss whether the protected terms offence applies.
Using a protected term in a caveated format
A company promotes its new model as ‘self-driving’ but caveats it, for example, by saying the vehicle’s system provides ‘enhanced driver support’ or that it operates ‘under human supervision’. The vehicle has not been listed or authorised. Would this fall foul of the marketing offence?
Yes. The company would be using a protected term in connection with the promotion or supply of a vehicle that is not listed or authorised, even if it has been caveated in this manner.
Referring to future self-driving capability
In this example, a company releases new software claiming it can provide ‘future self-driving capability, subject to regulatory approval’. The vehicle has not been listed or authorised as an automated vehicle. Would this still fall foul of the marketing offence?
Yes. If the vehicle is yet to be authorised or listed as self-driving, it cannot be marketed using a term protected for use in connection with only authorised/listed vehicles.
The offence might apply even if the claim is factually accurate. An example might be where authorisation has been applied for but not yet received. In these circumstances a TV advertisement for the vehicle might state: ‘we have applied for authorisation to allow it to drive itself’.
This would fall within the offence. The business has used a protected term in connection with the promotion or supply of a road vehicle. Even though the statement is factually correct, it may still cause harm. Viewers do not necessarily absorb the precise meaning of every statement made in an advert. Instead, the general association of the vehicle with self-driving may lead end-users to think that they do not need to pay attention to the road.
It would be different if the same statement were made in a press release for a vehicle which is not yet on the market. This would be legal. The term would not be used in connection with the promotion or supply of a road vehicle.
Trials of self-driving passenger services
In this scenario, a developer trials a self-driving passenger service, in which members of the public are invited to participate. The vehicle has a safety driver, while members of the public travel in the vehicle as passengers. Can the service be promoted as a ‘self-driving trial’?
Yes. The misleading marketing offences only apply if the communication is ‘directed at an end-user or potential end-user of the vehicle’. In this scenario, there is no end-user who is likely to be misled into thinking that they do not have to pay attention to the road.
- As discussed in chapter 1, the passengers are not end-users, because they have no responsibility for ‘controlling, managing or operating the vehicle’.[footnote 33]
- The safety driver is not an end-user because they are specifically exempt. Under section 81(1), a person is not to be taken as an end-user if they use the vehicle ‘for commercial purposes to do with the development’ of the vehicle. It is clearly vital for the safety of the trial that the safety driver is fully trained and understands the need to pay attention. However, given this training, it is reasonable to assume that the safety-driver will not be misled by references to ‘self-driving’ in the marketing materials.
However, as stressed in chapter 1, it would be an offence to invite consumers who have already bought vehicles to participate in a trial of unauthorised ‘self-driving software’. Consumers who own or drive vehicles are end-users, and the use of phrases such as self-driving are likely to mislead them into not paying full attention the road. It does not matter that the unauthorised software is described as a ‘trial’.
How to respond
See ways to respond to find out how you can respond to this consultation.
The consultation period began on 10 June 2025 and will run until 23:59 on 1 September 2025. Ensure that your response reaches us before the closing date.
Freedom of Information
Information provided in response to this consultation, including personal information, may be subject to publication or disclosure in accordance with the Freedom of Information Act 2000 (FOIA) or the Environmental Information Regulations 2004.
If you want information that you provide to be treated as confidential, please be aware that, under the FOIA, there is a statutory code of practice with which public authorities must comply and which deals, amongst other things, with obligations of confidence.
In view of this it would be helpful if you could explain to us why you regard the information you have provided as confidential. If we receive a request for disclosure of the information, we will take full account of your explanation, but we cannot give an assurance that confidentiality can be maintained in all circumstances. An automatic confidentiality disclaimer generated by your IT system will not, of itself, be regarded as binding on the Department for Transport.
DfT will process your personal data in accordance with the Data Protection Act (DPA) and in the majority of circumstances this will mean that your personal data will not be disclosed to third parties.
Data protection
Your consultation response and the processing of personal data that it entails is necessary for the exercise of our functions as a government department. DfT will, under data protection law, be the controller for this information. DfT’s privacy policy has more information about your rights in relation to your personal data, how to complain and how to contact the Data Protection Officer.
-
Most provisions in the Automated Vehicles Act 2024 are not yet in force. References in this document to provisions of the 2024 act are phrased to reflect the anticipated legal position once the relevant provisions are in force. ↩
-
By ‘protected terms’, we mean the same as ‘restricted terms’ defined under s 78(8) of the Automated Vehicles Act 2024. ↩
-
Automated vehicles: joint report, recommendations 34 and 35. ↩
-
For a short summary, see Law Commissions, Background paper A (2022), paragraphs 1.22 to 1.26. ↩
-
Brown v Roberts [1965] 1 QB 1 at pages 15A to B. See also R&S Pilling v UK Insurance Ltd [2019] UKSC 16. ↩
-
Automated Vehicles Act 2024, s 78(5)(b) There is also a defence if the business can show that it took all reasonable precautions and exercised all due diligence to prevent the term from coming to the attention of end-users in Great Britain. Under section 78(5)(a), the business must also prove that the use of the restricted term was directed only at end-users or potential end-users of vehicles outside Great Britain. ↩
-
Automated vehicles analysis of responses to the preliminary consultation paper, paragraph 5.9. ↩
-
Automated vehicles analysis of responses to the preliminary consultation paper, paragraph 5.28. ↩
-
Automated vehicles analysis of responses to the preliminary consultation paper, paragraph 5.29. ↩
-
Automated driving hype is dangerously confusing drivers, study reveals, Thatcham Research. ↩
-
Confused UK drivers believe they can buy a fully autonomous car today ‘Drive itself’ was defined as ‘a car with technology that can drive the car completely autonomously, as safely as a competent human driver would, and allows you to remove your hands from the steering wheel’. ↩
-
Great self-driving exploration: a citizen view of self-driving technology in future transport, DfT (June 2023). ↩
-
Great self-driving exploration: a citizen view of self-driving technology in future transport, page 15. ↩
-
Great self-driving exploration: a citizen view of self-driving technology in future transport, page 76. ↩
-
Automated vehicles: The Highway Code update cognitive testing report, BritainThinks (July 2021). ↩
-
Automated Vehicles: The Highway Code update cognitive testing report, page 7. ↩
-
Xiaomi will cooperate with investigation into fatal EV crash, says founder, Reuters. ↩
-
China mandates regulatory approvals for autonomous driving software upgrades, Reuters. ↩
-
China bans ‘smart’ and ‘autonomous’ driving terms from vehicle ads, Reuters. ↩
-
China’s MIIT tightens regulations on autonomous driving features, banning key functions, CarNewsChina. ↩
-
Self-driving technologies need the help of the public, PAVE UK (May 2025), p.18. ↩
-
Self-driving technologies need the help of the public, PAVE UK (May 2025), p.35. ↩
-
S. Khastgir, S. Birrell, G. Dhadyalla, and P. Jennings: Calibrating trust through knowledge: Introducing the concept of informed safety for automation in vehicles, Transp Res Part C Emerg Technol, vol. 96, pp. 290–303, 2018. ↩
-
Responsible innovation in self-driving vehicles, CDEI (August 2022). ↩
-
Future of transport: deliberative research, BritainThinks (March 2021). ↩
-
Private Hire Vehicles (London) Act 1998 s 31, Transport Act 1980 s 64. ↩
-
The phrase ‘travelling autonomously’ is also used in sections 2(1), 2(4), 7(3)(c), 79(1)(e), 83(2) and 91(1)(a). ↩
-
Road Traffic Act 1988, s 34B, inserted by Automated Vehicles Act 2024, s 53 (subject to being commenced). ↩
-
Automated vehicles: joint report, paragraph 7.29. ↩
-
Euro NCAP-assisted driving 2020 Tesla model-3 datasheet commented: ‘Tesla’s system name Autopilot is inappropriate as it suggests full automation. The promotional material suggests automation where the handbook correctly indicates the limitations of the system capabilities, which could lead to confusion’. ↩
-
Brown v Roberts [1965] 1 QB 1 at pages 15A to B. See also R&S Pilling v UK Insurance Ltd [2019] UKSC 16. ↩