Smart Data opportunities in digital markets: government response
Updated 12 May 2026
Introduction
In the Modern Industrial Strategy, His Majesty’s Government (HMG) committed to examining how it could empower individuals by improving access to data in a way that supports innovation, competition and economic growth, while maintaining strong protections for customers.
This document analyses responses received to the government’s call for evidence on the potential development of a Smart Data scheme in digital markets. The call for evidence sought views on whether the powers in the Data Use and Access (DUA) Act could be used to establish Smart Data schemes in Digital Market Sectors, what data access issues these schemes could resolve, the use cases such a scheme might support, and the key design considerations, risks and impacts associated with its implementation. The analysis presented here is intended to provide an objective synthesis of stakeholder views to inform the government’s consideration of next steps. It does not set out government policy or pre-judge future decisions.
The call for evidence explored a broad range of issues, including:
-
barriers customers face in accessing and sharing their data
-
potential use cases and benefits of Smart Data in digital markets
-
the scope of any potential scheme, including data and data holders
-
feasibility, costs, risks and impacts
-
customer trust, governance and design principles
-
interactions with the existing regulatory landscape
The government received 54 responses to the call for evidence. Responses were submitted by a diverse range of stakeholders, including digital markets firms, technology providers, potential authorised third-party providers, trade bodies, consumer groups, academic organisations and other interested parties. Not all respondents answered every question, reflecting the breadth of topics covered and differing areas of interest.
This document provides a question-by-question thematic analysis of responses to the call for evidence. The analysis seeks to reflect the balance of views expressed neutrally. Individual respondents are not attributed unless explicitly stated in their submissions. The purpose of this analysis is to capture the range of perspectives received and to support evidence-led consideration of the potential role of Smart Data in digital markets.
Theme 1: Trust, control, and protection of end users as the central success factor
Respondents consistently said that customers often do not have clear sight of the data held about them and face practical difficulties when trying to access or share it. Many described current processes as complicated and time-consuming, with platform design, identity checks and limited system compatibility making transfers harder than they should be. Together, these issues were seen to limit people’s ability to use their data in practice and to slow the growth of trusted third-party services that depend on straightforward data access.
Common threads include:
-
Strong concern about data misuse, commodification, profiling, and discriminatory outcomes, especially where data is aggregated across sectors.
-
Repeated guidance that consent must be meaningful, granular, revocable, and understandable, particularly for vulnerable or less digitally confident users.
-
Recognition of the importance of trust to scheme success – once users feel data sharing is exploitative, adoption collapses and public confidence in data regulation erodes.
-
Emphasis on privacy-by-design, transparency, auditability, redress mechanisms, and clear accountability for data holders and authorised third parties (ATPs).
Theme 2: Governance, Standards, and Scope Matter More Than Technology
Respondents also pointed to limits in how data sharing tools are currently designed. Many noted that firms often do not provide real-time or automated API access for third parties, even where similar systems exist internally or are required in other jurisdictions. This was seen to restrict how data can be used in practice, reducing consumer control and making it harder for third parties to access information in a structured and efficient way.
While technology is acknowledged as necessary, respondents repeatedly argue that governance and scope decisions will determine outcomes far more than technical capability.
Key recurring points include:
-
Strong calls for independent, outcomes-based governance, with clear allocation of responsibilities between government, regulators, data holders, and ATPs.
-
Widespread agreement that poorly scoped or overly broad schemes increase risk, cost, and complexity without delivering proportional benefits.
-
Repeated lessons drawn from Open Banking, UK GDPR portability, and international regimes that standards must be interoperable, enforced, and resistant to fragmentation or capture.
-
Concerns that weak governance could lead to exploiting gaps or inconsistencies in oversight, inconsistent compliance, or de facto monopolies, especially among ATPs.
-
Preference for phased, use-case-driven implementation, rather than a “big bang” rollout.
Theme 3: Balancing Competition, Innovation, and Proportionality
The responses consistently highlight tension between pro-competitive ambitions and real-world market dynamics, particularly in complex digital ecosystems.
Shared concerns include:
-
Recognition that Smart Data can lower barriers to entry and enable small and medium-sized enterprise (SME) innovation, but only if compliance costs and certification requirements are proportionate.
-
Warnings that mandatory data sharing may entrench incumbents if only large firms can absorb implementation costs or influence standards.
-
Acknowledgement that digital markets differ fundamentally from financial services: data is more varied, more context-dependent, and more tightly bound to product differentiation.
-
Strong emphasis on avoiding one-size-fits-all rules, with repeated calls for flexibility by sector, use case, and risk profile.
-
Clear expectation that economic benefits depend on uptake, which in turn depends on demonstrable value for businesses and customers alike.
Conclusion
The call for evidence has demonstrated that there are clear concerns regarding how effectively data portability works in digital markets. Stakeholders noted problems with the time it takes for UK GDPR portability requests to be addressed as well as with the formats in which data is provided. Firms also noted the risks involved in being satisfied that data portability requests are genuine. Several points were also made in relation to the inconsistent provision of APIs, particularly those developed for the purposes of compliance with the EU’s Digital Markets Act.
Respondents also argued that addressing these concerns could support a range of promising business use cases including opportunities for data monetisation and donation and contextual AI services.
There was a broad consensus that a Smart Data scheme could address these issues and that a scheme in digital markets would be beneficial for growth and individuals. However, there was less consensus on precisely what a scheme should look like.
Responses highlighted the diverse and broad nature of digital markets. Unlike other broadly homogenous sectors where Smart Data schemes are envisioned, digital markets firms form a complex ecosystem offering complementary rather than necessarily equivalent products and services. This makes the question of the scope of any scheme particularly complex.
Stakeholders also presented arguments in favour of a mandatory scheme, an “opt in” scheme or for an industry led approach.
Any scheme in digital markets must be evidence-led, proportionate and designed to deliver meaningful benefits for businesses and individuals. As such, the government will consult shortly on these issues and the overall design of a digital markets scheme.
Evidence Summaries
Q1. What issues do customers face in accessing their data held by digital markets firms and sharing that data with third parties?
Across the responses, there was broad agreement that the issues faced by customers are not isolated, but mutually reinforcing, stemming from:
-
platform design that favours data retention
-
security and verification flows that create disproportionate friction
-
technical ecosystems that lack standardisation or portability by default
Overall, respondents indicated that customers are often left without clear visibility of their data, without efficient means to transfer it, and without technical routes that support seamless third-party ingestion, limiting both the exercise of rights and the potential growth of trusted data intermediation services.
Respondents identified persistent challenges that customers face when accessing data held by digital markets firms and sharing it with third parties. The barriers most frequently raised were process complexity, restrictive platform design, verification friction, and limited interoperability, with many noting that these issues collectively inhibit effective data mobility.
A significant proportion of respondents stated that data access request processes are overly burdensome for customers, often requiring multiple steps, unclear navigation, or lengthy identity verification flows that are disproportionate to the request. Several responses emphasised that firms frequently rely on account-level security checks that prioritise fraud prevention over user experience, resulting in customers being asked to repeatedly authenticate or supply additional documentation, even when already logged in. This was viewed as a source of friction that can deter customers from completing requests, particularly where data is not surfaced in intuitive dashboards or export tools.
A common theme was that data sharing mechanisms are often technically constrained or commercially imbalanced. Respondents reported that many firms provide access only through one-off downloads, non-standard file formats, or export flows that do not support onward transfer, making it difficult for customers to transmit data directly into third-party services. There was repeated mention that APIs enabling real-time or continuous data transfers are rarely made available to customers, even where similar functionality exists internally for the firm or including where required under the EU Digital Markets Act. Respondents noted that this lack of interoperability reinforces market lock-in and restricts the ability for third parties to meaningfully act on user-authorised data.
Several respondents highlighted that terms of service and platform permissions are frequently used to shape or limit data access. Some suggest that digital markets firms design data flows in ways that discourage external sharing by default, for example by embedding consent within lengthy legal agreements, or failing to provide transparency about what data is held, where it is stored, or which onward recipients are permitted. These design choices were perceived as creating structural barriers for customers seeking to exercise rights through intermediaries, particularly where the benefits of sharing are not clearly communicated.
Q2: To what extent do digital markets firms provide customers with clear, accessible, and user-friendly mechanisms to access and share their data with third parties?
Across the evidence base, there was a consensus that the current level of provision is partial, fragmented, and inconsistently user-centric, with barriers stemming from:
-
unclear and non-intuitive user-interface placement of data tools
-
limited explanation of data scope and exclusions
-
lack of third-party ingestible API access
-
sharing mechanisms that assume technical literacy
Overall, respondents indicated that while mechanisms to access data are technically present, they are not typically clear, accessible, or user-friendly. This means customers cannot confidently exercise their rights or share data onwards at scale, limiting both customer agency and the development of trusted third-party data intermediation services.
A common theme across responses was that existing data access tools lack clarity and accessibility, with many respondents stating that mechanisms are frequently buried within account settings, privacy menus, or legal compliance pages, rather than being presented through intuitive, front-facing user dashboards. Several responses noted that customers are often required to interpret complex terminology or traverse fragmented settings architectures, which was perceived as inhibiting ease of use and limiting completion rates.
Respondents highlighted that there were inconsistencies in user experience, particularly for data sharing. Many reported that platforms tend to offer download-based exports, but that these flows are not designed for structured onward transfer, lack clear guidance on third-party sharing, and are often provided in formats that are difficult for customers to understand or reuse without technical literacy. There was repeated mention that firms prioritise minimum legal compliance over usability, which was framed as creating a gap between theoretical availability and practical accessibility.
Several respondents emphasised that data sharing tools are often restrictive by design, noting that digital markets firms typically do not offer real-time, continuous, or third-party ingestible API access, even where comparable infrastructure exists internally or under EU Digital Markets Act obligations. Respondents indicated that this limits customer agency and undermines effective data mobility, particularly where third parties require structured or automated access.
A smaller number of respondents took a more positive view, acknowledging that some firms provide reasonably accessible export tools, particularly for profile, transactional, or account-activity data. However, even within these responses, caveats were included that data scope is often unclear, customers are not shown what data is excluded, and consent and sharing permissions are not sufficiently transparent.
Q3: What technical, commercial, or legal barriers do third parties face in receiving customer-authorised data from digital markets firms?
Across the evidence base, respondents commented that barriers are not discrete, but arise from interlocking structural conditions, including:
-
absence of standardised, secure, third-party ingestible API provision
-
commercial incentives that favour data retention over mobility
-
developer and contractual terms that shape data reception conditions
-
legal uncertainty around liability and downstream compliance
Overall, respondents indicated that third parties are often placed in a position where data transfer is technically possible but commercially constrained, legally ambiguous, and operationally unscalable, inhibiting both effective customer-authorised data sharing and the development of a competitive, trusted third-party data services ecosystem.
A major theme across responses was that technical ecosystems are not built for secure, standardised, or scalable third-party data ingestion. Respondents reported that when firms do share data, it usually comes in inconsistent formats, often downloaded by hand, organised differently each time, or missing important information. This makes it hard and expensive for others to use that data properly. Many noted that digital markets firms typically do not make APIs for continuous, machine-readable, customer-authorised data transfer, even where such infrastructure exists internally or in EU-regulated markets, creating a two-tier access environment that disadvantages third-party services.
Commercial barriers were repeatedly emphasised. Respondents noted that platforms have limited commercial incentives to enable friction-free data mobility and often apply commercial controls – such as deliberately slowing down or limiting how quickly data can be accessed, transferred, or used. Restrictive developer terms, or conditional access models, also limit how customer-authorised data can be received and reused. Several responses highlighted that data flows are frequently subject to access gating, lack of pricing transparency, or contractual obligations that favour the platform’s retention advantage, reinforcing market lock-in and inhibiting competition for third-party innovators.
Legal barriers were framed as material but secondary to commercial and technical blockers. Respondents frequently cited liability ambiguity, unclear data ownership boundaries, and uncertainty around onward compliance obligations, particularly where data contains mixed party, inferred, or co-generated content. Some respondents noted that UK GDPR rights and platform terms of service can conflict in practice, creating uncertainty for third parties on whether the firm or recipient bears responsibility for breach risk, secure transfer, redaction, or downstream consent. Several responses suggested that firms interpret legal obligations narrowly, opting for risk-averse data minimisation rather than enabling broader portability.
Q4: What controls or safeguards are appropriate when enabling customers to share their data with third parties, particularly via intermediaries?
Across the evidence base, respondents agreed that appropriate safeguards must balance security and usability, including:
-
proportionate identity verification, avoiding repetitive authentication
-
secure, encrypted, and integrity-checked data transfer
-
granular, auditable, and revocable consent records
-
clear accountability for intermediaries and recipients
Overall, respondents indicated that controls should be trust-enhancing, customer-centric, interoperable, and operationally scalable, ensuring that safeguards protect users without becoming a barrier to effective customer-authorised data sharing or third-party innovation.
A recurring theme was that robust identity verification and secure transfer protocols are essential, but respondents emphasised these must be proportionate and risk-appropriate, warning that current platform approaches create friction that harms usability. Many respondents advocated for end-to-end encrypted delivery, token-based authentication, and secure authorisation standards (such as OAuth-style delegated access) to ensure customers, not platforms, control permissions. There was broad support for firms to provide revocable, time-bound access credentials, enabling customers to withdraw third-party access easily.
Transparency and accountability were repeatedly highlighted. Respondents supported clear data inventories showing what data is held and what is portable. They noted that sharing mechanisms should include tamper-evident logs or audit trails so customers and recipients can confirm when data was accessed, shared, or modified. Several responses emphasised that consent must be explicit, granular, and auditable, ideally via digital consent receipts, machine-readable permission records, or standardised consent objects that can be stored and verified by intermediaries. Respondents also stressed that customers should receive clear explanations of sharing risks, retention periods, and permitted onward uses, in plain language, not embedded solely within legal terms.
Respondents indicated that any future accountability frameworks should be unambiguous, particularly where intermediaries facilitate data transfer. Many supported clear role delineations between the data-holding firm, the intermediary, and the third-party recipient, including agreement on downstream breach liability, redaction responsibility, and data security obligations. Some respondents suggested that regulatory alignment or accreditation for intermediaries could strengthen trust, though this was not universally stated. There was general agreement that safeguards must avoid locking customers into platform-defined intermediaries only, noting that customer choice of recipient and intermediary should remain protected.
Q5: What role should the government take to improve data mobility for customers in digital markets?
Across the evidence base, respondents agreed that government should take a role that is:
-
standards-setting and ecosystem-convening
-
transparency-enhancing
-
mindful of competitive dynamics and aligned with balanced incentives
-
legally clarifying without prescribing firm-specific solutions
Overall, respondents characterised the appropriate government role as enabling, harmonising, and trust-building, ensuring that improvements to data mobility preserve customer agency, avoid reinforcing lock-in, and support a competitive third-party data services ecosystem.
A dominant theme was that government should prioritise the development and promotion of interoperable data transfer standards, rather than mandating proprietary tools created by and for individual companies. Respondents repeatedly supported government acting as a convener of ecosystem-wide technical frameworks, including open schema standards, secure delegated access protocols, and common data portability taxonomies. Many noted that where standards are not harmonised, customers face fragmented export experiences, and third parties incur duplicative integration costs, suggesting government coordination could reduce both.
Respondents also highlighted a role for government in strengthening transparency and accountability norms, including encouraging or requiring firms to provide clear, customer-facing data inventories, consistent export labelling, and plain-language explanations of data portability rights and risks. Several responses indicated that improved data mobility requires not only technical standards but trust frameworks, with government positioned as a neutral actor capable of supporting accreditation, assurance models, or best-practice guidance for intermediaries and third-party data recipients. Some respondents framed this as trust infrastructure rather than regulatory gatekeeping, emphasising customer choice must remain protected.
Commercial incentives were recognised as a structural issue. Many respondents stated that digital markets firms have limited intrinsic motivation to maximise data mobility, particularly when data retention confers competitive advantage. Respondents suggested government could explore mechanisms that rebalance incentives toward portability, including monitoring competitive asymmetries, reducing policy ambiguity around liability, and ensuring data mobility provisions do not inadvertently lock customers into incumbent ecosystems.
Legal clarity was another recurring theme. Respondents noted that government should help address ambiguity around data ownership, downstream compliance responsibilities, and liability for intermediary-facilitated transfers, particularly where data contains mixed-party or inferred content. Several responses suggested that rights under data protection law and platform terms of service should be more clearly aligned in practice, and that government could investigate non-binding guidance or clarifications to support firms and recipients.
Q6: What are the potential risks or unintended consequences of requiring digital markets firms to provide greater customer data access and portability?
Across the evidence, respondents agreed the principal risks fall into the following interlocking categories:
-
security and fraud, particularly around authentication and account compromise
-
competitive distortion, if compliance burdens or intermediary selection are poorly structured
-
liability and accountability ambiguity, deterring ecosystem participation
-
usability and integrity, where bulk data provision shifts complexity onto users and recipients
Overall, respondents framed greater portability as high-value but risk-sensitive, emphasising that unintended consequences are not a reason to avoid action, but rather to ensure government and firms pursue standardised, secure, competition-aware, and customer-centric implementations.
A key theme was security and fraud risk. Many respondents cautioned that expanding portability could increase the attack surface for account compromise, impersonation, or unauthorised transfers, particularly if firms implement controls that are inconsistent or poorly standardised. Several responses stressed that bad actors could exploit portability flows to exfiltrate data at scale or use intermediaries as a new vector for phishing or credential theft, if authentication standards are not robust and revocable. Respondents highlighted that risks are heightened where data transfers involve high-volume automated delivery, reinforcing the need for rate-limiting, encryption, and delegated access tokens rather than static credentials.
Respondents also identified commercial and competitive asymmetries. Many noted that poorly designed portability requirements could inadvertently strengthen incumbent firms, for example if obligations are implemented in ways that only large platforms can absorb the compliance cost of, or if customers are funnelled toward platform-preferred intermediaries. Several responses warned that mandates lacking clear standards could create fragmented tooling, increasing costs for smaller firms and third parties, and leading to uneven compliance burdens that distort competition.
Another major theme was legal and liability ambiguity. Respondents stated that greater portability could expose uncertainty around data ownership boundaries, inferred or co-generated data, and downstream breach responsibility, especially if firms take a narrow, risk-averse interpretation that limits data fields rather than clarifying obligations. Several responses highlighted a risk of UK GDPR accountability shifting informally onto intermediaries or third parties without explicit role delineation, which could deter participation in data mobility ecosystems. Respondents emphasised that unclear liability frameworks may drive firms toward over-redaction or portability avoidance, limiting the policy intent.
A smaller but consistent theme was data quality, integrity, and usability risk. Respondents indicated that greater access could lead to customers receiving very large data packages without context, structure, or clear provenance, increasing the chance of misinterpretation, improper reuse, or accidental onward sharing of sensitive fields. Several responses noted that platforms might meet obligations by providing bulk exports that are technically compliant but not meaningfully intelligible to users, shifting usability burdens rather than solving them.
Q7. What challenges and risks should we consider when developing a digital markets smart data scheme and how can we mitigate these? This might include (but is not limited to): competition; customer exclusion; data quality or data misuse; ethical, operational or technical readiness.
Across the evidence base, respondents agreed that mitigations should centre on:
-
open, non-discriminatory and technology-neutral standards to preserve competition
-
accessibility-by-design and supported journeys to mitigate customer exclusion
-
standardised, validated, context-preserving exports to improve data quality and usability
-
revocable, encrypted, auditable transfer controls to prevent data misuse
-
phased, tested rollout supported by government coordination and legal clarity
Overall, respondents characterised the scheme’s challenges as interdependent rather than discrete and indicated that effective mitigation requires trust-enhancing infrastructure, common standards, clear accountability, and inclusive design, ensuring that greater portability expands rather than limits customer agency while enabling a competitive third-party ecosystem.
A dominant theme was competition risk and market asymmetry. Many respondents warned that scheme design could unintentionally advantage incumbents if technical standards, compliance expectations, or delivery models are defined in ways that only the largest firms can operationalise at scale. There was strong support for government ensuring the scheme is open, non-discriminatory, and technology-neutral, including avoiding designs that funnel customers toward a closed set of platform-preferred third parties or intermediaries. Respondents indicated that mandatory participation without clear standards could increase fragmentation and raise costs for smaller firms and third parties, undermining competitive objectives.
Respondents also raised the risk of customer exclusion and digital divide impacts. Several responses highlighted that low digital literacy, accessibility needs, language barriers, and socio-economic factors could limit scheme uptake or prevent customers from exercising data sharing rights effectively. Many supported mitigations such as assisted journeys via accredited intermediaries, accessibility-by-design UI standards, and offline or supported routes for vulnerable users, to ensure portability is not only legally available but practically exercisable. Respondents agreed that customer choice must remain protected, with the scheme supporting mobility for all user groups, not only digitally confident consumers.
A major recurring theme was data quality, integrity, and usability risk. Respondents noted that digital markets firms may meet obligations by providing bulk exports that are incomplete, inconsistently labelled, poorly structured, or lack provenance, shifting complexity onto users and recipients. There was broad support for mitigations including standardised schemas, common data dictionaries, validation layers before export, and data-completeness signalling or confidence markers, to help customers and third parties understand what data is portable, verified, and reusable. Respondents also stressed that data transfers should preserve metadata and context, to reduce misinterpretation and downstream processing cost.
Respondents further emphasised data misuse and security risk, particularly where transfers occur via intermediaries. Many warned that portability flows could be exploited for unauthorised exfiltration, credential theft, or large-scale misuse, if safeguards are not robust. There was strong support for end-to-end encryption, revocable delegated access tokens, time-bound credentials, rate-limiting, and auditable consent objects (e.g., digital consent receipts or permission logs) to ensure transfers are secure, user-authorised, and traceable. Respondents agreed that static credentials or insecure API delivery would materially heighten misuse risk.
Several respondents highlighted ethical, operational, and technical readiness gaps. Many noted that firms vary significantly in technical maturity, legacy infrastructure, data-mapping capability, and operational capacity, meaning scheme rollout should be phased, tested, and supported by clear guidance, rather than deployed in a single step. Government was positioned as a neutral convenor to support ecosystem assurance frameworks, best-practice guidance, and industry coordination, particularly around liability clarity and intermediary accountability norms.
Q8: What are the potential implementation costs to industry of introducing a digital markets Smart Data scheme? What aspects of a scheme might be most expensive to implement?
Across the evidence, respondents converged on the view that the most expensive aspects of implementation are likely to be:
-
building and maintaining secure, interoperable APIs and data-sharing infrastructure
-
cybersecurity, fraud prevention, and resilience measures
-
compliance, accreditation, and ongoing governance requirements
-
legacy system upgrades and data standardisation efforts
Overall, respondents emphasised that while implementation costs will be significant and unavoidable, they are highly sensitive to design choices. A scheme that is standards-based, interoperable, phased, and aligned with existing frameworks was viewed as the most effective way to manage costs, concentrate burdens on those best able to bear them, and ensure that long-term efficiency, competition, and innovation benefits outweigh the initial investment.
Respondents broadly agreed that introducing a digital markets Smart Data scheme would involve material upfront and ongoing costs for industry, with the scale and distribution of those costs highly dependent on scheme design, scope, and governance choices. There was strong consensus that poorly scoped, fragmented, or overly prescriptive approaches would significantly increase costs, while standards-led, phased, and interoperable designs could mitigate them over time.
The most consistently cited cost driver was technical integration and infrastructure build. Many respondents identified the development, upgrading, and maintenance of secure, interoperable APIs as the single most expensive element, particularly where firms operate legacy or highly fragmented internal systems. Costs were noted to include schema mapping, data standardisation, real-time or event-driven data access, consent and authorisation dashboards, logging and audit infrastructure, fraud and abuse detection, and ongoing API maintenance. Several respondents emphasised that cybersecurity investment—including encryption, authentication, monitoring, and resilience testing—would represent a substantial and recurring cost, especially for large data holders managing high-volume data flows.
A second major theme was compliance, governance, and accreditation costs. Respondents noted that firms would need to invest in legal advice, data governance frameworks, Data Protection Impact Assessments (DPIAs), audits, certification processes, and liability management, with these costs recurring over time. Large platforms were seen as likely to require dedicated compliance and assurance teams, while SMEs and third-party providers could face proportionally higher burdens if accreditation and reporting requirements are not designed to be simple, standardised, and proportionate. Several responses highlighted that inconsistency between schemes or interpretations of standards is a key cost multiplier, forcing firms—particularly SMEs—to duplicate work across platforms.
Respondents also highlighted operational and organisational change costs, including staff training, customer support, process redesign, incident response capability, and internal change management programmes. For larger organisations, these were described as potentially enterprise-wide transformation efforts, while for smaller firms the primary cost was often staff time and disruption rather than capital expenditure. Customer-facing design work, such as consent journeys and transparency tooling, was generally seen as necessary but less costly relative to core technical integration.
Several responses drew attention to legacy data and digitisation costs, particularly for sectors or organisations holding paper-based, unstructured, or inconsistent historical records. Converting these into machine-readable, standardised formats was described as resource-intensive but essential to ensure scheme effectiveness. Some respondents also noted data quality uplift and validation as a non-trivial cost, as inaccurate or poorly maintained data would undermine the value of Smart Data while increasing downstream risk.
A recurring cross-cutting theme was the uneven distribution of costs across market participants. Many respondents argued that large incumbents are best placed to absorb high upfront costs, while SMEs and start-ups are more sensitive to complex standards, bespoke integrations, and unclear governance models. Several responses warned that without mitigation, such as shared toolkits, reference implementations, phased rollout, and reuse of existing trust frameworks, the scheme could inadvertently raise barriers to entry or accelerate market consolidation. Experience from Open Banking was frequently cited to illustrate both the scale of initial investment required and the long-term reduction in marginal costs once infrastructure is established.
Q9: How can we build and maintain customer trust in a digital markets Smart Data scheme? For example, what responsibilities need to be considered for data owners and ATPs?
Overall, the evidence indicates that customer trust in a digital markets Smart Data scheme is most likely to be built and sustained where the scheme combines:
-
meaningful customer control and transparent consent
-
clear accountability, liability, and redress
-
robust security and privacy safeguards
-
accreditation and independent oversight
-
inclusive, accessible, and user-centred design
Respondents consistently stressed that trust is not a one-off achievement but an ongoing condition, requiring visible enforcement, continuous communication, and demonstrable customer benefit to endure.
A dominant theme was the importance of meaningful customer control and consent. Most respondents emphasised that trust depends on explicit, informed, granular, and revocable consent, with customers able to understand what data is shared, with whom, for what purpose, and for how long. Many highlighted the need for simple, centralised dashboards or consent management tools allowing users to view access histories, revoke permissions easily, and manage multiple ATP relationships in one place. Several responses stressed that consent must be opt-in by default, free from dark patterns, and presented in plain language, particularly to protect vulnerable users and those with low digital literacy.
Closely linked to this was a strong emphasis on transparency and accountability. Respondents argued that customers must be able to see and audit data flows, including clear logs of access and purpose, and must know who is responsible if something goes wrong. There was broad support for clear liability allocation between data holders and ATPs, with customers shielded from harm caused by authorised participants. Accessible complaints, redress, and dispute resolution mechanisms, backed by enforcement powers and sanctions, were seen as essential to sustaining trust over time rather than merely establishing it at launch.
Another widely shared theme was the role of accreditation, standards, and independent oversight. Many respondents drew on lessons from Open Banking, arguing that a centralised, scheme-wide accreditation regime for Authorised Third Parties—covering security, operational resilience, data protection, and conduct—would provide a visible “stamp of approval” that customers can recognise and rely upon. Several respondents cautioned that accreditation should be proportionate and interoperable across sectors, to avoid excluding SMEs or creating unnecessary duplication, while still maintaining robust safeguards. Independent governance, with transparency in decision-making and consumer representation, was repeatedly highlighted as critical to credibility.
Security and data protection were consistently identified as non-negotiable foundations of trust. Respondents emphasised the need for secure APIs, strong authentication, encryption, fraud prevention, audit trails, and breach notification requirements, with ATPs held to standards at least as high as data holders. Some respondents argued that trust should be “designed in” through privacy-preserving architectures, data minimisation, and purpose limitation, rather than relying solely on behavioural commitments. Others noted that international consistency and alignment with existing frameworks (such as UK GDPR, the Digital Identity and Attributes Trust Framework, or ISO standards) would help avoid weakening security or fragmenting trust across borders.
A further recurring theme was clarity of roles and responsibilities. Many respondents stressed that trust depends on customers understanding who does what within the scheme. Data holders were commonly seen as responsible for data accuracy, timely availability, and secure provision, while ATPs were expected to use data only for authorised purposes, protect it appropriately, and deliver clear customer value. Several responses challenged the terminology of “data ownership,” arguing instead that schemes should reinforce the principle that individuals ultimately control their personal data, consistent with existing data protection law.
Inclusion, accessibility, and fairness were also frequently raised. Respondents warned that complex consent journeys, technical language, or digital-only processes could exclude vulnerable groups, undermining trust and adoption. There was broad support for accessible design, assisted journeys, alternative access routes, and age-appropriate or risk-based defaults. Some responses also highlighted the need for delegated authority mechanisms (e.g. for carers, executors, or legal representatives) to ensure trust is maintained across life events such as illness or death.
Q10: What common principles are needed to support the development of a digital markets Smart Data scheme and why?
Overall, respondents agreed that the development of a digital markets Smart Data scheme should be guided by a coherent set of principles centred on:
-
customer control and empowerment
-
transparency and accountability
-
security and privacy by design
-
interoperability and open standards
-
proportionality, flexibility, and inclusivity
-
competition and effective governance
Respondents emphasised that these principles are mutually reinforcing and collectively necessary to ensure the scheme is trusted, scalable, and capable of delivering long-term benefits to customers, markets, and the wider digital economy.
A core and consistently cited principle was customer control and empowerment. Respondents emphasised that individuals must be able to decide when, how, and with whom their data is shared, supported by explicit, informed, granular, and revocable consent. This was framed not only as a legal requirement but as a foundational trust mechanism, ensuring the scheme operates in the interests of customers rather than data holders or intermediaries. Several responses noted that without genuine customer agency, uptake and long-term legitimacy would be undermined.
Closely linked to this was the principle of transparency and clarity. Respondents argued that a Smart Data scheme must make data flows, roles, and responsibilities visible and understandable, both to customers and participating organisations. This includes clarity on what data is in scope, how it is standardised, who is accountable at each stage of the data journey, and how issues can be escalated and resolved. Transparency was repeatedly identified as critical to maintaining trust, supporting regulatory oversight, and reducing disputes between participants.
Security and privacy by design emerged as another foundational principle. Respondents stressed that robust security controls, data minimisation, purpose limitation, and compliance with existing data protection frameworks must be embedded into the scheme’s architecture from the outset, rather than added retrospectively. Many noted that high and consistent security standards across all participants are necessary to avoid weak-link risks, where trust in the entire scheme is damaged by the failure of a single participant.
Respondents also highlighted the importance of interoperability and standardisation. There was broad consensus that common technical standards, shared data schemas, and reusable components are essential to reduce integration costs, enable scalability, and prevent market fragmentation. Respondents emphasised that standards should be open, technology-neutral, and aligned with existing domestic and international frameworks, allowing innovation while ensuring compatibility across sectors and jurisdictions.
Another frequently cited principle was proportionality and flexibility. Respondents argued that the scheme should be risk-based and outcomes-focused, rather than prescriptive about specific technologies or business models. This includes tailoring requirements to the sensitivity of data, scale of participants, and maturity of sectors, and allowing for phased implementation and iterative improvement. Proportionality was seen as key to ensuring participation by SMEs and new entrants, while avoiding unnecessary compliance burdens.
Competition and inclusivity were also identified as essential principles. Respondents stressed that the scheme should lower barriers to entry, avoid favouring incumbents, and preserve customer choice, including the ability to select and switch between ATPs freely. Inclusivity was framed both in market terms and user terms, with respondents emphasising the need for accessible design, support for vulnerable users, and mechanisms to prevent digital exclusion.
Several respondents additionally pointed to the principle of clear governance and accountability. This includes defined roles for data holders, ATPs, and scheme operators; effective monitoring and enforcement; and mechanisms for dispute resolution and redress. Respondents noted that without clear governance, even well-designed technical systems would struggle to maintain trust and coherence over time.
Q11: Are there any tensions, overlaps, gaps or other features of the regulatory landscape in digital markets that the government should take into consideration?
Overall, respondents emphasised that the government should take into account:
-
overlap with existing data protection, competition, and consumer protection regimes
-
the need for coordination across multiple regulators
-
potential gaps at the boundaries of regulatory responsibility
-
international alignment and cross-border considerations
-
differences in sectoral maturity and regulatory readiness
Respondents consistently stressed that addressing these tensions and overlaps proactively, through clear articulation of scope, responsibilities, and governance, will be critical to ensuring that a digital markets Smart Data scheme is coherent, credible, and effective within the wider regulatory landscape.
Respondents broadly agreed that the current digital markets regulatory landscape is complex, fragmented, and evolving, and that these characteristics create both risks and opportunities for the development of a Smart Data scheme. Many respondents emphasised that careful navigation of existing frameworks will be essential to avoid duplication, regulatory conflict, and unnecessary compliance burdens, while ensuring that Smart Data delivers additional value rather than restating existing rights and obligations.
A dominant theme across responses was concern about overlap with existing data protection and competition regimes, particularly UK GDPR, the Data Protection Act 2018, consumer protection law, and the emerging digital markets competition framework. Respondents noted that Smart Data schemes must be clearly differentiated from, and aligned with, data subject access rights, portability rights, and consent requirements, to avoid confusion for customers and firms alike. Several responses stressed that if Smart Data is perceived as duplicative of UK GDPR rights without offering clearer, more usable mechanisms, its legitimacy and uptake could be undermined.
Respondents also highlighted regulatory fragmentation and coordination challenges. Many pointed to the number of regulators with relevant remits, including the CMA, ICO, FCA, Ofcom, and sector-specific bodies, and warned that inconsistent interpretations, overlapping guidance, or uncoordinated enforcement could create uncertainty and raise compliance costs. There was strong support for clear leadership, coordination mechanisms, and consistent cross-regulator messaging, with some respondents suggesting joint guidance or shared governance structures to reduce friction for market participants.
Another recurring theme was the risk of gaps between regimes, particularly at the boundaries between competition regulation, data protection, consumer protection, and sector-specific rules. Respondents argued that Smart Data schemes could fall into grey areas where responsibilities are unclear. For example, in relation to liability for data misuse, redress mechanisms, or oversight of ATP conduct. Several respondents cautioned that without explicit clarity on how Smart Data interacts with existing enforcement powers, there is a risk of regulatory blind spots that could weaken consumer protection or trust.
International alignment and cross-border data flows were also frequently raised. Respondents noted that many digital markets firms operate globally, and that divergence between UK Smart Data requirements and international regimes, such as the EU’s Digital Markets Act, Data Act, or existing Open Banking-style frameworks, could increase costs and complexity. Several responses argued that alignment, or at least interoperability, with international standards would help avoid regulatory arbitrage and ensure UK schemes remain attractive and competitive.
A further theme concerned sectoral inconsistency and uneven maturity. Respondents observed that some sectors (such as financial services) have relatively mature data-sharing frameworks, while others lack established standards or governance models. This raises tensions around how far common approaches can or should be applied across sectors, and whether sector-specific adaptations are necessary. Some respondents warned that attempting to impose a single, uniform model could create friction or inefficiency, while others cautioned that excessive sectoral divergence could undermine coherence and scalability.
Several responses also noted potential tensions between innovation and regulatory certainty. While respondents broadly supported strong safeguards, some cautioned that overly rigid or prescriptive regulation could stifle experimentation and new business models, particularly for SMEs and new entrants. Others argued that regulatory clarity and certainty are themselves enablers of innovation, provided requirements are proportionate and outcomes focused.
Q12: What data sharing initiatives already exist in digital markets that the government should be aware of when evaluating a Smart Data scheme in digital markets?
Overall, respondents converged on the view that:
-
numerous data sharing initiatives already exist, spanning regulated schemes, voluntary tools, sector pilots, and international models
-
these initiatives demonstrate feasibility and demand, but are fragmented, inconsistent, and often platform – or sector –controlled
-
a digital markets Smart Data scheme should build on existing successes, particularly Open Banking-style governance and trust frameworks, while avoiding duplication and addressing the limitations of voluntary, siloed, or opaque approaches
Respondents consistently stressed that the government’s key challenge will be to synthesise lessons from these initiatives into a coherent, cross-sector Smart Data framework, rather than replicating or entrenching existing fragmentation within digital markets.
The most frequently cited and consistently referenced initiative was Open Banking, alongside emerging Open Finance work in the UK and abroad. Respondents widely viewed Open Banking as the clearest and most mature precedent for a regulated Smart Data-style intervention, highlighting its use of mandated APIs, common standards, accreditation of third parties, and independent governance. Many noted that Open Banking has delivered tangible benefits in competition and innovation, while also offering important lessons about implementation costs, phased rollout, and the limits of sector-specific approaches. Several respondents cautioned that Open Banking is not itself a full Smart Data scheme, but rather a useful reference point.
Beyond financial services, respondents pointed to sector-specific Smart Data or data-sharing initiatives in areas such as energy, communications, property, healthcare, and gambling. Examples included smart metering data schemes, telecoms portability initiatives, property data trust frameworks, and government-backed pilots to digitise and standardise sector data. These initiatives were generally seen as demonstrating practical progress and accumulated expertise, but respondents warned that proliferation of sector-by-sector models risks fragmentation, duplication, and inconsistent user experience if not aligned under a common framework.
A second major theme concerned platform-led and voluntary portability tools operated by large digital firms. Respondents referenced “download your data” services, platform APIs, advertising and analytics dashboards, and cross-platform projects such as the Data Transfer Project (DTP) Data Transfer Initiative (DTI). While these initiatives were acknowledged as evidence that data sharing is technically possible at scale, respondents consistently described them as limited, non-standardised, and controlled by platform interests. Several responses argued that these tools provide batch exports rather than real-time, actionable data, are poorly understood by users, and lack meaningful guarantees around continuity, governance, or redress.
Respondents also highlighted international regulatory initiatives as important reference points. Australia’s Consumer Data Right, India’s Account Aggregator framework, EU developments under GDPR, the Digital Markets Act, and the EU Data Act, and other global Smart Data or portability regimes were frequently cited. These were viewed as offering valuable lessons on consent artefacts, accreditation, scope definition, and regulatory sequencing, as well as cautionary examples where ambiguity or lack of harmonised guidance has complicated implementation.
Another recurring theme was the emergence of personal data stores, personal information management systems (PIMS), and trust framework models. Some respondents pointed to initiatives such as Solid, MyData, and Safe Secure Cloud as closer to the user-centred ambitions of Smart Data, allowing individuals to hold and control their own data directly. While these approaches were seen by some as promising, others noted that adoption has been limited without strong incentives, regulatory backing, or widespread interoperability.
Q13: What lessons should we bear in mind from Open Banking that would be helpful to consider when developing a digital markets Smart Data scheme?
Overall, respondents converged on the view that:
-
strong, independent governance and clear regulatory oversight are essential
-
standardisation and interoperable technical infrastructure underpin functionality
-
clear user value, trust, and effective consent mechanisms drive adoption
-
regulatory intervention is often required to ensure participation
-
costs, funding models, and proportionality must be carefully managed
-
phased, iterative implementation supports scalability and reduces risk
Respondents consistently highlighted the importance of strong governance and clear institutional frameworks, noting that Open Banking’s progress was enabled by a central body with a clear mandate to set standards, accredit participants, and enforce compliance. Many emphasised that governance must be established early and remain adaptable, with ongoing oversight required to maintain momentum.
A second key theme was the role of standardisation and technical infrastructure. Respondents widely agreed that standardised APIs, common data formats, and interoperability are critical to enabling effective data sharing. However, some noted that inconsistent implementation and overly rigid specifications in Open Banking created friction, suggesting future schemes should balance consistency with flexibility.
There was also broad agreement on the need for clear regulatory frameworks and active enforcement. Many respondents observed that participation in Open Banking was driven by regulatory mandates rather than market incentives, with continued oversight and enforcement seen as necessary to ensure delivery. A minority, however, questioned whether this model would translate effectively to less regulated digital markets.
Respondents emphasised that user trust, consent, and clear value propositions are central to uptake. Granular, transparent, and revocable consent mechanisms, alongside strong security safeguards, were seen as essential. At the same time, many noted that user awareness of Open Banking remains limited, highlighting the need for clearer communication and demonstrable benefits to drive engagement.
Cost and proportionality were also recurring themes. Several respondents highlighted the high implementation and compliance costs associated with Open Banking, particularly for smaller firms, and called for proportionate accreditation, sustainable funding models, and support for SMEs. A small number of responses raised more fundamental concerns about whether the cost and complexity of the model are replicable across sectors.
Finally, respondents pointed to the importance of phased implementation and iterative development, with Open Banking’s staged rollout seen as enabling manageable delivery and refinement over time. There were also calls for greater cross-sector interoperability and international alignment to maximise the potential of Smart Data schemes.
While many respondents viewed Open Banking as a successful foundation for Smart Data, others noted limitations in consumer uptake, competition impacts, and overall engagement. A small number of respondents raised specific concerns, including the design of consent mechanisms, risks of over-prescriptive standards, and challenges in applying the model to more dynamic or sensitive digital markets.
Q14: What lessons should the government bear in mind from the EU Digital Markets Act (DMA) and other Smart Data schemes in other jurisdictions including the establishment of Open Banking schemes around the world?
Overall, respondents converged on several key lessons from international experience:
-
mandates are necessary but not sufficient; enforcement and usability are critical
-
clear, open, and well-governed technical standards underpin successful Smart Data schemes
-
proportionality, phased implementation, and clear scope definition reduce risk and cost
-
security, accreditation, and trust frameworks must be embedded from the outset
-
visible, practical benefits are essential to drive adoption and sustain momentum
Respondents consistently emphasised that the UK has an opportunity to learn from both the successes and shortcomings of international approaches, particularly the EU DMA and global Open Banking initiatives, and to design a Smart Data scheme that is enforceable, proportionate, user-centred, and grounded in practical implementation realities rather than purely formal rights.
A dominant and recurring theme was that voluntary approaches are insufficient. Many respondents argued that experience from the EU, the US, and other jurisdictions shows that data holders, particularly firms with significant market power, rarely provide effective interoperability or data access without enforceable obligations. The DMA was often cited as evidence that mandates are necessary to unlock access, though respondents stressed that mandates alone do not guarantee usability or consumer benefit.
Closely linked to this was the lesson that enforcement quality matters as much as legal rights. Respondents repeatedly warned against what several characterised as “compliance theatre” under the DMA, where firms meet the letter of obligations while undermining their practical value through poor user experience, degraded data quality, restrictive terms, or technical constraints such as rate limiting. Strong enforcement powers, clear deadlines, active monitoring, and credible penalties were widely seen as essential to prevent circumvention and delay.
Another strong theme was the importance of interoperability and robust technical standards. International Open Banking schemes were frequently cited as demonstrating that common, open, and well-governed standards are critical to scalability, competition, and innovation. Respondents emphasised that bulk data exports or poorly specified APIs are insufficient; instead, real-time or near-real-time, event-driven access is often required to enable switching and dynamic services. Several respondents cautioned that mandates without detailed technical specifications risk fragmented and uneven implementation, as seen in parts of the EU.
Proportionality and scope definition were also widely emphasised. Many respondents supported the DMA’s focus on targeting firms with entrenched market power but argued that the UK should define in-scope entities, datasets, and obligations more clearly to avoid ambiguity and overreach. Comparisons were drawn with Australia’s Consumer Data Right, which uses a sector-by-sector designation and phased implementation approach, providing greater predictability and clarity for market participants. Respondents stressed that requirements should be risk-based and tiered, so that SMEs and new entrants are not excluded by disproportionate compliance burdens.
Another frequently cited lesson was the need to align legal obligations with technical and operational reality. Respondents highlighted challenges under the DMA where portability rights exist in law but are difficult to exercise in practice due to onboarding processes, security constraints, or reliance on third parties. Many argued that Smart Data schemes must be designed with early and ongoing technical engagement, clear guidance, and hands-on regulatory involvement from the outset, rather than relying on post hoc interpretation.
Security, privacy, and trust frameworks featured prominently. Respondents warned that some international regimes, particularly the DMA, lack sufficiently clear standards for third-party data requesters, creating risks around data misuse, over-collection, and cross-border transfers. Several responses argued that a UK scheme should learn from these shortcomings by embedding minimum security standards, accreditation, purpose limitation, and clear liability allocation, ensuring that portability does not weaken data protection or expose users to harm.
Alongside these shared themes, several divergent or singular perspectives emerged. Some respondents strongly cautioned against replicating perceived ideological or cultural assumptions embedded in EU frameworks, arguing that UK schemes should prioritise domestic legal traditions, sovereignty, and individual agency. Others argued that the EU experience shows how overly rigid or prescriptive rules can degrade user experience or stifle innovation. A small number of responses emphasised emerging technical approaches—such as proof-based portability, edge processing, or digital identity frameworks—as ways for the UK to leapfrog existing models, though these views were not universally shared.
Q15: Do you have any additional comments on any aspect of developing a digital markets Smart Data scheme that has not been covered elsewhere in this call for evidence?
Overall, respondents used Q15 to reinforce that the success of a digital markets Smart Data scheme will depend not only on technical design and legal powers, but on:
-
clear strategic purpose and prioritisation
-
phased, evidence-led delivery
-
durable, independent, and inclusive governance
-
avoidance of concentration and single points of failure
-
ongoing evaluation, communication, and adaptation
Respondents consistently framed these additional considerations as enablers rather than alternatives to the core elements discussed elsewhere in the call for evidence, emphasising that attention to implementation, governance, and long-term stewardship will be critical to translating Smart Data principles into durable, real-world outcomes.
A prominent theme in the Q15 responses was the importance of clarity of purpose and policy intent. Several respondents emphasised that a Smart Data scheme must have a clearly articulated objective, whether centred on competition, consumer empowerment, innovation, or trust, and that ambiguity risks diluting impact. Respondents cautioned that attempting to solve too many policy problems simultaneously could lead to over-complex design, stakeholder confusion, and weak accountability, arguing instead for a focused initial scope with the flexibility to evolve over time.
Many respondents returned to the issue of delivery sequencing and prioritisation. There was strong support for a phased or modular approach, beginning with high-value, lower-risk use cases and expanding as standards, governance, and market capability mature. Respondents suggested that early success would be critical to maintaining stakeholder confidence and political momentum, noting that poorly executed initial phases could undermine trust even if later improvements are planned.
Respondents also raised concerns around long-term governance of the scheme. Several questioned who should act as the scheme operator or coordinator, and how independence, accountability, and adaptability would be maintained over time. There was support for governance models that include clear regulatory oversight, strong industry participation, and meaningful consumer representation, with some respondents stressing that governance arrangements must be sufficiently resourced and empowered to evolve standards and address emerging risks.
A further recurring theme was the need to avoid unintended centralisation or dependency risks. Some respondents cautioned that Smart Data schemes could inadvertently create single points of failure, whether technical (e.g. central infrastructure), institutional (e.g. a dominant scheme operator), or commercial (e.g. reliance on a small number of large ATPs). Mitigations suggested included distributed architectures, open standards, and safeguards against concentration, though these points were generally raised at a high level rather than with specific technical prescriptions.
Respondents also highlighted the importance of ongoing evaluation and evidence-gathering. Several responses suggested that government should build in formal review points, performance metrics, and transparency mechanisms to assess whether the scheme is delivering intended outcomes for customers and markets. Suggested indicators included uptake, switching behaviour, diversity of ATPs, consumer satisfaction, and incidence of harm. Respondents emphasised that without clear success measures, it would be difficult to justify expansion or adjustment of the scheme over time.
Some respondents raised broader considerations around public awareness and engagement. It was noted that Smart Data schemes are unlikely to succeed if customers are unaware of them or do not understand their benefits. Several responses suggested that clear public communication, education, and trusted intermediaries will be essential to drive uptake, particularly outside technically literate user groups. However, respondents also cautioned against placing excessive responsibility on customers to navigate complex choices without adequate support.
A number of less frequent but noteworthy points were also raised. A small number of respondents reiterated concerns about regulatory burden and consultation fatigue, particularly for smaller firms, and encouraged alignment with existing initiatives to minimise duplication.