Guidance

Delivering the Data Maturity Assessment: objectives and resources

Published 27 March 2023

1. Year one objectives and supporting resources

Our shared ambition is for all departments to have actionable data maturity insights from using the Data Maturity Assessment (DMA) for Government, within 12 months. You can benefit from your first use of this assessment by completing a carefully targeted assessment of a strategically important area of your organisation.

These supporting resources have been designed to assist you in initiating, delivering and actioning the results of your first assessment using the DMA for government. The resources are deliberately light-touch and you have significant freedom to decide the ‘what’ and ‘how’ of your first assessment. The resources include a range of options, considerations, suggestions and simple tools.

The DMA supporting resources are broadly linear and cover:

  • The year one objective for departments
  • Planning your assessment
  • Assess the targeted part of your organisation and record the maturity levels
  • Communicate the results of your assessment inside your organisation
  • Review your results in terms of the organisation’s objectives, prioritise maturity issues for improvement or maintenance, and reflect on your assessment experience

1.1 Year one objectives for departments

Actionable insights from the Data Maturity Assessment are achievable at pace.

All departments should complete an assessment of a strategically important part of the organisation, using the DMA for Government, within the year. It is for you to determine what ‘strategically important’ means for your organisation. Factors to consider when deciding which part of your organisation to focus on are described in Strategic targeting of your first assessment.

Completion of an assessment means that you have:

  • A clear accountability, leadership and delivery structure for data maturity. Planned the assessment (scope and scale, information gathering methodologies, validation activities, resource requirements, etc).
  • Delivered the assessment and recorded line-by-line maturity level selections.
  • Reported and communicated the results to key internal stakeholders.
  • Reviewed the results in the context of the assessed area’s strategic, operational and/or corporate priorities.
  • Reached conclusions about priorities for improving and/or maintaining data maturity.
  • Reflected on the process to inform your future DMA planning and delivery.
  • documented/recorded your decisions, choices, methods and rationale throughout the process.
  • Shared your conclusions and learning with others to build cross-government data maturity assessment knowledge.

1.2 Communicate with the Central Digital and Data Office

We ask that you check-in with the data maturity team at the Central Digital and Data Office periodically over the course of your first assessment in 2023-24:

  • Q2 advise CDDO of the proposed scope of your assessment
  • Q2/3 have a progress conversation with CDDO
  • Q3/4 share your conclusions and learning with CDDO

These check-ins are an opportunity to explore your planning and delivery, seek advice and guidance, maintain momentum and support a cross-government knowledge base for data maturity. These conversations will contribute to the development of the wider strategic deliverables of the DMA for Government, as we learn from your experience of using the assessment and resources. Engagement with CDDO on data maturity is welcomed and encouraged beyond these milestones. Contact the data maturity team.

1.3 Structure of the data maturity assessment framework

The data maturity assessment framework is structured to enable a systematic review of your data environment.

The Data Maturity Assessment (DMA) for Government is a self-assessment framework. The DMA is organised around ten topics, intersected with six themes, arranged across rows, with each row providing five maturity levels. The comprehensive assessment has 97 rows.

An assessment requires gathering expert views, non-expert views, information and evidence as needed for each of the topics and rows. These inputs are used to identify the maturity statement in each row which most closely reflects the current status of your organisation’s data environment. The assessment outputs provide a maturity level for each of the ten topics and a score for each of the themes. Together, these outputs provide a holistic and nuanced picture of your organisation’s capability, effectiveness and readiness to use data to achieve your priorities.

There is more information in the DMA Framework about the topics, themes and maturity levels.

1.3.1 The topics

The ten topics of the DMA cover the areas which are most important to understanding our capability, effectiveness and readiness to use data for our organisational priorities in a government context. The result of your maturity assessment is a maturity score for each topic.

1.3.2 The themes

Each row in the assessment has been assigned a theme. The themes provide a lens or perspective for considering what drives or underpins the maturity level for each row. Your themes results are especially helpful for planning maturity improvement or maintenance actions.

1.3.3 The maturity levels

For each row in the DMA there are five levels of maturity to consider. The maturity levels describe the features or behaviours associated with progression from low to high maturity for that specific row. An organisation chooses the maturity level which best describes their current state for each of the rows in the assessment.

1.4 These resources are for people who have delivery responsibilities for the year one assessment goals

These resources have been designed to support people and teams with direct responsibility for and/or a hands-on role in undertaking an assessment. All users, across strategic or delivery responsibilities, will benefit from reading through these resources. Teams and individuals who are tasked with the detailed delivery work may need to refer to these resources a number of times at different stages in the assessment.

1.5 Feedback on these resources

These supporting resources were developed under the guidance of a cross government Task and Finish Group. We will iterate these resources as we learn more from wider use of the DMA for Government. We welcome your comments and suggestions - please get in touch.

2. Plan

There is no single ‘right’ way to deliver a maturity assessment and your choices will reflect your specific context and organisational needs. Working through this document, alongside your usual project management structures and methods, will help to ensure you are prepared and able to undertake a meaningful assessment. A meaningful assessment will give you the information you need to understand your current data ecosystem and prioritise the right areas for development.

2.1 Leadership and accountability for your data maturity goals

You will need senior, strategic and delivery leadership and accountability to make data maturity assessment a success for your organisation.

Senior leaders have been asked to assign accountability for the first year data maturity goals to an appropriate officer in their organisations. This may be a data specialist senior leader, or a non-data specialist, such as with the chief operating officer, chief executive or director general, with the data office and/or data specialists acting as subject matter expert advisors and leading on delivery. Check with your senior data leaders on accountability for first year data maturity goals in your organisation. This will make any internal sign off or reporting requirements easier to identify and understand.

Your organisation will have its own structures for leadership, accountability and management of projects. We are using the following project management view of accountability and delivery. This may not map directly to your organisation, but each of the high level responsibilities should be clearly assigned.

Responsibility Accountability role Example civil service roles / grades
Senior accountability Senior accountable officer Perm Sec, Director General, CEO, CDO, COO (SCS4-2)
Strategic leadership and accountability Sponsoring leader and/or group CEO, COO, CDO, Director (SCS2-1)
Delivery leadership and accountability Programme/delivery director DD, Head of, Team Leader (SCS1 - G6)
Delivery responsibility Programme/delivery team Lead, senior advisor, senior (specialist), manager, assistant (G6-7, SEO, HEO, EO)

2.2 A right-sized assessment

The comprehensive assessment framework can be refined within these parameters.

The comprehensive assessment framework is large and you may not have capacity or need to assess against every row. One of our priorities is to work with stakeholders and experts across government to deliver a core set of rows which are applicable to all departments. We anticipate this will be completed in quarter 3 of 2023-24.

Until then, you can contribute to this refinement and create a manageable framework for your first assessment using the following method and meeting these criteria:

Method:

  • Convene a group of experts and stakeholders.
  • Review the rows in the comprehensive DMA framework.
  • Identify the rows of greatest importance or relevance to your organisation and/or the part of the organisation to be assessed.
  • Take care to not eliminate rows because they may return a low maturity score.
  • Document the rationale for inclusion and non inclusion of rows.
  • Agree the subset of DMA rows with the group and senior sign-off as needed in your organisation.

Your subset of rows from the comprehensive DMA should:

  • Include all ten topics.
  • Maintain the general proportionality of the topics with the full assessment.
  • Do not edit the content of rows.
  • Not be less than c. 50 rows.

Keeping to these criteria will make it easier to align with other assessments (within and between departments) and accelerate the identification of priority rows for the cross-government core assessment.

2.3 Key delivery considerations

Planning your data maturity assessment requires making a range of decisions and choices. You should document your decisions and rationale for later reference and reflection, and where appropriate, as part of what you share with others in government for collective learning and intelligence.

2.3.1 Strategic targeting of your first assessment

Starting with a strategically important part of your organisation is the key to unlocking value from your first use of the Data Maturity Assessment for Government

You will need to define the scope and scale of your assessment. This is an important decision and will impact and inform your assessment plans.

While there is significant flexibility in how an assessment is delivered, your assessment should:

  • cover a strategically important part of the organisation;
  • be a ‘deep’ assessment.

We have set these criteria to ensure you get real value from your effort. You should agree which strategically important part of the organisation you will assess with your senior accountable officer for data maturity. A deep assessment of a strategically important part of the organisation means the results and insights will be important and useful beyond the ‘data function’.

A ‘deep’ assessment is one that enables a rich view of the data ecosystem of the assessed area. It will provide useful granularity and opportunities to identify and address some of the delivery challenges this work may present in your context. A deep assessment is one that uses several different sources of evidence and methodologies to identify your current maturity status in each row.

A ‘strategically important’ part of the organisation may be a specific domain, service, business unit or function that is fundamental to your remit, strategy or business success. The aim is to achieve a meaningful assessment with actionable insights, and this may mean choosing a part of your organisation where you may not be confident of achieving a strong maturity result.

Smaller or less complex organisations, for example, where the organisation is not divided into business units or where overall management converges in one team, a ‘whole organisation’ approach may be the best route to value.

2.3.2 Identifying contributors and evidence: who and what

You will need to work across your organisation to find the people and evidence for your assessment

The maturity assessment covers the length and breadth of the data environment relevant to your targeted assessment. You will need to find people who can answer questions and provide evidence from a range of different specialisms and perspectives. Information from people and evidence, such as relevant documentation, is used to identify your current data maturity level for each row.

Depending on the methods you choose, you might seek out experts and those responsible for different data activities, accountabilities and systems in your organisation. You should also include non-experts or non-technical colleagues who may be the users of data outputs, policies, products, systems, and tools.

To assist with finding contributors and evidence, each of the topics from the data maturity assessment and a brief summary of their associated data issues and concepts is listed below.

For some topics and rows in the assessment you might ask for evidence to support a maturity level choice. Evidence may include policies, recent audit reports, memoranda of understanding for data sharing (providing or acquiring), strategies, organisation charts, logs, product licences, data schema, etc. The existence of documentation is not in itself proof of a particular maturity level, and may need to be queried to understand what it represents for the assessment.

Topic Engaging with others
Data issues and concepts - Understanding user needs for data
- Collaboration
- Data sharing
- Networks and communities of practice
Topic Having the right data skills and knowledge
Data issues and concepts - Staff data literacy
- Data skills in leadership roles
- Developing specialist data staff
- Provision of data skill development opportunities.
Topic Having the right systems
Data issues and concepts - Infrastructure
- Storage
- Tools/technology
- Investing in data systems and tools
Topic Knowing the data you have
Data issues and concepts - Asset registers / data catalogues
- Data disposal
- Metadata.
Topic Making decisions with data
Data issues and concepts - Data-driven operational and strategic decision making
- Understanding users
- Monitoring performance.
Topic Managing and using data ethically
Data issues and concepts - Transparency
- Bias mitigation
- Inclusivity
- Oversight and scrutiny.
Topic Managing your data
Data issues and concepts - Data standards
- Data quality
- Data collection
- Data pipelines
- Data disposal.
Topic Protecting your data
Data issues and concepts - Data security
- Data protection
- Disaster recovery.
Topic Setting your data direction
Data issues and concepts - Data principles
- Data policy
- Data strategy.
Topic Taking responsibility for data
Data issues and concepts - Governance
- Responsibility and accountability structures
- Ownership of assets and processes
- Oversight of adherence to policy and procedure

2.3.3 Obtaining inputs to identify current maturity level: methodologies

Choose and record your methodologies for obtaining inputs which support decisions about your current status maturity levels

You will need to carefully plan for how you will determine which maturity level represents the current status of your data environment for each row in the assessment.

You will likely need to use different methods for different topics. Some topics may lend themselves to more qualitative approaches, and others to quantitative approaches. Your method should be consistent across each row within a topic. You may use more than one method per topic area. Use of multiple methods per topic will help to clarify or reinforce a finding, and likely increase confidence in the results and outputs of your assessment. The use of multiple methods per topic may be considered a ‘deep’ assessment.

Your chosen method/s may mean you have a number of inputs to inform the maturity level selection, particularly if you have used multiple methods in a single topic area. You will need to plan for how you will bring those inputs together and how the decision on which maturity levels to select will be made.

Provided below is a brief overview of different methods and how they might be used in delivering a data maturity assessment. It is not exhaustive or prescriptive. Included in the descriptions are reflections on the advantages and disadvantages of using a particular method. Every approach has limitations and there is no single methodology which will guarantee impartial, fair and accurate results.

Before undertaking research with contributors, ensure your plans and processes are compliant with your organisation’s policies for research, ethics and privacy.

Name Individual expert response
Example Specific rows from the maturity assessment are allocated to targeted contributors who answer on behalf of the organisation.
Resource Needed - Low-medium
- This method likely needs resources to identify the right individuals and brief them, but analysis should be simple and straightforward.
Useful for - Quickly gathering one perspective of data maturity ahead making. decisions about use of more detailed methods
- Validation of results
- Clarifying other inputs.
Pros - Is relatively low resource and can produce results quickly
Cons - Unlikely to provide high levels of confidence in accuracy and bias mitigation for users of the assessment outputs if it is the sole method for gaining inputs for a topic or entire assessment
- May reinforce concerns about the validity of self assessment approaches.
Name Central team/panel consensus
Example A central team/panel choose the row-by-row maturity levels on behalf of the organisation. This may be one team/panel for the entire assessment, or different representatives by topic area.
Resource Needed - Low-medium
- Most resource will be used in setting up and delivering the panel sessions.
Useful for - Reaching consensus across inputs from different methods
- Validation.
Pros - Outputs of sessions will deliver clear results.
Cons - If used as a standalone method it may not provide high levels of confidence in accuracy and bias mitigation for users of the assessment outputs.
Name Structured group workshops
Example Representatives from across the organisation discuss data maturity in a structured workshop.

Workshops are arranged on a topic basis.

There may be one or multiple workshops on each of the topics.

Participants in workshops should come from all aspects of the data environment, data experts, data users and non-experts from across the assessed part of the organisation. They may be in mixed or separate groups by their subject matter expertise or perspective.

Participants agree or reach a majority decision on the maturity levels for each of the rows of the topic covered.
Resource Needed - Medium-high
- Medium - if one or two sessions per topic
- High - if three or more sessions per topic.
Useful for - A detailed, qualitative view of data maturity
- Gathering evidence from non-data experts or similar users
- Gathering views related to ‘soft’ themes such as culture and leadership
- Reaching a shared understanding and agreement on current status maturity levels.
Pros - Outputs of sessions will deliver clear results.
Cons - Unless carefully managed (either in the contributor mix or the delivery of the workshop), some views may dominate the discussion and skew the result.
Name Qualitative survey
Example A qualitative survey is sent to contributors, who provide free text responses about the current status of the data environment in relation to their area of expertise or perspective.

Responses are analysed and used to identify the best fit maturity level for the rows being assessed.
Resource Needed - High
- A significant amount of time will be needed to design the survey and to code, analyse and validate results.
Useful for - Gathering evidence from non-data experts or similar users
- Gathering views related to ‘soft’ themes such as culture and leadership
- Increasing confidence or providing greater depth in results of specific topics
- Understanding specific parts of the data environment for which you may have low initial understanding or concerns
- Building a detailed picture and narrative evidence base.
Pros - Can provide a detailed picture of the data environment.
Cons - It may be difficult to achieve a response rate that ensures confidence in the results
- It may be difficult to code responses so that a clear enough picture emerges to choose maturity levels
- Potentially limited access to an appropriate and approved digital tool for delivering the survey.
Name Quantitative survey
Example A self-completed survey that is structured to capture a respondent’s specific row-by-row maturity level choices.

Respondents are segmented and respond to specific rows or topics relevant to their expertise, experience or perspective.

The survey responses are analysed and used to decide a consensus answer.
Resource Needed - Low-medium
- You will need to factor in effort to identify and segment contributors, survey design and undertaking the analysis
- Resource to ensure the quality of survey design.
Useful for - Developing a clear picture of views of the current status of the data environment
- Gathering views from expert respondents.
Pros - Potentially a way to get a fast and wide view of the area of the organisation you are assessing.
Cons - It may be difficult to achieve a response rate that ensures confidence in the results
- Potentially limited access to an appropriate and approved digital tool for delivering the survey.
Name Semi-structured interviews with experts and other stakeholders
Example Representatives from across the organisation respond to questions in a one-to-one interview.

Interviews are arranged on a topic basis.

There may be one or multiple interviews/interview subjects for each of the topics.

Participants should cover all aspects of the data environment, data experts, data users and non-experts from across the assessed part of the organisation.

Interviewees select the maturity levels for the relevant rows during the interview, or narrative results are used by others to determine maturity levels.
Resource Needed - High
- Design, delivery and analysis will all use resource.
Useful for - Gathering evidence from non-data experts or similar users
- Gathering views related to ‘soft’ themes such as culture and leadership
- Increasing confidence or providing greater depth in results of specific topics
- Understanding specific parts of the data environment for which you. may have low initial understanding or concerns
- Building a detailed picture and narrative evidence base.
Pros - Can provide a rich picture of the data environment.
Cons - It may be difficult to achieve enough interviews to ensure confidence in the results
- The skills and knowledge of the interviewer may impact results
- Requires high resource to achieve responses across topics and the organisation.
Name Random sampling deep dives
Example A random selection of rows are subject to a deep dive which may include using elements of other methods (interviews, quantitative surveys, etc).
Resource Needed - High
- Design, delivery and analysis will all use resource.
Useful for - Providing detail
- Validation.
Pros - May provide useful detail when used in conjunction with wide-shallow methods
- May increase confidence levels for users of the assessment outputs where wide-shallow methods have been used.
Cons - Cannot be used as a standalone method as it is a sample of rows and thus will not provide information for every row in the assessment.

2.4 Testing your plan

Your data maturity assessment plan needs to be achievable. You may wish to assess whether you have adequate resources to deliver the plan. You might consider the resource needs across skills, tools, time, and people. You may also wish to consider how much time contributors will need to provide and whether that is realistic.

You might trial the more resource intensive methods to provide a more accurate estimate of resource implications and estimates. If your organisation has a project management office or function you might consult with them and use the readiness tools and processes they recommend.

If your testing demonstrates that your plan will have delivery challenges you should either seek additional resources or revise your plan to ensure it is deliverable while still meeting the objectives.

2.5 Validating your assessment

Ensuring users of your assessment outputs have confidence in their impartiality, fairness and accuracy

The users of your assessment outputs need to be confident that your results are impartial, fair and accurate. To achieve this you will need to undertake validation activities. This is particularly important for self assessment, where users may need additional assurance that potential bias has been mitigated.

Validation activities will take place at various points in the data maturity assessment process, it is important to plan for and build validation from the start.

There are three to four stages to validation:

  • Design: design your validation process, including who you will involve and what methods you will use.
  • Delivery: undertake the validation activities and summarise results.
  • Confirmation: confirm whether or not the results are impartial, fair and accurate.
  • Remediation: if there are areas that are not successfully confirmed, plan and deliver activities that will provide impartial, fair and accurate results.

Your process might seek to ensure confidence and assure the results by validating different aspects of the assessment, for example you might look at:

  • Methodologies: the methodologies used to gather information and evidence for the assessment are robust enough to achieve reliable results.
  • Delivery: the assessment is delivered consistently across the assessed area.
  • Evidence: evidence to support maturity level choices is sufficient and relevant.
  • Accuracy: responses and maturity level choices do not under or overestimate your current status maturity levels.

To achieve validation of your assessment you might choose one or more of the methodologies suggested and described in Obtaining inputs to identify current maturity level: methodologies and/or devise your own validation approach and methodology.

2.6 Plan: checklist

This is a high-level checklist covering the content in this section of the supporting resources. Different organisations will have their own processes and requirements for agreeing and approving decisions, so we have focussed on the end point of each broad task area. Your specific list will be a product of your plan and your organisation’ procedures.

  1. Leadership and accountability for senior, strategic and delivery goals has been allocated and responsibilities are clear.
  2. The specific rows to be used in your first assessment were determined using the methods suggested and meet the criteria provided.
  3. The scope and scale of the first assessment was set and meets the criteria provided.
  4. You have had a conversation with the data maturity team at CDDO about the scope and scale of your assessment.
  5. Contributors (data experts and non-experts) and evidence sources have been identified.
  6. The methods to obtain inputs which will determine the current maturity status of the data environment have been chosen.
  7. The delivery plan has been tested and revisions have been made where needed.
  8. The domains and methods to validate the assessment approach and outputs have been chosen.

3. Assess and record

You are now ready to deliver your data maturity assessment plan. At the end of this stage you will have ‘raw’ maturity results for each of the topics and themes.

You may find that your plan changes during delivery as you learn more, identify new opportunities or face unexpected challenges. If this happens, document the changes and reasons. If you find a row that you had planned to include really does not apply, it can be marked as ‘not assessed’ and should be included in your refinement documentation or other reporting. If you are not certain how to interpret a term, you might discuss it with others in government undertaking an assessment and/or decide what it means for your organisation and document that definition for the purposes of this assessment.

3.1 Obtain information and make maturity decisions

You should keep these principles in mind when delivering the assessment:

  • Make decisions based on the typical and usual: evidence and understand what is typical or usual for your organisation, and minimise impact of any outliers.
  • Include specialists and non-specialists: capture the full organisational experience of your data ecosystem by ensuring you include the views and experience of non-specialists and stakeholders outside the data function.
  • Default to the lower maturity level: if you do not meet all the criteria of a level, choose the level below.
  • A wide perspective: Consider the entire data ecosystem for the assessed area, across the whole lifecycle, and from end-to-end.

While these activities can be described succinctly, this will be a resource intensive stage of your assessment. The headline activities are:

  1. Deliver actions to obtain inputs to identify your current data maturity levels.
  2. Use the inputs to inform decisions on the data maturity level for each row you have assessed.
  3. Validate your decisions.
  4. Record your maturity levels and generate your results

At the end of this stage you will have:

For each row you assess you should note:

  • The methodologies used to obtain inputs.
  • How the maturity level was agreed.
  • The reason for the maturity level decision.
  • Who/roles involved in this decision.
  • The maturity level selected.

This information will support validation processes, reviewing your assessment experience, and contribute to aligning different assessments over time.

3.3 A note on differences between the results for topics and themes

The results generated for the themes are less robust than for the topics, because the themes are not evenly represented across the assessment. This discrepancy will likely be increased where organisations have not assessed all rows. The results for the themes should be used as a way to consider the drivers of high or low maturity for a topic or specific row, rather than as a standalone output of the assessment.

3.4 Assess and record: checklist

This is a high-level checklist covering the content in this section of the supporting resources. Different organisations will have their own processes and requirements for agreeing and approving decisions, so we have focussed on the end point of each broad task area. Your specific list will be a product of your plan and your organisation’s procedures.

  1. You have obtained a range of inputs from different sources and using a range of methods.
  2. Your approach was in line with the assessment principles.
  3. Any changes to your delivery plan were documented.
  4. Evidence based decisions determined your current maturity levels for each row that was assessed.
  5. Your process was documented.
  6. Results have been validated.
  7. You have had a progress conversation with the data maturity team at CDDO.
  8. Maturity levels for each row that was assessed have been recorded on the DMA Record and Calculate Results spreadsheet.
  9. You have maturity results for each of the topics and themes, generated by the DMA Record and Calculate Results spreadsheet.

4. Communicate

Communicating your data maturity results within your organisation is the first step towards getting value from your assessment. You will need to present your results to others in your organisation, both within and beyond the data function.

You will now have ‘raw’ maturity results generated in your copy of the DMA Record and Calculate Results spreadsheet. The results are presented in the spreadsheet as a maturity level and a numerical score for each of the topics and themes. For the topics you might use either or both, depending on what works best for your organisation’s needs.

The format you choose to present your maturity results will depend on your organisation’s usual ways of communicating. The suggestions below apply whether you choose, for example, a word processed or slides-based format or structure.

4.1 Presenting your data maturity results

Help others to understand your assessment results by providing materials that are engaging and easy to understand

For your data maturity results to have impact you will need to describe and present them in ways that are relevant to the audience and make it easy for them to understand why data maturity and the results of your assessment are important.

The content of your data maturity results communications should follow these principles:

  • Be brief - this is a report on the results for internal use, rather than a report on the data maturity assessment process.
  • Be visual - using charts and diagrams, as well as text, can make the information more engaging and easier for many to understand.
  • Include examples and quotes - provide depth and context to the results by including examples and quotes obtained during the evidence and information gathering activities to provide depth and context to the results.
  • Provide a narrative - consider the ‘story’ your results tell and make that clear through the way the communication document is structured.
  • Consider developing a small selection of communications products to meet different needs, such as different levels of understanding of data and different types of responsibilities in the organisation.
  • Test your communications products with others and revise as needed.

4.2 The results for topics and themes need different messages

The results generated for the themes are less robust than for the topics. This is because the six themes are not evenly represented across the assessment and this makes it difficult to say what is a lower or higher result. This difficulty will likely be increased where not all rows have been assessed.

The results for the themes should be used as a way to frame the drivers of higher or lower maturity for a topic or specific row, rather than as a standalone output of the assessment. For example, you might consider if the same theme is seen across topics in relation to rows assessed as higher or lower maturity.

4.3 Enable your organisation to explore what the results mean and what to do next

You will need to provide enough context and ways of considering the meaning and implications of your results so that others can start to understand how data maturity affects them and their work.

For the data team/data specialists a straightforward description can be enough detail:

  • Our result for the topic ‘Having the right systems’ was Level 2 - Emerging.

For non-data specialists or those with less background knowledge of data maturity, you may need to provide additional context, for example:

  • Our result for the topic ‘Having the right systems’ was Level 2 - Emerging.
  • This topic is about the tools and systems we have for using and managing our data effectively.
  • Level 2 maturity means that we have challenges related to prioritising data effectively in the organisation and lots of opportunities to get more value from our data.
  • Some of our biggest challenges are having the right tools to organise our data and to allow appropriate internal access to data.

In this example we have added very high-level context about:

  • The topic and what it covers.
  • The meaning of the maturity level, with light framing in relation to the organisation.
  • The specific areas of low maturity in this topic, with light framing in relation to the organisation.

The depth and detail of the context you include will differ in relation to the audience and the purpose of the document or meeting. This contextual information was taken from the Data Maturity Assessment for Government: framework.

4.4 Communicate: checklist

This is a high-level checklist covering the content in this section of the supporting resources. Different organisations will have their own processes and requirements for designing, agreeing and approving internal communications, so we have focussed on the end point of each broad task area. Your specific list will be a product of your plan and your organisation’s procedures.

  1. You obtained your completed DMA Record and Calculate Results spreadsheet.
  2. The Data Maturity Assessment for Government: framework was referred to for context and to check information as needed.
  3. You followed the principles of communicating your results.
  4. You have considered the needs of your audiences.
  5. You have provided deeper context where needed.
  6. You presented or shared your maturity results with data specialist and non-data specialist audiences.
  7. You had a conversation with the data maturity team at CDDO.

5. Review, prioritise and reflect

The final stage of your first-year data maturity assessment has three areas:

  1. Review your results in terms of the assessed area’s objectives and responsibilities.
  2. Use that review to prioritise the right maturity improvements and maintenance.
  3. Reflect on your data maturity assessment experiences.

In this stage you will deliver:

  • Recommendations for which areas of lower maturity should be prioritised for improvement.
  • Recommendations for which areas of higher maturity should be prioritised for monitoring or maintenance.
  • Recommendations for how to carry out your next data maturity assessment.

You will have answers to the following questions:

  • Are we capable, effective and ready to use data to achieve our strategic goals, meet performance targets and prepare for the future?
  • What are our main priorities for data maturity improvement and maintenance?
  • What will we do differently for our next data maturity assessment next data maturity assessment?

5.1 Review your maturity results in the context of the organisation

Data maturity assessment is a tool for the data function and the business. Data is fundamental to everything our organisations do and your maturity results provide a starting point for understanding your data ecosystem in the wider organisation.

One approach to achieving this understanding is to develop a set of proposals for testing with senior leaders. You could start by developing an understanding of the strategies, performance indicators and risks that are specific to the data function, specific to the assessed area and organisation-wide.

The documents and information you will need include:

  • Your completed DMA Record and Calculate Results spreadsheet.
  • Data Maturity Assessment for Government: framework.
  • The strategy relevant to the part of the organisation you have assessed.
  • Your wider organisational strategy.
  • Your data strategy, if a standalone strategy is available.
  • Any performance indicators or similar evaluation metrics for, or relevant, to the part of the organisation you have assessed.
  • The risk register for part of the organisation you have assessed.
  • The risk register for the data function, if available.
  • Any information about any changes in the organisation, legislative environment, or technology, etc, anticipated in the near, mid- or longer-term future which may affect the assessed area.

Use the documents and information listed above to map your data maturity results, strengths and weaknesses, against:

  • Your data priorities, strategy, performance indicators and risks.
  • The assessed area’s strategy, performance indicators and risks.
  • Your organisational responsibilities, strategy, performance indicators and risks.

Use the ‘map’ you have developed to inform draft answers to these questions:

  • Do any areas of lower data maturity, per your results, present a challenge, barrier or risk to achieving the aims of the service/programme/domain/function you have assessed?
  • Do any areas of lower data maturity mean that you will not achieve or partly achieve key indicators?
  • Do any areas of lower data maturity mean that any indicators or ambitions for the area of the organisation you have assessed have been set lower than comparable organisations/programmes?
  • Does lower maturity contribute to lower efficiency in the area of the organisation you have assessed?

Using the the answers to the questions, develop a set of proposals for improving or maintaining data maturity in specific areas, including:

  • Describe all areas of lower data maturity, including relevant contextual information as set out above.
  • A description of the impact of these areas of lower maturity on the delivery of the assessed area’s objectives and responsibilities.
  • Describe the risk implications or provide a risk view for each of these areas of lower maturity.
  • Provide a priority rating and rationale for that rating for each of the areas of lower maturity.
  • Identify areas of higher maturity which are important to the delivery of the assessed area’s objectives and responsibilities.
  • Provide a priority rating and rationale for that rating for each of these areas of higher maturity.

5.2 Prioritise the right data initiatives

The outputs of your review work should be presented in a way that enables others to provide feedback. You should seek feedback from senior leaders relevant to the goals and objectives of the assessed area. Different organisations have their own processes and ways of engaging senior leaders, use these methods and channels.

Use the feedback to refine your proposals to create a ‘data maturity assessment conclusions, improvements and maintenance priorities’ report. Ensure appropriate internal clearance.

5.3 Reflect on your data maturity delivery experience

To assist teams delivering the next data maturity assessments and others across government you should reflect on your experiences and provide a document covering what you have learned. This might be achieved by obtaining and analysing feedback from those who were involved in the delivery of the assessment and users of the outputs. You might use a range of methodologies to obtain this feedback, as well as building in ‘retrospectives’ at various points in the process

  • What went well, what worked?
  • What went less well, what didn’t work?
  • What will you do differently next time?

This might be in terms of:

  • Delivering different stages of the assessment process effectively.
  • Designing different elements, such as validation actions or using inputs to determine maturity levels.
  • The team structures and resources needed for different stages of the process.
  • The tools and systems used or needed.

Your organisation may have tools or structures for undertaking this kind of review available through your project management office or specialists.

5.3.1 Reflection outputs

The outputs of your reflections should include a document with recommendations for use within your organisation and a version of this report that you can share with other organisations as part of the data maturity knowledge hub.

5.4 Review, prioritise and reflect: checklist

This is a high-level checklist covering the content in this section of the supporting resources. Different organisations will have their own processes and requirements for completion of projects and these should be adhered to and may inform aspects of completion of the data maturity first year goals. Here we have focussed on the end point of each broad task area. Your specific list will be a product of your plan and your organisation’s procedures.

  1. Obtained information to understand the strategic, performance and risk environment for the assessed area.
  2. You have mapped your data maturity results, strengths and weaknesses, against relevant priorities, strategy, performance indicators and risks.
  3. You have developed a set of proposals for improving or maintaining data maturity in specific areas.
  4. You have consulted on the proposals with relevant leaders and experts in your organisation.
  5. You have incorporated the feedback in a final data maturity assessment conclusions, priorities and recommendations document.
  6. You have shared your data maturity assessment conclusions, priorities and recommendations within your organisation.
  7. You shared an appropriate version of your data maturity assessment conclusions, priorities and recommendations with the data maturity community via the data maturity team at CDDO.
  8. You shared your data maturity assessment process reflections and recommendations with the data maturity community via the data maturity team at CDDO.
  9. You have had an end of first-year data maturity commitments conversation with the data maturity team at CDDO.

5.5 Feedback on these resources

These resources are dynamic and will be iterated as we learn more from wider use of the DMA. We welcome your comments and suggestions - please get in touch.

© Crown copyright, 2023

Licensed under the Open Government Licence v3.0 except where otherwise stated.