Personal social services ASCS report, England: 2024 to 2025 - methodology
Published 30 October 2025
Applies to England
Introduction
This methodology relates to the personal social services adult social care survey (ASCS), which is an annual national survey conducted by councils with adult social services responsibilities (CASSRs) in England. CASSRs will be referred to as ‘local authorities’ throughout this document.
The ASCS asks service users aged 18 and over in receipt of long-term adult social care services what they think of the support they receive. These services are provided or commissioned by local authorities following a full assessment of need.
The survey is designed to help the adult social care sector understand more about how services are affecting lives. The ASCS also collects information about service users’ self-reported general health and wellbeing across 6 sections of the questionnaire:
- overall satisfaction with care and support
- quality of life
- knowledge and information
- your health
- your surroundings
- yourself, the service user
The ASCS was developed in consultation with the Social Services User Survey Group (SSUSG). SSUSG exists to recommend a programme of social services user experience surveys, develop their content and advise on the methodology. The group includes:
- Department of Health and Social Care (DHSC) policy leads
- NHS England
- local authority representatives and researchers from the Personal Social Services Research Unit (PSSRU, now the Care and Outcomes Research Centre (COReC)) at the University of Kent
SSUSG reports to the Data Delivery Action Group, which in turn reports to the Adult Social Care Data and Outcomes Board (DOB). DOB oversees the development of national data collections and surveys relating to adult social care, and is co-chaired by the Association of Directors of Adult Social Services (ADASS) in England and DHSC. The survey has DHSC and ADASS approval.
Business case and relevance
The adult social care data collection: September 2025 notice to local authorities informs them about the requirement for, and timing of, the survey of service users. The accompanying adult social care data collections data provision notice states the legal basis for this work.
The survey is also listed on the Ministry of Housing, Communities and Local Government’s Single Data List as one of the data returns that local authorities are required to submit under current arrangements. The Single Data List lists all the data sets that local government must submit to central government.
Survey rationale
There is a need to understand more about how services and support are affecting the outcomes in people’s lives. It is critical to have high-quality information that:
- aids local authorities’ and wider government’s understanding of the impact and outcomes achieved
- enables choice
- informs service development and improvement
A robust survey programme that collects the views of the people who use services and support is the best and most appropriate vehicle to achieve this.
The Care Act 2014 consolidates past legislation and regulation, and continues to strive for greater transparency, accountability and personalisation in health and social care. Outcome-focused intelligence is important to supporting the implementation of the act.
The ASCS is the most significant pool of personal outcome information for those receiving local authority-funded or managed adult social care. It is an important resource for:
- reporting what has been achieved for local people
- supporting development and improvement of local services
- enabling people to make better choices about their care
It is important to understand at a national level how well services are meeting service user and carer needs. However, data from the survey is not intended to be used solely to monitor performance through national outcome measures - it should also be used locally to inform delivery of services and support, and to monitor and develop standards.
It is understood that some local authorities may undertake regular feedback through their agreements with service providers, but this survey will give a greater insight into outcomes for service users and provide a consistent basis for comparing results across different areas.
ASCS outcomes, measures and uses
The survey provides assured, benchmarked data to help local services find ways to improve outcomes in a challenging financial climate.
It is constructed so that an individual outcome can be disaggregated into constituent groups. So, as well as providing an overall quality-of-life index, it provides intelligence on:
- whether specific groups experience better outcomes
- whether services and support are meeting all outcome needs
- in time, the value added by adult social care services
Data from the survey is used to populate 7 of the measures in the Adult Social Care Outcomes Framework (ASCOF). These are:
- 1A: social care-related quality of life (questions 3a, 4a, 5a, 6a, 7a, 8a, 9a and 11)
- 1B: adjusted social care-related quality of life - impact of adult social care services (questions 3a, 4a, 5a, 6a, 7a, 8a, 9a and 11)
- 1D: overall satisfaction of people who use services with their care and support (question 1)
- 3A: the proportion of people who use services who have control over their daily life (question 3a)
- 3C1: the proportion of people who use services who find it easy to find information about support (question 13)
- 4A: the proportion of people who use services who feel safe (question 7a)
- 5A1: the proportion of people who use services who reported that they had as much social contact as they would like (question 8a)
For more information, see the ASCOF handbook of definitions.
More uses of the ASCS are given in the ‘Personal social services ASCS report, England: 2024 to 2025 - data quality statement’ on the Personal social services adult social care survey report, England: 2024 to 2025 statistical release.
Overview of methodology
Local authorities could choose any date between 30 September and 31 December 2024 to extract from their systems the list of service users who make up the eligible population for the survey.
To be included, a service user must - at the point data is extracted - be in receipt of long-term support services funded or managed by the local authority following a full assessment of need. This includes part‑funded and fully funded service users, and is the same population of service users as those who would be reported in table LTS001b of the Short and Long Term (SALT) support data return.
All eligible service users and supplementary information are included in the initial extract of data. Local authorities send questionnaires to a sample of these service users in the period of January to March each year.
Guidance and materials for local authorities
Local authorities are provided with detailed survey guidance and materials such as questionnaires, forms, letters and interview scripts. This year, the ASCS 2024 to 2025 guidance and materials for councils have been published on the NHS England website.
There are 2 main versions of the survey questionnaire for those in:
- residential or nursing care
- receipt of community-based services
There are also accessible versions of the questionnaires, including:
- easy read versions designed for service users with a learning disability
- large-print versions
- translated versions for service users who may not be fluent in English
The questionnaires are also provided as interview scripts so that service users who request a face-to-face or telephone interview can participate in the survey.
The model questionnaires and interview scripts are generic and contain sections that are customised by local authorities. There are optional questions and local authorities can also include additional questions or free-text boxes for local research purposes. Any proposals to do so are subject to approval.
Margin of error
The survey uses data from a sample of service users to make estimates about the whole population. These estimates are subject to uncertainty that can be expressed as a ‘margin of error’.
The margin of error of an estimate relates to the proportion of the population who respond to the survey. As a greater proportion of the population responds to the survey, the margin of error decreases. Therefore, the margin of error can be reduced by increasing the survey sample size, the response rate or both.
Local authorities are required to select a sample so that their results will have a margin of error that is less than 5 percentage points.
Sampling method
The sampling method used for the ASCS is known as stratified random sampling. This involves splitting the eligible population into discrete groups, known as strata, and drawing an independent sample from within each stratum. This:
- helps to make the sample more representative of the eligible population
- allows local authorities to oversample in strata of interest to obtain robust results for that group
The use of stratified sampling means that there is a need to weight local authority-level data to adjust the results to represent the eligible population from which the sample is drawn. If it is not possible to assign a service user to a stratum then they are removed.
There are 4 strata as follows. Service users must meet all criteria listed.
Stratum 1
Stratum 1 includes service users:
- with a primary support reason (PSR) of learning disability support
- at any age
- within any service setting
Stratum 2
Stratum 2 includes service users:
- with a PSR of disability support, but excluding those with learning disability support
- at ages 18 to 64
- within any service setting
Stratum 3
Stratum 3 includes service users:
- with a PSR of disability support, but excluding those with learning disability support
- at ages 65 and over
- within permanent residential or nursing care settings
Stratum 4
Stratum 4 includes service users:
- with a PSR of disability support, but excluding those with learning disability support
- at ages 65 and over
- within community-based services (including supported living) settings
Fieldwork period
The recommended fieldwork period in which the questionnaires are distributed and the completed questions collected is during January to March of each year.
In most instances, local authorities post a questionnaire to each member of the sample, with one reminder letter sent to any participant who does not respond to the initial questionnaire.
Participants return their questionnaire to their local authority, which codes the results onto a data return spreadsheet. The resulting data sets produced by each local authority are then submitted to NHS England for validation and analysis.
Identification of non-participants
At some point during the survey process, local authorities must identify service users who should not participate in the survey. The list of who shouldn’t be included are those who:
- have stopped receiving long‑term support services
- have died
- have moved away
- are hospitalised
- are known to not have the mental capacity to consent to take part
- are involved in an open safeguarding alert or investigation
In addition, a questionnaire is not sent to those service users who are in active dispute with the local authority because this might be perceived as being provocative or insensitive. Service users are also removed where there are insufficient address details known to send them a survey.
Local authorities may run all or some of these checks at different points in the process once the eligible population is extracted.
Some local authorities will identify and remove non-participants before determining the sample, thereby creating a discrete sample frame from which the sample is drawn. Some service users may be identified and removed after the sample has been drawn, in which case they are replaced with other service users on a like-for-like basis.
Service users who are removed from the sample are still counted as members of the eligible population.
Local authorities remove any identifiable data items before sending their completed data return. DHSC cannot identify any individual service users in the data.
Eligible population
In 2024 to 2025, local authorities reported an eligible population totalling 643,050 service users of which 248,010 were sent a survey. 60,490 surveys were completed and returned equating to an overall response rate of 24.4%.
Demographic profile
Local authorities are required to complete a data return with administrative data on all the service users in their sample. If a service user responds to the survey, their questionnaire responses are then added to the data return.
Further details on missing administrative data and data quality are provided in the ‘Personal social services ASCS report, England: 2024 to 2025 - data quality statement’ and ‘Personal social services ASCS report, England: 2024 to 2025 - data quality tables’ published as part of the Personal social services adult social care survey report, England: 2024 to 2025 statistical release.
2024 to 2025 demographic sample data
In the 2024 to 2025 ASCS sample, there were 248,010 service users. The gender was known for 247,945 (100.0%) service users, of which 108,120 (43.6%) were male and 139,585 (56.3%) were female. Those who were ‘Other’ accounted for 240 (0.1%).
Age was known for all service users in the sample, of which 106,860 (43.1%) were aged 18 to 64 and 141,150 (56.9%) were aged 65 and over.
The ethnic group was known for 234,485 (94.5%) service users in the sample, of which 190,045 (81.0%) were White and 44,440 (19.0%) comprised all other ethnic minorities.
2024 to 2025 demographic respondent data
In the 2024 to 2025 ASCS, there were 60,490 respondents to the questionnaire.
The gender was known for 60,480 (100.0%) service users, of which 26,435 (43.7%) were male and 34,000 (56.2%) were female. Those who were ‘Other’ accounted for 40 (0.1%).
Age was known for all service users in the sample, of which 26,280 (43.4%) were aged 18 to 64 and 34,210 (56.6%) were aged 65 and over.
The ethnic group was known for 57,575 (95.2%) service users in the sample, of which 48,515 (84.3%) were White and 9,060 (15.7%) comprised all other ethnic minorities.
Changes to the methodology
Information on methodological changes to the ASCS that happened before the 2024 to 2025 data collection are covered in the ‘Methodological change notices’ section below.
Development of the survey
The survey ran for the first time in 2010 to 2011 following the successful completion of a pilot survey in 2010 and a development project carried out by PSSRU.
The 2010 to 2011 survey methodology and questionnaire were reviewed by the Office for National Statistics (ONS) Methodology Advisory Service and a response to this review was provided in collaboration with SSUSG.
In 2019, the 2017 to 2018 release was reviewed by the Government Statistical Service’s good practice team. Changes made based on the resulting recommendations are reflected in statistical releases published since 2018 to 2019.
In 2022 to 2023, the survey was redefined as a service evaluation as opposed to research. Due to this, ethical approval was not required for this year’s survey.
Developing a preference-weighted measure
Questions 3 to 9 (parts a and b), and questions 10 and 11 from the ‘Quality of life’ section of the survey were developed by PSSRU as part of the Adult Social Care Outcomes Toolkit (ASCOT).
A specific research project entitled ‘Outcomes of social care for adults: developing a preference-weighted measure, health technology assessment’ was carried out primarily to establish a way of weighting the different social care domains covered by these questions into an overall preference-weighted measure for social care-related quality of life.
The project also included some rigorous testing of the questions. This built on the work of previous studies associated with the development of ASCOT, the history of which is detailed within the research paper. Pages 13 to 17 of the paper detail the findings of the cognitive interviews carried out to ensure service users understood the survey questions.
Findings for each of the ASCOT domains
Main findings for each of the ASCOT domains were as follows.
Control (question 3a in the current survey)
The term ‘control over daily life’ was understood by the people interviewed. People distinguished between making decisions and carrying out those decisions.
Most spoke about a dependency to some extent on having help from others. They said that having control over their daily life depended on having someone - and, importantly, the right someone - to help them.
A written definition of ‘control over daily life’ was added to the self-completion version of the questionnaire to ensure the question captures these issues.
Personal care (question 4a)
The term ‘clean and presentable’ was understood well. People talked about how frequently they washed, showered or bathed, and whether they were able to do their hair the way they liked and wear the clothes they liked.
Many of the women talked about the difficulties they had with jewellery and make-up, and the importance of being able to wear them.
Food (question 5a)
The term ‘food and drink’ was found to work well in terms of expressing the aspects of meals and nutrition. Including ‘drink’ was very important as people drink more often than they eat.
The original wording tested for the question was ‘I can get …’ - however, this was being interpreted literally by people as referring to their own physical ability to get food and drink without help, so the wording was changed to ‘I get …’ which removed this problem.
Accommodation (question 6a)
The wording ‘my home is clean and comfortable’ was found to work well.
Important aspects were having clean, dust-free surfaces and hygienic kitchens and bathrooms, but people also mentioned:
- the state of their décor
- whether their home was neat and tidy
- whether they had as much of their own ‘stuff’ in it they could get to easily
- whether they could get around their home easily
Personal safety (question 7a)
This question was understood well, although people questioned whether it was referring to safety outside as well as inside the home. A definition of safety was added to make it clear that both feelings should be captured.
Feedback from local authorities running the 2010 to 2011 ASCS suggested that some service users responded to this question with respect to fears relating to neighbourhood factors (such as local crime, youths hanging around estates and so on), and the reference to fear of being ‘attacked or robbed’ was removed from the definition to try and make the question capture fewer factors that are outside of the control of social services departments.
Social life (question 8a)
This domain was found to be quite difficult and, after trying different options, the phrase ‘social contact with people I like’ was found to work well.
The original answer options tested included the phrase ‘I feel lonely’, but this was removed as people found it confusing and referred to feeling lonely because they did not have a special person in their life as well as feeling lonely because they didn’t know many people. It was decided to focus the question on social rather than personal loneliness because it was felt that social care would be expected to address social loneliness and not personal.
Occupation (question 9)
The term ‘doing things I value and enjoy’ was found to capture the type of things intended.
People mentioned things such as:
- voluntary and paid work
- activities they did with others, such as shopping or eating out
- activities they did on their own, such as reading, craft activities and other hobbies
The wording for the answer options was developed during the testing to capture the frequency of doing these things as it was found that, for some people, the issue was not whether they did things they valued or enjoyed, but whether they could do as many things as they would like due to health limitations. A written definition was added to the self-completion version of the questionnaire to ensure the question captures these issues.
Dignity (question 11)
The term ‘the way I think and feel about myself’ was found to capture a person’s sense of self and significance well.
The term ‘the way I’m helped and treated’ (question 11) forced people to consider the way their care and support packages had an impact on their sense of self and significance. However, the research found that some people felt negatively about themselves not because of how they were treated but because they found it difficult to accept that they needed help.
Therefore, an additional question (question 10) was asked about the impact of having help to provide an outlet for these thoughts. This helped moderate these thoughts influencing responses to the question on the way people were helped and treated (question 11).
General point for all domains
A time frame for contact was also tested of ‘the past couple of weeks’, but it was found that this made the question difficult for people to answer - this was because many people had conditions that fluctuated, and others tended to ignore this part of the question when discussing reasons for their answer. Therefore, this phrase was left out of the eventual question wording.
Comparing ASCOT domains with established measures
A further research project, ‘An assessment of the construct validity of the ASCOT measure of social care-related quality of life with older people’, took place in 2012 that compared the ASCOT domains with other established social care and quality-of-life measures.
The research consisted of conducting interviews with older people receiving publicly funded home care and asking the ASCOT questions along with other questions that enabled the established social care and quality-of-life measures to be calculated. The data from the interviews was also merged with a previous survey of older home care users conducted by NHS Digital (now NHS England) 6 months previously that contained questions on perceptions of survey quality.
The responses to the questions covering the ASCOT domains were then compared with the other measures and the home care survey data.
The research found there was evidence for the validity of the control, occupation, personal care, personal safety, accommodation and social life domains. The results are summarised as follows.
Control and occupation
There was evidence to support the validity of these domains as there were strong associations with the expected variables and, where there were unexpected relationships, there was an explanation.
Personal care and accommodation
There was good evidence concerning these domains since both had strong associations in the expected direction with significant variables.
Social life
This had the anticipated relationships with other variables, although sometimes the associations were weak.
Personal safety
This seemed to capture factors inside and outside the home that could make a person feel unsafe, but there was a lack of association with how often the person met up with friends and family, which was unexpected.
Food and dignity
These domains had the weakest evidence, but this may have been a consequence of not being able to find good data to compare them against. The few measures that were used had the expected relationship, but often the relationship was weak.
These concerns should be considered when using the results in this publication, alongside the fact that the validity has only been proven for older home care users.
Addressing areas of concern around operating a national survey
The other questions were cognitively tested through a research project carried out by PSSRU specifically to address areas of concern around how the survey would operate. The report, ‘A report on the development studies for the national social care user experience survey’ (PSSRU discussion paper 2721, University of Kent), focused on 4 areas and aimed to:
- explore the variety of help received by service users in completing the questionnaire and consequences for the validity of the data
- examine the feasibility of using the proposed approach and the suitability of the questionnaire for people living in care homes
- develop a version of the questionnaire suitable for people with learning disabilities and explore the feasibility of the approach with this group
- ensure the feasibility of asking advocates to help service users to complete the questionnaire and the consequences for the validity of the data
Note that the report is currently not available on the University of Kent website.
Report conclusions
Allow service users to seek help
There was the possibility that the person helping to complete the survey may influence the responses of the service user, but this should be balanced against the desire for the survey to be as inclusive as possible.
It was recommended that service users should be encouraged to seek help from friends and family if they cannot answer the questionnaire without help.
Encouraging service users to seek help to complete the survey from advocates was not recommended except in circumstances where service users already had an existing relationship with the advocate.
Minimise bias
It was also recommended to minimise the potential for the person helping to influence the results by:
- adding instructions for the people supporting the service user to respond to the questionnaire to the front cover
- adding additional questions about who helped and how
- considering adding a mention of the survey for carers to the front cover
Care home staff members were more likely to be on hand to help than relatives or friends of the resident. Again, the possibility of bias needed to be balanced against the desire to be inclusive, so it was recommended that:
- care home staff should be engaged in the survey to make sure they are on hand to help
- a letter should be sent to the care home manager to gain support for the survey
- the letter to the care home manager should outline what type of help is acceptable
- while service users in care homes could seek help from care home staff, staff should encourage residents to initially seek help from regular visitors or a local authority helpline before offering assistance
Clarify how data will be used
It was recommended that local authorities may wish to consider making it clear to care homes the way in which the data will be used as staff are more likely to present truthful accounts if they think the survey will not be used to judge the care home.
Adapt the questionnaire for care home use
Also, the questionnaire should be adapted for care home residents to ensure all questions are applicable by replacing ‘home’ with ‘care home’.
Create an easy read version of the questionnaire
The questionnaire should be adapted for people with learning disabilities. Questions should not have more than 5 response options, the language should be simpler and images should be used.
Exclude service users who lack capacity
Another important conclusion was that those service users who lacked capacity to consent to take part in the survey should be excluded from the sample.
Questions that were not understood or appropriate for care home residents
Most of the questions were understood well and elicited the responses intended across all the studies. The exceptions were:
- several questions on abilities in activities of daily living were not appropriate for care home residents. Questions on ability with steps, mobility outside, managing the household shopping and preparing meals were felt to not be relevant to care home residents
- different questions to ascertain the health of the service user were tested. It was reported that the questions on self-perceived health and some of the questions in EQ-5D1 were not recommended. EQ-5D1 is a standardised instrument for use as a measure of health outcome
- the question on health conditions was answered unreliably and, instead, local authorities should report the primary reason someone is receiving support (their PSR) in their ASCS return, and use their own records to gather data on any additional or secondary support needs (such as health conditions)
Testing and reviewing the questionnaires and survey methodology
National Adult Social Care User Experience Pilot Survey
The questionnaire and survey methodology were tested through a National Adult Social Care User Experience Pilot Survey (NASCUEPS). NHS Digital ran NASCUEPS with 18 volunteer local authorities in April 2011. A report on the findings of this pilot survey is available as ‘Paper 2 - Report Pilot NASCUEPS Survey’ on SSUSG’s 20 July 2010 meeting page. The discussion of the paper is reported in the minutes of the meeting at ‘Minutes - 20 July 10’.
This NASCUEPS pilot found the methodology and questionnaire to work well in general. There were some areas of concern around the:
- resource required to run the survey
- potential for differences in approach when removing clients who lacked capacity and replacing them with other service users
This led to a tightening of the guidance around this issue when the survey went live in the autumn of 2010.
All the questions were found to work reasonably well, although it was suggested that the sub-questions within one of the activities of daily living questions were reordered to address a relatively low response rate for that question. An additional option was also added to the question on getting around outside of the home to provide an answer for those people who never left their home.
Stratified sampling pilot exercise
Further developments since the survey was run for the first time in 2010 to 2011 have mainly related to the introduction of stratified sampling. A stratified sampling pilot exercise was run in April 2011 with 17 volunteer local authorities. A report on the pilot survey is available as ‘Paper 3 - Stratified Sampling Pilot’ on SSUSG’s 20 July 2011 meetings page. The local authorities were asked to resupply their 2010 to 2011 survey data but, this time, to stratify their sample according to draft guidance.
The findings of this exercise showed several local authorities had misinterpreted the requirement to stratify their sample as an instruction to deliver robust results at stratum level. Therefore, the stratified sampling pilot report concluded that the requirement should:
- be made clearer in the final guidance
- emphasise that local authorities should not need to survey any more service users than they did previously
ONS review of ASCS methodology and questionnaire
Around the same time, the ASCS methodology and questionnaire were reviewed by the ONS Methodology Advisory Service and a response to this review was provided in collaboration with SSUSG (PDF, 437KB). This included details of which recommendations would be taken forward in the 2011 to 2012 survey.
Most changes were minor with the most significant being to clarify that:
- interviews should only be offered if requested in response to an initial postal invite to take part
- all local authorities should issue one reminder letter regardless of whether they have already achieved the required sample size
Process for checking whether a service user has capacity
Another significant change to the 2011 to 2012 survey was to the process for checking whether a service user had capacity to complete the survey. The new process instructed that the check:
- only needed to be carried out for service users in residential care
- could be performed by the care home manager
This was based on feedback from local authorities that the previous process was burdensome and, as a result, was being applied inconsistently across local authorities. To read more about this, see ‘Paper 5 - Minutes of meeting between SSUSG subgroup and SC REC subgroup’ on SSUSG’s 19 April 2011 meetings page.
Calculating confidence intervals and standard error
The confidence coefficient is determined by the value of the alpha parameter, which by default equals 0.05. This corresponds to 95% confidence limits. The confidence limits are calculated using the:
- estimate of the mean
- standard error of the mean
- relevant percentile of the t-distribution based on the appropriate degrees of freedom
Weights are applied to ensure that the survey results accurately represent the eligible population. The standard formula for the variance of estimates in a stratified sample design is used.
The weight for each stratum reflects the number of eligible individuals in that stratum relative to the total eligible population across all strata.
The variance is then calculated using the weighted results from each stratum.
This variance is the main value needed to determine the 95% confidence interval for an estimate.
The 95% confidence interval is calculated by taking the estimate and adding and subtracting approximately 1.96 times the square root of the variance of that estimate. This provides a confidence interval around the estimate.
A standard formula is also used to calculate the standard error of an estimated percentage. The standard error shows how much an estimate from a sample survey is expected to vary from the true population value. It depends on both the sample percentage and the size of the sample achieved.
Rationale for questions and notes on interpretation
Section 1: overall satisfaction with your social care and support
Question 1
This is a general measure of how satisfied social care users are with the services they receive. This is a general question and is like the satisfaction question asked in previous surveys. It has been included to ensure some degree of continuity with the previous user experience surveys.
Section 2: your quality of life
Question 2
This is a general measure of quality of life that has been used in other national surveys. Local authorities will be able to use this question to:
- get a sense of users’ views about their overall quality of life
- compare this to the average for the UK population
Question 3a, 4a, 5a, 6a, 7a, 8a, 9a, 10 and 11
These questions ask about aspects of social care-related quality of life (SCRQoL), meaning the aspects of quality of life that the range of social care services can be expected to impact upon. With these questions, local authorities should be able to monitor outcomes for social care users.
Questions 10 and 11 are as follows:
- question 10: which of these statements best describes how having help to do things makes you think and feel about yourself?
- question 11: which of these statements best describes how the way you are helped and treated makes you think and feel about yourself?
While the questions may seem similar, they have both been included as they complement each other. Most social care services are ongoing, so the services become an integral part of the user’s life. Aspects associated with the way the services are delivered are therefore very important and question 11 is designed to capture the effect of this on a person’s psychological wellbeing.
Cognitive testing found that some people interpreted question 11 as whether their need for care and support affects their psychological wellbeing. Clearly, for many disabled people, coming to terms with the consequences of their disability is an important issue and question 10 was introduced to capture this.
Including question 10 prior to question 11 ensures that question 11 is interpreted as intended.
Questions 3b and 7c
These questions - along with the other optional (b) questions - add context to their predecessors and let the service user say whether they feel social care services impact on the various aspects of their quality of life.
Section 3: knowledge and information
Question 13
Social care services have an important role in signposting service users to organisations that could help them and provide advice.
The white space separates the substantive answers from the ‘not applicable’ option. It is very important that this white space is included as research has shown that respondents are guided in their answers by their impressions about the length of the response scale.
If the visual cue makes them think the scale consists of 5 rather than 4 options, then they will answer accordingly and responses will be, in this case, more negative.
Section 4: your health
Question 14
This question asks about users’ self-perceived health. Health can mean many things to different people and research has shown that this question correlates well with mortality. Local authorities can use this question to interpret questions 3 to 11.
Questions 15a and 15b
Being and staying healthy is an important goal of social care services, which can contribute to a person’s health by maximising their SCRQoL.
There are, however, some important aspects of health - in particular pain, anxiety, and depression - that are not included in the measure of SCRQoL as it was felt that social care services do not directly act to improve these aspects of health. They can of course have a role by ensuring medication is taken and providing company, but it is the responsibility of health services to manage dosage or put in place psychological interventions.
It is useful for local authorities to know about these aspects of health as it can help local authorities to interpret questions 3 to 11.
Questions 16a to 16d and questions 17a to 17d
These questions capture the extent to which the service user is dependent on help from another person to undertake activities of daily living.
They provide some information on the need level of the respondent across a variety of activities and are critical in helping to explain variations in questions 3 to 11.
Section 5: about your surroundings
Question 18
The layout of the person’s home can greatly influence the type and amount of help they need. This question provides information about the extent to which housing stock could be improved for people in the area and is an indicator of need.
Question 18 is important in helping to explain variations in questions 3 to 11.
Question 19
Similarly to question 17, the layout of the local area, transport links and proximity of amenities can greatly influence the type and amount of help a person needs when they venture out of their home.
While this question provides information about the extent to which the built environment and local transport could be improved for people in a local authority’s area, it is also an indicator of need and is critical in helping to explain variations in questions 3 to 11.
Section 6: about yourself, the service user
Question 20
Friends and family can also contribute to ensuring a person has good SCRQoL. This question is an indicator of the extent to which friends and family are involved in the care of social care users in your area, but it is also an indicator of need and is critical in allowing local authorities to interpret variations in questions 3 to 11.
Question 21
In our exploratory work, some service users bought care privately or topped up the care they received from the local authority. Often this was for specific aspects of their care, such as housework and cleaning. Service users wanted to make it clear that it was not the local authority that was helping them achieve good SCRQoL in these areas.
This question is an indicator of the extent to which service users need to draw on other resources to achieve the desired level of quality of life in each of the areas identified in questions 3 to 9.
This question is important in helping local authorities to interpret variations in questions 3 to 11.
Questions 22 and 23
A very large number of service users need help to answer the questionnaire. Exploratory work has shown that the help given is hugely varied and the way the help is given may well influence the responses.
These questions are very important in helping local authorities to understand variations in responses to questions 3 to 11.
Local authorities should consider whether they want to treat responses from people who had no help differently to those who had help, for example - particularly where the help meant the service user had very little input in the answers.
Methodological change notices
These changes were agreed by the SSUSG, which included representatives from NHS Digital (now NHS England), DHSC, PSSRU, the Care Quality Commission and local authorities in England.
There have been no other changes to the methodology for the survey or to the design of the questionnaire.
2021 to 2022
See the ASCS 2021 to 2022 methodological change notice.
2020 to 2021
See the ASCS 2020 to 2021 methodological change notice.
2016 to 2017
In the 2016 to 2017 survey, a new question was included in the standard version of the community questionnaire - question 2c: ‘Which of the following statements best describes how much choice you have over the care and support services you receive?’
This question is only asked of community service users who completed the standard questionnaire and therefore the weighting applied to this question reflects these specific service users within the eligible population. This question is weighted using the following filtered eligible population data:
- support setting is equal to ‘community’
- primary support reason does not equal ‘learning disability’
- stratum is equal to 2 (aged 18 to 64 excluding learning disability support) or 4 (aged 65 and over in the community, excluding learning disability support)
2014 to 2015
In the 2014 to 2015 reporting year, the main changes that had an impact on the ASCS were:
- the replacement of the Referrals, Assessments and Packages of Care (RAP) return with the SALT collection, which resulted in a change to the target population for the survey
- changes to the sample substitution criteria for the survey
- changes to the way in which sample weights are calculated when analysing the data
Change to the population covered by the survey
Previously, the eligible population of adult social care users for the ASCS had been those in receipt of local authority-funded services following a full assessment of need. This was the same group of individuals who would have been eligible for inclusion in a ‘snapshot’ of the RAP return.
When RAP was replaced by SALT, the eligible population for the ASCS changed from a snapshot in the RAP table to the most closely comparable SALT table at a chosen extract date.
To be included in the SALT table, a service user had to - at the point that data was extracted from the local authority’s system - have been in receipt of long-term support services funded or managed by the local authority, following a full assessment of need.
The changes to the population covered by the survey were:
- service users whose only services were the provision of equipment, professional support or short-term residential care were included in the RAP table but not in the SALT table. The exception to this was that service users receiving professional support for their mental health needs were included even where this support was the only service they received
- ‘full-cost clients’ (those who paid for the full costs of their services but whose care needs were assessed and supported through the local authority) were not eligible for inclusion in RAP but were included in SALT
Before 2014 to 2015
Change to the sample substitution criteria
Before 2014 to 2015, where a service user had been selected in the sample, they were sent a questionnaire even if it was known that their services had stopped since they were selected. These service users were now removed from the sample and replaced with a suitable alternative.
Changes to the weighting methodology
Stratified sampling was introduced into the ASCS in 2011 to 2012, and each response assigned a weight so that the results were representative of the entire target population from which the sample was drawn.
For the years 2011 to 2012, 2012 to 2013 and 2013 to 2014, these weights were calculated by dividing the count of the target population by the count of respondents (the inverse probability of being a survey respondent) in each local authority for each stratum. The same weights were applied for all question responses.
For 2014 to 2015 onwards, a unique set of weights were calculated for each question by dividing the count of the target population by the count of usable responses to that question (the inverse probability of responding to that question) in each local authority for each stratum.
This method is more robust and produces more accurate results.
