Evaluation of CQC's local authority pilot assessments

Published: 8 December 2023 Page last updated: 8 December 2023

Downloads

Fieldwork evidence gathering

Methods used to gather evidence

The assessment team used a variety of methods to gather the wide breadth of evidence that they needed to obtain in the intense fieldwork period. Members of the team gathered evidence both on site and off site through:

  • carrying out individual interviews with staff and stakeholders
  • meeting with different groups
  • offering drop-in sessions
  • visiting services and different groups.

Often, a lot of these engagements centred around the local authority’s main offices, but the team also travelled to other locations to meet different people. Overall, the fieldwork approach seemed to work well for the assessment team.

Assessment team timesheet data showed that time spent on evidence gathering increased as the pilots went on. This is likely to be explained by the team finding they needed extra time for some methods, such as group interviews and having additional methods on the later pilots, including having to manage staff drop-ins and the provider survey. There was also more engagement with local voluntary groups on some of the pilots, and increasingly case tracking interviews were undertaken in person.

Leads for the pilots trialled slightly different meeting lengths, but they arrived at a consensus of around 45 minutes to an hour for individual meetings and up to 1.5 hours for groups. For most interviews and groups, one member of the team would lead while another took notes. It became clear on the pilots that it was important to have a note taker who could digest what was being said to help agree the key points and who could discuss the meeting afterwards with the lead.

We found that there was generally a good balance of on-site and remote fieldwork from the local authority’s perspectives, and they appreciated the flexibility that this offered to fieldwork participants. Most commented that their staff preferred meeting CQC face-to-face and that some local groups or providers requested online meetings. This largely reflects what happened in practice and where specific requests were made, CQC was generally accommodating. There were some reflections from local authorities that although some of the local groups requested online meetings, the local authority felt face-to-face may have worked better in encouraging participation.

We heard from the local authorities that staff were often apprehensive before speaking to CQC, but they later reported on the experience with positivity. They said they were able to speak about their work and their achievements and that the team created an open and comfortable atmosphere. One described staff coming out of the meeting with CQC as being “on a bit of a high”.

We also heard that CQC considered features of individual local authorities in deciding evidence gathering methods, for example taking into consideration the geographical spread of a local authority and what visits could be accommodated practically in the time available for the fieldwork. These adjustments tended to be more practical considerations, which the approach was flexible enough to accommodate.

One local authority did make a wider comment on the importance of considering the local area context in how the approach is applied:

I think sort of tailoring the inspection model to particular characteristics of an area is something if there's scope to do it in an inspection and that adds valuable insight.

They went on to explain how deprivation in parts of the area they cover has a “huge impact on how we discharge our social care duties.”

Question techniques

There was some mixed feedback from the local authorities on whether the questions CQC asked were at the right level. One explained:

I think the nature of some of the questions were really open ended, very open ended in that you could look to respond to that from multiple angles.

This left some staff concerned that they were not always giving CQC the information needed. However, others seemed to appreciate the opportunity to talk openly about their work. The team seemed to adapt questioning styles where it made sense, using prompts to gather more detail, and we observed more pointed questioning where, for example, the team were following up points raised in the corroboration meetings.

There was potentially some benefit in team members following certain themes in the meetings they led, such as safeguarding, as this helped them to corroborate what they had previously heard. However, there is also value in attending a mix, and often the timetable and other logistical pressures dictated this more so. Similarly, there were suggestions of an ideal order in which to hold certain meetings, for example speaking to the local authority’s CEO late on in the fieldwork to reflect on and test out what had been heard, but again this was not always possible owing to people’s availability.

A couple of points arose about good practice for interviewing, including ensuring a consistent message at the start of conversations on the confidentiality of what participants say (if this has not been shared through briefings before) and ensuring this is upheld, for example in not sharing an individual’s viewpoints with other fieldwork participants.

Engaging the right people

In our survey of local authority staff who had been involved in the pilots in some way, 88% (n=136) agreed that CQC was speaking to the right people to understand and assess the local authority. Furthermore, most felt CQC asked them the right questions, with 82% answering ‘no’ when asked if there were any other questions CQC should have asked them. Separately, we heard reflections from some of the local authority lead members about CQC’s areas of interest when speaking to them, as they mentioned:

  • performance management
  • governance
  • scrutiny processes and oversight
  • the involvement and perspectives of people who use services.

In our interviews with local authorities, some told us that there were other groups and individuals who they felt should have been involved in the fieldwork but were not initially. Some of this was explained by early confusion in establishing the responsibilities of different teams and roles, which was usually rectified quickly. Some local authorities suggested in future CQC should outline the types of role responsibilities they are interested in, rather than specific role or team titles. One explained:

I think it's a case of more clarity around the roles you wanted to cover and then letting us identify who within our organisation could actually do that work to do that interview for you.

Another local authority commented that information they had already supplied could have been put to better use and avoided the need for meetings to discuss who CQC should engage with:

If someone had really read our self-assessment and looked at a structure chart of our department first before sending us the list of teams they wanted to meet, we could have avoided needing to have that meeting.

The assessment team also reported a lack of information from the local authorities on why certain teams or individuals had been scheduled to meet them. This was exacerbated by the rush to finalise timetables before fieldwork, allowing little time for queries.

It seems that the pressured time in the run-up to fieldwork may have prevented an opportunity to alleviate some of this confusion. Going forward, local authorities would appreciate more up-front refined guidance to support this. If this is effective it would also help to ensure that CQC is targeting the fieldwork to the most pertinent groups and individuals. Speaking to staff without managers present has been a long-established method used by CQC on other types of inspection and influenced the approach CQC took with local authorities. However, some local authorities expressed frustration about the request to not include managers in some meetings with frontline staff. local authorities also said CQC was reluctant to engage some of the managers separately or as a group and they felt this was potentially a missed opportunity to gain more of a strategic perspective.

Two local authorities felt there could have been more engagement with a wider range of health colleagues. One explained they’d expect this:

where there is that really strong dependency between social care and how to deliver and social care duties under the Care Act.

Another added that although CQC had asked to speak with health partners they had been unclear on precisely which ones:

We were asked to have interviews with Health Partners, but there wasn't any clarity as to which ones, strategic health colleagues, operational colleagues, or place based.

One of the local authorities also acknowledged that, as yet, they didn’t know how far this might be covered by integrated care system assessments, as they were aware that CQC is also undertaking these pilot assessments. It will be possible to then consider the integrated care system and local authority pilots alongside each other and the extent to which each examines parts of the Care Act requirements.

Linked to a number of these points, was the importance of striking the right balance between the local authority and CQC regarding how far the local authorities should influence who CQC is engaging in the fieldwork. Being pilots, it seems it was possible for this to be more of a two-way process between CQC and the local authorities. Going into formal assessments, it’s not expected this would be sustainable as an approach from a practical perspective but pertains to an important point about what level of independence is needed in the fieldwork arrangements.

People’s experience evidence

Gathering people’s experience as evidence is integral to validating other evidence found in the assessments and is one of the evidence categories in the single assessment framework. There has been progress in this area since the test and learn exercises, such as including case tracking in the approach. We asked local authorities and the team if CQC had good quality methods to collect enough information on experiences in the pilots, and while they could describe a range of ways this was happening, they often had suggestions about how CQC could do more.

The team frequently spoke of actively trying different ways to engage a wide range of community and other types of local groups. One described asking the local authority for a list of local voluntary groups who were then telephoned, but this was at rather short notice and few responded. local authorities described thinking during the planning stage that there was not sufficient engagement with certain groups of people and in some cases they suggested wider groups to involve, which CQC was often able to accommodate. Others found that CQC made requests when on site to speak to or visit different groups, such as asking to visit a community hub at short notice. One local authority added that CQC would have,

got a different picture of [the local authority area] if they had gone out and met people in different locations.

It is clear that these engagements were valuable and should be continued along with considering other opportunities to engage people in the assessments. Suggestions to do this included engaging more community groups to get a more diverse set of views, including through:

  • drop-in sessions
  • involving Healthwatch to obtain people’s views
  • involving advocates to ask people questions on CQC’s behalf.

Increasing the range of methods should also consider how to ensure these are independent of the local authorities, who, as one team member flagged, tended to determine the groups that CQC engaged with. Additionally, to engage people more it is vital to ensure that opportunities are accessible and inclusive, considering people’s different needs.