The CQC continues to release updates in respect of its progress with the implementation of the new single assessment framework.
Earlier in the summer it was announced there would be further delays, with most regions of the country now not progressing on to the new framework until Q1 of 2024. As of November 2023 the CQC will start using the new framework in the South region (Berkshire, Buckinghamshire, Cornwall, Devon, Dorset, Gloucestershire, Hampshire, Kent, Oxfordshire, Somerset, Surrey, Sussex and Wiltshire).The framework will then be rolled out to other regions with the roll out expected to be completed to all regions by March 2024. Until that time, services operating elsewhere will continue to be inspected under the existing framework.
The CQC have confirmed that it will give 2-3 months’ notice to providers of when the framework will be implemented in their area. It is clear that new assessments will take time to reach everyone as the CQC will continue to inspect services on a risk basis, which may be frustrating for improved providers with historic poor ratings. Therefore, not all services will be inspected immediately using the new regime.
New provider portal
The new provider portal is currently being rolled out from September 2023 to all providers (following an initial pilot in August) in phases until March 2024 when the portal will be available to all providers. All interactions with the CQC will be via the provider portal, enabling providers to share information, validate information the CQC holds about you, and comment on CQC concerns in real time. The invitation to the portal will be sent to the Nominated Individual and Registered Manager and therefore it is vital that providers, nominated individuals and registered managers ensure the CQC has up-to-date contact details. How easy the system is to use, if it will make the process of sharing and validating information smoother, remains to be seen, including how quickly the CQC will update ratings/data in response to provider responses.
The portal will show a provider, we are told, information taken from other stakeholders and not just information collected from the provider themselves which will give greater visibility for the provider to understand what information the CQC holds about them. This has the potential to affect relationships with other stakeholders and is something to consider when interacting and engaging with other stakeholders.
Rating system under the single assessment framework
As previously confirmed, the CQC will continue to use the five questions (safe, effective, caring, responsive and well-led) and the four-point ratings scale (outstanding, good, requires improvement and inadequate).
Under each key question is a set of Quality Statements that describe what good care looks like. Information obtained by the CQC under each Quality Statement will be organised and scored under six evidence categories:
- People’s experience of health and care services
- Feedback from staff and leaders
- Feedback from partners/stakeholders
Each evidence category for each Quality Statement will be assigned a score:
- 4 – evidence shows an exceptional standard
- 3 – evidence shows a good standards
- 2 – evidence shows some shortfalls
- 1 – evidence shows significant shortfalls
The score for each evidence category will be combined to give a quality statement score, and each quality statement score would then be combined to give a key question score. Interestingly, it appears that all quality statements will have the same weighting regardless of what area it touches on. There has, so far, been no suggestion of possible rating limiters, whereby if a service achieves a low score on the evidence in some key areas, it will have an impact on restricting what overall score a key question will be given, regardless of how highly every other quality statement may score.
Key question scores are then converted to give a key question percentage. These percentage scores are then used to assist benchmarking against other services.
The idea is that the CQC could update scores for different evidence categories at different times. Any changes in evidence category scores can then update the existing quality score as well as the overall rating.
The CQC will continue to undertake inspections to gather evidence and assess quality. The CQC plan to publish shorter reports which will include a summary section about people’s experience of the service. The CQC also intend to provide benchmarking data against other similar services in the area/country. The new report template will use a standard template text for automatic generation of text within the report. We have already seen issues with the use of generic statements in reports not accurately representing the findings during the inspection process but which have proven difficult to have changed during the factual accuracy process. The CQC has said that the fundamentals of the factual accuracy process will remain unchanged but will be improved. It is not clear how this will operate under the new framework which is meant to be more dynamic.
In addition to the inspections, the CQC will also be collecting evidence on an ongoing basis with the intention that it will “in time” be able to change a service’s rating at any time. The suggestion that this will take some time for this to happen appears to move away from the key part of the drive to be more agile, responsive and up-to-date with ratings, for the time being at least.
It is not clear what information will be available when a service rating is changed outside of an inspection process to ensure transparency around decision making. There is no guidance currently on how a provider can challenge such decisions, for example by way of a factual accuracy process or rating review – processes that are already considered by many to be clunky and too rigid to work effectively.
Emphasis on feedback and engagement
A key part of the new framework is an emphasis on obtaining the views of service users, relatives and staff. Our recent experience is showing that the CQC are not waiting for the implementation of the new framework to increase its focus on this aspect.
The CQC Public Engagement Strategy 2023-2026 sets out the CQC’s approach to improve public engagement to deliver what it says will be smarter regulation driven by people’s needs and experiences of care. Whilst the strategy sets out the CQC’s own objectives and approach under this strategy, there is reference to the CQC’s expectation that providers will be proactive in seeking feedback. For example the strategy is clear that the CQC will assess how those who provide and organise care services in an area encourage and enable people and communities to provide feedback, how services act on what they hear, and how they work in partnership with them to design services. The CQC also state they are particularly interested in how services work with people who are more likely to have a poorer experience of care and who face inequalities, which will be given more scrutiny of how well people’s feedback is obtained and used. All of this will then be reflected in the assessments and ratings that the CQC give to services.
We have had very recent experience of a substantial increase in the volume of feedback sought. This includes one provider where well over half of relatives and a substantial proportion of staff were spoken to. There had been no opportunity to notify staff or relatives, and there were cases of some who found it distressing as they did not know why the CQC were contacting them outside of work whilst they were on holiday. We would recommend advising staff members and relatives that it is possible that the CQC will reach out to them personally with no notice.
We have also seen requests for contact details of staff members for the CQC to contact those individuals outside of work, raising the question of whether a provider should be providing personal contact details.
There is also some significant concern over fairness of how feedback is selectively referenced in reports. The routine refusal of the CQC to share even anonymised feedback makes it impossible to properly assess whether any report reflects a fair balance of the feedback used.
All of the above demonstrates that it is imperative that providers seek feedback from a high proportion of service users, relatives and staff. This not only allows providers to identify and subsequently address any concerns, but also obtaining a high proportion of feedback avoids self-selection for negative feedback, reduces the likelihood of ‘venting’ to the CQC and provides evidence for contesting the CQC’s summary of feedback.
Right support, Right Care, Right Culture (RRR)
The new CQC Director of People with a Learning Disability and Autistic People, Rebecca Bauers chose to emphasise her commitment to Right Support, Right Care, Right Culture as the “framework of good practice for providers and systems to adhere to” in her first blog. As well as potentially frustrating providers who think that guidance is fundamentally flawed and harmfully inflexible, this will have implications for local authorities and ICBs who routinely seek to commission services that do not adhere to the framework.
It may also have implications for services registered before RRR or its predecessor guidance, many of whom may have struggled to demonstrate strict compliance but have nonetheless been repeatedly found to be providing good care. Not to be too negative, however, the clearly increased drive by the CQC in recent times to truly listen to the experience of people with learning disabilities and autism is encouraging and comes across as a passion for Ms Bauers. The same blog also warns of an upcoming cross-sector CQC policy position on the use of restrictive practices, stating that all use of restrictive practice will be treated as a potential failure in a person’s care pathway. There will also be an increased emphasis on training including the Oliver McGowan Code of Conduct
What to expect in the near future
The CQC has promised monthly updates and other guidance documents coming out on the roll-out of the new framework and we therefore recommend that providers keep an eye out to keep up to date.
Our clients’ recent experience with the CQC shows examples of enforcement action/negative reports on the basis of information ‘not provided’ or ‘unavailable’, when in fact this stems from lack of requests or ambiguity in those requests. There are significant delays with the CQC processing times. We have seen draft reports take several months to be issued following inspection, even where issues are identified as requiring quick action by the provider. Representations made to the CQC, for example in respect of Notices of Proposal to add conditions to a licence, are taking up to 60 days for the CQC to respond. We do not anticipate that timescales will decrease in the near future whilst the CQC transitions providers to the new framework.