Reflecting on your feedback supports our ongoing improvement

Care Quality Commission
6 min readMar 26, 2024

CQC’s Chief Executive, Ian Trenholm, talks about the feedback we’ve received about the roll out of our new regulatory approach and what we’re doing with it.

CQC is changing. We’ve developed and introduced a new assessment framework which, over the last few weeks, has been implemented across the health and social care providers that we rate in England. It provides a single vision of quality for the whole health and care system.

Our new regulatory framework is a significant change in the way that we assess health and social care services, though it’s important to remember that the regulations themselves have not changed. We have drawn heavily on our experience of regulation over the last decade, including lessons learned during the pandemic and our Key Lines of Enquiry. We’ve worked closely with Think Local Act Personal (TLAP) to embed people’s expectations about what they want from good health and social care into how we assess quality.

The new framework is supported by a restructured organisation, new technology and new ways of assessing. It builds on what we do well and introduces new ways of working to match how health and social care is changing. We have carried across elements where possible and are changing where necessary. Importantly, it makes more of the voice of people using services and enables them to be heard more clearly.

Our ambition for the new assessment approach is to use the data we collect more effectively to enable us to more frequently update our view of quality across services and systems. It will give people who receive care and providers a clear view of the quality of care being delivered and areas for improvement.

Transformation of this scale has taken a huge amount of planning and development. Over the last 3 years we’ve worked with people who use services and the people who care for them to ensure that the new framework represents and supports them and their needs. We’ve worked closely with thousands of providers across the sectors we regulate to ensure that there’s clear understanding of the changes we’re introducing and what it means for them.

At the same time, transformation of this scale is, of course, neither linear nor simple. We needed to find a way to transition from where we were to our new approach, in a simple way that preserves the ratings we have developed over the last decade.

We’ve taken a staged approach, and the first providers to be assessed under the new framework have offered valuable perspectives on their experience so far. We’re learning and refining as we implement, and hugely value the feedback we’re receiving from colleagues across health and social care.

Because we have made significant investments in the latest generation of technology, we can change the way things operate very quickly — a massive departure from our old ways of working. There are examples below where we have already been able to put in place changes based on feedback.

You said, we did

We’ve heard from providers that there’s overall support for the ambition, intention and outline for our new approach. There has also been support for the move to a single assessment framework made up of a set of quality statements that allows for greater flexibility. We’ve had feedback that interactions with assessment teams have often been positive and more collaborative.

We have, however, had some challenges on where we can improve our regulatory approach. Feedback has been concentrated on the number of quality statements we might choose to review at each assessment; the frequency of assessment; the way we manage relationships with providers; the clarity of our guidance; and the functionality of our new provider portal.

Quality statements

Key feedback we’ve heard has been around the number of quality statements that are being used to assess services, and whether this means assessments, and therefore scores, are being based on out-of-date evidence. We know how important it is for the public, providers and stakeholders that we share an up-to-date view of quality. We must also balance the need to undertake more assessments with ensuring we review as much new evidence as possible in each assessment.

We are committing significant resources to assessing those services which have not been inspected for some time. Also, to those rated inadequate or requires improvement, with a view to changing the rating. We know this is of particular concern for providers. We plan to do this in a way which maximises the number of services we can visit, helping to provide clarity to providers whilst avoiding setting inflexible rules on numbers of quality statements we’ll assess, which in the longer term would be unhelpful and unnecessary.

Every service is different, of course, and so the amount of work required to confirm or change a rating will vary depending on the starting position. When carrying out an assessment of a service that is either inadequate or requires improvement all quality statements under the key question that are rated inadequate or requires improvement will be reviewed. It has always been true that a provider with many key questions rated as requires improvement will require significantly more work to re-rate as good than one key question rated as requires improvement. That does not change in the new approach, though the amount of work per key question rating is reduced. Our new approach gives us the opportunity to do that work in smaller packages of work over shorter time frames, working on and off site, which providers will find less disruptive.

Responsive assessments will of course continue to be done on receiving information of concern or to follow up enforcement action. We’ll also assess when we receive alerts from our data and insight work or information indicating innovation or outstanding practice.

Frequency of assessments

We know as well that providers are keen to understand the planned frequency of assessments — when might they reasonably expect to have an assessment? We’re building that information at present, using the feedback and data we’re gathering during this period of transition. We aim to publish timelines in the summer.

Relationships with providers

Another thing we’re hearing is concern from providers that they will no longer have a dedicated relationship holder. Our strategic intent to make it easier for providers to work with us and to understand the quality of care in an area remains the same. Our new assessment team structure means that there will always be someone you can contact and speak to who has knowledge about your service. This might not always be the same person, but knowledge and relationships will be shared across a team.

On the back of feedback from providers and our colleagues, we are reviewing how this assessment team structure is working in practice. This might result in some changes to ways of working and how individuals and teams respond to providers. We will keep providers updated on any changes that come from this work. Our intention is to talk more about this during April and explain how to get in touch in the most effective way.

Clarity of guidance

We’re already making changes based on provider feedback. We heard that the guidance section of the website was difficult to navigate and generally confusing. We’ve created a new ‘guidance and regulation’ landing page, which is more clearly structured around the main ways providers interact with us (registration, regulations, notifications, assessment, enforcement). We’ve also designed an index page for assessment — which covers the operational side of assessment as well as our assessment framework. It sets out a clear running order and enables providers to download all of our assessment content from a single page into a pdf format.

Adding dates for the ‘last significant update’ against our assessment guidance has also made it easier for providers to see when information has changed.

Provider portal

Our new provider portal is now available and an increasing number of people will be able to enrol over the next couple of weeks. The portal has been built and is being refined using feedback from providers. Where we’ve encountered issues in the enrolment process, we’re taking swift action to resolve them — again thanks to helpful feedback from providers and colleagues. We’re continuing to work towards the release of more functionality in the coming weeks along with further improvements such as developing how providers upload and share information with us for a better experience.

I’m confident that as we continue to assess services against the new framework, and as our technology changes mature and become embedded, the strategic aims behind our transformation will be realised. These aims — regulation that’s driven by people’s needs and experiences; smarter regulation; safety through learning, and accelerated improvement, all underpin everything that we’re collectively working for.

Thank you for your patience and support as we navigate through this transition period.

--

--

Care Quality Commission

We make sure health and social care services provide people with safe, effective, compassionate, high-quality care and we encourage care services to improve.