Developing our new regulatory model

Care Quality Commission
7 min readJun 24, 2022

In her new blog, Joyce Frederick, Director of Policy & Strategy updates on the development of our new regulatory model.

In his latest blog, CQC’s Chief Executive, Ian Trenholm, gave an overview of what we have done as an organisation since the launch of our strategy in May 2021. It’s been a busy year for CQC, and we’ve made great progress in developing how we’ll deliver the ambitions of our strategy. This would not have been possible without the incredible amount of input through our engagement activity from providers, partners and the public.

A huge part of our efforts over the past year has been developing a new regulatory model, which will change the way we operate and allow us to give a more up-to-date view of quality across health and social care. We want to move away from our current ‘monitor, inspect, and rate’ approach, towards a more flexible approach where inspections won’t be set by frequency, and where ongoing assessment of quality will be proportionate to risk.

Previously, I’ve talked about our ambition to replace our current four assessment frameworks with a single assessment framework, which we will use to assess quality in all service types at all levels, including local authorities and integrated care systems. This will help us to avoid duplication and streamline our approach across the system.

Our new model will also include:

  • registration becoming the first assessment activity in an integrated process based on our single assessment framework, with an ambition to register providers at the rating level of ‘good’
  • the four ratings and five key questions remaining central to our approach
  • a new set of focused quality statements that replace our existing key lines of enquiry (KLOEs) and ratings characteristics
  • more frequently updated ratings and an approach to updating them that doesn’t solely rely on inspection, but with inspections remaining a vital tool in assessing quality
  • a clear set of evidence that we’ll use to assess different types of providers and a scoring approach that supports greater consistency
  • reports that are simpler, shorter and clearer about the quality of care we’ve found.

Today I want to talk about some of these areas in a bit more detail.

Putting people at the heart of our approach

Our five key questions will remain at the centre of our new approach, but we will be strengthening them using ‘I statements’. We want to use these to help articulate what a good experience of person-centred care looks and feels like, be clear about the standard of care that people should expect, and to support a structured approach to gathering feedback from people who use services. In this space, we’re building on work previously developed by Think Local Act Personal (TLAP), National Voices and the Coalition for Collaborative Care on Making it Real. We have worked with these partners to develop the ‘I statements’ through co-production.

A simplified view of quality

In our current approach, there are over 300 KLOEs and prompts that set out the standard of care we expect to see from providers. We know we need to present a simpler view of quality so that the public can understand what they should expect to receive and that providers understand what we expect them to deliver.

In our new model, and to replace the KLOEs, we will describe the standard of care we expect to see through a series of 34 quality statements — part of the single assessment framework. These are written from the perspective of a provider, local authority or integrated care system and you may hear them referred to as the ‘we statements’. We will now be testing the quality statements with a number of providers and stakeholders.

Our quality statements focus on clear topic areas (for example, safeguarding or medicines optimisation), rather than these being spread across and within our KLOEs as they are now. Our new strategy has helped us to reduce duplication and we have strengthened the areas of the five key questions to align with our strategy, for example:

  • safety is now stronger on safety cultures and learning
  • effective is looking more at how teams work together to improve standards
  • caring is focused on listening to the voices of people and also the wellbeing of the workforce
  • responsive will include equity of access, experience and outcomes
  • well-led will look at sustainability of services and the environment — these are just a few of the changes we are making within the five key questions.

Our ambition is that this streamlined approach will help us all to understand what good quality care looks like in a way that supports providers and the system to drive improvement.

Flexible and consistent assessment

It’s also important that we’re clear how we’ll assess providers against our new quality statements, and what evidence we’ll use to make these assessments. And we know that it’s important to everyone that we’re as consistent as possible in this approach.

We also recognise that we will be able to collect evidence in different ways depending on the circumstances and will use the most appropriate method. Inspection activity remains a vital part of our model, but this is not the only way that we’ll gather evidence to contribute to assessments.

That’s why, in our new model, we will use a wider range of activities to gather evidence to assess quality. This means that when we do inspect, we will be able to maximise every minute we spend on site, focusing on the things that matter most, like speaking to people who use services and observing care. Some gathering of evidence just cannot be done without being there in person, such as observing how staff interact with people or understanding what the culture is like in a service.

To aid consistency, and add structure and transparency, we will set out the evidence we will use every time when assessing individual quality statements. This evidence will be grouped under six categories. Of course, we appreciate that the evidence we’ll need to see for a GP service compared with a homecare service, for example, will be completely different. So underneath the six evidence categories will be specific sets of evidence requirements for different service types. We want to be as specific and as clear as possible in what we are looking for, so that providers and system partners know what we will require of them. Underpinning all of this will be best practice standards and national guidance.

We are currently seeking feedback on the evidence we will use for specific services, so look out for more information on our online engagement platform.

Reporting to drive improvement

By adopting more flexible approaches to collecting evidence and assessing quality, we will be able to update ratings more regularly and publish reports that better reflect the quality provided at a particular point in time.

Our ambition is to always be able to give the most up-to-date view of quality possible.

We know how much our ratings are valued as a way of understanding the quality of a service, so we’ll keep these for the types of services that we give ratings to. But we’ll also explore through engagement how we might want to change how we display ratings and how we describe what ratings mean through narrative reports. This could include ways for providers to benchmark themselves through giving them access to how we’ve scored them against quality statements or types of evidence.

This will almost certainly mean shorter reports that focus on the information that the public, providers, and other stakeholders need. These changes will help us to be much clearer about what needs to change within a provider or system to improve the quality of care.

Working differently

To help us achieve these ambitions, we’ve also started to re-organise ourselves as an organisation, creating a new Operations group and Regulatory Leadership team. Our Chief Inspectors will lead regulatory leadership, having oversight of our regulatory policy and influencing external partners and the system to support quality improvement and innovation, helping to raise standards across health and care.

The Operations group will deliver our regulatory activity, which is now divided into four networks or geographical areas (North, Midlands, London & East of England, and South), which we hope will help us build a better picture of quality across an area. Within the networks we will have teams with a mix of expertise/experience in different health and social care sectors, with the specialist knowledge and skills to carry out the regulatory activity needed in that area. We’re still designing the detail of how this will work and will continue to update you on this.

Developing how we work in partnership

We know that all our plans will only work if we continue to develop them in partnership with stakeholders. So, over the coming months there will be opportunities to help shape our new model, including helping to shape how we implement it. As part of this, we will soon be testing elements of our new approach with small numbers of providers. This is to make sure it works so we can improve where it’s needed, and to make sure we can offer the right level of support. Look out for opportunities to be involved in this work in our regular provider bulletins.

I wanted to finish by saying thank you again to all those who have already fed back their views. My colleague, Chris Day, recently reflected on the importance of co-production to developing the way we work. I’m incredibly grateful to the tens of thousands of people who have shared their views and helped us get to where we are. I look forward to hearing more feedback in future and working with you all to make sure our regulatory approach is fit for the future of health and social care.

--

--

Care Quality Commission

We make sure health and social care services provide people with safe, effective, compassionate, high-quality care and we encourage care services to improve.