Victim service assessment – technical methodology 

Published on: 26 April 2021

Between April 2016 and February 2020, we carried out our rolling programme of crime data integrity (CDI) inspections in all 43 territorial forces in England & Wales measuring the accuracy of recorded crime.

We found that, in general, crime recording standards have improved. We found the combined recording accuracy for all reported crime (excluding fraud) in England & Wales was 90.3%; for violent offences it was 88.3%, and for sexual offences it was 94.0%[1]. This built on our 2014 thematic inspection, when we found that 80.5% of all reported crime, 66.9% of violent offences and 74.2% of sexual offences were correctly recorded.

However too many victims of crime still do not receive the service they deserve. When the Audit Commission stopped auditing crime recording in 2004, our subsequent inspections and audits showed a gradual decline in crime recording accuracy. It is therefore important to continue inspecting forces to ensure standards are maintained and victims receive the service they deserve.

As such, from 2020 onwards, we continued our inspection activity into crime recording but with an expanded focus on the whole victim experience. Our victim services assessments (VSAs) will track a victim’s journey from reporting a crime to the police, through to a crime outcome being assigned. From 2023 onwards this will focus on five areas of:

  1. Call handling
  2. Deployment and response
  3. Crime recording
  4. Investigations
  5. Outcomes

Force selection

All forces will be subjected to a VSA within our PEEL inspection programme, comprising stages: 1, 2, 4 and 5. Some forces will be selected to additionally be tested on crime recording (stage 3), in a way that ensures every force is assessed on its crime recording practices at least every three years.

All our VSAs will result in a narrative description of the victim services assessment within a force’s PEEL inspection report and findings from the VSA will be considered in other questions. When we assess crime recording, we will give a graded judgment, specifically for this area.

Methodology

For those forces being assessed on crime recording (stage 3), we will start by producing a statistically significant sample of the force’s crime reports to form an estimation of the force’s recording accuracy.

Crime types

A statistically significant sample will be audited for three offence groups:

  • violence against the person;
  • sexual offences;
  • all other offences (excluding fraud).

The recording accuracies of these three offence groups will then be used to form the force recording accuracy for all crime (excluding fraud). This will involve the use of weighting. First, to take account of the fact that samples and not whole populations will be audited. Secondly to take account of the fact that the proportion of the sample comprised of violence against the person and sexual offences is higher than the proportion that these offence groups make up of recorded crime.

Police forces use different IT systems and routes for receiving crime reports. The main routes we will consider are incidents, directly recorded crime (DRC) and online reports of crime. All forces record some crime through incidents – predominantly from 999 calls. DRCs are crimes directly recorded without an incident record by a trained operator, typically in a crime management unit. Online reports of crime, reported for example through a force’s website, are typically recorded without an incident in a similar way to DRCs. For forces that create an incident based on an online report of crime, these will be considered as incidents. Not all forces use the DRC or online methods.

Forces also record small proportions of reported crimes through various other routes, for example from members of the public walking into a police station or by vulnerable victim departments. These will only be included if they make up a high proportion (at least 5%) of all recorded crime in the force and contain an auditable record (this will be decided on a force-by-force basis).

The samples for the three offence types will be split across these crime reporting routes according to the information about the proportion of crime reported through each route, as provided by the force.

Confidence intervals

The confidence interval provides an estimated range of values that the true recording accuracies are likely to fall within. For example, if an audit found that 85% of crimes were correctly recorded with a confidence interval of +/- 5%, we would be confident that between 80% and 90% of crimes during the time period were correctly recorded based on the parameters used.

We will apply the generally accepted 95 percent confidence level used in statistical tests. This means that were we to repeat the audit many times under the same conditions, we expect the confidence interval would contain the true population recording accuracy 95 times out of 100.

The audit aims to review a large enough sample to yield confidence intervals of no more than +/- 5% for each crime type (at the 95 percent confidence level) and +/- 3% for all crime (excluding fraud). Where the sample (based on pre-audit estimates) for any crime type is found during the audit to not be large enough to achieve this, an extension of the confidence interval to achieve a maximum of +/- 5.4% can be applied if required. We will audit a minimum of 100 crime reports for each crime type, even if we achieve our required level of accuracy before this.

Sampling of crime records

Each report of crime that we review can either be correctly recorded or not. Our aim, from a sample of all crime reports, is to estimate the recording accuracy, p, between 0 and 1, representing the proportion of crime reports correctly recorded by the force.

The binomial distribution provides us with the distribution of successes from a series of pass/fail tests. Under the binomial distribution, the estimated sample size (n) required is calculated as:

Z ~ 1.96 (Z score at 95% confidence level)

P is the expected recording accuracy of the force

I is the confidence interval (+/- 5%).

The sample size is then adjusted to consider the estimated population size (N) of the crime type in the force during the audit period:

This adjusted sample size will then be split proportionally across incidents, and where applicable, directly recorded crimes, online reports of crime, or crimes from other routes.

Not all incidents generate a crime, and some generate multiple crimes. So, for practical reasons, we only review incidents where we would expect to find crimes – based on incident opening codes. Decisions on which opening codes are likely to contain crimes are made in conjunction with each force.

For incidents likely to contain crimes, we will set ratios of 1.4 violent incidents per violent crime, 1.2 sexual offence incidents per sexual offence crime and 1 other incident per all other offences (excluding fraud), based on evidence from our audits between April 2016 and February 2020. For example, for every 140 incidents opened with a violence opening code, we would expect these to yield 100 notifiable violent crimes. Therefore, we will multiply the estimated sample sizes for crimes from incidents by these ratios, to calculate the estimated number of incidents required.

In case the recording accuracy is lower than estimated pre-audit or the incident to crime ratio is higher than assumed, we will prepare reserve records. For incidents, this will be based on using a p value of 5 percentage points lower than the expected value pre-audit. For directly recorded crimes, online reports of crime, and crime reports from other routes, we will select an additional 10 records for the same reason.

Once calculated, the sample will be randomly selected from the force’s incident and crime records for the three-month period ending one complete calendar month prior to the commencement of the inspection.

Auditors will also examine dip samples on:

  • Antisocial behaviour (ASB)
    • 50 incidents closed as ASB personal
  • Cancelled crime decisions
    • 20 rape crime cancellations; commentary and feedback sent to the force
  • N100s
    • 20 reports of rape where the circumstances are insufficient to immediately record a crime of rape
  • Vulnerable victims
    • 25 vulnerable adult cases
    • 25 child protection cases
    • 20 referrals from partners or third parties

To ascertain the quality and consistency of crime recording decisions in these areas. These records will be taken in order, starting from the end of the audit period, working backwards no more than 12 months.

In circumstances where data or information is not available to allow us to follow our methodology as stated above, we will amend our approach accordingly. In these circumstances we will ensure our approach is robust as possible and, if necessary, seek advice from the wider analytical community.

Case file review

To assess stages 1, 2, 4 and 5 we will review 10 recorded crime files, for each of the following offences:

As well as 20 recorded crime files that were assigned a specific crime outcome[2]. This crime outcome will be chosen based on a risk assessment. In total we will review 100 crime files.

This will allow us to follow and assess an incident from initial contact, through to the investigation stage and the application of a Home Office approved crime outcome (if applicable). When there are an insufficient number of records within the random sample to select these 100 files, we will choose the remaining files from the next available incidents within our sampling frame that match the requirements needed.

For forces not being assessed on crime recording (stage 3), we will randomly select the crime files from the force’s crime records for the three-month period ending one complete calendar month prior to the commencement of the inspection.

Diversity

As part of all of our reviews, we will assess whether the force collected and recorded diversity information (protected characteristics) from victims of crime in order to inform its compliance with its equality duty. Victims with different characteristics have different needs and the first step of an effective response to a victim’s different needs is to identify what protected characteristics they have.

Audit quality and validation – crime recording

The quality of audit decisions depends on the knowledge, experience, and skills of the auditors. All auditors will be required to attend a HMICFRS in-house course which will teach auditors the Home Office Counting Rules (HOCR) and auditing techniques used in the CDI programme. It will be overseen by the national crime registrar who will validate the content of the course. Instruction will be provided by HMICFRS’s crime data specialist who has passed the College of Policing force crime registrars’ (FCR) accreditation course.

To ensure consistency, the results of each audit will be subject to peer review by an independent expert. This peer reviewer will be an FCR, or deputy FCR, from another force who has also passed the College of Policing FCR accreditation course. In addition, forces will have the opportunity to review the audit decisions. We aim to resolve any issues with the force in the first instance, but if no agreement can be reached, then the matter will be passed to the national crime recording standards expert at HMICFRS for consideration in consultation with the national crime registrar. The ultimate decision on reconciliation of any disputed cases will rest with HMICFRS’s senior reporting officer for the inspection.

Audit quality and validation – case file review

All auditors have been accredited to professionalising investigations programme (PIP) level 3 as a minimum. They are required to attend a HMICFRS in-house course which teaches auditing techniques used in the case file review element of the programme. Instruction is provided by HMICFRS’s case file review specialists and subject matter experts.

Forces can review audit decisions through a structured resolution process. We aim to resolve any issues with the force in the first instance, but if no agreement can be reached, then the matter will be passed to the case file review Audit Coordinator at HMICFRS for consideration. The ultimate decision on reconciliation of any disputed cases will rest with HMICFRS’s senior reporting officer for the inspection.

[1] These figures have confidence intervals of: +/- 0.3% for all reported crime; +/-0.4% for violent offences; and +/- 0.4% for sexual offences.

[2] From the following subset of police recorded crime outcomes: (2) Caution – youths, (3) Caution – adults, (8) Community Resolution, (10) Not in public interest (police decision), (14) Evidential difficulties: suspect not identified; victim does not support further action, (15) Evidential difficulties: suspect identified; victim supports action, (16) Evidential difficulties: suspect identified; victim does not support further action, (17) Prosecution time limit expired, (18) Investigation complete – no suspect identified, (21) Further investigation to support formal action not in the public interest (police decision), (22) Diversionary, educational or intervention activity, resulting from the crime report, has been undertaken and it is not in the public interest to take any further action.

Change log

26 March 2023: Updated to reflect methodology changes from 2023 onwards: removal of review of crime screening decisions, change to how crime outcomes are selected and rape crime cancellation commentary and feedback returned to forces.

26 April 2021: First published