Survey Results

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 32

RPC impact

assessment survey
results & turnaround
times

October-December

Introduction
The Regulatory Policy Committee
(RPC) has been conducting a
survey of individuals who have
submitted impact assessments,
validation statements and PIRs
since Quarter 2 (Q2) of 2014. The
survey aims to gather the views of
those who write IAs and interact
with the better regulation
framework, including seeking
opinions from the RPC. The RPC
secretariat also records data
internally on the processing of
cases. This data includes
turnaround times and the ratings
of submissions.
This paper sets out the RPCs
perspective on the key themes
emerging from the latest survey
and internal data, provides specific
information of the actions the RPC
is taking in response to these
themes and provides an update on
the actions that have been taken
in response to the key themes
from the previous survey and set
of internal data. It also gives a
detailed breakdown of the latest
survey results as well as RPC
turnaround times.
The analysis of this paper focuses
on Q4 2015. This includes cases
for which an opinion was issued to
Departments in October,
November or December 2015.
Results are also presented using
data from quarters going back to
Q4 2014 to provide historical
context to the latest results.
Analysis is also presented for
specific departments.

Structure of this document


1. The action plan
The RPC action plan forms the first part of this document. The aim is to set out
specific actions that are being taken, or will be taken, and how they will improve
customer experience in line with the priorities expressed by respondents to the
survey.

2. Results from the survey


We focus on four results from the survey which specifically relate to the RPC.
These cover overall satisfaction, clarity of process, clarity of opinions and the
agreement with the RPCs interpretation of methodology. These results are
presented over time and across departments. Additional analysis is provided on
the links between these results and other variables, such as colour of opinion
received.

3. RPC turnaround times and output statistics


Finally we present statistics on the RPC outputs. This covers measures such as
the average turnaround times and the percentage of opinions that went late. In
addition we present the volume of cases the RPC has seen split by types of
submission, department and rating.

Executive summary
The results presented on this paper draw from the impact assessment (IA) survey
and the RPCs own internal recording of the time taken for opinions to be issued.
Care should be taken when giving weight to quarterly fluctuations, both positive
and negative, as the sample considered is small, although the 78% response rate
implies that it is at least representative.
Departmental satisfaction and the RPCs performance in issuing
opinions have both remained at similar levels to previous periods.
Nevertheless, this level remains high relative to its benchmarks.
Success in meeting turnaround time deadlines has marginally improved
for cases that do not receive IRNs. The percentage of these cases going late

has decreased from 4.9% to 2.6% and average turnaround times marginally
decreasing from 22.2 days to 21.7 days this quarter.
Turnarounds are longer for cases which have IRNs issued. These cases
take 25.4 days to turnaround on average and 14.3% of them had turnaround
times longer than 30 days. This is, however, still significantly shorter than the 60
day maximum that the RPC used to have to produce a red and then a
subsequent green opinion.
Clarity of process has recovered from 71% in the previous quarter and
now stands at 84%. This is likely to reflect that BRUs and Departments were
not effectively made aware of the changes to the fast-track process and the
better regulation framework in advance of the previous survey.
Reported satisfaction and clarity of opinion are both similar to last quarter.
Satisfaction has marginally fallen from 7.1 to 6.7 (out of a maximum of 10) and
clarity of opinion has marginally risen from 78% to 79%.
The only clearly negative result is that agreement with the RPCs
methodology decisions has fallen from 97% to 79%. While last quarters
percentage was the highest on record this this quarters percentage is still the
lowest in over a year. This survey question clearly asks about respondents
agreement with the RPCs methodological comments, however, inevitably some
respondents reply with comments on the methodology itself.
The percentage of cases receiving a not fit for purpose rating (as
initially submitted) remained the same as the previous quarter at 20%.
This is similar to the average over a longer period. Just under 16% of
submissions received a red rating and just under 5% of submissions received an
amber rating.
The rankings of Departments based on the percentage of fit for purpose
submissions are relatively similar to those in the last update paper. DH
and DECC remain strong while HMT show room for improvement. However, there
are some changes, such as BIS improving its performance.
These results suggest that the RPC needs to ensure that Departments are
aware of the RPCs current expectations around key methodological
areas. The free text comments implied that there was some unease with the
system among Departments. It also showed that Departments want guidance on
PIRs and a more proportionate approach from the RPC in relation to impacts that
do not materially affect the analysis in submissions.

Action plan
Key themes emerging from the October-December (Q4) 2015
survey
3

We have identified these themes from the results of the October-December (Q4)
2015 survey and the comments made in the free text boxes. They represent the
broad areas in which departments wish to see improvements.

Methodology. The percentage of respondents that agreed with the RPCs


methodology decisions decreased from 97% last quarter to 79% in the
most recent quarter. Also a number of survey comments questioned the
RPCs decisions on methodology issues. This may reflect a lack of
understanding by Departments on the RPCs latest understanding of
several methodological areas.

PIRs. Respondents who had completed PIRs stated that they would like to
see additional guidance on the RPCs process and what factors it would
take into account when rating PIRs. PIRs are a relatively new area for the
RPC; however, we are expecting the number of PIR submissions to
increase in 2016 and are starting to develop our case histories on them.

Proportionality and burdens. A number of respondents used the free


text boxes to articular a desire for a less burdensome framework and
system. Departments have expressed related desires in past surveys,
however in this survey they particularly referred to a desire for a less
burdensome approach in relation to very small impacts that do not
materially affect analysis in submissions.

Key themes that emerged from the previous survey


These themes were identified from the results of the July-September (Q3) 2015
survey as well as the comments made in the free text boxes. They represent the
broad areas in which departments wished to see improvements.

Proportionality and burdens. A number of respondents used the free


text boxes to articulate a desire for less burdensome framework and
system, particularly for small value measures.

Clarity of process. The percentage of respondents agreeing that the RPC


process was clear was the joint lowest on record last quarter. Free text
comments indicated that this was partly, but not entirely, related to
changes to the fast-track system.

Actions to be taken
The list of actions the RPC is taking on each of these themes in the current
quarterly update paper is provided below.
4

Proportionality

PIRs

Methodology

Them
e

Action

Impact

By

Run classes at the BRU drop-in


on complex methodology
areas (such as direct/indirect
and primary/secondary)

Build Departments capacity


so that more methodological
issues can be resolved before
IAs reach the RPC
Help Departments to
understand the RPCs
methodology and give them a
guide to answering complex
issues in IAs

Monthly
classes
until May
2016

Upload case histories sections


on complex areas

Monthly
until May
2016

Arrange specific classes with,


and take secondees from,
Departments that appear to
be having particular
methodology issues

Should improve the


performance of the
Departments that appear to
be having the most
methodological issues

Currently
being
arranged
with
Departme
nts

Distribute PIRs guidance and


case histories to Departments

Help Departments to
understand the RPCs
methodology and give them a
framework for PIRs

April 2016

Encourage Departments to
address the areas the RPC
looks for when assessing PIRs

Ongoing
(BREs
timescale)

Reduce resource burden on


Departments and the RPC
with relation to small EU PIRs

Ongoing
(BREs
timescale)

Reduces resource burden on


Departments while
maintaining the level of RPC
scrutiny

Completed

If the RPC adopts this process


it will reduce the amount of
time needed for initially redrated IAs to obtain green
ratings. (Relative to the RPC
issuing two opinions and
assuming that the
Department addresses the
issues the RPC raises in its
IRN)

April 2016
(subject
to
negotiatio
n with
BRE)

Contribute to BREs suggested


PIR template to ensure that it
accurately reflects what the
RPC looks for in PIRs
Work with BRE to try and
ensure that small PIRs of EU
origin dont need to be
scrutinised by the RPC
Clarify that the RPC only
validates EANCBs to the
nearest 100,000 and so does
not require information in
greater detail than this
Decide whether to adopt a
form of the IRN pilot process
as the RPCs new permanent
process

Update on previous actions


An update on the actions the RPC has taken on the themes from the previous
quarterly update paper is provided below.

Proportionality

Communication of Process

Them
e

Actions following previous


survey

Actions following current


survey

Uploading guidance on
internal Departmental Triaging
and Initial Review Notice to
the RPC portal and
disseminating this news to
BRUs

Increase Departments
understanding of the new
system leading to a
reduction in the quantity of
time spent clarifying issues

Completed

Once process changes are


known to the RPC, ensure that
these are made clear to
Departments via Better
Regulation Unit catch-ups,
Departmental meetings and
the RPC portal

Increase Departments
understanding of process
changes before they happen
leading to a reduction in the
quantity of time spent
clarifying issues

When
process
changes
occur

Develop RPC expected


standards for key elements of
assessments In addition
identify examples of good
practice

Form one part of a new RPC


guidance document to sit
alongside the BRFM

Completed
(will be
updated
as new
material
becomes
available)

Work with BRE to reduce the


number of times the RPC sees
submissions. Introduce a fast
track that doesnt require RTAs
to be sent to the RPC.

Increase Departments
understanding of process
changes before they happen
leading to a reduction in the
quantity of time spent
clarifying issues

When
process
changes
occur

Due by

Impact Assessment Survey Results


The RPC, in conjunction with BRE, conduct a survey of individuals who have
submitted Impact Assessments. To date six surveys have been issued covering
seven quarters, Q2 2014 to Q4 2015.
This section of the performance paper will outline the results of the survey,
focusing on four particular questions. These questions are presented under the
headings of overall satisfaction, clarity of opinion, clarity of process and
interpretation of methodology. The questions that underlie each of these
measures are described in the sections that follow.

Summary of survey results


Fig. 1.0 Summary of survey responses (Q4 2014- Q4 2015)

2014
2015
2015
2015
2015
Agg.

Q4
Q1
Q2
Q3
Q4

Number
of
Respons
es
46
27
29
35
35
172

Respons
e rate
52%
43%
57%
50%
78%
54%

Satisfac
tion
6.7
7.0
7.0
7.1
6.7
6.9

Clarity
of
Opinion
79%
96%
71%
78%
79%
80%

Clarity
of
Process
84%
92%
93%
71%
84%
84%

Method
ology
84%
92%
82%
97%
79%
87%

*% answering Yes completely and Yes to some extent

Survey response composition


Response rate
Between Q4 2014 and Q4 2015 the survey has been sent to 317 individuals, of
which 172 have responded, a response rate of 54%.
In the latest quarter we have experienced our highest response rate to date,
at 78%. The likely explanation for this extremely high response rate is that the
RPC was conducting a number of surveys during this period, so the secretariat
made a particular effort to communicate with Departments.
However the quantity of responses this quarter was similar to previous quarters.
This is because relatively few IAs were submitted to the RPC this quarter, and
hence relatively few survey requests were sent out to Departments.

Fig 1.1 Response rates (Q4 2014- Q4 2015)


100%
90%
80%

78%

70%
60%
Response rate

57%

50% 52%

50%
43%

40%
30%
20%
10%
0%
2014 Q4

2015 Q1

2015 Q2

2015 Q3

2015 Q4

Representativeness of sample
Across the whole sample (Q2 2014 Q2 2015) we know that those who
responded were more likely to have received a GREEN rating (at first
submission) than the contacted group as a whole.
This pattern was also observed for Q4 2015. This was driven by a relatively small
number of responses from those receiving red ratings (at first submission). There
were actually a disproportionately large number of responses from those who
received amber opinions.
Fig 1.2 RAG proportions for contacted and responded samples for (Q4
2014- Q4 2015)

100%
80%
60%

84%

89%

40%
20%

2%
13%

9%
3%

0%
Contacted Sample

Responded Sample

Survey Results
Below are the results from the latest survey, which covered the period Q4 2015,
as well as a comparison of the key results over time.
Overall satisfaction
How would you rate your overall experience of the RPC? (Out of 10,
discrete intervals of 1)
The average level of satisfaction has marginally fallen to 6.7. This level
has remained relatively constant across surveys and there is no clear trend over
time.
It is unlikely that the slight decrease this year is greatly informative. In particular
the result was heavily influenced by one extremely low rating of 1 out of 10 from
an individual that received a not fit for purpose opinion from the RPC.
A further breakdown of these results can be found in Annex section 1.4.

Fig 1.3 Distribution of satisfaction scores for Q4 2015

10
9
8
7
6
Number of responses

5
4
3
2
1
0
1

10

Score (scale 1-10)

Fig 1.4 Satisfaction scores over time


10
9
8
6.7

7.0

7.0

7.1

0
2014 Q4

2015 Q1

2015 Q2

2015 Q3

6.7

6
Score (scale 1-10)

5
4
3
2
1

10

2015 Q4

Selected quotes: Do you have any further comments about the RPC
(including its IA scrutiny process)? Q4 2015 [Asked after satisfaction
question]
Proportionality and Burdens:
The IA process is very time consuming - both writing the IA itself and then
waiting for the IA to go through RPC. I hope I don't need to do another one any
time soon!
Very much agree that if possible the 30 days followed by a red followed by a full
other 30 days should if possible be reduced. I heard they were trialing a 45 day
system whereby a red would be able to resubmit within 30 days, but haven't
heard anything more about that.
I think it would be helpful once submitted to hear back at some point if RPC
expect to meet the 30 day deadline rather than waiting and hoping.

Clarity of opinion
Did you understand what you needed to do to address the RPCs
comments?
Answers: Yes completely, Yes to some extent, No and Not sure
The percentage of respondents answering yes completely or yes
remained relatively constant at 79% in Q4 2015.
This is similar to the average across past samples and there is no clear trend
over time.
A further breakdown of these results can be found in Annex section 1.5.

11

100%
90%
80%

42%

70%
60%
% of responses

Yes completely
Yes to some extent

50%
40%

No
36%

Not sure

30%
20%
10%

9%
12%

0%

Fig
1.5 Clarity of opinion results for Q4 2015
100%
90%

96%

80% 79%
70%

78%79%
71%

60%
50%
% answering "yes completely" & "yes to some extent"

40%
30%
20%
10%
0%

Fig 1.6 Clarity of opinion results over time


Selected quotes (Q4 2015):
Positive:
Broadly we had an idea from the first RPC opinion what we needed to address but this was clarified when we had a meeting with the RPC to make sure the key
areas were covered, and what the RPC would expect us to have done in order to
get a 'green' - though there still remains some ambiguity, which may be
inevitable.
Methodology:
12

We (policy team) felt that we'd already addressed the comments identified by
the RPC in our initial submission. This may be because we were inexperienced in
drafting IA or, slightly controversially, the RPC saw a need to challenge albeit on,
in our opinion, slightly weak grounds.
The RPC comments were clear, but they were wrong in certain places - i.e.
misinterpreting the policy. Yields difficult situation where we have to assess
whether to refuse and explain why - risking a red - or just cave and do what we
think is wrong but path of least resistance for a green.
Proportionality and Burdens:
The comments were minor - and one was repeating a comment made at
consultation stage initial review notice, that the RPC had agreed had been
addressed. So this comment was spurious. It was also vague and unimportant to
the EANCB/NPV.

Clarity of process
Was the RPC process clear?
Answers: Yes or No
There has been an increase in the % of individuals who found the
process clear. This has risen to 84% this period.

13

This may represent a return to normality after the decline caused by the removal
of the RTA system at relatively short notice in Q3 2015.
100%
90%
80%

92%

93%
84%

84%
71%

70%
60%
% answering "Yes"

50%
40%
30%
20%
10%
0%
2014 Q4

2015 Q1

2015 Q2

2015 Q3

2015 Q4

Fig
1.7 Clarity of process results over time

If not, what was not clear about the process? (Q4 2015)
Post Implementation Reviews:
It would have been helpful if the RPC were clearer about what the exact scope of the Post
Implementation Review needed to be. It would have also been helpful if the RPC had committed
to a turnaround time. I did ask on a number of occasions. However, the RPC was not able to
confirm how long it would take for them to turnaround the PIR.
Proportionality and Burdens:
The process was overly complicated - there was little contact with the RPC, which made the
process inefficient (instead of another body - BRU / BRE - having to interpret how the RPC may
approach specific issues in IA). Though I appreciate the need to remain independent, how RPC
would treat IAs was very vague.
There should be more clarity on how the IA process fits into wider Govt processes.

14

Interpretation of methodology
There has been a decrease in the % of individuals who agreed with the
RPCs interpretation of the methodology. This percentage has fallen to
79%.
This measure is volatile however it has fallen to the lowest level on record and so
cannot be ignored.
The RPCs understanding of several complex methodological areas (such as
direct/indirect, primary/secondary and SaMBA) has recently evolved. It is
therefore possible that Departments do not yet fully comprehend these changes.
Did you agree with the RPCs comments in the issued opinion?
Answers: Yes completely, Yes to some extent, No and Not sure
Fig 1.8 Interpretation of methodology results for Q4 2015
100%
90%
80%

48%

70%
60%
es completely 50%
% of Y
responses

Yes to some extent

40%

No
30%

30%
20%
18%

10%

3%

0%

15

Not sure

Fig 1.9 Interpretation of methodology results over time


100%

80%

97%

92%

90%
84%

82%

70%
60%
50%
% answering "yes completely" & "yes to some extent"

40%
30%
20%
10%
0%

Selected quotes (Q4 2015):


Methodology:
Costs - especially meaningful costs with an acceptable level of accuracy - very often do not
exist.
Proportionality and Burdens:
The level of quantification and detail seemed disproportionate.
The RPC approach of driving for almost everything to be costed militates against a
proportionate approach which is necessary to make prudent use of taxpayers' money.

16

79%

Breakdown by department (Q42014 Q4 2015)


There is significant variation between the responses of different
Departments to the survey.
The combined score gives equal weighting to each of the four survey results.
This measure demonstrates that DCLG has a far more negative view of the RPC
than any other Department. Their combined score for the RPC is only 5.9,
whereas the next lowest score is 7.4.
Fig 1.10 Results for all four measures split by department for (Q4 2014
to Q4 2015 inclusive)
10.0
9.0
8.0
7.0
6.0
5.0
4.0
3.0
2.0
1.0
0.0
HO

DH

DECC DEFRA HSE

BIS

DCMS DWP

DfE

DfT

HMT DCLG

*A department is excluded if there are less than 5 responses recorded for each of
the four questions.

17

Further analysis
Satisfaction
One question to ask of the results, particularly overall satisfaction, is whether
they vary by the rating received and if they do can some of the trends be
explained by changes in the proportions of green versus red opinions in the
sample.
Responses by RAG rating
Fig 1.11 Number of Responses to satisfaction question (Q4 2014 to Q4
2015)
Green
2014
2015
2015
2015
2015
Agg.

Q4
Q1
Q2
Q3
Q4

% Not
Green

Not Green
41
22
25
35
31
154

5
5
4
1
4
19

11%
19%
14%
3%
11%
11%

The proportion of respondents to the question How would you rate your overall
experience of the RPC? who received a not fit for purpose opinion has varied
between 3% and 19%.
The evidence presented below suggests that, on average, individuals who
received red or amber opinions report lower satisfaction with the RPC. However
this result is based on small sample sizes.

18

Fig 1.12 Average satisfaction scores by RAG rating of opinion received


(Q4 2014 to Q4 2015)
10
9
8
7

7.1
6.4

6.9

7.07.0

7.2

7.0

6
Score (scale 1-10)

7.0
5.3

4.3

4.0

4
3

2.0

2
1
0

2014 Q4 2015 Q1 2015 Q2 2015 Q3 2015 Q4


Average score (Green)

Agg.

Average score (Red/Amber)

By looking at the change in satisfaction for those receiving green and non-green
opinions separately, we can see that there appears to be very little clear change
in satisfaction in the latest period.
The overall decrease in satisfaction may well be due to a high proportion of
responses from individuals that received not fit for purpose opinions this quarter
due to sampling variation.

Engagement
Another question we can attempt to answer is whether engaging with the RPC
before submission results in improved outcomes for departments.
Departments were asked did you engage with the RPC before submitting? and
were given the options of i) No ii) Yes by email iii) Yes by phone iv) Yes
we met in person.
Below we present the % of respondents who received a green opinion conditional
on providing each of the above answers. This covers the period Q4 2014 to Q4
2015 i.e. all the periods for which we have data.
Fig 1.13 Engagement with the RPC and RAG rating received (Q4 2014 to
Q4 2015)
Responses
Did not
engage

% Green
87

89%
19

Engaged

74

89%

The above shows that the % of respondents receiving a green does not vary
between those that engaged with the RPC and those who did not.
It seems reasonable to hypothesise that individuals with complex or controversial
IAs are more likely to seek engagement with the RPC and that those types of IAs
are also more likely to receive not fit for purpose opinions. Therefore we can,
very tentatively, conclude that engagement with the RPC does increase
the chances of receiving a not fit for purpose opinion.
However since this effect appears to be balancing out the countervailing effect of
having a complex or controversial case, RPCs pre-submission meetings
appear to be serving their purpose well.

RPC ratings & Turnaround times


The RPCs ratings and turnaround times data is covered below. This data is for
the period Q4 2015 and includes comparisons of the key results over time.
Summary Table
The results for Q4 2015 are fairly positive. The percentage of cases that didnt
receive IRNs going late has decreased from 4.9% to 2.6% and average
turnaround times marginally decreasing from 22.2 days to 21.7 days this quarter.
Therefore the RPC is clearly meeting its target to process cases in less than 30
days at least 90% of the time for these cases.
However the results are far less positive for cases that received IRNs. Turnaround
times for these cases increased from 22.0 days last quarter to 25.4 days this
quarter.
Overall, average turnaround times in the latest quarter, at 22.3 days, and
percentage of cases with turnaround times over 30 days, at 4.4%, are very
similar to the results from the last quarter and the average over the period
considered. This result is encouraging as it shows that the RPC is maintaining an

20

average turnaround time clearly below its 30 day aim. This may, in part, be due
to the relatively low caseload.
The percentage of cases that were rated as not fit for purpose has also remained
constant at the relatively high level of 20%.
Fig. 2.0 Summary of RPC ratings and turnaround times (Q4 2014 to Q4
2015)
Volume of
cases

% Late
(non-IRNs)

Average days
(non-IRNs)

Average
Days (IRNs)

% Not fit for


purpose

Q4 2014

101

4.0%

20.9

NA

18%

Q1 2015

65

7.7%

23.1

NA

14%

Q2 2015

48

14.0%

27.3

28.3

19%

Q3 2015

50

4.9%

22.2

22.0

20%

Q4 2015

45

2.6%

21.7

25.4

20%

Agg.

309

6.8%

22.6

25.3

18%

Opinions issued
The volume of cases the RPC has dealt with over Q4 2015 has been low
compared to past standards, although it is similar to Q2 and Q3 2015.
The RPC expects to see a significant increase in cases in 2016 as a result of PIRS
due, regulators beginning to submit cases and increased departmental activity
as we enter the second calendar year of the new parliament.

Fig 2.1 Number of opinions issued in each quarter (Q4 2014 to Q4 2015)

21

120

100

101

80
65
Number
PIR of opinions
EANCB
issued

60
IA Final

IA Consultation
48

50

Total
45

40

20

0
Q4 2014 Q1 2015 Q2 2015 Q3 2015 Q4 2015

*Th
e RPC also used to (and still occasionally does) scrutinise regulatory triage
assessments. However as these are no longer submitted to the RPC as a matter
of course they have been excluded.

Overall turnaround times and % late


Average turnaround times in the latest quarter, at 22.3 days, are very similar to
the results from the last quarter and the average over the period considered.
This result is encouraging as it shows that the RPC is maintaining an average
turnaround time clearly below its 30 day aim. However this may increase in 2016
if the expected increase in caseload is realised.
The percentage of cases going late has also remained similar to the previous
quarter, at 4.4%. The RPC is clearly meeting its aim of fewer than 10% of cases
going late.
However, again, this may become more of an issue in 2016. Especially as a
relatively small number of cases going late can have a significant effect on the
percentage of cases late, as the spike in the percentage of late cases in Q2 2015
shows
22

Fig 2.2 Average turn around times and % of case late (Q4 2014 to Q4
2015)
18.0%

30.0

16.0%

25.0

14.0%
12.0%

20.0

10.0%
% of cases late (>30 days)

15.0

8.0%

Average turnaround time

6.0%

10.0

4.0%

5.0

2.0%
0.0%

0.0

% Late

Av days

Turnaround times and % late of IRNs


One question that has recently arisen in the RPC is whether turnaround times of
cases are longer if an initial review notice (IRN) is issued on a case.
In this quarter it appears that this has been the case. On average, cases that
receive IRNs are turned around in 25.4 days whereas cases which dont receive
IRNs are turned around in 21.7 days.
In addition 14.3% of cases that received IRNs had turnaround times longer than
30 days whereas only 2.6% of cases without an IRN did. However the small
23

sample size is very small: only a single case which received an IRN and a single
case which didnt receive an IRN were late.
The RPC is currently considering its turnaround time deadlines for cases that
receive IRNs as part of the evaluation and decision on whether to continue the
IRN process.
Fig. 2.3 Summary table for cases with and without IRNs
Number of
cases

% With
turnaround time
over 30 days

Average days

Cases with an IRN


Cases without an
IRN

14.3%

25.4

38

2.6%

21.7

All cases

45

4.4%

22.3

Fig. 2.4 Turnaround times and % late for cases with and without IRNs
30.0

25.0%

25.0

20.0%

20.0

15.0%

15.0
10.0%

10.0

5.0%

5.0
0.0

0.0%
Cases with IRNs

Cases without IRNs

Average days
% With turnaround times over 30 days

Red, Amber, Green ratings


The percentage of first time submissions receiving red and amber rated opinions,
as submitted, has marginally increased this quarter to 20%. This may partly
reflect that Amber opinions have been used more in practise.
The increased use of Amber opinions may also explain the slight decline in the
number of red opinions issued.
24

Fig 2.5 % of submissions issued that received either a Red or Amber (or
Red/any, Amber/any) Not limited to first submissions (Q4 2014 to Q4
2015)
40%
35%
30%
25%
20%18% 18%
Amber
% rated "not fitTotal
for purpose" at first submission

0%

20%
18%

Red

15%13%
10%
6%
5%

23%
21%

16%

10%
8%

5%

3%
0%

*This data is based on first time submissions to enable meaningful comparisons


to be made between the periods before and after the introduction of IRNs.
Therefore the % of opinions rated as not fir for purpose is not the same in this
graph as in the summary table above (which is based on all submissions).

Ratings of different types of submissions


The percentage of submissions receiving fit for purpose ratings does not seem to
vary between different types of submission.
Although 100% of PIRs received fit for purpose ratings this is not hugely
meaningful as it is based on a sample size of two.

25

Fig 2.6 % of each submission types that received a fit for purpose
opinion (Q4 2014 to Q4 2015)

100%

18
16

80%

14
12

60%

10
8

40%

6
4

20%

2
0%

0
IA Consultation

IA Final

EANCB

% fit for purpose

PIR

Number of cases

Departmental ratings
The sample size is insufficient to meaningfully break the ratings down by
department for individual quarters so departmental breakdowns must be
conducted over a number of periods. This paper considers data from Q4 2014 to
Q4 2015 inclusive, in line with the approach last quarter.
26

The rankings are relatively similar to those in the last update paper. Departments
such as DH and DECC remain strong while HMT show room for improvement.
Many of the changes in position between middle ranked Departments are not
hugely meaningful, for instance HO has fallen from 6 th to 10th on the basis of a
one percentage point reduction in fit for purpose opinions. This reinforces that
caution should be taken in attaching weight to the results below as the sample
sizes for individual Departments, even over multiple quarters, are small and the
results are, consequently, volatile.
One of the few Departments that have meaningfully changed in the rankings is
BIS. The percentage of its submissions rated fit for purpose has increased from
80% to 85% as a result of very strong performance in Q4 2015.

Fig 2.7 % of each Departments submissions that received a fit for


purpose opinion (Q4 2015 to Q4 2015 inclusive)
100%

90

90%

80

80%

70

70%

60

60%

50

50%
40

40%

30

30%

20

20%

10

10%
0%

0
1

% fit for purpose

10

11

12

Number of cases

*A department is excluded if there are less than 5 responses recorded for each
of the four questions.

Annex
Fig. 3.1 Number of responses by question and quarter (Q4 2014 to Q4
2015)
27

2014
Q4

2015
Q1

2015
Q2

2015
Q3

2015
Q4

Overall satisfaction

41

24

27

35

32

Clarity of Opinion

43

26

28

36

33

Clarity of Process
Interpretation of
Methodology

43

26

29

35

32

43

26

28

36

33

42.5

25.5

28

35.5

32.5

Average

Fig. 3.2 Average satisfaction score by quarter (Q4 2014 to Q4 2015)

2014
2015
2015
2015
2015
Agg.

Q4
Q1
Q2
Q3
Q4

Average Score
6.7
7.0
7.0
7.1
6.7
6.9

Average
Score
(Green)
6.9
7.1
7.0
7.2
7.0
7.0

Average
Score (Not
Green)
4.3
6.4
7.0
2.0
4.0
5.3

Fig. 3.3 Clarity of opinion by quarter (Q4 2014 to Q4 2015)

2014 Q4

2015 Q1

2015
Q2

2015
Q3

2015
Q4

Not sure
No
Yes to some
extent

12%
9%

0%
4%

21%
7%

19%
3%

12%
9%

16%

42%

25%

28%

36%

Yes completely
Yes + Yes to
some extent

63%

54%
28
96%

46%

50%

42%

71%

78%

79%

79%

Ag
g
13
%
7%
28
%
52
%
80
%

Fig. 3.4 Clarity of Process (Q4 2014 to Q4 2015)

Yes
No

2015
2014 Q4
2015 Q1
2015 Q2 2015 Q3 Q4
84%
92%
93%
71%
84%
16%
8%
7%
29%
16%

Fig. 3.5 Interpretation of methodology results by quarter (Q4 2014 to


Q4 2015)

Methodology
Yes completely
Yes to some
extent
No
Not sure
Yes + Yes to
some extent

2015
2014 Q4
2015 Q1
2015 Q2 2015 Q3 Q4
53%
50%
57%
58%
48%
30%
12%
5%

42%
4%
4%

25%
0%
18%

39%
0%
3%

30%
18%
3%

84%

92%

82%

97%

79%

Fig. 3.6 Departmental results using all data (Q4 2014 Q4 2015)
Departm
ent
HO
DH
DECC
DEFRA
HSE
BIS
DCMS
DWP
DfE
DfT
HMT
DCLG

Satisfacti
Opinion
Process
Methodol
Combined
on
clarity
clarity
ogy
9.2
6.6
100%
100%
100%
8.8
7.4
89%
100%
89%
8.6
6.7
92%
83%
100%
8.5
7.7
76%
93%
94%
8.5
8.0
90%
90%
80%
8.0
7.1
82%
79%
87%
7.9
5.4
100%
60%
100%
7.8
5.2
100%
60%
100%
7.8
5.2
80%
100%
80%
7.6
6.9
71%
91%
71%
7.4
6.6
60%
80%
90%
5.9
5.6
30%
80%
70%

* A department is excluded if there are less than 5 responses recorded for each
of the four questions.
Number of cases
Fig. 3.7 Quantity of submission types (Q4 2014 to Q4 2015)

Q4 2014

IA
Consultation
28

IA Final
34

EANCB
39

29

PIR
0

Total
101

Q1 2015
Q2 2015
Q3 2015
Q4 2015

13
9
18
14

30
10
20
17

22
29
12
12

0
0
0
2

65
48
50
45

Fig. 3.8 % not fit for purpose by submission types (Q4 2014 to Q4 2015)

Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015

IA
Consultation
29%
15%
33%
44%
21%

IA Final
18%
20%
50%
10%
24%

EANCB
10%
5%
3%
0%
17%

PIR
NA
NA
NA
NA
0%

Total
18%
14%
19%
20%
20%

Fig. 3.9 % Late and Average Turnaround times (Q4 2014 to Q4 2015)
% Late

Days

Q4 2014

4.0%

20.9

Q1 2015

7.7%

23.1

Q2 2015

16.7%

27.0

Q3 2015

4.0%

22.5

Q4 2015

4.4%

22.3

Fig. 3.10 Percentage of first time submissions rated fit for purpose by
Department. Figures in brackets give total number of first time
submissions (Q4 2014 to Q4 2015)
DECC
DH
DWP
BIS
DfT
DEFRA
DCMS
HSE
CLG
HO
DfE
HMT

Q4 2014
100% (7)
80% (10)
100% (2)
84% (19)
93% (15)
67% (9)
100% (3)
67% (6)
86% (7)
88% (8)
100% (2)
83% (12)

Q1 2015
100% (1)
100% (7)
100% (3)
91% (11)
70% (10)
75% (8)
100% (1)
100% (2)
75% (8)
100% (1)
67% (9)

Q2 2015
100% (4)
100% (3)
88% (8)
100% (7)
100% (4)
67% (3)
100% (1)
71% (7)
80% (5)
50% (4)
71% (7)

Q3 2015
100% (7)
100% (5)
67% (3)
80% (30)
83% (6)
100% (6)
67% (3)
100% (1)
100% (5)
80% (5)
100% (2)
70% (10)

Q4 2015
100% (5)
75% (4)
88% (16)
50% (6)
86% (7)
100% (1)
100% (1)
0% (1)
67% (3)

Total
100% (24)
90% (29)
88% (8)
85% (84)
82% (44)
82% (34)
82% (11)
82% (11)
81% (27)
79% (19)
78% (9)
73% (41)

*Departments with fewer than 5 submissions in the period have been excluded
Fig. 3.11 Turnaround times for non-IRNs (Q4 2014 to Q4 2015)

Number of cases

% With turnaround time


30

Average days

over 30 days
Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015

101
65
43
41
38

4.0%
7.7%
14.0%
4.9%
2.6%

20.9
23.1
27.3
22.2
21.7

Fig. 3.12 Turnaround times for IRNs (Q4 2014 to Q4 2015)

Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015

Number of cases
0
0
3
3
7

% With turnaround time


over 30 days
NA
NA
66.7%
0.0%
14.3%

31

Average days
NA
NA
28.3
22.0
25.4

You might also like