Professional Documents
Culture Documents
Survey Results
Survey Results
Survey Results
assessment survey
results & turnaround
times
October-December
Introduction
The Regulatory Policy Committee
(RPC) has been conducting a
survey of individuals who have
submitted impact assessments,
validation statements and PIRs
since Quarter 2 (Q2) of 2014. The
survey aims to gather the views of
those who write IAs and interact
with the better regulation
framework, including seeking
opinions from the RPC. The RPC
secretariat also records data
internally on the processing of
cases. This data includes
turnaround times and the ratings
of submissions.
This paper sets out the RPCs
perspective on the key themes
emerging from the latest survey
and internal data, provides specific
information of the actions the RPC
is taking in response to these
themes and provides an update on
the actions that have been taken
in response to the key themes
from the previous survey and set
of internal data. It also gives a
detailed breakdown of the latest
survey results as well as RPC
turnaround times.
The analysis of this paper focuses
on Q4 2015. This includes cases
for which an opinion was issued to
Departments in October,
November or December 2015.
Results are also presented using
data from quarters going back to
Q4 2014 to provide historical
context to the latest results.
Analysis is also presented for
specific departments.
Executive summary
The results presented on this paper draw from the impact assessment (IA) survey
and the RPCs own internal recording of the time taken for opinions to be issued.
Care should be taken when giving weight to quarterly fluctuations, both positive
and negative, as the sample considered is small, although the 78% response rate
implies that it is at least representative.
Departmental satisfaction and the RPCs performance in issuing
opinions have both remained at similar levels to previous periods.
Nevertheless, this level remains high relative to its benchmarks.
Success in meeting turnaround time deadlines has marginally improved
for cases that do not receive IRNs. The percentage of these cases going late
has decreased from 4.9% to 2.6% and average turnaround times marginally
decreasing from 22.2 days to 21.7 days this quarter.
Turnarounds are longer for cases which have IRNs issued. These cases
take 25.4 days to turnaround on average and 14.3% of them had turnaround
times longer than 30 days. This is, however, still significantly shorter than the 60
day maximum that the RPC used to have to produce a red and then a
subsequent green opinion.
Clarity of process has recovered from 71% in the previous quarter and
now stands at 84%. This is likely to reflect that BRUs and Departments were
not effectively made aware of the changes to the fast-track process and the
better regulation framework in advance of the previous survey.
Reported satisfaction and clarity of opinion are both similar to last quarter.
Satisfaction has marginally fallen from 7.1 to 6.7 (out of a maximum of 10) and
clarity of opinion has marginally risen from 78% to 79%.
The only clearly negative result is that agreement with the RPCs
methodology decisions has fallen from 97% to 79%. While last quarters
percentage was the highest on record this this quarters percentage is still the
lowest in over a year. This survey question clearly asks about respondents
agreement with the RPCs methodological comments, however, inevitably some
respondents reply with comments on the methodology itself.
The percentage of cases receiving a not fit for purpose rating (as
initially submitted) remained the same as the previous quarter at 20%.
This is similar to the average over a longer period. Just under 16% of
submissions received a red rating and just under 5% of submissions received an
amber rating.
The rankings of Departments based on the percentage of fit for purpose
submissions are relatively similar to those in the last update paper. DH
and DECC remain strong while HMT show room for improvement. However, there
are some changes, such as BIS improving its performance.
These results suggest that the RPC needs to ensure that Departments are
aware of the RPCs current expectations around key methodological
areas. The free text comments implied that there was some unease with the
system among Departments. It also showed that Departments want guidance on
PIRs and a more proportionate approach from the RPC in relation to impacts that
do not materially affect the analysis in submissions.
Action plan
Key themes emerging from the October-December (Q4) 2015
survey
3
We have identified these themes from the results of the October-December (Q4)
2015 survey and the comments made in the free text boxes. They represent the
broad areas in which departments wish to see improvements.
PIRs. Respondents who had completed PIRs stated that they would like to
see additional guidance on the RPCs process and what factors it would
take into account when rating PIRs. PIRs are a relatively new area for the
RPC; however, we are expecting the number of PIR submissions to
increase in 2016 and are starting to develop our case histories on them.
Actions to be taken
The list of actions the RPC is taking on each of these themes in the current
quarterly update paper is provided below.
4
Proportionality
PIRs
Methodology
Them
e
Action
Impact
By
Monthly
classes
until May
2016
Monthly
until May
2016
Currently
being
arranged
with
Departme
nts
Help Departments to
understand the RPCs
methodology and give them a
framework for PIRs
April 2016
Encourage Departments to
address the areas the RPC
looks for when assessing PIRs
Ongoing
(BREs
timescale)
Ongoing
(BREs
timescale)
Completed
April 2016
(subject
to
negotiatio
n with
BRE)
Proportionality
Communication of Process
Them
e
Uploading guidance on
internal Departmental Triaging
and Initial Review Notice to
the RPC portal and
disseminating this news to
BRUs
Increase Departments
understanding of the new
system leading to a
reduction in the quantity of
time spent clarifying issues
Completed
Increase Departments
understanding of process
changes before they happen
leading to a reduction in the
quantity of time spent
clarifying issues
When
process
changes
occur
Completed
(will be
updated
as new
material
becomes
available)
Increase Departments
understanding of process
changes before they happen
leading to a reduction in the
quantity of time spent
clarifying issues
When
process
changes
occur
Due by
2014
2015
2015
2015
2015
Agg.
Q4
Q1
Q2
Q3
Q4
Number
of
Respons
es
46
27
29
35
35
172
Respons
e rate
52%
43%
57%
50%
78%
54%
Satisfac
tion
6.7
7.0
7.0
7.1
6.7
6.9
Clarity
of
Opinion
79%
96%
71%
78%
79%
80%
Clarity
of
Process
84%
92%
93%
71%
84%
84%
Method
ology
84%
92%
82%
97%
79%
87%
78%
70%
60%
Response rate
57%
50% 52%
50%
43%
40%
30%
20%
10%
0%
2014 Q4
2015 Q1
2015 Q2
2015 Q3
2015 Q4
Representativeness of sample
Across the whole sample (Q2 2014 Q2 2015) we know that those who
responded were more likely to have received a GREEN rating (at first
submission) than the contacted group as a whole.
This pattern was also observed for Q4 2015. This was driven by a relatively small
number of responses from those receiving red ratings (at first submission). There
were actually a disproportionately large number of responses from those who
received amber opinions.
Fig 1.2 RAG proportions for contacted and responded samples for (Q4
2014- Q4 2015)
100%
80%
60%
84%
89%
40%
20%
2%
13%
9%
3%
0%
Contacted Sample
Responded Sample
Survey Results
Below are the results from the latest survey, which covered the period Q4 2015,
as well as a comparison of the key results over time.
Overall satisfaction
How would you rate your overall experience of the RPC? (Out of 10,
discrete intervals of 1)
The average level of satisfaction has marginally fallen to 6.7. This level
has remained relatively constant across surveys and there is no clear trend over
time.
It is unlikely that the slight decrease this year is greatly informative. In particular
the result was heavily influenced by one extremely low rating of 1 out of 10 from
an individual that received a not fit for purpose opinion from the RPC.
A further breakdown of these results can be found in Annex section 1.4.
10
9
8
7
6
Number of responses
5
4
3
2
1
0
1
10
7.0
7.0
7.1
0
2014 Q4
2015 Q1
2015 Q2
2015 Q3
6.7
6
Score (scale 1-10)
5
4
3
2
1
10
2015 Q4
Selected quotes: Do you have any further comments about the RPC
(including its IA scrutiny process)? Q4 2015 [Asked after satisfaction
question]
Proportionality and Burdens:
The IA process is very time consuming - both writing the IA itself and then
waiting for the IA to go through RPC. I hope I don't need to do another one any
time soon!
Very much agree that if possible the 30 days followed by a red followed by a full
other 30 days should if possible be reduced. I heard they were trialing a 45 day
system whereby a red would be able to resubmit within 30 days, but haven't
heard anything more about that.
I think it would be helpful once submitted to hear back at some point if RPC
expect to meet the 30 day deadline rather than waiting and hoping.
Clarity of opinion
Did you understand what you needed to do to address the RPCs
comments?
Answers: Yes completely, Yes to some extent, No and Not sure
The percentage of respondents answering yes completely or yes
remained relatively constant at 79% in Q4 2015.
This is similar to the average across past samples and there is no clear trend
over time.
A further breakdown of these results can be found in Annex section 1.5.
11
100%
90%
80%
42%
70%
60%
% of responses
Yes completely
Yes to some extent
50%
40%
No
36%
Not sure
30%
20%
10%
9%
12%
0%
Fig
1.5 Clarity of opinion results for Q4 2015
100%
90%
96%
80% 79%
70%
78%79%
71%
60%
50%
% answering "yes completely" & "yes to some extent"
40%
30%
20%
10%
0%
We (policy team) felt that we'd already addressed the comments identified by
the RPC in our initial submission. This may be because we were inexperienced in
drafting IA or, slightly controversially, the RPC saw a need to challenge albeit on,
in our opinion, slightly weak grounds.
The RPC comments were clear, but they were wrong in certain places - i.e.
misinterpreting the policy. Yields difficult situation where we have to assess
whether to refuse and explain why - risking a red - or just cave and do what we
think is wrong but path of least resistance for a green.
Proportionality and Burdens:
The comments were minor - and one was repeating a comment made at
consultation stage initial review notice, that the RPC had agreed had been
addressed. So this comment was spurious. It was also vague and unimportant to
the EANCB/NPV.
Clarity of process
Was the RPC process clear?
Answers: Yes or No
There has been an increase in the % of individuals who found the
process clear. This has risen to 84% this period.
13
This may represent a return to normality after the decline caused by the removal
of the RTA system at relatively short notice in Q3 2015.
100%
90%
80%
92%
93%
84%
84%
71%
70%
60%
% answering "Yes"
50%
40%
30%
20%
10%
0%
2014 Q4
2015 Q1
2015 Q2
2015 Q3
2015 Q4
Fig
1.7 Clarity of process results over time
If not, what was not clear about the process? (Q4 2015)
Post Implementation Reviews:
It would have been helpful if the RPC were clearer about what the exact scope of the Post
Implementation Review needed to be. It would have also been helpful if the RPC had committed
to a turnaround time. I did ask on a number of occasions. However, the RPC was not able to
confirm how long it would take for them to turnaround the PIR.
Proportionality and Burdens:
The process was overly complicated - there was little contact with the RPC, which made the
process inefficient (instead of another body - BRU / BRE - having to interpret how the RPC may
approach specific issues in IA). Though I appreciate the need to remain independent, how RPC
would treat IAs was very vague.
There should be more clarity on how the IA process fits into wider Govt processes.
14
Interpretation of methodology
There has been a decrease in the % of individuals who agreed with the
RPCs interpretation of the methodology. This percentage has fallen to
79%.
This measure is volatile however it has fallen to the lowest level on record and so
cannot be ignored.
The RPCs understanding of several complex methodological areas (such as
direct/indirect, primary/secondary and SaMBA) has recently evolved. It is
therefore possible that Departments do not yet fully comprehend these changes.
Did you agree with the RPCs comments in the issued opinion?
Answers: Yes completely, Yes to some extent, No and Not sure
Fig 1.8 Interpretation of methodology results for Q4 2015
100%
90%
80%
48%
70%
60%
es completely 50%
% of Y
responses
40%
No
30%
30%
20%
18%
10%
3%
0%
15
Not sure
80%
97%
92%
90%
84%
82%
70%
60%
50%
% answering "yes completely" & "yes to some extent"
40%
30%
20%
10%
0%
16
79%
DH
BIS
DCMS DWP
DfE
DfT
HMT DCLG
*A department is excluded if there are less than 5 responses recorded for each of
the four questions.
17
Further analysis
Satisfaction
One question to ask of the results, particularly overall satisfaction, is whether
they vary by the rating received and if they do can some of the trends be
explained by changes in the proportions of green versus red opinions in the
sample.
Responses by RAG rating
Fig 1.11 Number of Responses to satisfaction question (Q4 2014 to Q4
2015)
Green
2014
2015
2015
2015
2015
Agg.
Q4
Q1
Q2
Q3
Q4
% Not
Green
Not Green
41
22
25
35
31
154
5
5
4
1
4
19
11%
19%
14%
3%
11%
11%
The proportion of respondents to the question How would you rate your overall
experience of the RPC? who received a not fit for purpose opinion has varied
between 3% and 19%.
The evidence presented below suggests that, on average, individuals who
received red or amber opinions report lower satisfaction with the RPC. However
this result is based on small sample sizes.
18
7.1
6.4
6.9
7.07.0
7.2
7.0
6
Score (scale 1-10)
7.0
5.3
4.3
4.0
4
3
2.0
2
1
0
Agg.
By looking at the change in satisfaction for those receiving green and non-green
opinions separately, we can see that there appears to be very little clear change
in satisfaction in the latest period.
The overall decrease in satisfaction may well be due to a high proportion of
responses from individuals that received not fit for purpose opinions this quarter
due to sampling variation.
Engagement
Another question we can attempt to answer is whether engaging with the RPC
before submission results in improved outcomes for departments.
Departments were asked did you engage with the RPC before submitting? and
were given the options of i) No ii) Yes by email iii) Yes by phone iv) Yes
we met in person.
Below we present the % of respondents who received a green opinion conditional
on providing each of the above answers. This covers the period Q4 2014 to Q4
2015 i.e. all the periods for which we have data.
Fig 1.13 Engagement with the RPC and RAG rating received (Q4 2014 to
Q4 2015)
Responses
Did not
engage
% Green
87
89%
19
Engaged
74
89%
The above shows that the % of respondents receiving a green does not vary
between those that engaged with the RPC and those who did not.
It seems reasonable to hypothesise that individuals with complex or controversial
IAs are more likely to seek engagement with the RPC and that those types of IAs
are also more likely to receive not fit for purpose opinions. Therefore we can,
very tentatively, conclude that engagement with the RPC does increase
the chances of receiving a not fit for purpose opinion.
However since this effect appears to be balancing out the countervailing effect of
having a complex or controversial case, RPCs pre-submission meetings
appear to be serving their purpose well.
20
average turnaround time clearly below its 30 day aim. This may, in part, be due
to the relatively low caseload.
The percentage of cases that were rated as not fit for purpose has also remained
constant at the relatively high level of 20%.
Fig. 2.0 Summary of RPC ratings and turnaround times (Q4 2014 to Q4
2015)
Volume of
cases
% Late
(non-IRNs)
Average days
(non-IRNs)
Average
Days (IRNs)
Q4 2014
101
4.0%
20.9
NA
18%
Q1 2015
65
7.7%
23.1
NA
14%
Q2 2015
48
14.0%
27.3
28.3
19%
Q3 2015
50
4.9%
22.2
22.0
20%
Q4 2015
45
2.6%
21.7
25.4
20%
Agg.
309
6.8%
22.6
25.3
18%
Opinions issued
The volume of cases the RPC has dealt with over Q4 2015 has been low
compared to past standards, although it is similar to Q2 and Q3 2015.
The RPC expects to see a significant increase in cases in 2016 as a result of PIRS
due, regulators beginning to submit cases and increased departmental activity
as we enter the second calendar year of the new parliament.
Fig 2.1 Number of opinions issued in each quarter (Q4 2014 to Q4 2015)
21
120
100
101
80
65
Number
PIR of opinions
EANCB
issued
60
IA Final
IA Consultation
48
50
Total
45
40
20
0
Q4 2014 Q1 2015 Q2 2015 Q3 2015 Q4 2015
*Th
e RPC also used to (and still occasionally does) scrutinise regulatory triage
assessments. However as these are no longer submitted to the RPC as a matter
of course they have been excluded.
Fig 2.2 Average turn around times and % of case late (Q4 2014 to Q4
2015)
18.0%
30.0
16.0%
25.0
14.0%
12.0%
20.0
10.0%
% of cases late (>30 days)
15.0
8.0%
6.0%
10.0
4.0%
5.0
2.0%
0.0%
0.0
% Late
Av days
sample size is very small: only a single case which received an IRN and a single
case which didnt receive an IRN were late.
The RPC is currently considering its turnaround time deadlines for cases that
receive IRNs as part of the evaluation and decision on whether to continue the
IRN process.
Fig. 2.3 Summary table for cases with and without IRNs
Number of
cases
% With
turnaround time
over 30 days
Average days
14.3%
25.4
38
2.6%
21.7
All cases
45
4.4%
22.3
Fig. 2.4 Turnaround times and % late for cases with and without IRNs
30.0
25.0%
25.0
20.0%
20.0
15.0%
15.0
10.0%
10.0
5.0%
5.0
0.0
0.0%
Cases with IRNs
Average days
% With turnaround times over 30 days
Fig 2.5 % of submissions issued that received either a Red or Amber (or
Red/any, Amber/any) Not limited to first submissions (Q4 2014 to Q4
2015)
40%
35%
30%
25%
20%18% 18%
Amber
% rated "not fitTotal
for purpose" at first submission
0%
20%
18%
Red
15%13%
10%
6%
5%
23%
21%
16%
10%
8%
5%
3%
0%
25
Fig 2.6 % of each submission types that received a fit for purpose
opinion (Q4 2014 to Q4 2015)
100%
18
16
80%
14
12
60%
10
8
40%
6
4
20%
2
0%
0
IA Consultation
IA Final
EANCB
PIR
Number of cases
Departmental ratings
The sample size is insufficient to meaningfully break the ratings down by
department for individual quarters so departmental breakdowns must be
conducted over a number of periods. This paper considers data from Q4 2014 to
Q4 2015 inclusive, in line with the approach last quarter.
26
The rankings are relatively similar to those in the last update paper. Departments
such as DH and DECC remain strong while HMT show room for improvement.
Many of the changes in position between middle ranked Departments are not
hugely meaningful, for instance HO has fallen from 6 th to 10th on the basis of a
one percentage point reduction in fit for purpose opinions. This reinforces that
caution should be taken in attaching weight to the results below as the sample
sizes for individual Departments, even over multiple quarters, are small and the
results are, consequently, volatile.
One of the few Departments that have meaningfully changed in the rankings is
BIS. The percentage of its submissions rated fit for purpose has increased from
80% to 85% as a result of very strong performance in Q4 2015.
90
90%
80
80%
70
70%
60
60%
50
50%
40
40%
30
30%
20
20%
10
10%
0%
0
1
10
11
12
Number of cases
*A department is excluded if there are less than 5 responses recorded for each
of the four questions.
Annex
Fig. 3.1 Number of responses by question and quarter (Q4 2014 to Q4
2015)
27
2014
Q4
2015
Q1
2015
Q2
2015
Q3
2015
Q4
Overall satisfaction
41
24
27
35
32
Clarity of Opinion
43
26
28
36
33
Clarity of Process
Interpretation of
Methodology
43
26
29
35
32
43
26
28
36
33
42.5
25.5
28
35.5
32.5
Average
2014
2015
2015
2015
2015
Agg.
Q4
Q1
Q2
Q3
Q4
Average Score
6.7
7.0
7.0
7.1
6.7
6.9
Average
Score
(Green)
6.9
7.1
7.0
7.2
7.0
7.0
Average
Score (Not
Green)
4.3
6.4
7.0
2.0
4.0
5.3
2014 Q4
2015 Q1
2015
Q2
2015
Q3
2015
Q4
Not sure
No
Yes to some
extent
12%
9%
0%
4%
21%
7%
19%
3%
12%
9%
16%
42%
25%
28%
36%
Yes completely
Yes + Yes to
some extent
63%
54%
28
96%
46%
50%
42%
71%
78%
79%
79%
Ag
g
13
%
7%
28
%
52
%
80
%
Yes
No
2015
2014 Q4
2015 Q1
2015 Q2 2015 Q3 Q4
84%
92%
93%
71%
84%
16%
8%
7%
29%
16%
Methodology
Yes completely
Yes to some
extent
No
Not sure
Yes + Yes to
some extent
2015
2014 Q4
2015 Q1
2015 Q2 2015 Q3 Q4
53%
50%
57%
58%
48%
30%
12%
5%
42%
4%
4%
25%
0%
18%
39%
0%
3%
30%
18%
3%
84%
92%
82%
97%
79%
Fig. 3.6 Departmental results using all data (Q4 2014 Q4 2015)
Departm
ent
HO
DH
DECC
DEFRA
HSE
BIS
DCMS
DWP
DfE
DfT
HMT
DCLG
Satisfacti
Opinion
Process
Methodol
Combined
on
clarity
clarity
ogy
9.2
6.6
100%
100%
100%
8.8
7.4
89%
100%
89%
8.6
6.7
92%
83%
100%
8.5
7.7
76%
93%
94%
8.5
8.0
90%
90%
80%
8.0
7.1
82%
79%
87%
7.9
5.4
100%
60%
100%
7.8
5.2
100%
60%
100%
7.8
5.2
80%
100%
80%
7.6
6.9
71%
91%
71%
7.4
6.6
60%
80%
90%
5.9
5.6
30%
80%
70%
* A department is excluded if there are less than 5 responses recorded for each
of the four questions.
Number of cases
Fig. 3.7 Quantity of submission types (Q4 2014 to Q4 2015)
Q4 2014
IA
Consultation
28
IA Final
34
EANCB
39
29
PIR
0
Total
101
Q1 2015
Q2 2015
Q3 2015
Q4 2015
13
9
18
14
30
10
20
17
22
29
12
12
0
0
0
2
65
48
50
45
Fig. 3.8 % not fit for purpose by submission types (Q4 2014 to Q4 2015)
Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015
IA
Consultation
29%
15%
33%
44%
21%
IA Final
18%
20%
50%
10%
24%
EANCB
10%
5%
3%
0%
17%
PIR
NA
NA
NA
NA
0%
Total
18%
14%
19%
20%
20%
Fig. 3.9 % Late and Average Turnaround times (Q4 2014 to Q4 2015)
% Late
Days
Q4 2014
4.0%
20.9
Q1 2015
7.7%
23.1
Q2 2015
16.7%
27.0
Q3 2015
4.0%
22.5
Q4 2015
4.4%
22.3
Fig. 3.10 Percentage of first time submissions rated fit for purpose by
Department. Figures in brackets give total number of first time
submissions (Q4 2014 to Q4 2015)
DECC
DH
DWP
BIS
DfT
DEFRA
DCMS
HSE
CLG
HO
DfE
HMT
Q4 2014
100% (7)
80% (10)
100% (2)
84% (19)
93% (15)
67% (9)
100% (3)
67% (6)
86% (7)
88% (8)
100% (2)
83% (12)
Q1 2015
100% (1)
100% (7)
100% (3)
91% (11)
70% (10)
75% (8)
100% (1)
100% (2)
75% (8)
100% (1)
67% (9)
Q2 2015
100% (4)
100% (3)
88% (8)
100% (7)
100% (4)
67% (3)
100% (1)
71% (7)
80% (5)
50% (4)
71% (7)
Q3 2015
100% (7)
100% (5)
67% (3)
80% (30)
83% (6)
100% (6)
67% (3)
100% (1)
100% (5)
80% (5)
100% (2)
70% (10)
Q4 2015
100% (5)
75% (4)
88% (16)
50% (6)
86% (7)
100% (1)
100% (1)
0% (1)
67% (3)
Total
100% (24)
90% (29)
88% (8)
85% (84)
82% (44)
82% (34)
82% (11)
82% (11)
81% (27)
79% (19)
78% (9)
73% (41)
*Departments with fewer than 5 submissions in the period have been excluded
Fig. 3.11 Turnaround times for non-IRNs (Q4 2014 to Q4 2015)
Number of cases
Average days
over 30 days
Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015
101
65
43
41
38
4.0%
7.7%
14.0%
4.9%
2.6%
20.9
23.1
27.3
22.2
21.7
Q4 2014
Q1 2015
Q2 2015
Q3 2015
Q4 2015
Number of cases
0
0
3
3
7
31
Average days
NA
NA
28.3
22.0
25.4