We have two items of public evidence gathering on our agenda. The first is consideration of the Auditor General for Scotland’s report, “National Fraud Initiative in Scotland 2022”.
Our witnesses from Audit Scotland, who are in the committee room, are Antony Clark, executive director, performance audit and best value, and Anne Cairns, manager, performance audit and best value. We have some questions to put to the witnesses, but, before we get to those, I ask Antony to make a short opening statement.
Good morning, convener and committee members. I am delighted to be with the committee to brief you on Audit Scotland’s report, “National Fraud Initiative in Scotland 2022”, which we published on 18 August. As committee members will probably know, the NFI is a counter-fraud exercise across the United Kingdom’s public sector, which uses data matching to help to prevent and detect fraud. It looks for fraud and error that is related to things such as blue badges, public sector pensions and council tax discounts. The exercise takes place every two years.
One hundred and thirty two public sector bodies in Scotland participated in the NFI exercise in 2020-21, which is an increase of eight since the previous exercise was done two years ago. Alongside the report, Audit Scotland published more detailed analysis of the findings on different organisations, sectors and localities so that the public can get a picture of what is happening in their area. We think that the communication of the findings of that exercise is important as part of the deterrent effect of the NFI.
I will briefly highlight some of the key messages in the report. Outcomes valued at £14.9 million were identified in the exercise, and the cumulative outcomes from the NFI since it started in 2006-07 are now £158.5 million in Scotland. Across the UK, the cumulative total for the NFI’s outcomes is £2.4 billion. There has been a fall of £400,000 in the amount of outcomes that were recorded in Scotland since our previous report—there has been a small fall this year compared with our exercise two years ago.
We cannot say exactly what that is down to—there are a number of reasons why that might have happened. It is partly to do with there being less fraud and error in the system. People are learning lessons from previous exercises and improving their controls, and stronger internal controls are in place across a number of public bodies. It is also quite likely that the drop is, in part, a result of data matching on payroll and pensions being done separately by the Department for Work and Pensions with local authorities on a real-time basis. That is one of the areas in which we used to identify housing benefit fraud, so that is being picked up elsewhere.
A key benefit of participating in the UK-wide exercise is that data can be shared between bodies across national borders. Data that was submitted by Scottish bodies in the exercise has helped other organisations in Scotland and across the UK to identify outcomes of £1.2 million. I am pleased to say that most organisations demonstrate a strong commitment to counter-fraud and the NFI, but a lower percentage of participating bodies than we would have liked managed their role in this year’s exercise and there has been a slight deterioration in the quality of participation among public bodies. External auditors have identified areas for public bodies to improve their participation in the future, and they will follow that up with the bodies concerned.
As members know, we have discussed the issue on a number of occasions with your predecessor committee—the Public Audit and Post-legislative Scrutiny Committee. Audit Scotland is always keen to develop the NFI. We have undertaken a number of pilot exercises in which we have looked at new ways of data matching to see whether they will prove to be fruitful in identifying fraud. We set those out on pages 20 to 23 of the report. I suspect that members might want to talk to us a bit more about the pilot exercises and what they have found.
Audit Scotland continues to work with the Cabinet Office and the Scottish Government to develop new ways to prevent and detect fraud. We have set out how we are doing that on page 24.
As always, Anne Cairns and I are very happy to answer any questions.
Thank you very much. I have a couple of questions to get us under way.
It struck me that the cases that were identified—I think that they led to four prosecutions by the police—are largely small-scale, household-level examples of fraud or individual fraud. Is part of the exercise designed to look at the wider spread of organised crime fraud or at examples of much bigger, co-ordinated attempts to defraud the system?
It is important to recognise that our process through the NFI is designed to look at many of the large-scale systems and processes that are in place in public bodies and to identify whether there are systemic problems that can be addressed. If we saw systemic problems that could be linked to serious and organised crime, we would, obviously, follow them up, but that is not the purpose of the exercise. We have a range of other arrangements in place to look at the quality of fraud and counter-fraud arrangements in public bodies through internal audit and external audit. Many public bodies also have their own counter-fraud teams.
We in Audit Scotland liaise very closely with Police Scotland as well as the Cabinet Office in that work. However, that is not primarily the purpose of the exercise.
Thanks for that helpful clarification.
You have said that the amount of fraud identified in the exercise was slightly down on that identified in 2020, which was when the exercise was last undertaken. Do you see that as a sign of success in that there is growing public awareness, or as a sign of failure in that more fraud has gone undetected?
That is a brilliant question. I will bring in Anne Cairns once I have given my initial answer.
I am always slightly cautious about answering that question, which cropped up a number of times when colleagues were in front of the Public Audit and Post-legislative Scrutiny Committee. We feel that that is probably a sign of success in many ways and that people are learning the lessons that were identified in previous NFI exercises.
We know from our liaison with auditors, the Cabinet Office and participating bodies that people are learning lessons and sharing good practice. A whole network of agencies and arrangements sit alongside the NFI and wider counter-fraud activity, which is used to share good practice. We see in the NFI report an example of people learning lessons and strengthening controls.
I would not want to be complacent, as it is a bit of a known unknown, and I do not think that anybody could sit here with confidence and say to members whether fraud is increasing or decreasing across the public sector. The environment is volatile. The reason why we undertook some of the pilot exercises this year is that new funding streams have come into play and new systems have been introduced. It would be very difficult to say that fraud in year X is X per cent and that fraud in year Y is Y per cent, and what the balance is, if you understand my drift.
Does Anne Cairns want to add anything to that?
Yes, thank you.
We should not be complacent. In the past couple of years, we have issued a report to the audited bodies on new, emerging risks that we identified as coming into play during the Covid pandemic. That is on our website for the general public. That is an additional challenge that the public sector has had to deal with.
I think that the outcomes have gone down because the local auditors, the external auditors and the audited bodies have identified previous control weaknesses through previous outcomes identified in the previous NFI exercise and they have followed those up with the bodies to ensure that they have strengthened their controls. Obviously, that is a positive.
On the negative side, we have seen public bodies struggling with resource pressures and sickness absence in the past couple of years, due to Covid. As Antony Clark said, the majority have participated in the NFI and have reviewed the high-risk matches that we would expect them to look at. They might not have got there as timeously as they did in the previous exercise, but they got there.
You will see from the report that a few data sets were not included this time round. For example, the immigration data was not included, due to restrictions placed on it by the Home Office. Data about housing benefit, which was a source of significant outcomes in previous exercises, is now being matched through DWP systems. We are still working through some challenges about the legality of patient data, so those matches have not been included. Those are the areas from which we would have expected outcomes, had we been working on a like-for-like basis with previous exercises.
There is a mixed picture. There are some positives; there are also some negatives.
Thank you. Members of the committee will ask further detailed questions about some of the areas that you have identified.
One thing that you mentioned, and that Anthony Clark mentioned in his opening statement, is alluded to in the report. There appears to be greater reliance on external agents this time. I presume that that is a euphemism for the outsourcing of some of the data matching work. First, do you have any reflection on whether that affected the quality of the data matching exercises? Secondly, was there a pattern? For example, did smaller local authorities struggle more with the effects of Covid and therefore have to rely on outsourcing some of that work, or did big local authorities and big public agencies also do that?
I will answer the question about data quality before handing over to Anne Cairns to talk about the nature of the agents and where they were used.
We are satisfied that the quality of data that was submitted and that we have reported in this exercise is as good as the data in previous NFI exercises. We would not have published the report had that not been the case. There are strong and robust quality assurance arrangements across all the UK audit agencies, the Cabinet Office and others to ensure that that is the case.
Anne might want to say more about the specifics of where agents were used and why that was the case.
A few councils decided to use external contractors for their council tax single person discount matches. There were a few reasons for that. The timing of the NFI did not suit some councils. The exercise happens once every two years, so they could do it as and when they wanted to. However, some councils might have had other contracts with an external provider for data matching in different areas and the work was therefore bolted on to an existing arrangement.
There was no pattern as to whether that happened with large or small authorities. It tended to happen with medium-sized authorities. I had discussions with one authority that had previously used an external contractor for some of the work and was considering doing that again, but it has decided to use the NFI in future.
I should also point out that the NFI is only one of many tools in councils’ counter-fraud toolkit. As long as councils are doing the work, we are generally content that they are addressing the fraud issue and we are not too harsh on them. A few councils have gone down that route.
We have some more questions on that subject. Colin Beattie wants to follow up on that line of questioning.
This NFI outturn is pretty mixed. It is good to see that there are eight additional participants. The number and spread of participants have been part of an on-going discussion between the committee and the Auditor General for an extended period. That is still unresolved.
I have some key points. Page 4 of the report says:
“Immigration data was also not included in the 2018-19 and 2020-21 exercises due to restrictions placed on it by the Home Office.”
Tell me more about the restrictions that the Home Office put in place which resulted in immigration data not being included.
09:15
I will give a brief overview and then hand over to Anne, who might want to add a bit more detail.
The situation arose as a consequence of the Home Office review of the Windrush generation issues. As a consequence of that review exercise, lawyers in the Home Office determined that it would not be appropriate for immigration data to be shared with us—or other agencies—as part of the NFI exercise.
Do you have any more detail to add to that, Anne?
That was the main reason that the Home Office initially gave for not sharing the data. It wanted to review its own data before sharing it with other bodies. More recently, the Home Office has been looking at alternative ways of sharing data with public sector bodies, for example through the Digital Economy Act 2017, as part of a wider project. The Cabinet Office is still in discussions with the Home Office about getting immigration data back into the NFI, but that has not been successful so far.
Do we know whether that data will be available for the next exercise?
We do not, but we hope that the issue will be resolved in the forthcoming period. It is two years until the next exercise and one would hope that the issue will be resolved by then.
What has been the impact of not having that data?
The immigration data was really helpful in relation to student award data, in that we could identify students who were getting funding from the Student Awards Agency Scotland to which they were not entitled because their visa had expired and so on. We were also finding it helpful in relation to some of the payroll data—typically, national health service bodies found that they had employees whose immigration status had changed and they were not aware of that. It was mainly payroll and student awards matches that were impacted.
We are all conscious that there have been quite a few different requirements placed on public and private sector employers around identifying the immigration status of members of staff and employees and being confident that they are as they should be. That is another adjunct to the NFI work.
The Home Office gave the excuse that it was about Windrush, but that is a fairly discrete group of people. Does it really impact that much on the big picture? Is the Home Office saying that that impacts on a wide group of people?
Our understanding is that the legal considerations in the Home Office were that it would not be appropriate to share immigration data in this NFI exercise because of some of the broader issues that had been identified through the way in which some of the people who arrived in England as part of the Windrush generation had been treated. The impact on the NFI was part of wider considerations, rather than being a specific NFI issue.
Okay.
The report says on page 4:
“Residential care home data, direct payments and social care customers’ data were not matched in the 2020/21 exercise due to a legal question being raised around the definition of patient data.”
Can you tell us a bit more about that legal issue? Will it be resolved soon?
I can give you a bit of a sense of the legal issue, and then I will ask Anne to add more colour and confirm whether it has been resolved.
The issue arose because there was a difference of opinion about the extent to which social care data should be treated as patient data. The legal advice in the Cabinet Office is that some aspects of social care data should be treated as patient data and for that reason it was not possible to include some social data as part of the NFI exercise. Resolution is being sought on that issue—I am not entirely sure whether it has been resolved yet, but I do not think that it has.
No, it has not. The real issue is that lawyers in the Cabinet Office started to look at that area of the NFI and started to question whether the data that the councils are submitting could be classed as patient data. Typically, such data is on people in care homes or in nursing homes. The lawyers came to the view that it could be classed as patient data. The definition of patient data in legislation can be read several ways and so it can be said that that is patient data.
The legislation for the data-matching exercises that we are permitted to carry out—and that the Cabinet Office has carried out—says that, if we take patient data from an NHS body, any matches that result from the data-matching exercise should be released to the NHS body. A council is not an NHS body. We could take the data from the councils and do the matching, but we would not be able to release the matches back to the council. I have been working with the Cabinet Office legal team to try to resolve the issue.
We think that there might be a way to amend the Local Audit and Accountability Act 2014. Jacob Rees-Mogg was involved until fairly recently when he changed job and he was quite supportive of making a change to the act that would allow us to take the data from councils, to carry out the matches and to release those back to them. The Cabinet Office lawyers have not signed that off—they are still considering the matter. That is where we are at; we are actively trying to get it resolved as soon as possible.
Is there a way round that by releasing the information back to health and social care partnerships?
I do not think that that would be possible under the legislation, Mr Beattie. The same issue would arise.
It has to go back to a health board.
The obvious question is: what has the impact been?
We have not been able to do those data matches and identify whether any outcomes would arise as a consequence of that exercise.
Have you found discrepancies in the past?
Yes.
In the past, we have identified people in care homes who had unfortunately passed away and the council had not been notified, so it had continued to pay the monthly bill to the care home—until that match was made and the payment was stopped.
Potentially, the impact could be significant.
Yes.
Let us carry on. Page 4 of the report has a lot of bad news on it.
Let us hope that we can move on to the good news soon.
The report says that data from 11 councils was
“inadvertently deleted”,
so
“full supporting documentation is not available for these councils. The Cabinet Office has taken steps to prevent this error from re-occurring.”
Let us hope so. What is the impact of that? Does that relate to council tax discounts?
Yes.
What has been the impact for the 11 councils concerned?
I will hand that question over to Anne Cairns because she was much more involved in the specifics.
That was human error—it was nothing to do with the information technology or anything else. In accordance with the Data Protection Act 2018, at the end of each exercise the data that is no longer required is deleted. The head of the NFI signed off the data deletion instruction to the IT contractor, but there was an error and instead of the instruction saying that the data was to be deleted up to November, as with the previous exercise, it said “December”, and neither the head of the NFI nor the person who second-checked it picked that up. The instruction was signed off and the IT contractor did what was in the instruction. Unfortunately, that meant that the data was deleted.
Am I correct in saying that one month of data was deleted?
It is not as simple as that. That was the month in which those 11 councils uploaded their data.
Ah.
So it was not just one month’s worth of matches that were affected.
Those matches stayed in the system for about a year. Because they were not deleted immediately, the councils were able to work on them. However, we do not have the detail. For example, the report on the system for the council tax matches and the outcomes would have the heading “Council tax”, then the name—say, Anne Cairns—then the match, the outcome, her address, her council tax, the recorded outcome of, say, £500, the dates and the reason why. All that detail has gone; what is left is just a management information summary report for each of those 11 councils.
We have asked the councils involved whether the data looks realistic and feels like the right value in order to try to verify whether the figure in the summary report is right, but they do not have the full details behind that. Some of the councils have some records—for example, if they were working offline on a certain number of cases—so we have been able to verify some of the information but, overall, we do not have the evidence to say, “That’s the figure for the outcomes and it’s made up of all these cases and values.”
We have raised the issue with the Cabinet Office, which has assured us that it has implemented new controls and new arrangements around data deletion.
How confident are you about those new controls? The fact is that it was a human error. An instruction was given and checked and then released to the contractor, who acted on it. Why would that not happen again? What will those who are responsible do—will they have triple checks or quadruple checks?
They have implemented some automation in the system as well. I do not really understand it all—I am not an IT expert—but they have implemented an automated process for situations in which a request is made for data to be deleted and the data is outwith the normal range. For example, if data was due to be deleted up to November, the system would ask a question such as, “Wait a minute—you are asking for data to be deleted up to December; are you sure?” That is as well as the members of staff involved being mortified.
Of course, we cannot give you a categoric assurance that such things will not happen again. Regrettably, human error is part of life. However, we are as confident as we can be that people have done as much as they can to avoid that happening again. It is a very regrettable thing. We are all very disappointed.
So there is hope.
Yes.
Willie Coffey wants to come in at this point.
The issue of data, data loss and whether back-up data could have been retrieved is one that has come to the committee before. Was there any discussion of that? Was the back-up data destroyed as well?
Again, I am not an IT expert, but I asked that question. My immediate thought was, “Can you not just wind back the clock to whatever date it was deleted?” The IT contractor went away and looked at that.
We identified the issue while we were drafting the report. We were trying to verify the figures—as the committee can imagine—so it was around May or June time this year. We were thinking, “Where’s that number coming from?” We were struggling, which is why we raised the issue with people at the Cabinet Office. They were not aware of it. When they said, “Oh, sorry—the data’s been deleted,” one of my first questions was, “Can you just turn the clock back?” Apparently, they could have done that, but it would have meant that everything that had been input into the NFI system since that date—say that it went back to the December—would have been deleted. We would have corrected one thing but created issues in every other set of data that was in the system.
To make a more general point, that reinforces the value of people moving to more real-time matching of data so that we have a more preventative model of identifying risks as and when they present. The NFI is a really important exercise. We would not do it or bring the report to the committee if we did not think that it was an important part of the fraud management toolkit. However, what has happened here says to me that there are merits in people sharing data on a real-time basis across different agencies, which is another important part of the work.
It also reinforces the importance of people identifying and putting in place improvement actions on the back of the work. This is an example of something having gone wrong. In this report, and in our fraud and irregularity report that Anne Cairns referred to earlier, which we share with all public bodies and auditors, we identify problems. This is an important bit of the learning that needs to go on all the time around fraud and corruption.
I will now bring in the deputy convener, who wants to follow a particular line of inquiry.
Good morning. Page 9 of the report provides outcomes and information relating to blue badges. There appears to have been a significant rise in the number of blue badge outcomes being reported, with 44 per cent more matches identified compared with the most recent NFI exercise. Are you aware of any reasons that may have led to that increase?
The honest answer to that question is that no one entirely knows quite what the reason is. However, the feedback that we have had from at least one local authority that has looked into it in a degree of detail is that it may be a consequence of the Covid-19 pandemic and the higher mortality rates during that period. That is our sense of what might be prompting that increase. I do not think that Anne Cairns has anything more to add to that, do you?
09:30
No, not really. We have had anecdotal information from councils. In general, they found that the Covid pandemic adversely impacted people who tended to be on disability benefits, including people who had blue badges. Some of those people unfortunately passed away, and due to the pandemic and everything being in lockdown, people did not know where to hand in the badges, because councils were shut and so on. When we did the data matching, councils found that there were still a number of badges out there in circulation for people who had died.
When we repeat the exercise in two years’ time, we will get a sense of whether it was a one-off blip or a more systemic problem. Had there been a more systemic problem beyond that, councils would have reported it to us as part of the NFI exercise, so that is our suspicion at the moment.
I will move on to pensions. It is clear that there has been a significant drop in the value of pension outcomes that have been identified by the NFI exercise, which the report suggests is down to improved effectiveness. Although that is very welcome news, what further steps do you think could be taken to reduce the £1.5 million of outcomes that were detected?
Gosh—that is a good question. To be perfectly honest, I am struggling to think of a single answer to it. The sense that we have from the exercise, as you have just said, is that the matches that were identified through the NFI, which we shared with the Scottish Public Pensions Agency and the appropriate pension bodies, were ones that they picked up through their own control of risks and so on.
It feels as though the control environment around pensions is relatively robust and strong, so I am not sure that there is anything that I could say to you—for example, “We need to fix X or Y”—that would reduce the level of fraud or errors. That might sound like a counsel of doom and that we must accept the situation. I am not saying that we should just accept that level of fraud or errors in the pension system, but I cannot think of a single obvious thing to say. Anne, do you have a silver bullet?
I do not have a silver bullet. The match is for people who have unfortunately died but their pension has remained in payment. We could make headway on improvement if we were somehow able to move to real-time reporting and the pension funds were able to get real-time information when someone passes away.
Pensions bodies use the “tell us once” reporting process and they have strengthened their controls, which has, as you can see, resulted in a significant drop in outcomes. We look closely at those outcomes; we think, “Oh my goodness! What is going on here?” The pension bodies got a number of matches, we looked at them and found that they had properly investigated and reviewed them. They got the matches because of a timing difference. When we said that someone had died, by the time they reviewed that case, they had already actioned that in the time period between the data getting submitted and the matches coming out. The only way to improve that would be through the use of some kind of real-time information.
Good morning. I want to probe the issues of housing benefit and council tax reduction, and to ask a couple of questions about recovery and prosecution.
On housing benefit, page 11 of the report states that the NFI exercise identified 177 cases of housing benefit overpayments, which had a value of £1.2 million. That is a significant drop from previous years. The report explains that the reduction is mainly due to the use of the Department for Work and Pensions verify earnings and pensions—VEP—alert service.
When did the DWP system come into use in Scotland? Are you certain that it is the reason for the underlying drop?
I can give you an assurance that we think that it is the reason for the drop, but I cannot give you the date on which the system was introduced; perhaps Anne will be able to.
I cannot remember exactly when the system was introduced, but it was before the pandemic. It has been an emerging system from the DWP. The DWP implemented its accuracy award initiative, as it was called, before the pandemic. Initially, it was on a voluntary basis. Councils got matches and it was up to them whether they reviewed them or not.
The DWP has progressed the scheme and made it mandatory for all councils to participate. The matches go through to councils daily. They are risk scored, so at the top of the pile every morning will be the highest-risk match for that day. If it is not actioned that day, it might still be number 1 the next day or it might drop down if another one of higher risk comes in.
The councils received funding from the DWP to action those matches. The DWP has not reported any outcomes or results from that as yet. It reports the overall level of fraud or error in the benefits system. As part of that reporting, there is a section on housing benefit, but it is not just down to the VEP service.
The figures speak for themselves: in 2018-19, there were 1,238 cases and, most recently, the figure was 177, so there seems to be a causal link.
On the sum of the overpayments, the report states that the average individual value of housing benefit overpayments has risen from £2,300, in effect, in 2018-19 to nearly £6,700 in 2020-21. Are you aware of any particular reasons for that significant jump?
I will have to ask Anne Cairns to answer that. I am sorry.
That is fine.
We do not have any specific information on why that is the case, Mr Hoy. It could be down to many factors. It could be that the rent levels have gone up or that the overpayments took longer to be identified, so they ran for a significant period of time.
The matching that took place in the NFI was not against payroll. Usually, when you match against payroll, you find cases more quickly. Some of those overpayments are matched against right-to-buy cases in England or other housing records in another council. I do not know the particular reason for the rise, but I hazard a guess that it is because the overpayments have been going on for a longer time than in the previous exercise, when it was just a matter of people’s fluctuating earnings.
So there is probably work in progress to try to identify why that sum has risen.
Yes.
Given that you have asked the question, we will examine that issue when we do the next exercise.
On the council tax reduction, page 13 of the report states that councils identified 772 cases of fraud or error with a total outcome of £700,000 in 2020-21. That is 2.5 times the number of cases that were identified in 2018-19. The report appears to suggest that councils are of the view that that might have been directly caused by the Covid-19 pandemic. Will you elaborate on that? What aspect of the pandemic might have resulted in that increase?
We think that the main driver is probably the volatility of the employment market during the Covid-19 pandemic—people’s roles changing and people moving in and out of employment—and the challenges that that presents for having up-to-date and accurate information to support council tax reduction activity.
I have a couple of slightly broader questions. Given the relatively low number of prosecutions and the cost of living crisis, do you expect fraudulent activity to increase during a period of economic downturn?
I think that it creates a higher-risk environment for fraud and errors, for some of the reasons that we have already discussed with the committee, so that is plausible, Mr Hoy.
Perhaps more generally, the report makes it clear that, during the Covid-19 pandemic, people were conscious of the elevated risks. Anne Cairns mentioned our report, “Covid-19: Emerging fraud risks”. That set out a range of new risks that were emerging as a consequence of the Covid-19 pandemic. Obviously, the cost of living crisis is not the same thing as Covid-19, but it has some common characteristics. Therefore, people being alert to the emerging Covid-19 fraud risks is important in the context of the cost of living crisis.
As we mentioned, there is a range of groups that consider fraud across the public sector. We have talked about some of those national networks. They will also be thinking hard about what new issues will emerge as a consequence of the cost of living crisis. Our “Emerging fraud risks” report set out how many public bodies had strengthened their anti-fraud and counter-fraud activity as a consequence of Covid-19. It is likely that similar activity will take place as a consequence of the cost of living crisis.
On the broader issue of recovery and prosecution, the public might be quite shocked to see that there was £15 million of potential fraud and overpayment in a single year and £160 million in total since the initiative began but, in the year that we are considering, only four cases were referred for prosecution in Scotland, which obviously does not mean that there would be a legal sanction against those individuals.
Let us look at a couple of the case studies that you used to highlight examples of fraud. In one, which concerns non-domestic rates and the small business bonus scheme, a ratepayer failed to declare other business premises, which resulted in an £11,000 overpayment. Apparently, that is the first business rates case to be reported for prosecution in Scotland, so action is being taken on that. However, on pensions, there is an example of somebody who claimed £10,560 and was overpaid £6,600. He received a police caution and the full amount was repaid. Then there is a case in relation to a council tax reduction claimant who failed to declare pension contributions and a pension lump sum but made off with nearly £15,000. The council is recovering the amount but there is no reference to any prosecution or any report to the procurator fiscal.
Is there a sense that the system is getting tough on recovery but there still seems to be a light touch approach on sanctions and prosecution?
That is not the conclusion that I draw from the work, Mr Hoy. It is important to say that the NFI is one part of the fraud and corruption toolkit. It is not designed to focus solely on the points that you have raised. Local internal and external audit teams examine the matter, as do the counter-fraud teams in public bodies. The case studies that are mentioned in the report were identified through the NFI process. There are other cases that have been identified through annual audit, external audit and whistleblowing activity.
It is worth saying that we have reported those matters in the current NFI report because of the direct request of your predecessor committee, the Public Audit and Post-legislative Scrutiny Committee. Questions were asked about what happens when a fraud is identified and we felt that it was appropriate for us to highlight what happens in terms of prosecutions. You are right that it is a small number but it is also important to recognise that there is a lag. We identified four cases in the report but more might be prosecuted in the future. There is also a risk exercise be done within public bodies about the cost versus the benefit of proceeding to prosecution based on the levels of the funds involved.
Anne Cairns might want to add something to that, but that is our assessment. That is not to say that there is no more work to do on prosecution. It is important to send the right signals to the public that public money counts and there is a price to pay for illegal activity. However, I do not draw the same conclusion as you from the report.
Basically, we leave it down to the individual council to speak to the fiscal. Some councils can report cases directly for prosecution. It is down to their judgment whether they take that forward.
Having said that, most councils, as Antony Clark said, do a cost benefit analysis. Therefore, if someone puts their hand up and says, “I got that wrong,” or “You caught me,” and starts to pay the money back, councils tend not to take the case to prosecution unless it is of a significant value or has a high public-interest value. In practice, that is what tends to happen.
The number of prosecutions is low but, as Antony Clark said, other counter-fraud activity takes place, mainly in councils, which is reported. We report that activity to committees and report cases on their websites.
That identifies my concern. If you were to steal £14,000 from your neighbour, that would be perceived to be a pretty heinous crime but, because of the size of council budgets, if you steal £14,000 from the council, it appears that, as long as it gets the money back, there is no legal sanction. That is an underlying concern but perhaps we shall return to it another day.
That might be a philosophical point, Mr Hoy.
I will look more closely at the pilots. One of the pilots that is highlighted in the report was undertaken in Fife Council on the national entitlement cards for travel. How long did that pilot take place for? What period did it cover?
09:45
Anne Cairns was heavily involved in that and worked closely with Fife Council. She will be able to give you chapter and verse. I am conscious that we need to finish at about 12, but she has a lot to say on the issue if you really want to know about it.
The pilot was run over the past year. Fife Council volunteered to participate, which was fantastic. It worked with the national entitlement card office and uploaded all the data about its national entitlement cards. The matches were all worked through—that work finished in the spring of this year—and the outcomes were recorded. Every match that we identified was a positive outcome.
I want to inject a sense of perspective. The pilot was presumably intended to understand whether that line of inquiry was worth pursuing and whether the resources invested in it will reap a significant harvest. The report says that, of all the cases in Fife during that year,
“Thirteen matches showed cause for concern as the NECs appeared to have been used after the death of the cardholder. Two of these cards were used for journeys to the value of almost £2,300 for one”—
I do not know where you can go to from Fife for that kind of money—
“and £240 for the other. The value of the journeys for the other 11 cards varied from £3.10 to £69.00.”
First, that seems to show how honest the good people of Fife are. Secondly, does that indicate that there is a major problem that would require lots of resources to be turned over to extend the pilot into a national-level scheme?
We are not sure at the moment. We are not convinced that that pilot merits roll-out across all of Scotland. We are reflecting on that with Fife Council and with the bodies that are involved more generally in Scotland. Anne Cairns has been having discussions with the relevant people and we are weighing that up quite carefully.
More generally, regarding mandating bodies’ participation in the NFI and the piloting and rolling out of new initiatives, we are very conscious of the costs and benefits. Participating in the process takes up people’s time, so there has to be real public value. The report says that we would only ever want to extend the NFI if we thought that that would offer real benefits, either by identifying a small number of very big outcomes or a lot of small outcomes that add up to a big number. Those are the criteria that we reflect on when we ask whether it is worth rolling something out. We are weighing that up carefully in relation to the national entitlement card being part of the NFI in future.
Yes, a single bus journey of £3.10 may not be worthy of a major national exercise in trying to understand what is happening.
Another interesting area that you mention is the pilot that is under way with Social Security Scotland. That pilot looks particularly at whether there are examples of people claiming benefits as if they are resident in Scotland when they actually are not resident in Scotland and at any cases of people who have multiple addresses and are, therefore, putting in multiple claims. How many matches have you found through that pilot?
We are in the process of reviewing the matches at the moment, so we cannot report any outcomes. Your question is about how many matches there were. I do not have those figures to hand and do not know whether Anne Cairns does. If not, we can write back to you with that information.
Do you have a sense of the scale of that? Is it at the level of national entitlement card fraud in Fife, or is it much more widespread?
I would need to double check. I am reluctant to give you an answer without being confident of the figures in front of me.
I appreciate that. If you can get back to us with those figures, that would be helpful.
I can make a more general point. That is one area where we think that there might be fruitful work to do, depending on the findings from the pilot exercise. Given the increased responsibilities that Scotland has for social security, we are taking that seriously as a potential area of work.
More generally, we set out a number of pilot exercises in the report. We have touched on a couple of those today. We are keen to keep the committee abreast of any decisions that are made in relation to an extension of the NFI in the future. That feels like an important conversation for us to have.
Absolutely, and thank you for that undertaking.
I will bring in Willie Coffey, who has questions on areas of future development.
Before I ask those questions on future developments, can you clarify the figure of £14.9 million that you reported? It is described as an outcome. Does that mean that that money has been recovered or is in the process of being recovered? Is that sum ever recovered in full?
The outcome is not just the money that is being recovered, but includes errors that are being prevented. It is made up of several different components—I will be corrected if I get this wrong. Sometimes, it is the amount that is being sought to be recovered from an individual, if there has been overpayment or fraud. In other cases, for example in relation to rental issues, the figure is based on how much it would have cost the public purse were the error or fault to have continued. The total is made up of a combination of different figures.
Members of the public would want to know how much money we have lost and how much money we have to get back. It is not clear to me in the report what we are talking about and whether we ever get that money back.
I am sorry that it is not clear to you and that you feel that it might not be clear to the public. We can go away and reflect on that. We have to operate within the framework of the NFI, which is not entirely under our control. However, there may be ways in which we can present the outcomes more clearly in future.
Is there is an element of that figure that is preventative, that is, money related to fraud that the NFI might have prevented?
Yes.
Does that figure go back two years to the previous NFI or does it relate only to the current year?
It is based on the matches that come out of the exercise and is a moment in time. It is made up of the frauds that were identified through that specific exercise and the amounts that are sought to be recovered from individuals who have defrauded the public body at that moment in time, alongside forward projections for those ones that rely on an algorithm.
Yes, but does the data that it uses go back two years to the commencement of the previous NFI?
I am obviously not making my point very clearly. It is based on a specific date—almost like a census. It is based on a moment in time. Perhaps Anne can explain it more clearly; I am trying my best.
For this report, it was all outcomes reported by councils, health boards and so on between 1 April 2020 and 31 March 2022. The majority of that would have related to the data that was submitted within that time, but there might also be some outcomes reported in that time that had not been included in the previous report because, as you will appreciate, when there is a match it takes a bit of time to undertake all the reviews, especially if there is a need to contact other bodies for confirmation. For example, if the match relates to housing and the council needs to contact a council down south because it appears that the person has a house down south, or if the match is a person is working down south and also working for NHS Greater Glasgow and Clyde. It is everything that the bodies have reported as an outcome between those two dates, but some outcomes will relate to data from the previous exercise that had not been reported at that time.
That is a lot clearer, thank you.
Antony, could you say a wee bit more about the future developments that you mention in the report? There is some commentary about new types of data matching that may be available to us and that Audit Scotland may have access to HM Revenue and Customs data that it did not have access to previously. We know that the Cabinet Office is consulting on potential new powers and on expanding all the powers in relation to the NFI. Can you tell us a wee bit more about that and whether you have been part of the UK Government’s consultation work on improving the process?
We are very actively involved in those discussions, in which we bring to bear the views of the Public Audit Committee and its predecessor committee, as well as the views of the Auditor General and other interested parties. Anne Cairns is actively involved in those discussions.
On the specifics of future developments in relation to whether the legal powers will be expanded to cover the four bullet points on that page of the report, my understanding is that the Cabinet Office has determined, based on the consultation feedback, that it is not minded to change the legislation to expand the powers to cover those points because there was general consensus from the consultees that there were risks and concerns about protecting people’s identity and other data issues.
Having said that, the Cabinet Office recognised that there is merit in those objectives being achieved, and it is looking at other ways of delivering the improved outcomes through strengthening the role of a specific team that has been set up in the Cabinet Office that is working on that—I can find out the name of that team.
That is the situation with the future developments. Does that answer your question, or was there more that you wanted to know?
It does, partially. Have you got access to HMRC data that you did not have access to before, or has it ruled that out?
I thought that you were talking specifically about the consideration given by the Cabinet Office to expand the powers so that the data that is made available could be used for different purposes.
No change has been made to our access to the data. The question that arose was whether the data could be used for purposes beyond those set out in current legislation, and as I said, based on the consultation, the Cabinet Office is not minded to seek changes to the legislation to allow the data that is available through the NFI process to be used for the wider purposes that are set out in the bullet points in the report.
However, it is keen to think about ways in which the NFI exercise can be used in a more preventative way and to help people identify positives as well as negatives, and it is also keen to think about ways that we can use the NFI exercise to better promote anti-fraud and anti-corruption work.
I might have missed something, so I will hand over to Anne Cairns.
Based on its consultation on expanding the powers, the Cabinet Office has decided not to do that at this time, but it has not said that it will not do so in future. Instead, it has decided to allocate the resource to the Public Sector Fraud Authority. You might have picked up on stories in the media when that was launched. It was meant to be launched at the beginning of July but, obviously, there was a lot of political unrest and change in Government in Westminster at that time. It has now launched it and that is what it is concentrating on now. That is wider than the NFI.
With regard to the HMRC data, we are keen to use the powers that Scotland has had since earlier this year—March, I think. We have been in discussion with the Cabinet Office and HMRC to try to get access to that data. We submitted a data sharing request to HMRC in June. Usually, when you put in that request, you get an automatic response saying that it will reply in 21 days, but it did not reply, so we have chased that up. There is a meeting going on this morning to try to get that data sharing with HMRC progressed. Ideally, I want to use the HMRC data on payroll, pensions and some data on who has paid stamp duty—which relates more to England, but it can relate to some of our housing cases—and so on. I do not have a timescale for that, because the meeting is going on just now, but we will try to tie HMRC down to a timescale of when it can share that data with us.
Thanks. Convener, I completely forgot to ask a question earlier. May I ask it now?
Of course.
I was listening to colleagues’ questions about the various themes in the report, including the number of potential fraudulent cases and the amounts of money involved, and I would like to know what preventative measures are taken to try and prevent those cases.
I will use the blue badge scheme as an example. The amount that is potentially being defrauded is around £2 million every time that scheme it is looked at. What preventative measures are taken to try and stop that so that the next time you sit in front of us the figure is not still £2 million?
There are a number of different dimensions to that. Part of the preventative measures is making sure that only the right people get the blue badge, so there should be controls to ensure that people have the right characteristics to justify them receiving a blue badge, but work on that is done within local authority areas.
Through the NFI, we identify what happens when people’s characteristics change or they die while they are still in possession of a blue badge. More real-time matching of data might help in that area. We would also be looking for appointed auditors to follow up on the findings of the NFI with public bodies.
In my response to Sharon Dowey’s question, I said that there might be a blip as a consequence of Covid-19. We will want auditors to follow up on whether people are learning the lessons of what we found in the NFI report. When we do the NFI in two years’ time, we will see whether there have been any improvements.
10:00
It seems to me that, year on year—or every two years—the same amount is potentially defrauded through the blue badge scheme, for example. Surely the public would expect that figure to go down with counter-measures and the NFI initiatives. The public might ask why it does not go down.
Anne Cairns might want to add something to my response that might illuminate that a bit more.
On preventative measures, it is councils that are impacted by the blue badge matches. They report to their audit committees and tend to put stuff on their websites about what they have identified and what they are clamping down on, so they do promotion-type work. They also promote the tell us once service so that, as soon as someone dies, they are informed. As Antony Clark said, real-time information would be the ideal. It is more a case of promotion.
I look forward to the next report.
I do not want to put words in the witnesses’ mouths but they said in answer to the initial question that was put to them on the blue badge scheme that the mortality rate among people with disabilities was higher than that in the general population. That might explain why there was a rise during the period. It is worth waiting to see what the next round of NFI results tells us about that before we jump to any conclusions.
In her letter of 3 May, the Cabinet Secretary for Finance and the Economy confirmed that she was keen to support Audit Scotland with any legislative changes that were required beyond any changes resulting from the Cabinet Office’s consultation. She confirmed that officials would continue to work with Audit Scotland to further improve engagement with the NFI in Scotland. Can you tell us about how that has been taken forward?
Anne Cairns has the operational engagement with the NFI team, the Cabinet Office and Scottish Government officials, so she might want to say a bit more about that.
We speak to the Cabinet Office several times every week and meet the Scottish Government regularly. The Scottish Government has a head of counter-fraud—I think that that is the title—as of last year, so we engage regularly with him and the wider Scottish Government risk team, who pulled together the response from the cabinet secretary and the action plan that is attached. The engagement was not so frequent during the pandemic because it was really difficult to get audited bodies to do more than the basics on the NFI, but we have opened up those discussions again over the past few months.
So, not a lot of progress has been made on it.
You specifically asked whether there were any plans to make changes to primary legislation. We are not aware of any such plans at the moment. We are still in discussions with the Scottish Government officials and the Cabinet Office about whether legislative changes might be needed. We want to continue to discuss that with the committee as well, given your strong interest in that area and the discussion that we have had, in which you have highlighted scope to further improve the NFI. We will take the feedback from today’s discussion to our discussions with Scottish Government officials in the Cabinet Office. That is one of the purposes of coming to you.
I thank Antony Clark and Anne Cairns for their time and the level of information that they have given us. It has been really helpful and given us some direct answers to some important questions that we have been keen to put to them. We shall no doubt see them in the future on the NFI.
I suspend the meeting to allow a changeover of witnesses.
10:04 Meeting suspended.