Official Report 594KB pdf
Moving on to our next agenda item, the committee will take evidence from the OECD on its recent report on Scottish education. I am delighted to welcome to the committee Dr Beatriz Pont, senior analyst, education policy, and Romane Viennet, policy analyst. We hope to take about an hour and a half of your time, Dr Pont and—I am looking across my screen to make sure that I can see her—Romane Viennet.
I will start the questioning, then bring in my colleagues as we go. My questions are directed at both or either of you. How did the pandemic restrict your ability to do the report, especially when it came to evidence gathering? What would you have normally done that you could not do because of the restrictions that were placed on you? That will be an interesting scene setter for the committee.
It is a pleasure for us to be with you today after all the work that we have been engaged in with Scotland. We appreciate this opportunity. I want to give an introduction to the project and its background before we start taking questions, because we think that it is important to set the scene.
Fair enough.
If you do not mind, I will take that opportunity now. I want to cover a bit of the background, the methodology, the focus and the findings. It is important for us that the committee understands how we have been working with Scotland.
We have a project at the OECD that focuses on the implementation of education policies, because we have learned through the years that many countries fail to implement their reforms successfully, as they do not focus well enough, or deeply enough, on implementation. They design beautiful policies, but the process of implementation is not very focused, so it is important to develop a comparative analysis of implementation and to support countries with that process.
About three years ago, we started the project, and we have now worked with eight OECD education systems—in Austria; Estonia; Ireland, on its senior cycle review; Mexico; Norway; Scotland; Wales, on its curriculum review; and Iceland, on its education strategies. The lessons that we use are very much comparative. Our approach is comparative and is based on all the knowledge that we, as the OECD, have on a wide range of policies and education systems. That is important, to contextualise.
We have a tried and tested methodology. It has been in use not only in the years since we started this project—we have been doing country reviews for more than two decades. I have been at the OECD for more than two decades and have been doing reviews and working with countries in a way that is tried and tested. It draws extensively on qualitative analysis, quantitative information and comparative analysis that we tailor to the countries.
The way in which we work is that the country prepares background information for us and we do our own preliminary analysis of the data, the research, the literature and anything that the country might have published or developed. We gather all that information ourselves, then we visit the country. In the case of Scotland, we made two virtual visits by video. We had video interviews, which actually lasted a lot longer than our usual interviews. Normally, we travel to the country and meet all the different stakeholders whom we have advised the Government that we want to meet. We develop a list of education stakeholders whom we consider important. In developing the list, we balance our timing needs with the possibility of meeting everybody at a specific time. [Inaudible.]—develop the list with the Government, and then we, as a team, meet those on the list individually.
We created a specific team involving me, Romane Viennet, Anne Looney and Jan van den Akker, who is a world-renowned curriculum expert. Anne Looney is an excellent academic who is engaged with you on the continuation of the recommendations.
We made two visits, the first of which focused on policy. We met all the different stakeholders whom we considered it important to meet, and we developed more interviews with those whom we were not able to meet after the week in September that we spent virtually in Scotland.
In the second meeting week, we visited schools. Normally, we can visit only three or four schools but, because it was virtual, we were able to visit more and meet more students. We gathered perspectives from different students across Scotland, and we focused on meeting principals, teachers, students and parents in different regions of Scotland. In addition, we had webinars in which we gathered the perspectives of stakeholders on our preliminary recommendations.
We met the Scottish practitioner forum every six weeks to review progress, share how things were evolving with the project and check the preliminary findings with it.
We have been greatly engaged—although virtually—in developing the final analysis and the recommendations that members have seen. A summary has been provided to the committee.
The process has taken a while, and there were issues to do with the pandemic. We had negotiated with the Government to do the review prior to the pandemic hitting. We had considered doing the review visits in June, but the pandemic hit. That was not the time to visit a school, because nobody was in school and teachers were really focused on something else, not on an OECD review. We considered that it was not the appropriate time to visit a school or to engage in an analysis when we would have just been in the middle of another unwitting situation for many schools and policy makers to respond to.
We decided to make the visit between September and December, because we knew that we had to deliver our report in a specific time and that it was important for Scotland. We delayed and thought that we would make a visit in person, if that was possible and then again in—
It was not possible.
[Inaudible.] We then did the analysis. That was the methodology.
As members know, the initial focus was on the senior phase only, but there was then a parliamentary request to focus on the whole system. The main question that we were asked to focus on was whether curriculum for excellence was implemented in such a way as to contribute positively to the education of all young people. We were requested to focus on implementation and learning, attention to broad general education and the senior phase, and young people and learning at the centre. The review was collaborative and inclusive of stakeholders as much as possible.
We started our review and looked at the programme for international student assessment data, which were of great concern across Scotland. I think that one of the reasons why the review was requested was the concern that the PISA data were declining. We saw that there was a decline, but that was also the case in many other OECD countries.
We saw that Scotland was among the leading countries in global competency proficiency, which measures the new types of skills that CFE is delivering. We also saw that, in terms of equity, Scotland is above average across OECD countries.
10:00We were concerned about the amount of working time that teachers spend in front of a class. It is very high in relation to other OECD countries—it is among the highest, actually.
We also reviewed selected evidence that your predecessor committee took on education outcomes.
It was difficult to see what CFE delivers, because there are many different data points. You use PISA, but that is a one-point-in-time measurement of 15-year-olds, and, for us, the analysis concerns children from the ages of three to 18. That means that PISA cannot give you a full picture of whether CFE is working well, especially in the senior phase. However, we saw that you have data that shows that 95 per cent of leavers end up in a positive destination, that more than 90 per cent of 16 to 19-year-olds participate in education, employment or training and that there has been a narrowing of the equity gap between the most and the least deprived areas.
We also saw that, in secondary 3, 88 per cent of learners achieved the expected level in literacy and 90 per cent achieved the expected level in numeracy. Further, there was also improved attainment in the Scottish credit and qualifications framework in S4 to S6, with the attainment gap decreasing between 2009-10 and 2018-19.
We saw in some of your own data some progress in education. That is the background. Then we looked at CFE.
CFE has been in the making for many years, as you know. Work on its development started in 2004 and the OECD followed its progress for a while. CFE statements were published and implementation started in 2010, more than 10 years ago. Many education systems review the curriculum every 10 years, so this is a good time to review CFE, because it is not new and people have had a lot of experience of it.
You have the four main building blocks of CFE—the four fundamental capacities—as well as children’s rights, eight curriculum areas and three interdisciplinary areas. Assessment is an integral part of the system, as is school-based curriculum design. You also have a number of priorities: closing the poverty-related attainment gap, preparing children for the future, raising standards and providing competency-based education.
We saw CFE as a pioneer among education systems internationally. Since you started implementing CFE, many other systems have been implementing curriculums that focus on knowledge, skills and competencies. Many education systems internationally have been watching what has been happening in Scotland and regard the system as a high-performing one. That is quite important for us, and it is one of the reasons why we wanted to understand CFE.
We found that there were a number of important underlying tensions in CFE. There is a lack of balance between flexibility at the school level and system coherence. There are also tensions around depth and breadth; the focus on knowledge, skills and/or competencies; and the alignment between student assessment and system evaluation.
We analysed all those tensions and developed a set of recommendations. We suggest that it is important to provide all Scottish students with a coherent learning experience between the ages of three and 18, which is not the case now, because there is a gap when they reach the senior phase and move from the CFE experience into preparing for the end-of-school testing. That is an important point. In relation to that, we said that you need to reassess CFE’s aspirational vision against emerging trends in education and, especially, deliver a better focus on knowledge. We also said that you must find a balance between depth and breadth, adapt the senior phase to the vision of CFE and continue building curricular capacity.
The second area that we thought was important was collaboration between and clarity in the roles and responsibilities of different stakeholders and consolidating institutional policy processes for effective change. I will stop there because I see that you want to come in.
You have given us a tour de force of the report, which I appreciate. I will ask my question again: is it correct to say that the pandemic restricted your ability to come here and, I assume, your usual way of doing business? Would you normally have come here and been on the ground? Has it been a different experience?
Yes, it has been a different experience. We have been taking that approach with all education systems, not only Scotland’s, and we have tried to find the best ways possible to gather evidence without being able to do the best thing, which would be coming to Scotland.
It would be better to have been here on the ground. Are you satisfied with the diversity of the voices that you heard in building the report? One aspect of the constructive feedback that came after the report’s publication is that most of the people and organisations that you consulted are on, or had been on, Scottish Government committees or had developed or managed CFE—the so-called insider bodies. That is one of the criticisms of the report. How do you respond to that?
We have met all the different education stakeholders that have been involved with CFE and that have lived CFE and experienced it as students, parents, teachers and policy makers. Of course, we needed to meet the policy makers who shaped the policy, but we also met a number of academics who had analysed and critiqued the policy, as well as many observers and representatives from different bodies and institutions across Scotland. CFE is a policy that covers the whole education system, and we met a representative set of stakeholders that matched the types that we would meet in any other education system.
Okay—that is a fair response. I will ask about changes that were made to the report between the draft and the final version. Were changes made at the request of the Scottish Government or any of the educational agencies?
The process for us is always the same with different countries. Developing the reports is an intricate process. We gather the data and then go back and meet the team, review the evidence and draft the report. We have a preliminary version that goes to a number of people internally at the OECD and externally in the country, because we believe that it is important to get the facts right. We are observers—we are not Scottish and the education system and assessment system in Scotland is very confusing. So, it is important to—
That is an interesting comment—it is “very confusing”.
It is complex, not confusing. With all countries, we always send the preliminary draft for comments and review, to check that all the facts are right. Normally, the process is that we have a national co-ordinator and we interact with that person only. They are in charge of gathering all the different feedback in the country and giving it to us, because if the OECD was to open its mailbox to everybody in the country, it would be unmanageable for us. There is the—
I am sorry to interrupt—I do not wish to be rude—but I had better bring in my colleagues shortly. Is it fair to say that there were changes between the draft and the final report?
Yes, of course it is fair to say that. We prepared a preliminary report on which we got comments from staff at the OECD and at Education 2030 who have a good understanding of curriculum internationally, as well as from you and from other observers. We then reviewed the report ourselves, as a team, and prepared a final draft. That is how any academic would work. A first draft is never a final draft.
That was a process-related question, and we may ask more of those. However, I will now bring in Ross Greer. I hope that we might also be able to bring in Romane Viennet—I am conscious of the fact that we have not heard from her yet.
Good morning. Over the past few days, much of the commentary in Scotland around the report has been about Scottish national standardised assessments—the achievement of curriculum for excellence level assessments—in relation to the references made both in the report itself and at the launch, back in June.
Rather than put words into your mouths, I will ask you to expand on what was said in the report about SNSAs. Specifically, is their purpose clear and are they meeting that stated purpose at present?
To clarify, are you talking about the SNSAs?
Yes.
The report brings in SNSAs when we consider the data that is available to monitor the progress of students within curriculum for excellence. That is linked to what we said earlier about our dual observation that a lot of data was generated but that it was maybe not relevant or appropriate data for the purposes of monitoring the effect of curriculum for excellence on student learning.
I will be brief. The SNSAs are brought in as an example alongside the achievement of curriculum for excellence levels—I think that the acronym for that is ACELs. We consider the relevance of each to data collection mechanisms and compare them with what we would want to see as a monitoring system for curriculum for excellence.
The argument that is made is not to scrap SNSAs or to say that they are useless; it is simply that they are maybe not the most appropriate mechanism to use to measure the impact of curriculum for excellence on student learning. Does that answer your question?
Yes. Thank you. I will go a little bit further. The SNSAs have a dual purpose: they are supposed to collect both formative and summative data. Their stated purpose is to help individual teachers in supporting their pupils and to provide that larger summative data about how the system as a whole is working. Romane Viennet made the point that SNSAs are not necessarily the best way to collect that data. To clarify, are you talking about the summative data? Is your point that SNSAs are not necessarily the best way to collect system-level data?
They do not necessarily collect the best system-level data for measuring the impact of curriculum for excellence on student learning. I emphasise that last part of my sentence.
In relation to data collection, we suggested that CFE needs some sort of study with a focus on students’ experiences of curriculum and assessment and their experiences of and suggestions about the qualifications that are linked to those assessments—a study that would consider the diversity of what curriculum for excellence is trying to achieve, rather than what is currently measured via the SNSAs. The fact that they have that dual purpose perhaps makes them a little less relevant for CFE as a policy.
10:15
The Scottish Parliament information centre, which is a neutral research resource available to all members—it is not aligned with any one party—has just published more analysis of that. It highlights the potential difference between the Scottish Government accepting the headline recommendations of your report and responding to the wider commentary that it contains. For example, the report contains no specific headline recommendation on SNSAs, but there is wider commentary—as you just explained—on whether they are the most useful way to collect the required data. Would you expect the Scottish Government to respond directly to the points that the report makes around SNSAs?
The OECD is independent and delivers a set of recommendations for countries to take on board and consider. What Scotland does with our recommendations is down to its own political process and discussions. We try to provide the most fair, independent and objective recommendations and to develop what options could be adopted and why, with examples from other countries that have similar policies.
The next step for Scotland is to consider how you want to take the recommendations on board. The recommendations are a summary to guide action, but it is important that they are taken on board coherently, as part of the whole CFE experience for students and for teachers.
As Romane Viennet explained, in the OECD’s view, the assessment system does not fully provide information about how CFE is succeeding. It is focused more on the knowledge aspect, but there are three other capacities that do not appear in the data. That means that, when you try to understand how CFE is progressing, the only—or at least the more prominent—focus is on the knowledge side rather than on the other aspects.
We think that it is important to consider those other aspects, but it is up to Scotland to see how it wants to take on board our recommendations. We cannot tell Scotland what it should do.
Thank you both. That is all from me for now, convener.
Thank you, Ross.
I gather from Beatriz Pont’s answer—unless I have misunderstood—that the text beneath the sub-headings in the report forms part of the recommendations.
I will bring in Fergus Ewing, who has a follow-up question, and then we will come to Willie Rennie.
Good morning to both witnesses. Thank you for coming along. I want to pursue the issue that Mr Greer raised, which you have both already covered—namely data and the absence of sufficient data to enable us to determine outcomes and success in three of the four competencies under CFE.
I noticed that you say—I was going to quote from your report, but you have already confirmed it today—that some data is missing: it is absent. I fully accept that it is for Scotland to respond to that, but perhaps you can give us a little more help with identifying what type of data you think that we should be getting. From whom should we get it, and how are other countries dealing with reportage on data to assess how their children are responding in respect of key competencies?
That is a very large question with which many countries are grappling. The three capacities that sit alongside “successful learners” are more difficult to assess, and many countries are wondering how to assess such things. In addition to our report, we developed a working paper on student assessment in upper secondary education, which is the senior phase in Scotland, from a comparative perspective. The Government wanted us to focus on that area, not for recommendations but rather to provide options for the future.
Professor Gordon Stobart has developed a working paper on the current assessment system and how it could be better aligned to CFE. He provided options to move beyond what he called the “legacy system” of student assessment in Scotland. The options vary, but they focus on developing a more resilient upper secondary assessment system. As you know, the system has been hit by the pandemic, and there have been a number of issues. Other education systems internationally have been more resilient because they have what Professor Stobart calls a “mixed economy” of student assessment approaches.
It is about how you better align student assessment with curriculum and pedagogy. You have to broaden how you assess. Some of it might be school-based assessment by teachers, and some might involve using IT to measure other types of skills that are not necessarily core knowledge. Professor Stobart develops a number of examples that show how other countries are doing that, bringing in Norway, New Zealand, Australia and other countries that are introducing or have tested and successfully used different approaches that go beyond the pen-and-pencil test. Scotland has been introducing such approaches more through the Scottish Qualifications Authority. You already have some experience of that, and that is what many countries are moving towards.
When we travelled virtually across Scotland and met parents, we asked them how they saw the impact of CFE on their children. Many said that their children speak much better, are much more open about their views and can cogently discuss and introduce many topics at the dinner table. Parents see a change, but that is such anecdotal evidence, and it is not enough to understand CFE. Therefore, it is important to find the right ways of measuring all the additional skills that CFE is focused on, which many education systems internationally are increasingly focused on, too.
Thank you for that extremely helpful answer. I was interested in your reference to the comments that some parents have made, albeit that it is anecdotal evidence, because it absolutely accords with my impression of listening to parents. They say that their children are well able to express themselves with confidence, and perhaps with greater confidence than was the case when I was at school, although that was a very long time ago.
We heard that quite often, actually—many parents said it. We heard critiques, but we also had anecdotal evidence that CFE is developing a certain set of skills, and that is perceived anecdotally by those who are observing the system and participating in it, and even by children.
We met a number of students. We even met a student who had dropped out and who lacked confidence and was very disadvantaged. She was wonderful and told us about how she had been bullied and how she got out. We heard a number of personal experiences of students that were very relevant to us in understanding how CFE can help students. We believe that there is not enough engagement with students to understand their views in Scotland.
That interesting comment leads us to Willie Rennie’s questions.
There is a significant debate that we need to have about measurement not just of the overall system to help politicians in the national debate but of what is going on in the classroom, too. Your report makes it quite clear that you think that using the SNSA assessment process—[Inaudible.]—for broad general education. Following on from Ross Greer’s questions—[Inaudible.]—SNSAs would not be used for national monitoring purposes. There needs to be a different process.
You have also talked about the separation of the—[Interruption.] I am sorry—something has happened.
We can still hear you.
My screen has gone funny.
I understand the point that SNSAs narrowly focus on one capacity and that that needs to change, but there are two separate issues here. The fact is that national monitoring needs to be separate from the assessment process. Am I understanding you correctly that your very clear message to us is that SNSAs are not suitable for national monitoring purposes?
Did you get that question, Beatriz?
Yes. I will ask Romane whether she wants to respond to it.
Thank you for your question, Mr Rennie. I got a little bit of white noise while you were asking it, so I will repeat what I understood of it. Are you asking whether we are saying that the SNSA is not fit for system monitoring purposes?
That is right.
As I have said, our recommendations and suggestions are based only on curriculum for excellence, and the report makes no pronouncement on the broader issue of the education system. We did not assess SNSAs on that basis, so I can speak only to how it connects to curriculum for excellence and its intentions.
In the report, we state rather clearly that our team does not consider the SNSA approach to be the most appropriate system monitoring mechanism as far as CFE is concerned. However, I point out that the SNSA is cited alongside at least one other monitoring tool that was developed—the CFE levels. It is cited as an example, but only with regard to CFE.
I think that that is clear. What you have just said is not included in your main recommendations, and my fear, therefore, is that the Government will not address it in a substantial way. I understand the purpose of the SNSA in assisting the process with regard to the teacher in the classroom, but, as it is currently used, it is not really suitable for national monitoring purposes. I just want to be absolutely clear that that is what is being said.
The points that I made are actually included in the main recommendations. For example, recommendation 3 refers to the alignment between the curriculum qualifications and system evaluation. Again, I make the point that the SNSA is cited as an example in that recommendation only with regard to CFE.
Thank you very much.
I go back to my earlier point. I concluded from your previous answers that the text underneath the sub-headings in the report is part of the recommendations, and you seem to have suggested that again. Am I correct in saying that?
10:30
As Beatriz Pont said, the text beneath the recommendations is part of the recommendations. The OECD does not expect the Government to provide a response or an action to every point immediately; it should use the whole text and the guidance that we provide on the recommendations. The recommendations cannot be understood or interpreted outside the context that we provide through the broader text.
That is exceptionally clear.
Michael Marra wants to come in on another topic.
I want to clarify something very quickly. The generic overall recommendation is recommendation 3. Underneath that, there are paragraphs 3.1, 3.2 and 3.3. The heading for paragraph 3.3 is:
“Align curriculum, qualifications and system evaluation to deliver on the commitment of Building the Curriculum 5”,
which is a document that the Scottish Government prepared a while ago. Underneath the heading is the text that will help the Government to consider how to deliver the recommendation.
You are confirming what Romane Viennet said. That is much appreciated.
I found all that commentary on assessments very useful.
Earlier, Dr Pont commented on the work that the OECD has done internationally on development of other systems. It is great to hear that other countries are observing Scotland, but I want also to learn a bit from those countries. Dr Pont said that other countries have implemented new curriculums that share the same ethos as curriculum for excellence. Have those countries faced implementation issues that are similar to those that we have had in Scotland? Are there any issues that are distinct to us, in Scotland?
Those are very interesting questions for us. Since the introduction of curriculum for excellence, a number of countries—Norway, Finland, Estonia, New Zealand, Japan and Wales, for example—have introduced curriculum reforms that involve what we call 21st century skills, competences and knowledge. Many countries, including Iceland more recently, are introducing transversal skills and values—the values that come under curriculum for excellence’s four purposes. We have watched many countries introduce such systems. A broader OECD project covers those systems, using a framework that is similar to the one that was used for Scotland, but we have looked at only a few of the systems that have been implemented.
One issue is teachers’ preparation and their development of the curriculum. In Scotland, teachers have to create and shape their own curriculum at a local level, which requires specific skills that teachers might not have been fully prepared for through their initial teacher training programmes. They also might not have the time in schools to develop those skills. We have found that to be a big issue.
There are high expectations of teachers, so it is important to build curricular capacity at various levels of the system. That is the case internationally. For example, Mexico introduced a skills-based curriculum and then delivered online training for one day to many teachers to start the course. Scotland has invested more in developing capacity. Many systems are developing different approaches to providing capacity for teachers.
That is the first big issue—you have designed a very good curriculum, but it is still difficult for teachers and principals to implement it at the school level.
The second big issue concerns student assessment, which we have discussed in response to a number of the questions that the committee has posed. The issue is how to align a 21st-century skills curriculum with 19th-century assessment systems. Many countries are upgrading and changing, or are looking at how to review, their assessment systems. Covid has provided an opportunity to do that, because assessments had to be cancelled and schools had to find different ways of delivering information on students’ progress. For example, France, Norway and other countries are giving greater weight to teacher-based or school-based assessments, and other methods.
The third issue is the need to find the right system-monitoring mechanisms to allow understanding of how the curriculum is moving forward.
The final issue, which we cover in our report on Scotland, concerns the institutional aspect: how the system should be reviewed, how often that should be done and who should be doing it. We found that it was important for Scotland to have a professional process for reviewing CFE rather than an ad hoc system, so we recommended that there should be an institution like the one in Ireland, which could review the curriculum in a cycle and take on board issues that arise throughout the year in a process that is both professional and informal.
That is, more or less, an overview of some of the challenges in implementation of the curriculum. There might be others, but I will stop there.
I understand that it was a broad question that probably requires broader analysis of the issues.
You mentioned that in Mexico there is a lack of training to prepare teachers to engage in curriculum development. That would have been somewhat familiar to teachers in Scotland at the start of curriculum for excellence, given the great challenges in its implementation phase. Are there places that have done that better, and are there lessons that we can learn? You have given Mexico as one example in which things have not gone well because of the lack of such capacity. Are you saying that we need to lift that capacity in Scotland? What kind of capacity do we need? One of the core issues that you mentioned is in-school development time. Where are the models that we should reflect on and learn from?
One of them is your neighbour, Wales. We have been working with Wales, which has, in a way, been following Scotland in developing a new curriculum. It will start to implement that curriculum from 2023, and it is aiming to develop a professional learning system for teachers that will engage with them at the local level and provide them with the right training and networks. Wales has a set of consortia that will act as school improvement partners. In a way, they will be a more formalised version of Scotland’s regional collaboratives, and their role will be at the core of the system.
We are also working with Norway, which introduced a decentralised funding scheme to develop a collaborative approach to training and capacity building at the local level and at the school level. The system is complex, but it is a savvy system—Norway is giving money to universities but only if they tailor their training to meet the demands of schools. Sometimes, universities have ready-made packages of training that do not fully help teachers. Norway is changing the way things are by setting up a collaborative approach to developing the right responses to meet teachers’ needs. That is a good approach, and we have been working with Norway on it. The development is slow and takes time—it is a process.
Scotland has the partnerships between universities and schools on professional inquiry. We find that those types of regional and local approaches to supporting schools and their staff to work together to solve their issues, and to develop themselves and their capacity, are working well internationally.
It has been a fascinating evidence session so far. I have a few questions on some of the recommendations on what the report sees as the mismatch between the senior phase of school and curriculum for excellence. One point is that there is too narrow a range of learning activities in the senior phase. I am keen to know how that might be improved and how you would broaden out the activities. There is much talk about diversity of pathways being required and about lack of time to go into detail in some subjects. I wonder about the range of learning activities and about going into detail on subjects.
If I have it correctly, the OECD’s suggestion is that there should be a limited number of core subjects in the senior phase, and some subjects in which students go into much more specialist detail. I am open minded on that, but I wonder whether it might have the unintended consequence of narrowing options for young people in the senior phase. I am interested in hearing your comments on that.
You have posed quite a large question. We analysed the senior phase and saw that there is a jump between the BGE, which meets the aspirations of CFE, and the senior phase. When students jump to the senior phase, there is what some have termed the two-term dash for exams, so that students get their qualifications and leave the system. There are all the student assessments, and the structures are set for students to pass the exams but not to have a broad experience, as CFE considers it.
We think that that is hindering the curriculum experience of many young people. Actually, the students whom we met told us that. They said that, when they arrive in the senior phase, having learned in a new way and having had a much broader experience, they then have to go back to learning for the test, which changes the way that they perceive education. We think that the senior phase has an issue between breadth and depth that is still unsolved and needs attention.
We think that a possible solution would be to clarify the structure of the senior phase but without restricting its diversity—as we heard, there is an objective of being diverse and providing as many opportunities as possible for young people. That was an issue that many highlighted to us. However, the current approach might be too broad and not deep enough. We think that without restricting diversity it could be possible to define a number of typical pathways or profiles for the senior phase with a limited number of compulsory courses or specialisation courses, and to provide room for additional or optional units. There would be some guidance from the system as to what the expectations would be.
We welcome the provision of courses by colleges. Students can take courses in colleges and colleges provide courses in schools. That is an interesting model that we valued greatly because it provides a range of opportunities for students to widen their learning experience while they are teenagers.
10:45
That is very helpful, and I absolutely recognise that two-term dash. I do not think that the OECD has been prescriptive about how that could be fixed. Some schools currently do nat 5 or highers over two years. They pace the curriculum and syllabus at a much more appropriate level for students. I get that. On additional provision of the further education that is already dropping down into schools, I absolutely get the idea of expanding those pathways and broadening that out.
My follow-up question relates to assessments. I see reference in the report to much more use of portfolio work, continuous assessment and teacher judgment—with appropriate moderation, of course. I also see that some of that moderation for continuous assessment should be external to the school, in order to build much more chunky checks and balances into the system. There is a lot to welcome in there.
My question is in the context of the poverty-related attainment gap. In years gone by, when we have given young people more content to produce, the young people who had better support at home for preparing folio work were, quite often, from higher-income backgrounds. They had more time and space at home, and they had tutors and that kind of thing.
I support what has been said, but would we have to be careful to broaden out continuous assessment, and not to build in an advantage, as we did with external assessment, for a cohort of young people who might be in a better place to take up the benefits of continuous assessment because of all the additional advantages of things such as tutors and parental support?
Yes. The equity dimension is very important in CFE. According to PISA, Scotland already has higher equity than the average education system. In addition, some of the data show that it has managed to reduce slightly some of the gaps between low and high socioeconomic backgrounds.
Given that so much of CFE is devolved to schools—in developing the curriculum, choosing the pathways and deciding which courses are to be offered—we were concerned that the more advantaged and privileged schools would have a broader array of offers, supply and courses. That can lead to higher inequality. It is important for Scotland to consider inequality and how CFE can, in all schools, be provided so that all children benefit from it.
When students are assessed externally, there is a question about which system is fairer. There has been discussion about what is fair. Do people think that external student assessments are fairer because they are the same for everybody? That has not been demonstrated to be true. We have seen that in the United States with the standard assessment tests, which have been dropped in many places because they have been considered to be unfair, given that—as you said—not all students have the same capacity to prepare at home.
It is important to have the right support mechanisms to enable schools to support their students and for schools that have more disadvantage to have more support, if possible, and the right conditions for their students to thrive.
Sometimes, teachers are in a better position to support their students, so continuous assessment by teachers and their school can be fairer than an exam for which there is no supporter of, or individual knowledge of, the student.
We consider that it is important to have a balance—not necessarily to drop one or the other, but to make sure that schools have provision, that students have the right support mechanisms, and that the assessments are well balanced so that they give a good understanding of the performance of students.
Thank you, and thank you to Bob Doris. I turn to Oliver Mundell.
I return to the original line of questioning that you started, convener. I have serious concerns that the report is flawed and has not engaged properly with non-ministry academics. I have written twice to the OECD without ever receiving a reply, and when, after a freedom of information request, I asked the Scottish Government which non-ministry academics were suggested to the OECD, I was told that a planned phone call to discuss additional participants did not take place. I am therefore interested in finding out how the non-ministry academics were suggested and where the view that CFE had been universally embraced in Scotland came from.
Thank you for your comments and question. I do not know whether you were present when I introduced the methodology.
I was, but I was not really satisfied with what you said. My understanding was that the OECD sent a paper to Scottish Government officials about who would participate in the review, and one of the questions in that paper was about which additional non-ministry academics should be approached. The Scottish Government and the OECD have been unable to tell me who was discussed and why you chose particular individuals. I am confused by that, because there are a number of voices in Scottish education who have fundamental concerns about curriculum for excellence and the principles behind it.
As I said at the beginning of the session, we have a set methodology. In every country we look at, we meet a group of academics. An issue for us is time constraints; we cannot visit all the academics in a country, because our time is limited and the report has to come out. As a result, we ask for a select set of academics who are critical of or who support something. We approached Keir Bloomer at the Royal Society of Edinburgh, for example, and met a number of others, but even if we did not meet them, we still read their papers and all their criticisms or supporting views. Regardless of whether we have met the academics, we have covered much of the critical territory of aspects of CFE that many Scottish academics have highlighted.
To tell you the truth, I cannot remember the specific exchange on the academics. We provide guidance about who we want to meet, we have an exchange and then we define a final set of five to 10 academics. Because all this was happening online, it was more challenging to have a large group of academics. In Sweden, for example, I have been in a room with 12 academics—such sessions are always fascinating. In Scotland, we still met a number of them either as a group or individually later, when we could do so.
With regard to the request for information, the OECD has a set process that works for all countries. It is sound, independent and objective, and we stand by it. I can tell you that Lindsay Paterson was on our shortlist, but it was not possible to fit him in, so we read what he had published. We ask our national co-ordinator to co-ordinate things for us and to send us all the information to ensure that we do not open ourselves up to receiving so much information that we get overloaded.
We covered a good set of academic perspectives, regardless of whether we met those academics or read their materials in the initial stages of or during the review. That is how we see the situation.
Quite frankly, I find it shocking that the OECD did not have the time to speak to Professor Paterson, who is highly regarded in Scotland by Scottish teachers, parents and many people in academia. That the voice of one of the leading critics of the current curriculum was not included and only his papers read confirms many of my concerns.
The report skirts over issues around knowledge. It pushes points, but it does not question whether the capacities that are at the heart of CFE are what causes the problem. As a result, the report is less than it would have been.
I do not need an answer to that, convener. I am happy to let other members come in.
We approached the RSE and we met Keir Bloomer. Therefore, we got the RSE’s perspective.
With respect, Keir Bloomer was not happy with the process either. He said that it was evident that it had been “stage managed by government”. Therefore, I do not think that it is right to reference him as a defence for not having taken time to speak to Professor Paterson.
Thank you, Oliver. That point has been made. Does James Dornan want to come in on that particular line?
Yes. I found the previous intervention highly embarrassing for the committee. The OECD is an internationally respected organisation. Oliver Mundell seems to have a conspiracy theory that the Scottish Government has power over all sorts of international bodies and that, if they do not do exactly what he wants, some conspiracy is going on. It is unacceptable for the OECD to come here in good faith, take questions and get such abuse from a member who is trying to—
I am not sure that we can say that what occurred was abuse, but your point has been made. Would you like to make a further point to the OECD?
To be fair, the evidence that I have heard so far has been pretty good. Obviously, there are clear issues relating to the final stages, but I was wondering about the fact that we are talking about the apprenticeships being part of the end of the process. I know that the OECD did not write the report on the upper secondary assessment, but how does it see the work of integrating the foundation apprentices being best built on? How might existing mindsets be shifted to ensure that that work is easier?
Thank you for your comments, Mr Dornan. We did not cover apprenticeships specifically. We welcome the openness of CFE and the senior phase to develop other qualifications that are focused more on expanding the pathways and the capacities for students to develop professional training and vocational education and training. Apprenticeships are one way. There is another report by the OECD on apprenticeships, which is by a different department. I would be happy to send that to the committee, if members want to read it.
Thank you very much.
I am interested in the alignment of business links, links into universities and colleges, and collaborative work with them. Will you expand on that, please?
Will you repeat the question, please?
I am really interested in the links to colleges and universities and what they are looking for. Will you expand on that?
Do you mean what universities are looking for from students coming in from CFE?
Yes.
Do you mean in terms of knowledge and skills, and whether CFE is suitable for universities, according to the universities?
What work do we need to do with schools and universities to support young people to get the skills that colleges and universities want them to have?
11:00
We met a number of university rectors and heard their views on the types of skills that students should have. More broadly, the OECD considers that students need the knowledge, skills and values that are important in enabling them to participate and contribute effectively in their societies and in shaping their future. The skills that are developed in CFE are quite important in that sense, because universities are looking at ways of assessing whether students are ready for them. Internationally, universities are changing the way that they assess and gather evidence to understand the skills that students have. There are also efforts to expand universities’ role in shaping curriculums. It is important that they are consulted in the process and contribute to shaping curriculums for the future.
I cannot go into more detail on that because I do not think that we covered the matter in very much depth in our report. However, there is a whole tertiary education team at the OECD from which I can request information to send to you.
I have listened with great interest, and I declare an interest: I have been a teacher for 30 years, so a lot of what you have said resonates with me. I first started teaching when we had the five-to-14 curriculum, so I was teaching at the beginning of the implementation of curriculum for excellence. It has been interesting for me to track the journey of its implementation and review.
It is great to hear that other countries are following our pioneering curriculum for excellence. As a former teacher, I agree that it provides avenues for children to express, for example, their talking and listening skills. I think that you referred to that when you fed back that children are much more articulate and able to debate and put their views across. Active citizens and responsible learners are part of the four capacities—I am familiar with those.
It is perfectly reasonable that the curriculum requires refinement after this amount of time. We have to adapt to a future that has changed, especially in the context of Covid, in relation to the different balance of skills that we will need. I note the statistic of 95 per cent of positive destinations being reached, which I think reinforces the fact that, with universities and colleges as well as apprenticeships, there is a wide range of positive pathways for our young people. I was glad to see that.
I noted the narrowing of the equity and attainment gaps. You might have noticed that we, in Scotland, sometimes suffer from the Scottish cringe a wee bit. We can do down education and certain other things, and we do not celebrate our successes as much as we should. Can I clarify that, in your opinion, Scotland’s education is performing well and is internationally regarded and that our education is not going backwards? Teachers sometimes get quite upset, as do parents and pupils, when they hear the narrative that Scottish education is not that great.
With regard to SNSAs, I totally agree with what is in the OECD report. As a practitioner, I found that SNSAs were not properly measuring the actual skills that we were teaching. I also found that disadvantaged children were even further disadvantaged by the assessments, because the examples in the questions did not resonate with those who came from poorer backgrounds. For example, stories would be set in castles—I suppose that is not a good example in Scotland, as we have a lot of castles here, but you get my general point.
Thank you for your evidence—it is really interesting. It would be helpful to hear a wee bit from you on Scotland’s standing in education internationally and across Europe.
That would be very interesting.
Thank you for your comments, Ms Stewart. We see that curriculum for excellence has expanded the opportunities for Scottish learners to thrive, and we find that the four capacities are very much relevant to the future. We think that it is important to invest further in CFE, but Scotland is viewed internationally as an example of high performance. When we compare the data with that from other countries, we see that Scotland is above average on a number of indicators, especially the OECD’s new indicator on global competencies. On those types of skills, Scotland is a very high performer. You are being watched internationally.
In June, we held a webinar to launch our report on Scotland, and more than 1,000 participants logged in, not only from Scotland but internationally—from the four corners of the world. People want to hear what Scotland is doing and how you are developing your curriculum. Policy always needs to be reviewed—it does not stay fit for purpose, and you do not want to go backwards; you need to review for the future. It is therefore important to continue the effort and ensure that you always review and update policy so that it is fit for purpose. As our societies change, our education systems need to reflect that, so it is very important that you do so.
When we looked at CFE, we found that, after 10 years of implementation, you still need to review it professionally and see what still works and what could be improved, and you need to define a good process that is institutionalised in order to do so.
I totally agree that we need to refine the curriculum. Are we in a good place to be able to move forward in many of the areas that you mentioned? Will our structures be fit for that purpose? Will we be able to do that?
That is a difficult question. On whether your structures are fit for purpose, we are providing a set of recommendations to consolidate the structures in order to make curriculum for excellence less political and more policy oriented. We find that, at present, the politics overtake the policy. That is why we think that it is important to have the right institutional structure, so that CFE is professionally reviewed by an institution that has the experts to do that and that consults externally with all the different stakeholders.
In our view, Scotland has the will. The whole system is obviously interested in education—it is one of the top priorities in public policy. We welcome that, as it is immensely important. If it is such a priority for you, you will make it happen and drop the politics behind it as you move forward.
I want to ensure that colleagues get the opportunity to come back in to ask questions. Fergus Ewing has a further question.
I was very pleased to hear Beatriz Pont’s very positive remarks about the confidence that is displayed by Scottish young people. That was a tremendously positive comment and is very encouraging. I am afraid that I have to echo Mr Dornan’s remarks—the remarks that another committee member made were inappropriate.
I want to ensure that James Dornan has an adequate opportunity to ask his questions.
I am fine, thank you.
Three other members have indicated that they wish to ask further questions.
For the purposes of time, I will ask only one question, which is on the governance arrangements around curriculum for excellence and, specifically, on the OECD’s findings relating to the Scottish Qualifications Authority and Education Scotland, which are the two major agencies that are responsible for delivery, and their relationship.
In response to the OECD’s report, the Scottish Government announced that those two bodies will, in essence, be merged. Education Scotland’s inspection function is being removed. That function will be carried out independently, which is supported across the Parliament. However, the body that is responsible for developing the curriculum and the body that is responsible for developing qualifications will be brought together. I recognise the point that was made about the qualifications system and the curriculum simply not aligning, so, on the face of it, it makes a lot of sense to bring the two agencies together in order to get, I hope, better alignment. However, is that a common governance arrangement in other comparable education systems?
As you have said, we found that the system is not aligned, with the SQA and curriculum for excellence not responding to each other. In some systems, the equivalent of the SQA is a separate institution that does quality control—sometimes, it can be an inspectorate. You have a unique system of qualifications that is very UK based. Many countries do not have such a system; they have an external test that is developed by the Government and a set of qualifications that are developed by an independent or semi-independent institution. Your approach is unique.
I am not sure what example I can give you. For us, the most valuable example is the Irish National Council for Curriculum and Assessment, which is a professional institution that defines and reviews the curriculum. It provides advice to the Government on how to shape the curriculum, and then the Government takes action. [Interruption.]
Something has gone wrong. Ah, now something is going right. Back to you, Beatriz.
In our advice for Scotland, we build on the example of the NCCA, because it is an independent institution that shapes the curriculum, gathers opinions from different stakeholders and has professional staff working on different curriculum areas. In Northern Ireland, one institution has both remits. When the meeting is over, I can send the committee more information.
We did not recommend that responsibility for qualifications and curriculum lie with the same institution; instead, we recommended that student assessment beyond qualifications be with curriculum in the agency. We left it a bit open for Scotland to decide how to handle this—we did not make a direct recommendation on the SQA.
11:15
Thank you. Are you perfectly happy with that reply, Ross?
Yes, convener. It was very useful.
Before I turn to Willie Rennie, I think that Romane Viennet wants to make a contribution.
I just wanted to reinforce Beatriz Pont’s point that student assessment does not apply only to qualifications. In the report, we make it clear that student assessment should be dealt with by the same agency that deals with the curriculum to ensure coherence. However, the issue of qualifications is not part of that argument, and it has been left to the Scottish Government for further reflection.
Thank you very much for that.
The report makes a lot of comment and recommendations about how knowledge is addressed. There is often misunderstanding about what is covered by knowledge, but I note that there is also an issue about how it is addressed in the broad general education and how that should change. In your report, pupils talk about their difficulties in catching up with the knowledge requirements of the senior phase because it has not been covered sufficiently in the broad general education, but the report also identifies a bias in the system towards one of the four capacities—successful learners. Am I correct in saying that there is a tension in that respect? Can you explain a little bit more the issue of knowledge and what needs to be done to address it properly?
A gap that we found in the concept of knowledge in CFE is that, in the senior phase, the focus is fully on knowledge, while in the broad general education there is more focus on the four capacities more broadly. What students told us is exactly what you have just said. When they arrived at the senior phase, they felt that they were not fully prepared, because their learning up to that point had been based on a broad pedagogical approach, and they found the renewed focus on knowledge alone in the senior phase to be challenging. That is what the qualifications system gives weight to in the senior phase.
It is quite an important issue that we have detected. Kids in the senior phase are being tested only on knowledge, not on other skills and competencies, so there is a gap for 14 and 15-year-old students moving into the new two-term dash regime.
The other issue is that knowledge lies at the heart of Scotland’s pride, and we understand that it needs to be built in for CFE to move forward with everybody’s support. It is therefore important that it be given more clarity and included more in the vision, to ensure that it is well supported by everybody. The concept should be consolidated in the BGE so that kids arrive well prepared in the senior phase. There needs to be a more seamless process with regard to knowledge for students from three to 18, instead of there being a focus on the four capacities and then a focus on knowledge alone. It has to be consolidated throughout CFE.
The report also says that there is too much emphasis on the successful learners aspect of the four capacities. Is it not a slight contradiction to recommend more focus on knowledge in one part of the report and, in another, to say that there is almost too much emphasis on knowledge?
There is too much emphasis on knowledge in the senior phase. The balance that is required is not yet there. It is important to make sure that the four capacities are better developed and better assessed in the senior phase so that the concept of knowledge is spread throughout the student’s learning from three to 18. How best to ensure that that knowledge is built in across the fourth capacity—without forgetting the three other capacities that you prioritise—is still something for you in Scotland to consider, which we also consider to be very important.
I have one more short question. Does that cause a problem with the connections with further education, higher education and employers? They are used to the current system, with its focus on knowledge, and you are proposing to change that. How do we make sure that it is fully integrated and that we do not have a problem at that end, by solving the problem between BGE and the senior phase?
We considered and provided commentary on how the assessment system needs to change—not so that it excludes knowledge, but so that it gives some weight to other types of skills and competencies. That is happening internationally, and many universities and employers are recognising students for other types of skills, which they consider to be as important as knowledge. Therefore, changing the assessment system will have an impact on how students are prepared throughout the whole system.
I will make a quick point on Mr Rennie’s question about whether the bias towards successful learners and the lack of treatment of knowledge is a contradiction. It is not a contradiction if you look a bit more deeply at what knowledge can cover and at how there are different kinds of knowledge and different ways of using that knowledge.
Our argument—which the reports gets at—is not that there is not enough or that there is too much knowledge, but that the focus is too much on one specific type of knowledge and one specific way of rendering and using that knowledge—for instance, within student assessment. As students grow older and get to the senior phase and prepare for qualifications, they tend to narrow the kind of knowledge that they focus on. Again, this is a generalisation, but what is asked of them in most qualifications is that they render the concepts and memorise content, as opposed to showing, in a specific task or a specific exam, ways in which they can use that knowledge to get to another conclusion or to build the argument.
I emphasise that I am not speaking about skills or competencies, but about how ways of learning and rendering knowledge are also encapsulated in what we call “knowledge” and in what we say that CFE should get into. Knowledge is not only content and memorisation; it is also getting the facts right so that the arguments, the thinking process and—later on—the development of skills have that basis. It is important to bring out that distinction. It is not a contradiction; it depends on how deep into the concept of knowledge you go.
We will have one final question from Michael Marra, which is on the same line, before I return to the deputy convener for a comment.
It is useful to hear those points. One of the most common comments that I hear from university principals and vice-principals is a real concern about the level of knowledge, capabilities and capacity in some of the people who come to university as undergraduates—in particular, in science, technology, mathematics and engineering subjects. I have heard that, for first-year students, universities are having to teach, or re-teach, things that would previously, in their understanding, have been in the school curriculum. I go back to Willie Rennie’s comments on how we can work with universities to try to understand why that is happening. Is it inevitable? The committee could perhaps discuss that at a later point.
My question relates to some of the causal factors around that issue. There is much research on it, including a report from the Education and Skills Committee in the previous session of Parliament, which noted—to get quite technical—that a key issue with senior phase implementation is timetabling in the fourth year. That issue was created predominantly by moving from standard grades, with 160 hours of teaching time over two years, to nationals, with 160 hours over one year.
From your research, how key do you think those issues are to implementation of the curriculum? It would also be useful to hear comments on what seems to me to be the resulting inevitable narrowing of choice, with regard to the senior phase and the general education experience.
We hear you—that is a valuable comment. The issue of depth versus breadth, especially in the senior phase, is important and was raised in a number of schools by students and by principals, in particular. We heard of students having to take 17 courses, which is too many, and arriving at university without having studied any subject in depth. They have covered many subjects but lack the minimum level of knowledge and capacity in specific areas.
We do not want to constrict the choices for students, and we were very impressed by everybody wanting to provide choice, choice, choice—as much choice as possible—for students and as many options as possible in schools. However, we recommended that a balance be struck between choice and the quality of education.
We recommended that a number of typical pathways or profiles could be defined for students, who would then go into specific areas at university. They could take a limited number of compulsory courses, which would give them enough depth while also providing choice and diversity of courses. How to marry those things is an issue, but we think that it is an important one to tackle, so that is a very good question.
On that issue, I have struggled to understand some of the process in my home city of Dundee. There has been a collapse in choice for many students. It is not clear to me whether that is being driven only by the process that we are describing or whether the fact that the council administration has cut one in eight teachers—12 per cent of all teachers—from schools has resulted in that kind of narrowing.
Can you comment, from the work that you did, on the resourcing of choice versus the structure of choice? What are the constraining factors?
We heard many principals say that they did not have the teachers available to provide enough choice. That is an important issue in respect of providing the right number of courses. Another point is that students may choose their courses strategically because they want to get into university. Are they and their parents asking for a narrower choice because that is what is being measured to enter further and higher education?
There is a balance to be struck in that regard. What is the question? Is it that schools do not have enough teachers or enough school or classroom capacity to offer all the choice that is necessary? If a school opens up choice, it may have only three students in a class. There is a resource issue with offering so much choice when it might not be taken up by many students.
We heard about some interesting partnerships. We visited a school—in Skye, I think—that was able to provide choice by collaborating with other schools, and schools in Oban and Tiree worked in partnership so that a school could offer pupils a subject that others schools could not offer. Principals told us that analysing the number of teachers available and the number of students who might enrol in courses led them to make strategic choices about the courses that they would offer.
Kaukab Stewart would like to make a comment, then I will ask one final question.
11:30
I will try to be brief, as I am keeping an eye on the time.
I thank Dr Pont for clarifying the gap in relation to the concept of knowledge and for explaining what that means. In primary school, there is an emphasis on the application of knowledge—that is, getting pupils to do something with the knowledge that they have acquired. That involves problem-solving and critical-thinking skills. As I remember it, curriculum for excellence was based on Bloom’s taxonomy, and knowledge is at the foundation level of that triangle of higher-order thinking skills.
On assessments, I agree that they do not match up with what we are doing in terms of knowledge or how children learn as opposed to what children learn. Our young people are learning very differently. A lot of what they are doing involves the application of knowledge using critical-thinking skills and problem-solving abilities. However, our assessments do not measure that. We are still in a pencil-and-paper approach, or an online replacement for that.
I welcome the clarification that Dr Pont gave, because I do not think that everyone understands the situation. Everyone says that children must learn facts, but it is what they do with those facts that is important, because that is what will help society and help them in their jobs.
That issue also feeds into skills, which I mentioned before. We need people who can apply skills, not just people who have knowledge. That broadens out into not only university entry, but entry to colleges and apprenticeships.
I will ask the final question. You have packaged together quite a lot of recommendations in your report. My simple question—to which, I am afraid, I must ask you to give a short answer—is this: what recommendations should be prioritised? Which of your recommendations do you feel we should look at first?
Who would like to answer? I left the difficult question to the end.
I agree that it is a difficult question. Our core message is that you should find a balance between depth and breadth of learning throughout CFE and adapt the pedagogical and assessment practices in the senior phase. The balance between assessment and CFE needs to be found as a priority, but that cannot be done immediately, because it will take a while to think about the best way to do that.
You should combine a systematic and inclusive approach to curriculum review with the creation of a clear division of responsibilities. We found that the landscape is complex, with many committees and institutions, and we believe that the responsibilities need to be more clearly divided. You should also support the teaching profession and align teachers’ qualifications to the curriculum. For us, it is important that students have a trajectory and that there is no gap between the broad general education and the senior phase.
To respond to Kaukab Stewart, I say that, although highers involve teaching to the test and the repetition of knowledge, advanced highers were welcomed by students because they felt that they measured more how they use knowledge, which they valued and felt was similar to the CFE experience.
Romane Viennet may want to add something, if we have time.
We do.
Beatriz Pont has set out our core recommendations clearly, and I do not want to add to what she said.
It remains only for me to give you our sincere thanks for the two hours that you have given us this morning. We have subjected you to a lot of questioning and you have not wilted once. Thank you for that and for the tremendous benefit that you have given us by responding so fully to our questions. We are indebted to you and we appreciate that. Your time this morning has been an investment in our understanding of the work that you have done for us. Thank you very much indeed.
The public part of today’s meeting is now at an end. I ask members to reconvene immediately in Microsoft Teams, so that we can discuss our final two items in private.
11:36 Meeting continued in private until 12:35.Previous
Subordinate Legislation