Children's hearings redesign: consultation analysis

Independent analysis of responses to the Children's Hearings Redesign consultation commissioned by the Scottish Government.


Background to the Children’s Hearings System redesign consultation

Scotland’s unique Children’s Hearings System (CHS) dates back to the Kilbrandon Report, which was commissioned in 1961 “to consider the provisions of the law of Scotland relating to the treatment of juvenile delinquents and juveniles in need of care or protection or beyond parental control”[1] and published in 1964. The Report’s recommendations were incorporated into Scots law through the Social Work Scotland Act 1968 and took full effect three years later. The fundamental principles of the system as instituted by the Kilbrandon Report have remained relevant since the establishment of the CHS more than 50 years ago, enshrining Scotland’s welfare-based approach to children’s care and youth justice in law and offering legal protections to children who are in need or are at risk. In May 2020, following a 20-month review of the CHS, “Hearings for Children: The report of the Hearings System Working Group”[2], was published, setting out a package of recommendations relating to the hearings system’s redesign.

This consultation is based on the Scottish Government’s response[3] to the Hearings for Children report and is part of a series of steps intended to support the redesign of the CHS. The focus of the consultation was limited to areas that may require legislative change[4] and could be introduced to the Scottish Parliament as part of a Bill and considered towards the end of the current Parliamentary term. More details on the aims of the consultation are contained within the policy paper; in brief, though, the aim of the hearings system redesign is to build on the strengths of the current system to enable the best possible experiences for children and their families who require its support, in view of the Scottish Government’s commitment to Keep The Promise by 2030.

Consultation process and analytical methods

The consultation process

The consultation was launched on the 26 July 2024 and ran for just over 13 weeks to 28 October 2024. A small number of organisations were granted an extension to the original deadline by the Scottish Government, with 105 responses (inclusive of easy-read responses) received by mid-November. One response was received after this time and was not included in the analysis.

The consultation was hosted on Citizen Space, the Scottish Government’s consultation platform. The consultation comprised 90 questions in total and included closed questions (questions with specific answer options, including many yes / no questions), open questions (questions with space for respondents to express their views freely and in their own words), and questions combining both formats (i.e. asking a “yes / no” question with open-text space for a respondent to explain their answer in more detail). Some questions contained several parts (i.e. several questions within a question).

The consultation posed a range of questions pertaining to the redesign of the Children’s Hearings System, including, but not limited to, the broad topics of:

  • The principles of a redesigned Children’s Hearings System;
  • Before a Children’s Hearing;
  • Grounds for Referral and Associated Processes;
  • Role of the Children’s Reporter;
  • The Children’s Panel and Children’s Hearings;
  • After a Children’s Hearing;
  • Assessing Impact.

An accessible summary version of the consultation was developed by CYCJ based on the original consultation document and was made available upon request. These responses were mapped to the questions within the main consultation by the Scottish Government and analysed alongside the other responses.

Responses

A total of 105 responses were assessed and checked for duplication by the researchers, leaving 104 responses that were included in the qualitative analysis. As already noted, several of these responses were received after the original deadline had passed. The research team waited until the respondents who had been granted an extension to the deadline had submitted their responses before beginning the process of analysis. However, it was requested that a final late response be considered as part of the analysis after the quantitative component of the analysis had already been completed. This meant that the quantitative analysis in this report takes into account 103 (not 104) responses. It is unlikely, however, that this would have meaningfully impacted the answers to the quantitative questions as this additional response did not contain any answers to these questions, instead simply leaving them blank. A further response was uploaded to Citizen Space by the Scottish Government in December; this response was not included in the quantitative nor the qualitative part of the analysis. It is important to note that responses from different groups (such as young people or panel members) are not necessarily representative of the general views of young people or panel members (for example), rather they reflect the views of those who did respond to the consultation.

In total, 47 responses identified as an individual response, representing 45% of responses (although it is noted some individual responses included more than one contributor and some organisational responses identified as an individual). Some organisational responses described how they had undertaken their engagement activities, with some responses containing the views of a wide range of people: “In responding to this consultation [organisation] has undertaken internal engagement to ensure that the circa 2,500 volunteers working within [the organisation] have all had an opportunity to express their views and contribute to the response, either by sharing their views with [the organisation] or responding directly to the Scottish Government’s consultation via the online portal” [R39, CH Org].

A further 59 responses identified as an organisational response, although one was a duplicate response and therefore was removed, meaning there were 58 organisational responses. This represented 56% of responses.

Additionally, 16 responses were submitted in response to the easy read version. These responses were either from children and young people or group responses that mainly involved young people: “In total, 25 young people across Scotland gave us their views; individually (13 interviews), or through three group discussions with board members at [organisations]” [R97, YP Org]. This represents 15% of responses. Various other responses will, however, have also included young people or drawn on young people’s views and lived experience as part of their response.

Where identification was possible, responses were categorised based on their role and sector. Although it is acknowledged that some respondents could be feasibly included as part of multiple categories, each response was categorised into only one category based on what the researchers deemed the one most appropriate or relevant category. Where it was clear from the content of individual responses that the respondent was a panel member, this has been identified. Where an individual response came from an organisational email account (usually a local authority) these have been categorised as such. Various responses across different sectors and categories had clearly drawn on research evidence to inform their response. Other responses, however, were specifically identified as being from a research or academic institution, and were thus categorised as such, while others who had simply drawn on research evidence were not necessarily included in this category. The categorisation of third sector organisations was based on their charitable status and whilst many respondents will advocate on the behalf of children in a general sense, either through their professional role or beyond, the “advocacy” category is limited to those whose primary purpose or function is directly related to advocacy work.

Table 1: characteristics of responses received (total n=104, those with identifiable information to allow categorisation by role/sector n=75)
Role/Sector Number Percentage
Panel member(s) 9 12%
Children’s hearings related 3 4%
Researcher 4 5%
Local government/social work (including representative bodies) 22 29%
Third sector 16 21%
Legal/law related 11 15%
Advocacy providers 5 7%
Other 5 7%
Table 2: quotation label key
Role/Sector Quotation label abbreviation
Panel member(s) PM
Children’s hearings related CH
Researcher / university R
Local government/ social work (including representative bodies) LA/SW
Third sector 3rd S
Legal sector Leg
Advocacy providers Adv
Young people YP
Other / unknown O/U

A respondent answering as an individual in the legal sector, for example, might be labelled: R84, Leg Indiv. A respondent answering on behalf of a third sector organisation might be labelled: R5, 3rd S Org.

Methods

Data cleaning

Most responses (63 in total) were submitted directly to the Citizen Space consultation platform; a further 42 responses were submitted directly to the Scottish Government. Government staff then manually uploaded these responses to Citizen Space. Where responses did not follow the format of the consultation (for example, submitting a single piece of text rather than answering specific questions), Scottish Government staff manually arranged sections from these responses under the most appropriate question headings in order that they could be analysed along with the rest of the responses. This approach and any additional information provided was subsequently checked by researchers.

The researchers downloaded the responses from the Citizen Space platform to Microsoft Excel, where the responses were minimally reformatted for the benefit of researcher readability and to remove columns containing irrelevant information. Responses were then checked for duplicates and assessed for blanks – many respondents chose to answer a selection of questions rather than the whole consultation. Some respondents did not follow the format of the consultation (in the sense of answering the specific questions individually using the Citizen Space platform), instead submitting an extended narrative answer addressing the themes of the consultation in a more general way: these responses were manually assessed and the text of these responses was sorted by Scottish Government staff under the most relevant question headings for the purpose of analysis. Duplicate responses were thoroughly checked and then removed from the dataset in Excel before being exported to NVivo (for the qualitative analysis) or to another Excel spreadsheet (for the quantitative analysis).

Consideration was given as to whether data could be weighted in analysis, for example, based on whether a response was from an individual or submitted as an organisational response, combining the contributions of multiple people and / or groups. However, as this information was not always included in responses and is not a typical approach to Scottish Government consultations, this was not progressed. The Scottish Government may want to consider an agreed approach to weighting for future consultations, especially where a greater number of responses is received.

Some respondents answered all questions contained within the consultation; however, as stated previously, most respondents only answered a limited number of questions. In addition, some respondents answered either only the quantitative or qualitative parts of questions. Where respondents answered both parts of such questions, at times these answers were not consistent (i.e., a respondent who said they agreed with a proposal in the quantitative component of a question expressed solely opposing views in their answer to the qualitative component) or more nuance was provided in the qualitative response, effectively contrasting with the respondent’s answer to the binary “yes / no” part of the question. Where possible, efforts having been made to highlight this to contextualise the statistics representing the quantitative element of a question (for example, where a majority of respondents selected “yes” in response to a quantitative question but then used the qualitative space to add conditions or qualifications to their answer). Consequently, it is essential to read the quantitative analysis in the context of the corresponding qualitative component (in the case of questions containing both elements); reading the quantitative data alone does not provide a full picture of respondents’ views.

Analytical approach

The analytical approach was discussed and agreed with the Scottish Government team prior to embarking on the analysis. The Government requested that the researchers conduct the analysis in a question-by-question style (rather than, for example, by overall theme), given that readers may choose to focus on certain parts of the analysis when reading this report. This would ensure that a person who was interested in only one question (or a couple of questions) would not miss out on certain information or context, even though it would be likely that someone reading the analysis in its entirety would encounter some degree of repetition and duplication.

Although the consultation was analysed with this question-by-question approach in mind, nevertheless, strong themes and narratives emerged across questions, and these are briefly discussed in the conclusion of this report. While some respondents included a great deal of additional information or expressed detailed views on topics not specifically mentioned in the consultation, this could not always be included in the analysis (depending on its relevance to the question or topic). However, as far as possible, “other comments” or “other issues raised” have been included, especially when raised by multiple respondents, to ensure that these views were represented in some way.

Quantitative analysis

The quantitative data was uploaded to Excel by one of the researchers. Frequencies for the closed questions were produced using an Excel formula, taking into account the blank responses and unanswered questions. Response percentages were converted into labelled charts, also using Excel. Across all the closed questions in the consultation, the average response rate (i.e., the percentage of respondents who selected an answer option rather than leaving it blank) was 48%. Some questions had a response rate as low as 28%, while the question with the highest response rate was 75%. Given that the proportion of non-responses was so high for most questions, these have been included in the statistical analysis and in charts in order to avoid over-emphasising the weight or significance of the other answer options available to respondents. Throughout the report, percentages have been rounded up to the nearest whole number percentage. For quantitative questions, we have also included the actual numbers of responses (e.g., 10 respondents said yes, 94 said no), however, it should be noted that given that the total number of respondents is close to the value of 100, this could cause confusion, as the percentage values and actual numbers are very similar.

Qualitative analysis

Once the data was cleaned and reformatted in Excel, the responses were uploaded to NVivo 2020, a qualitative analysis software. To begin with, both researchers read through all the responses to familiarise themselves with the dataset and to get a sense of how respondents felt about different topics. One researcher carried out an initial round of coding on all 90 questions, using NVivo 2020 to go through each question and to sort responses into (non-exclusive) categories. The researchers selected a number of questions in advance across the different sections of the consultation to be double coded by the second researcher; the second researcher completed the coding process on paper rather than using NVivo. A question-by-question approach to coding was adopted by both researchers – in other words, each question was coded separately on its own terms, with any clear overarching themes (or additional comments provided that were not necessarily relevant to a that specific question) noted separately to be revisited later in the process. These themes and recurring additional comments are briefly analysed as part of the conclusion. Together, both researchers reviewed the double-coded questions during a series of in-person meetings. There was little divergence between researchers in terms of identified codes, emerging themes, or categories for the double-coded questions; where there was, this was discussed and resolved. Early in the process, it was noted that Q85 and Q90 were duplicates; the responses to these questions were subsequently combined and analysed as one question and reported under Q90, the final question of the consultation. Each response was therefore considered in a systematic way, and every response was given equal consideration during the process of analysis.

The researchers encountered several issues during the process of qualitative analysis.

Firstly, given the breadth of the consultation, and both the number and depth of responses, it would not have been possible to reflect every individual view expressed in response to each question, for reasons relating to project timeframe and report length. Thus, the general analytical approach developed during and after the coding process was completed sought to identify (per consultation question): the key reasons respondents cited for and against a particular proposal; to note any specific areas of agreement or divergence (or any especially strongly-held views), and to provide more detail in relation to each of these ideas (or at least, the most frequently mentioned ideas, in cases where many different ideas emerged); as well as including a non-exhaustive list or summary of other issues or comments mentioned. Comments made in response to a specific question that did not directly address the topic in question were generally excluded from the analysis, but noted and acknowledged elsewhere, wherever possible. Respondents also frequently used open-text boxes for one question to add commentary to a different or unrelated closed question. This was especially the case where respondents did not feel comfortable selecting one of two binary answer options. As before, it should be noted that it was not always possible to reflect these additional comments in the analysis.

For some questions there were few identifiable common themes in respondents” answers, with an extremely wide range of views or comments expressed within the responses. Often, too, a number of respondents listed one idea as an advantage or benefit of a proposal, with a similar number of respondents citing the same idea or reason as a disadvantage. Rather than offering straightforward responses that directly answered the question, several responses called for balance or nuance; for changes to be made but only in certain circumstances or only for particular demographics; the need for more detailed exploration, policy development, or consultation with particular groups; or complaints that the consultation did not provide sufficient information in order for the respondent to answer the question. These ideas have been included in the analysis. It was also apparent that there were conflicting perspectives among respondents as to what was possible within the framework of existing legislation and practice: this has also been included in the analysis, to illustrate the level of complexity and uncertainty within the system and amongst those involved, whether directly or indirectly. Therefore, where information about current practice is inaccurate, this is a reflection of the content of the responses (for example, an incorrect claim about panel members” current powers or remit).

All consultation responses who provided permission to publish are in the public domain and have been given detailed consideration by the Scottish Government. A number of respondents – particularly those in legal professions – provided highly technical analysis, including detailed references to current legislation and case law, often resulting in strong views on the proposed changes and, at times, significant concerns about the legal details or unintended consequences of a specific proposal. In their detail and technical expertise, some of these responses were beyond the remit of this analysis, so while they have been acknowledged in the report, they have not been subject to any more detailed analysis than other responses within the context of this consultation analysis. Many respondents highlighted the conclusions of the Hearings for Children report and, although the consultation is based on the Scottish Government’s response to this report, they sometimes questioned if the proposals fully addressed these conclusions, which have been highlighted within the analysis where possible. We trust this information (the technical legal analysis in particular) will be more fully considered as part of a wider redesign process than was possible within the scope of this report.

Where certain categories of respondents (such as legal professionals, young people, etc.) had particularly strong or clear views on a topic, these have been acknowledged, however, a breakdown of views by respondent category for each question was not carried out due to time limitations. Rather than reporting on the views of young people separately, it was decided that it was important to give young people’s responses the same value as the others and to analyse them together with the rest of the dataset. However, where young people’s views were particularly strongly felt, or diverged significantly from other responses, this was highlighted. A separate summary version of this analysis will be published in due course.

During the process of assigning labels to quotations used in this report (based on the respondent’s sector and whether they were answering as an individual or on behalf of an organisation), it was noted that organisational responses were more likely to consent to having their response published compared to individual responses. This means that the quotes used to illustrate various points made in this report are disproportionately based on organisational responses. However, it should be acknowledged that the views of those who did not consent to having their response published (and therefore, to having direct quotes included in this report) were still considered with the same value as those that feature in quotes.

Finally, a large volume of comments was received about the overall approach to the consultation, or on the wording or framing of particular questions. At times, respondents were highly critical of the format and content of the consultation itself and used the answer spaces to express this. The length of the consultation was also criticised, although this may to some extent have been unavoidable given that the consultation itself reflects not only the complexity of the CHS, but also the depth and breadth of information and number of detailed recommendations contained within the preceding reports (in particular, the Hearings for Children report). While these comments have not been subject to detailed analysis within this report, the Scottish Government have been made aware of this information.

Contact

Email: childrenshearingsconsultation@gov.scot

Back to top