Follow-Up Evaluation of Self-Directed Support Test Sites in Scotland
This follow-on evaluation built upon the initial evaluation of the self-directed support test sites which reported in September 2011. This follow-on study sought to assess continued uptake in the test sites; to identify activities to further promote and increase awareness of self-directed support and identify system wide change within the test site local authorities.
5 SYSTEMS & PROCESSES TO IMPLEMENT SDS
Introduction
5.1 A key aspect of the test sites was the development of suitable systems and processes to implement SDS, particularly in relation to assessment and resource allocation. While the starting points had been In Control's 7-step process and self-assessment framework and the outcomes focused Talking Points, the test sites had invested significant staff time in redesigning processes in order to shift the focus to outcomes, and to fit with local circumstances and different service user groups. When the test sites ended, the local authorities therefore had bespoke processes of SDS assessment and resource allocation in place that would be applied to work with other service user groups and geographical areas.
5.2 In this chapter we use data primarily from stakeholder interviews and the survey of care managers to look at how these processes developed in the year following the test sites, and at the perceptions of those having to use these new systems and protocols. We focus especially on stakeholders' perceptions of the implementation of SDS assessment process, approaches to allocating resources and of the paperwork/red tape involved.
General
5.3 As a result of the test site activities the 3 local authorities had put in place new systems for implementing SDS. This had been a time consuming and lengthy process so that at the end of the test sites, this work remained on-going. The process of personalisation as described in Dumfries & Galloway involved individualised outcomes-based assessment, consideration of the proposed plan and budget by a resource panel and in most cases, an Individual Budget (IB). Similarly in Glasgow, the process comprised a self-evaluation questionnaire (SEQ), resource allocation panel, an agreed outcomes based support plan and IB. Since the test site, the local authority had set up Risk Enablement Panels (REPs) to address risk issues identified and discussed at the Resource Allocation Screening Group and referred by its Chair to the REP for help with "challenging or complex decisions" within the SDS processes of allocating IBs and support plan validation[15]. An outcomes based assessment framework had also been developed by Highland based upon In Control methods for those opting for SDS. An equivalency model was used instead of a RAS to decide SDS budgets and in all cases an IB was agreed. At the end of the test site the local authority had resolved to develop appropriate RAS for the future roll out of SDS.
5.4 The original evaluation report suggested that some staff, service users and carers did not have enough information about SDS and did not understand the new processes or found them challenging. Varying experiences of resource allocation panels were also given: some describing positive engagement of service users and carers in presenting the case for a support package, others feeling frustrated at not being awarded the budget to meet the support needs they had identified or feeling baffled by panel decision making. This follow-up evaluation has enabled further exploration of these issues.
5.5 Although on the whole care managers were positive about SDS and its potential to offer flexibility and choice to service users, their views about new assessment protocols and processes were less positive. As illustrated in table 5.1 below, only about a third of care managers in 2 of the sites seemed to find these helpful. While those in Highland were relatively more positive, there was still criticism that paperwork was burdensome and protocols were too focused on young people with learning disabilities (the target group during the test site).
Table 5.1 Care Managers' perspectives of SDS assessment protocols and processes
Local authority | Helpful | Unhelpful | Don't Know |
---|---|---|---|
Dumfries and Galloway | 32% (17) | 54% (29) | 14% (8) |
Glasgow | 35% (37) | 63% (67) | 2% (3) |
Highland | 42% (14) | 39% (13) | 19% (6) |
5.6 Care managers in Glasgow were the least positive overall and many reported that the SDS system incorporated a bureaucratic, lengthy and cumbersome 'stepped' process that was difficult to complete and often unnecessary, especially for very small packages of support. A number of care managers in Glasgow also mentioned the pressure of also having to get used to new IT systems (CareFirst6). Indeed much of the SDS Team's time was said to be spent supporting practitioners to record SDS on this system. In addition, many felt that procedures and systems kept changing and the "goal posts kept moving" with regards to criteria and funding. This left them feeling they might give out-of-date, confusing and possibly contradictory information to service users.
5.7 Interviews with third sector providers in Glasgow supported this viewpoint as it seemed from their perspective that systems introduced to implement SDS were subject to frequent change and amendment. Furthermore, some said that while people are encouraged to want more choice and control, the system acts as a barrier to achieving this. In the words of one provider, "it seems like a square peg in a round hole, trying to fit SDS into traditional systems".
5.8 Some care managers in Dumfries & Galloway were also critical, suggesting there was no clear route though the personalisation system, that there was insufficient guidance about what was required, eligibility criteria, resource allocation, and recording information and so on. This was reminiscent of the previous evaluation when care managers involved with case study individuals had reported feeling confused when navigating new processes. There was also some concern expressed by care managers about their own and the Personalisation Team's roles and responsibilities, which resulted in a "long winded" and "laborious" process for them and the service users to set up a support package.
Assessment
5.9 Key stakeholders interviewed in Dumfries & Galloway framed the main issue about assessment as being about supporting the cultural shift that would change the power relationship between professionals and service users. Similar language had been used about re-balancing power and culture change in all 3 areas during the test sites. Post test site, the Dumfries & Galloway Personalisation Team emphasised the continued importance of developing practitioners' capacity for co-production rather than focusing on getting the 'right' assessment form in place, and this was where the team's efforts had been directed. There was, however, a move to further develop the self-assessment document from the test site as this was considered "not fit for purpose". A short life working group was set up during the follow-up period involving the Frontline Improvement Team, Personalisation Team, service users and carers to develop more appropriate tools. In contrast, care managers from this local authority expressed concerns about the process to implement personalisation stating this was not always clear to them and they needed more support.
5.10 Since the test site, many more staff in Glasgow had become involved in the process of supported self-evaluation and developing outcomes based support plans (OBSPs). Unlike during the test site, support plans are now agreed in localities by service managers and signed off by the resource allocation screening groups (RASGs). In parallel, a financial assessment is completed to determine levels of client contribution to the support package. The SEQ has continued to evolve (version 12 at time of writing) in light of experience and with the broadening out of the needs of service users included in the SDS programme. One criticism levelled by third sector providers at Glasgow's SEQ was that the language and style is more appropriate to people with learning disabilities than to other groups.
5.11 Although the SEQ process used in Glasgow aimed to increase flexibility, third sector providers pointed out that in practice budget allocation is still based on support hours, thus limiting creativity and choice. At the early stages of the SDS programme for people with mental health problems, providers reported difficulties in ensuring the sensitivity of the process to the people's fluctuating needs and circumstances. Moreover they found it essential to liaise closely with Social Work Finance personnel, which proved to be very constructive.
5.12 Whilst care managers from Highland were generally more positive about SDS processes than those from the other local authorities, they also expressed a number of concerns. For example, some care managers commented that they felt the self-assessment tools (e.g. 'My Plan' or SSAQ) were not specific enough to be able to identify detailed needs. In addition, some felt that the paperwork is still too burdensome and the protocols are better suited to young people with learning disabilities. As one care manager commented, other service users are "being shoe-horned into a process designed for people with learning disabilities". Some also found the SSAQ not helpful as the main assessment tool because in the absence of the RAS in Highland the SSAQ "carries little weight".
5.13 Despite aiming to involve advocates more in assessment processes, access to advocacy generally was reported as uneven, with the involvement of, and partnership with, advocacy organisations working with people with learning disabilities being the most common. Those examples where advocacy was used seemed to positively impact on the resulting support plans and outcomes, and we were given examples from all 3 sites. In addition, over time a better understanding of the role of advocacy in SDS had developed. However, advocacy organisations in all areas reported variable experience of supporting individuals going through SDS/personalisation assessment and suggested that some care managers did not fully involve service users while others made great effort to do so.
IBs and Resource Allocation
5.14 Although the test sites had adopted a broad definition of SDS in line with the Scottish Government's national strategy for Scotland (2010), they had universally implemented IBs and the In Control model of SDS, the emphasis being on financial allocations that were more transparent. As Glasby and Duffy (2007, p2) assert, an IB is about "being clear with people from day one how much is available to spend on meeting their needs", and then ensuring they have as much control and choice over how this money is spent. From the start, the resource allocation is "up-front" (Mind 2009b, p3). IBs were to combine different funding streams, align assessments, encourage self-assessment and introduce a transparent RAS (Manthorpe et al, 2011). They were to focus on outcomes and allow users to choose where to purchase their support (Rabiee et al, 2009). The test sites had all trialled approached to identifying IBs and had set up systems of resource allocation. Two had implemented a Resource Allocation System or RAS in line with the In Control model, and one had been encouraged by Scottish Government to apply an alternative equivalency model (see original evaluation report). However, all had faced challenges in terms of implementing a system that was equitable and appropriate to meet different needs, and at the end of the test sites, systems for IBs and RAS were under review.
5.15 According to key stakeholders in Dumfries & Galloway the main learning regarding implementing IBs and RAS had been that being up front about the budget at the start was distracting. Through experience they had found if a budget figure was given up front this skewed people's thinking about ways of meeting needs. They emphasised the importance of work at the start of the process identifying natural supports and user-defined outcomes before money came into the equation. As a result, the Council developed a 10-stage process going beyond In Control's 7 steps: 1. Information pack on personalisation; 2. Complete self-assessment; 3. Create a support plan; 4. Support plan is checked; 5. Council considers the support plan; 6. Informed of decision about support plan; 7. If approved, plan put into action; 8. Funding put in place; 9. Regular review of how budget working; 10. Regular check of support plan and changes if needed.
5.16 All 3 local authorities had shifted from a centralised resource panel during the test site to delegating responsibility for funding/budgetary decisions to local area teams or geographical patches. In Dumfries & Galloway RAS is applied consistently across the region. Key stakeholders commented that the funding of personalisation packages was still under review and the Council had a finance sub-committee that regularly reviewed systems and had devised a 4-part banding structure for guiding decisions about IB levels to meet different needs. This system, implemented during the follow-up period, was felt to have more potential than the In Control RAS to allocate appropriate budgets. Deciding on appropriate levels of funding was described as "an art not a system" because "people's needs do not fit into boxes", and because personal and community capacity has to be taken into account. Several care managers commented positively on the personalisation panels which met every month (or more often if necessary), and agreed that they tended to approve appropriate funding requests. However, others expressed more uncertainty about this process; third sector organisations were more critical about the process of agreeing budgets and the reality of final funding levels of some personalisation packages.
5.17 Since the test site, localities in Glasgow had played a stronger role in the process and there are now local resource panels (locally known as RASGs) in each of the 3 areas. Budget decisions are taken on the basis of service managers' assessments of priority needs with the RAS estimate viewed as a starting point. However, experience led to recognition that the formula cannot handle the complexity of needs. Since Sept 2011, it appeared that RASG were held 5 days a week for each locality due to the high volumes of cases being processed. The Council was still involved in a process of "testing" the RAS with different client groups post test site. Key stakeholders commented on how time consuming they had found the process of getting agreement on levels of IBs, particularly where risks were identified. To address risk issues positively, Glasgow had set up Risk Enablement Panels (REPs), which involved multi-agency stakeholders including service users, carers, and advocates and were independently chaired. However, a third sector provider commented that in its experience, disputes about IBs were rarely resolved by the REPs, although other interviewees reported that budgets were amended as a result of REP hearings. It was also commented that IB levels tended to be lower than prior budgets so that for existing service users it appeared that their services were being cut. Service users and carers consulted for this research broadly held this perception or expressed concern about potential reduction in support.
5.18 Care managers were the least positive about how allocation panels were working and decisions about IBs in Glasgow. Many care managers perceived that decisions at the funding panel were often arbitrary and inconsistent. Some care managers felt that "it depended on who the chair (of the panel) was", perhaps suggesting there is too much power given to the chair to make funding decisions or at least a lack of transparency regarding these decisions. The RASG meetings were often reported as being stressful for workers especially as they were experienced as very adversarial.
5.19 Local authority stakeholders interviewed in Highland argued that not having a RAS had held up development of SDS implementation during the test site. The equivalency model that they had been encouraged to apply had not provided a suitable mechanism, and since the test they resolved to move to RAS. They had used underspend from the test site to employ an independent consultant to work on developing 3 different RAS - for children, adults, and older people over 65 years. As in the other areas, the centralised allocation panel set up during the test site had since been disbanded, and decision making was now taken at local level. The mechanism for resource allocation in Highland was therefore under early stages of development during the follow-up period. As in other areas, third sector interviewees had expressed concerns about the budgetary allocation process and the IBs agreed.
Paperwork and IT Systems
5.20 One of the issues that arose during the test site period concerned the duplication of assessment processes because alongside new SDS assessments, local authorities continued to use single shared assessment (SSA). The evaluation concluded that the work of the test sites had not resulted in a reduction in 'red tape' but had instead increased the amount of paperwork required. The local authorities planned to address these concerns post test site.
5.21 Care managers from all the sites highlighted the paperwork involved in implementing SDS as burdensome, especially in light of duplication with the SSA, which had continued post test site. Some stakeholders argued that the detailed paperwork was justified when taking an individualised approach, whilst others argued that unless the burden of bureaucracy for care managers was addressed, the desired widespread implementation of SDS would not take place.
5.22 As one stakeholder in Dumfries & Galloway commented "they (care managers) are spending too much time in front of a computer lacking permission to think and act differently, more creatively". Similarly, it was suggested that as a result of implementing SDS systems, care managers in Glasgow were spending more time on form filling and inputting information to the IT system than on support delivery. One third sector provider asserted that the way SDS has been implemented is "dominated by centralised control", which meant more bureaucracy. Similarly, third sector stakeholders interviewed in Highland felt there was too much paperwork and duplication of aspects of the system. Work had started on simplifying the forms used and in developing what was termed a 'Personal Plan', which they planned to pilot later in 2012.
Summary/Key Points
- On-going change in assessment and resource allocation systems had persisted as a main preoccupation since the test sites in all areas.
- Key stakeholders in Dumfries & Galloway, however, stressed the cultural shift needed to implement real choice and control rather than systems being 'right'.
- Care managers were the least positive about new protocols and processes to implement SDS, particularly in Glasgow.
- A key criticism of the assessment processes developed during the test sites was that they tended to be too orientated for use with people with learning disabilities and were having to be further developed.
- There was evidence of increased involvement of independent advocacy since the test sites in all areas though this was inconsistent and dependent upon care managers' understanding of the role of independent advocacy as well as on the capacity and training of advocacy services.
- In all 3 areas, more resource allocation panels had been created to enable greater numbers of support plans to be considered and to enable decisions to be taken at locality level.
- Systems of resource allocation were a zone of high uncertainty and one of the most problematic aspects of implementation and these had not been fully resolved during the follow-up period.
- Paperwork resulting from implementing SDS had not decreased the bureaucratic burden but had rather increased it in those cases where self-assessment continued in parallel, or was an addition to single shared assessment.
Contact
Email: Aileen McIntosh
There is a problem
Thanks for your feedback