Start Submission Become a Reviewer

Reading: Electronic Information Sharing Between Nursing and Adult Social Care Practitioners in Separa...

Download

A- A+
Alt. Display

Research

Electronic Information Sharing Between Nursing and Adult Social Care Practitioners in Separate Locations: A Mixed-Methods Case Study

Authors:

Helen Chester ,

University of Nottingham, GB
X close

Jane Hughes,

University of Nottingham, GB
X close

Ian Bowns,

Public Health Consultant, GB
X close

Michele Abendstern,

University of Manchester, GB
X close

Sue Davies,

University of Manchester, GB
X close

David Challis

University of Nottingham, GB
X close

Abstract

Context: A longstanding concern, both in the UK and internationally, is that multiple health and social care professionals undertake assessments of adults and older people with complex needs, but that information is not shared. Electronic information sharing within assessment and support planning has been identified as a means of promoting integrated care for adults with complex health and social care needs.

Objective: To evaluate the implementation of a shared electronic record between nursing and adult social care practitioners in separate agencies and locations to inform the assessment of need for adults and older people with complex needs.

Methods: The design of the study reflected the incremental implementation of the shared electronic record between 2010 and 2012 in one geographical area within England. It was a mixed-methods case study employing data from three sources: audit of patient case files; survey of nurse practitioners’ time use, well-being and job satisfaction; and manager interviews post-implementation providing further insights into the implementation process.

Findings: Electronic information sharing facilitated greater involvement of adult social care practitioners in the continuing healthcare assessment process and contributed to a more streamlined service. No adverse effects of the intervention on the well-being and job satisfaction of nursing practitioners’ were reported.

Limitations: This research was undertaken in a single setting.

Implications: Continuing healthcare services are a universal service that uses a standardised assessment process, offering the potential for this to be replicated elsewhere. Thus, findings are of value to policy makers and practitioners and offer the potential to inform wider roll-out.

How to Cite: Chester, H., Hughes, J., Bowns, I., Abendstern, M., Davies, S. and Challis, D., 2021. Electronic Information Sharing Between Nursing and Adult Social Care Practitioners in Separate Locations: A Mixed-Methods Case Study. Journal of Long-Term Care, (2021), pp.1–11. DOI: http://doi.org/10.31389/jltc.16
24
Views
10
Downloads
9
Twitter
  Published on 12 Jan 2021
 Accepted on 23 Oct 2020            Submitted on 25 Apr 2019

Introduction

Decisions about allocation of public funds to pay for care or treatment are reliant on a composite assessment process usually involving contributions from a number of professions (New Zealand Government, 2016; Australian Government, 2018; Department of Health and Social Care, 2018a). Communication is facilitated by appropriate tools and other mechanisms to coordinate multiple contributions to the process (Taylor, 2012). In England, one such example is decision-making relating to the provision of NHS Continuing Healthcare, a package of ongoing care that is arranged and funded solely by the health service for individuals outside a hospital setting who have complex ongoing healthcare needs. Guidance sets out a process for the NHS to work in partnership with its local authority partners to assess health needs and determine eligibility for NHS Continuing Healthcare (Department of Health and Social Care, 2018a). This provides the context for the study reported in this paper.

Information sharing between professionals within the assessment process to improve outcomes for patients and service users is a longstanding policy objective (Department for Communities and Local Government, 2014; Department of Health, 2001, 2013; Department of Health and Social Care, 2018a). Advances in technology have long stimulated debate as to how electronic information systems may support greater service integration by promoting and enabling the sharing of data between organisations and professionals (Kane and Kane, 2000, Weiner et al., 2003, Witham et al., 2015). However, its potential value in the assessment of general health and care needs has been neglected, although as the health and social care needs of older people become more complex, there is an increased requirement for a coordinated and efficient assessment process (Loader et al., 2008; Taylor, 2012). In England, attempts to integrate assessments have aimed to build on new technologies so that information recorded by a professional can be shared to improve care and service arrangements (Department of Health, 2012).

This study provided an opportunity to test a novel approach to electronic information sharing spanning the health and social care boundary. It was one of a number of pilot projects within a demonstration programme categorised as ‘Information sharing within the professional assessment process’ (Chester et al., 2015 p. 152). Pilot projects may be established with the objective of achieving long-term sustainability or as a means of testing whether there is the appetite and potential for a certain type of change in the future (The Cabinet Office, 2003). In this instance, it was introduced in a service with a legal responsibility to meet the health care needs of patients, and sought to integrate information from the records of the local authority with responsibility for the provision of social care. This challenged health and social care professionals to work closely together in a joint practice development process (Taylor, 2012). The initiative had its origins in the Common Assessment Framework (CAF), conceived of as a mechanism to promote electronic means of sharing information between health and social care professionals within a multidisciplinary assessment (Department of Health, 2009a).

Methods

This paper reports findings from a mixed-methods study evaluating the implementation of a shared electronic record between nursing and adult social care practitioners in separate agencies and locations to inform the assessment of need for adults and older people with complex needs. Approval was obtained from the National Research Ethics Committee (10/H0107/60) as part of a multisite evaluation spanning multiple service user groups of pilot projects, developing a Common Assessment Framework for use in adult health and social care services. Research governance approval was subsequently received from the health and social care agencies that participated in this study. Both patient and staff level (practitioners and managers) data were collected.

Study design and sample

This study was located in one continuing healthcare team within the primary care sector of the National Health Service in England. The assessment of need of adults with complex health needs to determine eligibility for continuing healthcare funded by the NHS in England uses a universal Decision Support Tool. The purpose of this is to facilitate the decision as to whether or not a patient has a ‘primary health need’ and therefore is eligible for NHS continuing healthcare to fund their entire long-term care needs, or are eligible for the Registered Nursing Care Contribution—i.e., payment for the nursing component of this (Department of Health, 2009b; Department of Health and Social Care, 2018b).

The study commenced whilst systems to promote electronic information sharing between health and social care practitioners were under development. Implementation of the new initiative was incremental. To plan the evaluation, researchers met with the continuing healthcare team regularly, and jointly agreed the phases of implementation detailed in Table 1. In phase 1, a paper-based approach was used to gather and distribute information to complete the Decision Support Tool, whereas in phase 3 these activities were completed electronically. Phase 2 constituted an interim phase with a mixed approach to information collection and distribution. In its final form (phase 3), the shared electronic record offered the ability to access and complete the Decision Support Tool electronically, and also permitted continuing healthcare service practitioners to view the records held by adult social care services in respect of the patient whose circumstances were being assessed. In phase 3, the shared electronic record also allowed adult social care assessors to view the progress of a continuing healthcare assessment for service users (patients) on their caseloads. It was made possible by the CareView component of CareFirst, the local authority client record system and its link with the electronic record system of the continuing healthcare service.

Table 1

Data collection arrangements.

Phase (date) Service arrangements Data collections and sample size

Audit of case files Staff survey Manager interviews

1 Historical cohort
(March 2010–February 2011)
Collection of information by visiting adult social care services and primary care providers, talking with colleagues and receiving faxed information. Distribution of Decision Support Tool by paper, fax and email 2931 8
2 Intervention cohort – time 1
(March–June 2011)
Electronic access to adult social care records and visits to primary care providers. Distribution of Decision Support Tool by paper, fax and email 3212
3 Intervention cohort – time 2
(July 2011–June 20123)
Electronic access to adult social care records and electronic distribution of the Decision Support Tool to adult social care staff 8 2

1 Exclusions: 8 duplicate records, 2 outside time frame, 8 died before panel adjudication/no record of decision.

2 Exclusions: 15 duplicate records, 19 died before panel adjudication/no record of decision. The 321 contains both the Phase 2 and Phase 3 intervention cohort.

3 Audit of case files ceased in February 2012.

This was a mixed-methods case study employing data from three sources: audit of case files and staff survey at two points in time and manager interviews post-implementation (Table 1) (Bowling, 1997). In this study, audit data were collected for the historical cohort in phase 1 and for the intervention cohort in phases 2 and 3. Staff survey data were collected in phases 1 and 3. Two manager interviews were conducted in phase 3, one of which was with a colleague within the partner local authority, the unit of local government responsible for the provision of social care to adults. Thus, the evaluation was embedded in evolving practice within the team, mirroring the development of the shared electronic record. Three research questions guided the data collection and analysis: (1) does electronic information sharing influence service delivery; (2) does electronic information sharing influence partnership working; and (3) how did the introduction of electronic information sharing impact on practitioners?

Audit of case files

The schedule for the audit of case files comprised socio-demographic information and care plan details to permit judgements to be made about quality, inter-agency working and timeliness. These are intermediate outcomes, measures of agency performance, adjudged to impact on service user/patient well-being (Challis et al., 2006). The audit of case files was conducted in two tranches: the historical cohort (phase 1) and the intervention cohort (phases 2 and 3) (Table 1). The inclusion criteria for the study were: (1) adults 18 years and over; (2) referred to the continuing healthcare service team for assessment for eligibility and for whom an assessment for NHS continuing healthcare (including the Registered Nursing Care Contribution) was completed; and (3) resident in the geographical area covered by the local authority between 1 March 2010 and 28 February 2012. Data on 666 cases were extracted manually from case records and were transferred into a pseudonymised format to researchers for subsequent data preparation and analysis. From these, we excluded 52 cases: duplicate records (n = 23); outside the time frame (n = 2); and the assessment for NHS continuing healthcare was not complete (n = 27). A total of 614 cases were included in the analysis, divided into a historical (n = 293) and an intervention cohort (n = 321) (Table 1).

Datasets were prepared and analysed in SPSS version 19 and Microsoft Excel. This included judgements about missing data, particularly relating to measures of performance which were applied to both cohorts. In terms of process measures, outliers were excluded following discussion with practitioners about the accuracy of the data. Individual cases were discussed and rules for interpreting the data generated. Checks for the consistency and validity of the data were undertaken. Where issues could not be resolved within the research team, advice was sought from the head of service.

To make comparisons between time points, t-tests and Mann-Whitney tests were conducted to identify significant differences in the mean scores (at the 5 per cent significance level). Both were conducted since some data were not normally distributed. Here we have decided to report t-tests reflecting the large sample sizes collected (Lumley et al., 2002; Field, 2009). Differences in proportions for categorical variables were explored first using chi-square tests (or Fisher’s Exact tests where sample size assumptions for these were not met) and then using z-tests of column proportions (at the 5 per cent significance level) in SPSS version 19 (IBM, 2017; Lumley et al., 2002). To obtain exact p values, Microsoft Excel 2010 was used (DSS, 2017).

Staff survey

Measures of staff well-being and time use were collected in a single survey. All nurse practitioners in the continuing healthcare services team were invited to complete it for one week on two occasions: in phase 1 (January 2011) and phase 3 (September 2011), at end of the implementation of the intervention (Table 1). These dates were selected with the team to represent a typical working week with the latter date regarded as the point at which the initiative was embedded into practice. The surveys, information sheets explaining the purpose, and guidance regarding completion were distributed by senior team members to promote a high response rate. Participation was voluntary and consent was assumed by completion of the survey. These were returned anonymously directly to the research team. A 100 per cent completion and response rate was achieved from a team of eight practitioners on both occasions.

The Karasek Job Content Questionnaire (Karasek, 1979) was used to measure different aspects of practitioners’ views of their work environment (staff well-being). Scores for each practitioner were entered into SPSS version 19, thus the practitioner was the unit of analysis. To make comparisons between phase 1 and phase 3, t-tests were used to identify significant differences in well-being and job satisfaction between the two timeframes. In interpreting the findings, caution should be exercised, because the sample is only 8. However, this represented the population, since all team members completed.

In completing the time use diary participants were invited to insert the appropriate code for the activity in which they had been engaged predominantly for each half-hour period from a coding list of 37 tasks designed to gain a representative view of the distribution of practitioners’ time. Practitioners were asked to record their time-use for each 30-minute interval using the activity code list, or ‘other’ where no appropriate code was available. Previous research suggests that a 30-minute interval provides an appropriate balance between accurate recording and respondent burden (e.g. Weinberg et al., 2003). In this there was a distinction between tasks undertaken in the presence of the patient and those completed on their behalf, but not in their presence. These were coded separately to tasks which contributed to the development of the team or the practitioner completing it. Accompanying instructions explained that where they undertook more than one activity in any interval, they should enter the code relating to the activity that took the most time. Participants were asked to complete the diary throughout the day to avoid inaccurate recollection later.

Datasets were prepared and analysed in SPSS version 19. The analysis was planned to explore the research questions from the perspective of practitioners. It was based on 17 of the 37 tasks in the schedule which practitioners selected as of particular interest. These were grouped into three activity categories: assessment (9 tasks), associated tasks of liaison with colleagues (3 tasks) and review and monitoring (5 tasks) (see Table 4). The unit of analysis was the team. Therefore the total number of hours spent by the team on each of the tasks was calculated as well as the relative proportion of time spent on each activity within the three categories. The analysis plan required the use of two-sample z-tests of column proportions. This permitted the identification of significant differences between phases 1 and 3 in terms of the proportion of time allocated by the team to each of the 17 tasks within each category of activities (IBM, 2017; DSS, 2017).

Manager interviews

These were conducted in phase 3 of the study. Areas of enquiry were derived from a literature review and emergent issues from the early stages of the research established in a series of meetings of the research team (Challis et al., 2012). These were: information regarding system design (for example, involvement of front-line practitioners in design); information governance (for example, identification of information to be shared); and impact on practitioners/working practices (for example, impact within and between organisations). Two senior managers in primary health and adult social care were selected because of their responsibility in each agency for the implementation of the initiative. Participation was voluntary and each manager received an information sheet and completed a consent form prior to taking part. These two semi-structured telephone interviews were conducted by a single researcher (IB) in July 2012. They were recorded and professionally transcribed.

Analysis of interviews with managers was undertaken by a single researcher (MA), using ATLAS Ti 6.2 software, and shaped using a Framework approach (Ritchie and Spencer, 1994) developed specifically for use in research where information is required for policy or planning decisions. This is a systematic process of organising material according to key issues and themes whereby data are examined for similarities and differences and text found to be conceptually similar in grouping. A preliminary coding frame was constructed collaboratively by the researcher with colleagues, using the literature review, which informed the evaluation of the demonstration programme to structure the themes (Challis et al., 2012). This was adapted in accordance with emerging data, and the process moved from description to theory building via the refinement of categories as the defining features, and properties specific to each were more clearly identified (Corbin and Strauss, 2008). The framework as applied here combined both deductive methods, whereby codes were generated according to predefined areas of interest (Pope et al., 2000), and an inductive element, where emergent themes ‘grounded’ in the data were recognised (Lewins & Silver, 2007).

Findings

Socio-demographic data

In each cohort, almost three-quarters of the patients were over the age of 75 years, and almost all were from the same ethnic background (White Caucasian). Table 2 provides additional information about their socio-demographic characteristics and health status. They were similar in most respects although there were three significant differences. Patients were more likely to be resident in hospital at the time of the assessment in the intervention cohort (phases 2 and 3) and less likely to be resident in a care home. With regard to cognitive skills, slightly more were identified as independent in the intervention cohort (phases 2 and 3). There were significantly more patients in the historical cohort (phase 1) with behavioural problems.

Table 2

Patient characteristics before and after implementation of shared electronic record.

Phase 1 Historical cohort Phase 2 and 3 Intervention cohort Significance

Mean (range) Mean (range)

Age (in years) 80 (18–101) 79 (17–99) T = –1.026, p = 0.305
N= 293 321
Gender n (%) n (%)
    Male 112 (38) 120 (39) Z = –0.252, p = 0.801
    Female 181 (62) 190 (61) Z = 0.252, p = 0.801
    N= 293 310
Domicile at time of assessment n (%) n (%)
    Alone 10 (3) 8 (3) Z = 0.000, p = 1.000
    With partner or family 24 (8) 35 (11) Z = –1.261, p = 0.208
    Residential/nursing home* 125 (43) 100 (31) Z = 3.076, p = 0.002
    Hospital* 134 (46) 176 (55) Z = –2.225, p = 0.027
    N= 293 319
Cognitive skills n (%) n (%)
    Independent 25 (9) 44 (14) Z = –1.915, p = 0.057
    Mildly impaired 4 (11) 22 (7) Z = –1.605, p = 0.110
    Moderately impaired 67 (23) 64 (20) Z = 0.897, p = 0.370
    Severely impaired 184 (64) 187 (59) Z = 1.260, p = 0.209
    N= 287 317
n (%) n (%)
Low mood 185 (63) 212 (67) Z = –1.035, p = 0.301
N= 293 316
Behavioural problems* 73 (25) 52 (16) Z = 2.765, p = 0.006
Katz Index of ADL Mean (SD) Mean (SD)
    Score 5.51 (1.05) 5.43 (1.18) T = –0.821, p = 0.412
n (%) n (%)
    High dependency (4–6) 274 (94) 298 (94) Z = 0.000, 1.000
N= 293 318

Source: Audit data. Note: N is the number of valid cases per variable.

Percentages may sum to >100 due to rounding. * p-value < 0.05.

Influence of electronic information sharing on service delivery

Perspectives on service delivery were elicited from interviews with managers and intermediate measures of outcome from the audit of case files. The impression from the manager interviews was that electronic information sharing had resulted in more timely service delivery. This was attributed to improved communication between team members and social workers and changed working practices, resulting from moving from a paper-based process to electronic system of assessment and information sharing.

I wouldn’t say that it’s having an impact on care decisions.., because obviously the practitioners were always assessing previously.., its more about we’ve been able to have an impact on the way that we operate and the way we deliver things better than other people really in a more timely way (health manager)

Table 3 provides detailed information of measures of change relating to service delivery: integration, quality and timeliness of the response in the assessment process in each cohort. With regard to integration, in the intervention cohort (phases 2 and 3) it was significantly more likely that contributions from adult social care and secondary health care would be included in an assessment and less likely that a contribution from primary health care would be included. It is possible that information from this source was duplicated within adult social care and secondary health care. Although the mean number of individual persons contributing to assessment was significantly lower in the intervention cohort (phases 2 and 3), the overall number of agencies involved in the assessment did not differ between the cohorts. There were significant differences in respect of three of the four quality measures. It was significantly more likely that the assessment would be reviewed by a person other than the assessor and that details of the care plan were recorded in the patient’s record in the intervention group (phases 2 and 3). However, it was less likely that a reason would be listed for case closure, possibly indicative of incomplete administrative processes within the time frame for data collection. A composite measure of quality was significantly higher in the historical cohort, however this appeared to be mostly influenced by one of the four quality indicators—reason listed for case closure. There was one significant finding in relation to timeliness, an increase in the number of days between assessment and panel adjudication in the intervention group (phases 2 and 3). A possible explanation is that the assessment was completed in less time but this did not influence the timing of panel decisions. These measures of process confirm the views of managers relating to a redesign of administrative processes to promote a more streamlined service. Overall, they also demonstrate greater involvement of adult social care practitioners in continuing healthcare assessment process.

Table 3

Measures of integration, quality and timeliness before and after implementation of shared electronic record.

Phase 1 Historical cohort Phase 2 and 3 Intervention cohort—time 1 and 2 Significance

n (%) n (%)

Integration
    Contribution to assessment
        Adult social care* 237 (81) 270 (94) Z = –4.905, p = 0.000
        Primary health care* 282 (97) 204 (71) Z = 8.646, p = 0.000
        Secondary health care* 216 (74) 252 (88) Z = –4.437, p = 0.000
        Other (third sector, care home staff) 63 (22) 47 (16) Z = 1.895, p = 0.059
        N= 291 288
    Extent of multidisciplinary contribution to assessment Mean (range) Mean (range)
        Number of agencies 2.7 (1 to 4) 2.7 (1 to 4) T = –1.062, p = 0.289
        N= 291 288
        Number of contributors* 3.5 (1 to 12) 3.2 (1 to 7) T = –3.472, p = 0.001
        N= 292 288
Quality n (%) n (%)
    Details of assessment in patient’s record (H = 293; I = 321) 293 (100) 319 (99) Z = 1.716, p = 0.087
    Assessment reviewed by another* (H = 293; I = 320) 279 (95) 317 (99) Z = –2.946, p = 0.003
    Care plan in patient’s record* (H = 293; I = 321) 253 (86) 302 (94) Z = –3.328, p = 0.001
    Reason listed for case closure* (H = 215; I = 240) 212 (99) 162 (68) Z = 10.165, p = 0.000
    Composite measure of quality Mean (range) Mean (range)
        Score* 3.8 (2 to 4) 3.6 (2 to 4) T = –5.005, p = 0.000
        N= 215 239
Timeliness Mean (range) Mean (range)
    Time between referral and assessment 11 (0 to 28) 10 (0 to 30) T = –1.301, p = 0.194
    N= 220 229
    Time between assessment and panel* 8 (0 to 28) 10 (0 to 30) T = 3.006, p = 0.003
    N= 145 251
    Time between referral and first service 23 (0 to 59) 25 (0 to 60) T = 0.966, p = 0.335
    N= 133 176

Source: Audit data. * p-value < 0.05.

Influence of electronic information sharing on partnership working

Data from the analysis of practitioner time use and interviews with managers provided insights into partnership working consequent on the development of the new ways of working associated with the introduction of electronic information sharing and the benefits which accrued from it. Both health and social care managers acknowledged that considerable work had been undertaken to ensure that both agencies were in agreement about what could and could not be shared. Any difficulties that had arisen were reported to have been well managed so that they did not delay or negatively affect implementation.

What we did together through the re-design and the changes was that we worked on it together.., it was never something that they just said - oh this is how you should do it (health manager)

Table 4 describes practitioners time use before (phase 1) and after the new ways of working had been established (phase 3), demonstrating the extent of partnership working between continuing healthcare services practitioners team members and colleagues in other agencies. Overall, there was little change in the time spent in activities associated with assessment, review and monitoring and, more generally, liaison activities following the establishment of the new way of working (phase 3). However, with regard to assessment, significantly less time was spent information gathering from health services practitioners and significantly more time was spent assessing the needs of carers. More time was also spent gathering information from existing user records/discussing cases with adult social care practitioners but this was not a statistically significant difference.

Table 4

Staff time use before and after implementation of shared electronic record.

Phase 1 Historical cohort Phase 3 Intervention cohort – time 2 Significance

n (%) n (%)

Assessment activities
    Pre-assessment information gathering 12.8 (12.1) 11.0 (12.0) Z = 0.022, p = 0.983
    Interview patient 0.8 (0.7) 5.3 (5.8) Z = –1.217, p = 0.226
    Information gathering about patient from carer 5.9 (5.6) 4.8 (5.2) Z = 0.252, p = 0.802
    Further information gathering from patient/carer by telephone 0.7 (0.7) 1.5 (1.6) Z = –1.217, p = 0.226
    Assessing and documenting carer’s own needs* 1.3 (1.2) 6.0 (6.5) Z = –1.976, p = 0.051
    Information gathering from health services staff* 19.8 (18.8) 6.9 (7.5) Z = 2.314, p = 0.023
    Information gathering from other agencies 6.8 (6.4) 6.1 (6.7) Z = –0.085, p = 0.932
    Information gathering from adult social care staff and records 10.9 (10.4) 16.8 (18.3) Z = –1.592, p = 0.115
    Complete documentation 46.6 (44.2) 33.5 (36.5) Z = 1.098, p = 0.275
Total time spent on assessment 105.5 (100) 91.8 (100)
Review and monitoring activities
    Review in person 18.5 (48.1) 9.5 (33.9) Z = 1.158, p = 0.254
    Review by telephone 1.7 (4.3) 0.3 (0.9) Z = 0.820, p = 0.417
    Monitoring social care* 0.5 (1.3) 4.3 (15.2) Z = –2.172, p = 0.036
    Monitoring health care 17.2 (44.6) 9.5 (33.9) Z = 0.879, p = 0.385
    Review with other providers and agencies* 0.6 (1.6) 4.5 (16.1) Z = –2.189, p = 0.035
Total time spent on review and monitoring 38.5 (100) 28.0 (100)
Liaison activities
    Liaise with health services staff1* 38.7 (62.7) 18.8 (33.7) Z = 3.076, p = 0.003
    Liaise with adult social care staff2* 15.7 (25.4) 26.3 (47.2) Z = –2.419, p = 0.019
    Liaise with staff from other agencies*3 7.4 (11.9) 10.6 (19.1) Z = –3.253, p = 0.002
Total time spent on liaison activities 61.7 (100) 55.7 (100)

* p-value < 0.05.

1 Information gathering from health services staff, arranging and monitoring health care.

2 Information gathering from adult social care staff and records, arranging and monitoring social care.

3 Information gathering from other agencies and review with other providers and agencies.

With regard to monitoring and review activities, significantly more time was spent on review in conjunction with other providers and agencies. The categories of ‘review in person’ and ‘monitoring health service provision’ constituted the greatest proportion of time spent on review and monitoring at both time points. Following the establishment of the new way of working (phase 3) both were lower, although this was not a statistically significant finding. In terms of significant findings, less time was spent liaising with health services practitioners and more time was spent liaising with adult social care practitioners as a proportion of time spent on this activity. Findings also suggested more time was spent liaising with practitioners from other agencies and service providers. Liaising with health services practitioners comprised the majority of time spent on this activity before the establishment of the new way of working (phase 1), whilst liaising with adult social care practitioners constituted the majority of this time after it.

Impact of electronic information sharing on practice

The impact of electronic information sharing was explored from two perspectives: managers’ views of the implementation process and objective measures of practitioners’ well-being and job satisfaction. Findings in respect of the former were reported in relation to both the development of new ways of working and following its establishment. Both managers acknowledged that the introduction of the intervention was facilitated by the involvement of external consultants as part of a larger programme to reduce waste and improve administrative efficiency. Within the continuing healthcare services team, there was evidence of the new manager acting as a ‘change agent’ in terms of motivating and encouraging practitioners to review and appraise their working practices (Kanter, 1983). Team members felt that they had played an active role in the development of the new electronic record sharing system alongside their adult social care colleagues, rather than being recipients of an initiative.

I think we have been in control really… If the team weren’t happy… they would look at it again (health manager)

Nevertheless, it was acknowledged that the technicalities of accessing information across health and social care agencies as well as the formulation of information governance agreements and mechanisms to operationalise them were complex issues. A concern of social care practitioners was that the nurse practitioners were not taking responsibility for moving the continuing healthcare assessment forward. The new means of sharing information was said to have brought a greater appreciation of the process of assessment for continuing healthcare services funding by focusing attention on and clarifying the different responsibilities of the practitioner groups involved. This shared process was described as having removed the ‘mystique’ (health manager) that had previously surrounded the process and provided transparency, noted by the adult social care manager as a key aim.

The impact on the social work teams has been that they haven’t got that frustration. They know who to speak to … they don’t have to wait for the result of the panel process, they can go into the system … and they can see what the decisions were and follow that flow…. I think from our point of view it is a much improved process (adult social care manager)

Overall, the initiative served to improve the connection between the two agencies and their practitioners and to harmonise aims and expectations. This had been facilitated by the secondment of an experienced social worker to work alongside the continuing healthcare services team for several months. The two managers described both the processes themselves and the learning from them as having become fully embedded and become ‘the way that we do things’ (health manager). This was described as having provided the building blocks for future improvements in information sharing across health and social care in the locality.

Data on the impact of electronic information sharing on practitioners are reported in Table 5. It reports data on practitioners’ well-being and job satisfaction for members of the continuing healthcare services team before (phase 1 historical cohort) and after the new ways of working had been established (phase 3 intervention cohort Time 2).

Table 5

Staff well-being and job satisfaction before and after implementation of shared electronic record.

Phase 1 Historical cohort Phase 3 Intervention cohort – time 2 Significance

Mean (Range) N= Mean (Range) N=

Job insecurity –1.8 (–4 to 3) 8 –3.7 (–8 to 0) 7 T = 1.648, p = 0.123
Decision latitude 69.3 (54 to 84) 8 69.4 (60 to 74) 8 T = –0.586, p = 0.567
    Decision authority 34.5 (24 to 44) 8 35.5 (28 to 48) 8 T = –0.327, p = 0.749
    Skill discretion 34.8 (30 to 40) 8 35.7 (32 to 38) 8 T = –0.952, p = 0.357
Psychological job demands 38.3 (30 to 44) 8 35.8 (29 to 42) 8 T = 0.951, p = 0.358
Social support 25.0 (19 to 30) 8 25.0 (22 to 27) 8 T = 0.000, p = 1.000
    Co-worker support 12.8 (11 to 15) 8 12.4 (9 to 16) 8 T = 0.436, p = 0.669
    Supervisor support 12.3 (8 to 16) 8 12.6 (8 to 15) 8 T = –0.315, p = 0.757
Customer relationships 12.3 (7 to 16) 8 12.3 (10 to 16) 8 T = 0.000, p = 1.000
Self-identity through work 18.3 (15 to 23) 7 19.5 (16 to 23) 8 T = –0.962, p = 0.354
Job satisfaction 3.1 (2 to 4) 8 3.4 (2 to 5) 8 T = –0.650, p = 0.527

No significant differences in the Karasek sub-scales or in job satisfaction were evident in the small sample. Staff well-being and job satisfaction scores were high at both timepoints. Importantly, overall there were no adverse effects on the well-being and morale of team members reported consequent on the introduction of a shared electronic record.

Discussion

The aim of this mixed-methods study was to evaluate the implementation of a shared electronic record between nursing and adult social care practitioners employed in separate agencies and locations to inform the assessment of need for adults with complex health needs. It was conducted in a single site. Successful implementation of a shared electronic record between nursing and adult social care practitioners was achieved, demonstrating the importance of involving staff in the design and implementation of changed administrative processes. Electronic information sharing permitted more timely service delivery by promoting more efficient processes within formal working structures. Partnership working was facilitated by electronic information sharing because it permitted greater involvement of adult social care practitioners in the assessment and associated activities. Electronic information sharing had no adverse effects on the well-being and morale of practitioners within the CHC team.

However, there were a number of limitations. Firstly, with regard to patient data, the audit of case files was prescribed through the contractual arrangements with the funders, permitting only small changes to reflect local circumstances. There were missing data in the final patient sample, largely attributable to the frailty of the population, some of whom died during the assessment process or were unfit for discharge from hospital. Furthermore, the nature of the intervention and the circumstances in which it was conducted meant it was inappropriate to collect data from patients and carers about their experience, and sufficient carers could not be recruited to the study. Secondly, whilst staff survey data was collected from all team members, the small population limited the power to detect changes. However, the very small changes in the measures reported would be unlikely to result in a significant difference even in a much larger sample. Thirdly, the qualitative findings represented the perceptions of managers at a particular point in the implementation of the shared electronic record and in a single setting. Additionally, no data was collected on the practitioner perspective on the development of electronic information sharing between the two agencies. Fourthly, this was a study of intermediate and not final outcomes and the perspective of patients and carers was absent. Fifthly, since this was a case study focussing on the circumstances, dynamics and complexity of a single service, some caution must be exercised with regard to the generalisability of the findings (Bowling, 1997).

A strength of this study was that it sought to achieve ‘ecological validity’, ‘to make the research fitting to the real world’ (Banister et al., 1994: p. 5). Regular meetings were held with senior members of the continuing healthcare team to facilitate data collection and interpretation of findings. For example, this group determined the distinction between the phases of the research (Table 1) and identified the sample. Spurious differences between the first phase and phases 2 and 3 were not found in the analysis, in part attributable to the involvement of team members in data collection, preparation and analysis and interpretation of the findings. Furthermore, the significant finding relating to the time between assessment and the adjudication of the panel was identified by team members as reflecting the reduction in time between referral and completion of assessment because the dates of the adjudication panel were fixed (Table 3).

Whilst measures of change within the empirical data were small, they complemented evidence of successful implementation of the new technology reported in the manager interviewers. This is evidenced in a number of ways. The Decision Support Tool, a schedule mandated by central government which did not change during the period of data collection, was the basis for the audit of case files. Differences between data collected in phase 1 compared to phases 2 and 3 were primarily related to measures of integration, quality and timeliness of tasks, process indicators of improved performance (Table 3). Data from the staff survey relating to time use revealed that more time was spent liaising with adult social care practitioners and less time liaising with and gathering information from health services practitioners following implementation of the shared electronic record (Table 4). Furthermore, staff well-being and job satisfaction changed little as a consequence of the implementation of the new technology (Table 5). It was high before and after implementation of the shared electronic record, suggesting that there were no adverse effects on staff morale consequent of this change, another measure of the successful implementation of the shared electronic record.

These data also demonstrated that electronic information sharing within a multidisciplinary assessment can promote partnership working and administrative efficiencies through standardisation and transparency in collecting information to inform decision making, reflecting long-standing policy objectives (Department of Health, 2001; Department of Health and Social Care, 2018a). Practitioners did not exhibit reluctance to using the shared electronic record, and achieved their goal of ‘paperless working’ in contrast to other research (Saleem et al., 2011; Waterson, 2014). This is perhaps a consequence of health and social care practitioners being in separate locations and lacking the opportunity for and benefits of informal corridor conversations (González-Martínez et al., 2015). Other research conducted in a district general hospital within the same demonstration programme demonstrated that face-to-face transfer of information was preferable to electronic transfer between ward staff and discharge coordinators (Wilberforce et al., 2017). Van der Meiden and colleagues (2003) suggested that the successful implementation of a computer-based information system was a relative concept, perceptions of which may change over time. Therefore, it might be that successful implementation of this shared electronic record was setting and time specific.

Nevertheless this pilot project demonstrated that it was technically possible to integrate the assessment process within continuing health care with the parallel process within adult social care. Moreover, there were benefits in terms of administrative efficiency with no adverse effects on staff morale. However, whilst the potential for change was established, its potential for wider adoption could not be proved by virtue of it being a case study. Nevertheless there may be the potential for it to be replicated elsewhere because continuing healthcare services are a universal service that uses a standardised assessment process—the Decision Support Tool—and a substantial number of local authorities use the CareFirst local authority client record system that hosted CareView, which permitted practitioners in both agencies to view a single patient record (OLM systems Ltd., 2013; Department of Health, 2018b).

Conclusion

This case study evaluated the implementation of a shared electronic record between nursing and adult social care practitioners employed in separate agencies and locations to inform the assessment of need for adults with complex health needs. Whilst transfer of information between agencies is not new, electronic means to support the accumulation of assessment data within a single shared document, as demonstrated in this study, are. The potential for this to facilitate multidisciplinary assessment merits further consideration.

Acknowledgements

We thank nursing practitioners, administrative staff and managers for their contribution to this work and the clinical support officers for their contribution to data collection.

Source of funding

This article presents independent research funded by the Department of Health and Social Care. The views expressed in this article are those of the authors and not necessarily those of the Department of Health and Social Care.

Competing Interests

The authors have no competing interests to declare.

References

  1. Australian Government. 2018. Aged care Australia – types of care and services. Australia: Australian Government. Available at: https://www.myagedcare.gov.au/help-home/home-care-packages [Accessed 26 June 2018]. 

  2. Banister, P, et al. 1994. Qualitative methods in psychology: A research guide. Buckingham: Open University Press. 

  3. Bowling, A. 1997. Research methods in health: Investigating health and health Services. Buckingham: Open University Press. 

  4. Challis, D, et al. 2012. National evaluation of the Common Assessment Framework for Adults Demonstrator Sites volume 1: Service delivery and outcomes: literature review, discussion paper M268. Manchester: Personal Social Services Research Unit, University of Manchester. 

  5. Challis, D, Clarkson, P and Warburton, R. 2006. Performance indicators in social care for older people. Aldershot: Ashgate. 

  6. Chester, H, et al. 2015. Exploring patterns of care coordination within services for older people. International Journal of Care Coordination, 18(1): 5–17. DOI: https://doi.org/10.1177/2053434515571371 

  7. Corbin, J and Strauss, A. 2008. Basics of qualitative research: Techniques and procedures for developing grounded theory. London: SAGE Publications. DOI: https://doi.org/10.4135/9781452230153 

  8. Department of Health. 2001. National service framework for older people: modern standards and service models. London: Department of Health. 

  9. Department of Health. 2009a. Common assessment framework for adults: A consultation on proposals to improve information sharing around multidisciplinary assessment and care planning. London: Department of Health. 

  10. Department of Health. 2009b. The national framework for NHS continuing healthcare and NHS funded nursing care. London: Department of Health. 

  11. Department of Health. 2012. The power of information: putting all of us in control of the health and social care information we need. London: Department of Health. 

  12. Department of Health. 2013. Making sure health and social services work together. London: Department of Health. 

  13. Department of Health and Department for Communities and Local Government. 2014. Better care fund: Policy framework. London: Department of Health. 

  14. Department of Health and Social Care. 2018a. Care and support statutory guidance. London: Department of Health and Social Care. [Accessed 26 June 2018]. 

  15. Department of Health and Social Care. 2018b. National framework for NHS continuing healthcare and NHS-funded nursing care. London: Department of Health and Social Care. [Accessed 26 June 2018]. 

  16. DSS. 2017. Comparing means and proportions, data and statistical services. New Jersey: Princeton University. Available at: https://www.princeton.edu/~otorres/Excel/proportions.xls [Accessed 31 October 2017]. 

  17. Field, A. 2009. Discovering statistics using SPSS. London: SAGE Publications. 

  18. González-Martínez, E, et al. 2015. Hospital staff corridor conversations: work in passing. Journal of Advanced Nursing, 72(3): 521–532. DOI: https://doi.org/10.1111/jan.12842 

  19. IBM. 2017. IBM SPSS custom tables 24. New York: IBM Corporation. Available at: ftp://public.dhe.ibm.com/software/analytics/spss/documentation/statistics/24.0/en/client/Manuals/IBM_SPSS_Custom_Tables.pdf [Accessed 31 October 2017]. 

  20. Kane, RL and Kane, RA. 2000. Assessing older persons: Measures, meaning and practical applications. New York: Oxford University Press Inc. 

  21. Kanter, R. 1983. The change masters: Innovation and entrepreneurship in the American corporation. New York: Touchstone Books. 

  22. Karasek, RA. 1979. Job decision latitude and mental strain: Implications for job redesign. Administrative Science Quarterly, 24(2): 285–308. DOI: https://doi.org/10.2307/2392498 

  23. Lewins, A and Silver, C. 2007. Using software in qualitative research: A step-by- step guide. London: SAGE Publications. DOI: https://doi.org/10.4135/9780857025012 

  24. Loader, BD, Hardey, M and Keeble, L. 2008. Health informatics for older people: A review of ICT facilitated integrated care for older people. International Journal of Social Welfare, 17: 46–53. DOI: https://doi.org/10.1111/j.1468-2397.2007.00489.x 

  25. Lumley, T, et al. 2002. The importance of the normality assumption in large public health data sets. Annual Review of Public Health, 23: 151–169. DOI: https://doi.org/10.1146/annurev.publhealth.23.100901.140546 

  26. New Zealand Government. 2016. Support at home after a needs assessment. New Zealand: New Zealand Government. Available at: https://www.govt.nz/browse/health-system/help-in-your-home/needs-assessment/support-at-home-after-a-needs-assessment/ [Accessed 26 June 2018]. 

  27. OLM Systems Ltd. 2013. CareFirst: Service definition. London: OLM Systems Ltd. Available at: www.olmgroup.com/WorkArea/DownloadAsset.aspx?id=973 [Accessed 24 May 2016]. 

  28. Pope, C, Ziebland, S and Mays, N. 2000. Qualitative research in health care: analysing qualitative data. British Medical Journal, 320: 114–116. DOI: https://doi.org/10.1136/bmj.320.7227.114 

  29. Ritchie, J and Spencer, L. 1994. Qualitative data analysis for applied policy research. In Bryman, A, and Spencer, L. (eds.) Analysing Qualitative Data. London: Routledge. 

  30. Saleem, J, et al. 2011. Paper persistence, workarounds, and communication breakdowns in computerised consultation management. International Journal of Medical Informatics, 80: 466–479. DOI: https://doi.org/10.1016/j.ijmedinf.2011.03.016 

  31. Taylor, B. 2012. Developing an integrated assessment tool for the health and social care of older people. British Journal of Social Work, 42: 1293–1314. DOI: https://doi.org/10.1093/bjsw/bcr133 

  32. The Cabinet Office. 2003 Trying it out: the role of ‘pilots’ in policy-making. Report of a review of government pilots. London: The Cabinet Office. 

  33. Van der Meijden, M, et al. 2003. Determinants of success of inpatient clinical information systems: A literature review. Journal of the American Medical Informatics Association, 10(3): 235–243. DOI: https://doi.org/10.1197/jamia.M1094 

  34. Waterson, P. 2014. Health information technology and sociotechnical systems: a progress report on recent developments within the UK National Health Service (NHS). Applied Ergonomics, 45(2): 150–161. DOI: https://doi.org/10.1016/j.apergo.2013.07.004 

  35. Weinberg, A, et al. 2003. What do care managers do? A study of modern working practice in older people’s services. British Journal of Social Work, 33(7): 901–919. DOI: https://doi.org/10.1093/bjsw/33.7.901 

  36. Weiner, K, et al. 2003. Integrating health and social care at the micro level: Health care professionals as care managers for older people. Social Policy and Administration, 37(5): 498–515. DOI: https://doi.org/10.1111/1467-9515.00354 

  37. Wilberforce, M, et al. 2017. An electronic referral system supporting integrated hospital discharge. Journal of Integrated Care, 25(2): 99–109. DOI: https://doi.org/10.1108/JICA-09-2016-0034 

  38. Witham, M, et al. 2015. Construction of a linked health and social care database resource – lessons on process, content and culture. Informatics for Health and Social Care, 40(3): 229–239. DOI: https://doi.org/10.3109/17538157.2014.892491 

comments powered by Disqus