Amy Vercell and Sarah Hanbridge Achieving long‐term sustainability of health and social care services will require investment in both people and technology. It is paramount to ensure that digital health systems and processes are implemented robustly, promoting interoperability between systems. Adequate education and support for healthcare workers accessing these systems must be provided, ensuring that they are clinically meaningful for the patient population to which they are directed. The cancer clinical nurse specialist plays a pivotal role in delivering cancer care; providing a point of access, often in a key worker role; ensuring that information needs are met; and delivering specialist holistic care to patients. Supporting nurses to be digitally literate to deliver digitally enabled care is crucial. This chapter explores the history of digital health, provides current examples of digital nursing and looks to the future of digital innovation. The first author, Amy Vercell, is a chief clinical information officer (CCIO) for Nursing and Allied Health Professionals with protected time for research; she completed a National Institute for Health and Care Research Pre‐Doctoral Fellowship last year and is now working towards a doctoral application. She also works as an acute oncology advanced nurse practitioner (ANP) one day each week. Previous experience has been varied across oncology and haematology services and included working on an inpatient haematology ward and as unit sister in outpatient systematic anti‐cancer therapy (SACT) delivery, acute oncology clinical nurse specialist (CNS), lead nurse for cancer of unknown primary (CUP), moving to acute oncology ANP upon completion of a master’s qualification. The second author, Sarah Hanbridge, is a CCIO at Leeds Teaching Hospital, England, United Kingdom (UK). Previous to this, Sarah worked at The Christie National Health Service (NHS) in Greater Manchester (a cancer specialist hospital) as the CCIO, and was the regional chief nursing information officer (CNIO) for the Northwest of England. Sarah has predominantly worked in acute care in clinical, operational, educational and digital roles. With an ever‐growing public health need, healthcare providers globally are being challenged to improve patient outcomes whilst containing costs, with digital technologies now being acknowledged as a key tenet in achieving this goal. Digital health tools can potentially improve the efficiency, accessibility and quality of care delivered to patients worldwide (Fahy and Williams 2021). Digital transformation enables a more holistic view of patient health through increased access to data whilst providing the tools to improve patient autonomy and self‐management (The King’s Fund 2022). Digital health refers to the use of information and communication technologies in healthcare to manage illness, reduce health risks and promote wellness (Ronquillo et al. 2022). It incorporates a broad scope of categories, including mobile health, health information technology, wearables, telehealth and telemedicine, artificial intelligence and machine learning, virtual visits and personalised medicine (Food and Drug Administration [FDA] 2020). The World Health Organisation (2021) established three key objectives to promote the adoption and scale‐up of digital health and innovation: translating data, research and evidence into practice; enhancing knowledge through scientific collaborations; and systematically assessing country needs to co‐develop innovations to ensure that they are fit for purpose and meet the requirements of the individual population. Digital health is perceived as a tool to improve access to healthcare, reduce inefficiencies in the healthcare system, improve care quality, lower costs and enable more personalised care to be delivered (Ronquillo et al. 2022). Ruth May, chief nursing officer (CNO) for the NHS England, UK, emphasised the importance of digital and data, proclaiming her support for every healthcare organisation to have a CNIO, identifying that this would enable significant developments within the digital sphere. Her strategic plan for research was announced in November 2021 and complements ambitions set out in ‘Saving and Improving Lives: The Future of UK Clinical Research Delivery’ (Department of Health and Social Care 2021) and the NHS Long Term Plan (NHS England 2019). Ms May identified five objectives that will promote the delivery of a person‐centred research environment that empowers nurses to lead, participate in and deliver research (NHS England 2021). Creating digitally enabled nurse‐led research is key to delivering better outcomes for the public. This work is being supported by The Phillips Ives Review, which aims to prepare the nursing and midwifery workforce to deliver the digital future. In the context of the UK, Professor Natasha Philips is currently the national CNIO for England. The Topol Review (2019) provided a view of digital healthcare technologies and recommendations that will enable NHS staff to make the most of innovative technologies such as genomics, digital medicine, artificial intelligence and robotics to improve services with the projected impact on the NHS workforce over the next 12 years. This nursing‐focussed review, which plans to publish its findings at the CNO Summit in 2023, will build upon Topol’s findings and recommendations to determine the requirements of the nursing and midwifery workforce to deliver healthcare in the digital age during the next 5, 10 and 12 years. Until recently, nursing informatics was one of the lesser‐known nursing specialties, even though it has been a recognised specialty for over 30 years (Kirby 2015). It may be surprising how nursing informatics evolved, but the concept of informatics can be traced back to Florence Nightingale during the Crimean War when she recognised the importance of documenting patient care to monitor a patient’s progress. This ingenious notion may have seemed radical at the time, but we now understand the importance of documenting patient care and collecting patient data (Kirchner 2014). Clinical nurses have many opportunities to get involved during the electronic health record (EHR) selection and implementation process, and this stakeholder involvement improves engagement and acceptance. In general, phases of the process include system selection; system design and development, including current and future state validation, testing and education implementation or ‘go live’; and ongoing maintenance and optimisation (Coiera 2003). Through partnerships and collaboration with other professional leaders, the CNIO can present a unified message and direction for innovation focused on safety, quality and efficiency (American Nurses Association 2015). The Royal College of Nursing (RCN) kick‐started the campaign ‘Every nurse an e‐nurse’ during NHS Digital’s Nursing Week (RCN 2018). The aim of the campaign was to transition every nurse into an e‐nurse by 2020, with a focus on training for various aspects of e‐nursing, including patient bedside technology, wearable technology, mobile health and data security (Stevens 2018). Delivering healthcare is complex, and optimising quality and safety is the priority. One of the first examples of using digital technology to improve care was the introduction of electronic observations. This technology facilitates a real‐time clinical assessment to be conducted at the patient’s bedside, enabling prompt detection of any change in the patient’s condition and allowing the required intervention to be implemented swiftly to attenuate clinical deterioration (Lang et al. 2019). Prior to electronic observations, vital signs were documented on paper observation charts, and incorrect calculation and interpretation of the Early Warning Score (as it was then), combined with poor compliance in clinical escalation protocols, had a detrimental impact on morbidity and mortality (Schmidt et al. 2015; Wong et al. 2015). Electronic observations ensure that complete data is inputted, and the National Early Warning Score 2 (NEWS2) is automatically calculated, with clinical recommendations provided based on that score, optimising outcomes (Royal College of Physicians 2017). The innovative technology allows real‐time, automatic information processing with the aim of improving the efficacy of Early Warning Scores in practice and provides greater visibility of key patient data. Traditionally, health records were written on paper and maintained in folders divided into sections based on the type of note, with only one copy available. New computer technology developed in the 1960s and 1970s laid the foundation for the development of the EHR. The use of EHRs has not only made patients’ clinical information more accessible but also changed the format of health records and, thus, changed healthcare (Evans 2016). One of the first and most successful attempts to streamline and improve the keeping of patient records was in the American problem‐oriented medical record (POMR). Developed by Dr Lawrence Weed in 1968, POMR is still used by some medical and behavioural health providers today. In 1972, the first iteration of what we now know as an EHR was introduced. The Regenstrief Institute in Indianapolis enlisted the help of Clement McDonald to develop its EHR programme. McDonald set out to tackle a twofold problem: the design of the database, with multiple issues that arise when attempting to link healthcare organisations, disciplines and professions through one central records system; and allowing for the full spectrum of capabilities, such as medication ordering, laboratory tests and radiology. By the early 1990s, most American industries had taken the plunge into automating data and transactions. Healthcare, on the other hand, was struggling to keep up. In 1991, a book titled The Computer‐Based Patient Record: An Essential Technology for Health Care (Institute of Medicine [IoM] 1991) shook the industry out of complacency and helped drive the adoption of EHRs by breaking down challenges associated with technology. In response to advances in technology, a shift from paper‐based health records to electronic records commenced in the UK (IoM 1997; Evans 2016) as the inadequacies of paper‐based health records gradually became evident in the healthcare industry (Ornstein et al. 1992). In 2002, the UK government chose a top‐down, government‐driven approach to implement one EHR nationally, known as the NHS Care Records Service; this was the cornerstone of the £12.7 billion National Programme for Information Technology (NPfIT) investment (Maude 2011). This very ambitious programme required enormous resources that proposed to eliminate the challenges of interoperability between various competitive EHR systems around the UK, reforming the way the NHS uses information to improve the quality of care being delivered (Justinia 2017). However, failings became apparent early in implementation in terms of an overambitious design without the required skills and strategy to deliver, and the programme was consequently dismantled in 2011. This failing highlights the necessity to ensure that digital systems and processes meet the needs of the population they serve, emphasising culture and leadership. One of the biggest advances in recent years regarding home blood sugar monitoring has been the introduction of the continuous glucose monitor (CGM), which is an alternative means of measuring glucose levels subcutaneously (Fain 2022). It uses a sensor placed on the upper arm and worn externally by the user, allowing glucose information to be monitored continuously. This information helps the patient and their multiprofessional team to identify what changes are needed regarding insulin administration to achieve optimal glucose control, which allows patients to quickly assess glycaemic patterns and prevent a hypoglycaemic episode. CGM devices comprehensively optimise diabetes management by reviewing activity levels, medication and insulin dosing, food intake and stress to provide patients with information about how self‐care decisions affect glucose levels. This promotes patient empowerment and gives them much more control over their diabetes management. Rather than performing capillary stabs numerous times daily to monitor glucose levels, patients with a CGM device insert a small sensor wire under the surface of their skin with an automatic applicator and secure it with an adhesive patch. A glucose‐oxidase platinum electrode attached to the sensor measures the glucose concentration in interstitial fluids throughout the day and night. A transmitter sends real‐time glucose readings wirelessly to a handheld receiver or compatible smart device (or insulin pump), which displays current glucose levels and trends (Fain 2022). Personal CGM systems measure blood glucose continuously and record readings every five minutes. Unlike self‐monitoring, which provides a snapshot or point‐in‐time glucose measurement, CGM devices report data in numerical and graph‐like formats, indicating current glucose levels and possible patterns. This allows patients to respond quickly to prevent acute glycaemic episodes and make informed decisions about insulin dosing and self‐management. Corticosteroids are a key tenet within cancer services for the myriad of complications associated with anti‐cancer treatments; consequently, steroid‐induced hyperglycaemia and steroid‐induced diabetes mellitus are becoming more frequently seen within the cancer population (Jeong et al. 2016). Blood glucose monitoring is advised when commencing patients on steroids; however, there appears to be a disparity in practice due to a lack of clear clinical guidelines (Dinn 2019). The cancer CNS plays a significant role in educating patients on symptoms associated with hyperglycaemia and how to monitor their blood sugars using the equipment that has been provided whilst liaising with the primary care team and/or endocrine team regarding any required interventions to optimise patient care. There was a time when prescribing was the realm of the medical profession. In May 2006, appropriately qualified/registered nurses gained access to the full British National Formulary (BNF), giving them similar independent prescribing capabilities as medical practitioners. Now non‐medical prescribing in the UK is well‐established as a mainstream qualification (it is the focus of Chapter 12 in this book). The ambition behind this expansion began in July 2000, when the Department of Health published ‘The NHS Plan’. It promised to create new roles and responsibilities for nurses and provide more opportunities to extend their nursing roles. To qualify as a non‐medical prescriber, nurses must undertake a recognised Nursing and Midwifery Council (NMC) accredited prescribing course through a UK university. Upon successful completion, the qualification must be registered with the professional body, such as the NMC in the UK. Evidence shows that nurse prescribing improves patient care by ensuring timely access to medicines and treatment and increasing flexibility for patients who would otherwise need to wait to see a medical doctor (Drennan et al. 2009). Electronic independent nurse prescribing has been transformational for the cancer CNS. Electronic prescribing enhances quality and safety by reducing prescribing errors, increasing efficiency and supporting healthcare costs (Porterfield et al. 2014). Similarly, cancer electronic prescribing systems enable improved access to drug reference information, warnings and alerts, with improved efficiencies for pharmacy and drug administration (NHS Transformation Directorate 2017). With new models of cancer care emerging and evolving, there is a clear clinical need for more effective information sharing between and across care settings. This is especially apparent in CNS roles when managing patients with complex cancer needs, to optimise clinical outcomes and improve the quality of care. Enabling nurses and other healthcare professionals to order tests and procedures electronically has been a positive step towards the future of pathology, with the overall objective of improving patient care. It has enabled clinicians to access pathology, microbiology and cytology tests and results across healthcare by receiving the right information at the right time and place, to support improved clinical decision‐making and patient safety. Many cancer CNSs involved in the delivery of SACT order blood tests for patients prior to each cycle of SACT treatment. Having electronic order sets increases efficiency within the SACT pathway, as it ensures that the correct blood tests are ordered, with results displayed within the EHR to promote accessibility for the clinical team whilst illustrating any trends in blood results. This trend is of value when assessing tumour markers, for example, as rising tumour markers can indicate disease progression (Liu et al. 2021).
15
Digital Health
Abstract
15.1 Introduction
15.2 The Role of the Informatics Nurse/Chief Nursing Information Officer
15.3 Electronic Observations
15.4 Electronic Health Records
15.5 Digitalisation of Blood Glucose Monitoring
15.6 Electronic Nurse Prescribing
15.7 Nurse Digitally Requesting Bloods
15.8 Remote Consultations