On this page we will be reporting on our funded projects as they progress, and as they come to fruition.
- ‘ExTRA-PPOLATE: (Explainable Therapy Related Annotations: Patient & Practitioner Oriented Learning Assisting Trust & Engagement)’, Mat Rawsthorne, Jacob Andrews et al.
- ‘BREATHE – IoT in the Wild’, Professor Katharine Willis (University of Plymouth) et al.
- ‘Emotional AI and Children: Ethics, Parents, Governance’, Prof A. McStay (Bangor University) and G. Rosner (IoT Privacy Forum)
Our Future of Mental Health theme has funded the ExTRA-PPOLATE project, which explores HDI approaches to the creation of an automatic coding tool that can help therapists improve their practice, and importantly, is trusted by stakeholders such as therapists and patients. The multidisciplinary research team includes Mat Rawsthorne and Jacob Andrews (Principal Investigators) from the NIHR Mindtech MedTech Co-operative, Sam Malins (Clinical Psychologist), Dan Hunt (Linguist) and Jeremie Clos (Computer Scientist) from Nottingham University, as well as Tahseen Jilani (Data Analyst) from Health Data Research UK and Yunfei Long (Computer Scientist and natural language processing specialist) from the University of Essex.
Sadly, the fallout from COVID-19 is expected to affect mental health for years to come. Large quantities of high quality therapy are going to be needed. This means carefully assessing therapy sessions’ effectiveness, whilst also thinking carefully about how therapists’ time is spent. Currently, therapy assessment is resource expensive, often requiring a second and more senior therapist in the room. This second therapist could instead be seeing another patient, and their presence can also alter the dynamic between the patient and their therapist. The ExTRA-PPOLATE tool offers to assist this problem using machine learning. As an augmented intelligence system, ExTRA-PPOLATE aims to help therapists assess their sessions and support decision making, identifying weak spots and suggesting ways to improve the sessions. The legibility of ExTRA-PPOLATE’s algorithm is also important to the research team, and the model offers insight to therapists on how it has reached decisions, allowing the therapist to make corrections.
In order to create the ExTRA-PPOLATE tool, natural language processing techniques were applied to therapy transcripts to identify features associated with different psychological processes in therapy. Features included sentence length, emotive words, sentence polarity and readability scores. To train the ExTRA-POLLATE tool, machine learning was then applied to the numbers generated from these techniques. ExTRA-PPOLATE can now be used to identify processes that are (or aren’t) happening in sessions, although further work is needed to validate the tool.
The involvement and engagement of therapists, patients and the public have been priorities for the ExTRA-PPOLATE team throughout the project. Two PPIE (Patient and Public Involvement and Engagement) members with lived experience of mental health difficulties have been involved since the project’s inception. Also involved are a PPRG (Patient and Practitioner Reference Group), made up of patients, carers, psychotherapists, therapy trainers and therapy managers. With the help of Matt Burton McFaul from Virtual Health Labs, the ExTRA-PPOLATE team has conducted three interactive online workshops with the PPRG, to help reflect on and reassess the project, specifically looking at issues of transparency and trust. These sessions have been crucial in informing the tool’s development and enabled the implementation of HDI approaches that increase stakeholders’ agency, legibility and negotiability capabilities around personal data, differentiating the project from approaches that are exclusively data-driven.
Key changes to the ExTRA-PPOLATE tool have been informed by the workshops. For example, patients and carers indicated that if therapists were to review each output for accuracy, the resultant codings would be subjective only to the therapists’ choices and professional biases, preventing any opportunities to negotiate and discuss interpretations of the output with patients. In addition, therapists thought it impractical to spend extra time reviewing the coding after seeing each patient. Thus, the system is now being approached as a tool to provide an indication of where therapists could improve, without them having to check each automatic coding.
Patients and carers also explained that they would like their own view of the system readout and would like to be able to review the processes occurring in their sessions together with therapists, to provide agency and enable negotiability in how therapists’ practice should be changed. Finally, patients suggested that a small part of a therapy session could be analysed together with patients, such that the patients had a better idea of what the system was doing, permitting them greater legibility.
The feedback from the workshops is thus showing a design trade-off between legibility of this system, and usability and practicality, when used by stakeholders. Tensions and frictions can emerge in clinical contexts where strong power dynamics are present, and limited resources may challenge patients’ agency. PPRG members stated that while patients should not be required to invest significant amounts of time to understand how their data is being used, there is still a requirement for them to be enabled to understand this, such that informed consent can be legitimately received and patients have negotiability in choosing whether they consented to the use of the system with their data or not.
Therapist members of the reference group explained that attempts to show the workings of the tool, here provided as short sentences describing the language features that have been used to predict the identified psychological process, do not inform action that can be taken by any stakeholder, and thus while increasing understanding of how the tool has identified particular processes, do not provide a practical purpose. Other explainability features of the system, including an indicator of which processes should be used more frequently within a psychotherapy session, and which less, were seen as more useful features by steering group members.
The project team are currently reviewing the output from these workshops to reflect on how best to move forward with the practical design of the system while moving away from traditional data-driven approaches. Our objective is to keep championing legibility, negotiability and agency for patients in future related projects that aim to improve therapeutic interventions and practices. For more on ExTRA-PPOLATE’s interface, from the second of these three workshops, see here.
On the 20th April, there is a (online!) roadshow event, aimed at those categories of people that form part of the PPRG (patients, therapists, therapy trainers, service managers).
To register for the roadshow, email Alice.Phillips@nottingham.ac.uk
If you want to know more about the project and have any questions, please feel free to email: firstname.lastname@example.org
The pandemic has imposed many obstacles that have needed innovative workarounds from those involved across our projects, in order to continue carrying out their cutting-edge research. There is one particular project in our second theme, Beyond ‘Smart Cities’, that we focus on for this showcase, because its deep entanglements with COVID-19, though a barrier to the project, also reveal its work to be ever more urgent. The project is BREATHE – IoT in the Wild by Katharine Willis (School of Art, Design and Architecture, University of Plymouth), with RA Marcin Roszkowski, the Breathers support group, the ERDF-funded EPIC eHealth project, the Hi9 start-up, the South West IoT Network and the U. Edinburgh IoT Network
Taking place in isolated and rural communities in Cornwall, BREATHE aims to test the value of smart technology to people with health problems in those communities, to enable them to breathe more easily. Participants are given wearable blood oxygen readers, and are able to manage their own health data, facilitated by an IoT test bed network. Through this approach, the processes of data collection, sharing and assessment are put into the hands of the people the data concerns.
This project investigates the challenges of creating an IoT network in a low-connectivity, rural area, and also the potential of such a network for improving health outcomes for those living in more rural and isolated settings. The pandemic has both hindered and highlighted the need for these investigations.
At the beginning of 2020, BREATHE partnered with the Liskeard and Wadebridge Breathers support groups to cooperatively plan, conduct and assess fieldwork. Breathers is a patient–run group in which people with Chronic Obstructive Pulmonary Disease (COPD), and similar long-term conditions, can get together to share experiences and take gentle exercise. Participating members were given pulse oximeters, and were asked to record their blood oxygen readings, alongside keeping a diary of readings and comments on their health. The next stage of the fieldwork had to be put on hold due to the COVID pandemic, as these vulnerable groups were required to shield.
Since March, the focus has been the technical development of prototyping a LoRAWAN IoT network, and linking it to the wearable blood oxygen readers, alongside an interactive speaker that people can speak to and get reports about their data from. Moving forward, the plan is to focus back into the community when the pandemic allows it, trialing a pilot of this IoT network ‘in the wild’ with the Breathers groups, alongside running a data ethics workshop.
Whilst the pandemic has hindered the follow up trial and community–driven aspect of the fieldwork, it has at the same time put breathing health, and the need for support networks, firmly on many organisations’ agendas, clarifying the necessity of this project.
An additional arm of this project is to assess its replication in isolated communities elsewhere in the UK, through partnership with U. Edinburgh’s IoT test bed network, working in the Highlands and Islands of Scotland. As with Cornwall, this part of the project has also been encumbered by COVID-19.
However, as this project progresses in the midst of this pandemic, the ethical challenges it addresses are becoming more urgent. It matters how people access and share their data, especially highly personal data such as their health data, which can have large consequences if shared with commercial companies. Accessing personal data about breathing can also help self-awareness of the condition and allow data sharing with health professionals to help manage symptoms.
Traditionally, testing smart sensor networks ‘in the wild’ has not been widely undertaken, and this project seeks to create a model for data sharing, where people and communities share data to help both the individual, the group and more widely for the treatment of particular health conditions.
On a technical level, the project aims to demonstrate the benefits of IoT networks ‘in the wild’ with a view to new products and services being created, to innovate off-grid data sharing for health policy guidance on data ethics and truly benefit patients and people living with health conditions in rural and coastal communities.
We very much look forward to seeing this project progress, as it sheds light on the potential for improved health for those in rural communities, through applying HDI design principles.
Professor Katharine Willis
Professor Willis is Professor of Smart Cities and Communities and part of the Centre for Health Technology at the University of Plymouth. She leads on the UKRI-funded Centre for Health Technology Pop-up. Over the last two decades she has worked to understand how technology could support communities and contribute to better connections to space and place. Her recent research addresses issues of digital and social inclusion in smart cities, and aims to provide guidance as to how we can use digital connectivity to create smarter neighbourhoods.
One of the first projects to come out of the AI Intelligibility and Public Trust call out is Prof A McStay’s (Bangor University) and G. Rosner’s (IoT Privacy Forum) report on Emotional AI and Children: Ethics, Parents, Governance, which has already informed and been cited in UNICEF’s white paper Policy Guidance on AI for Children.
A relatively new form of artificial intelligence, Emotional AI in children’s toys is expected to become increasingly widespread in the next few years. Through measuring biometrics of a child (such as heart rate, facial expression, and vocal timbre) these ‘intelligent toys’ have potential to improve learning, detect and assist with developmental health problems, assist with family dynamics, help with behaviour regulation, and diversify entertainment. However, being placed at the heart of a vulnerable and delicate stage of life, it is critical that emotional AI and the legal frameworks surrounding them are legible and negotiable. They should provide children and parents with agency every step of the way, in order to prevent the exploitation of children (and their parents) via emotional data.
Using AI to assess and act upon the complex emotional and educational development of a child is no simple task, and with algorithmic complexity comes obscurity. We must ensure parents are equipped to understand the implications these systems and the data they harvest have. It is one thing to attend to a child’s emotional state in a playful or educational setting, but what happens when a child’s emotions inform content marketed to that same child? What happens when the turbulent emotional development of childhood is used to profile that person in later life, affecting chances of employment or insurance? The reductive nature of AI in a parental role is concerning too, as is the commercialisation of parenting and childhood more generally.
Alert to both the positive and negative potentials, this report sets out to explore the socio-technical terms by which emotional AI and related applications should be used in children’s toys. Based on interviews with experts from the emotional AI industry, policy, academia, education and child health, it is recognised that there are serious potential harms to the introduction of emotion and mood detection in children’s products. In addition, the report pays particular interest to the views of parents, and carries out UK surveys and focus groups (the latter with the help of Dr. Kate Armstrong and the Institute of Imagination in London). Through these, McStay and Rosner outline the necessary frameworks for emotional AI in toys. Problems and potential solutions discussed include the issue of current data protection and privacy law being very adult-focused, and not comprehensive enough to address child-focused emotional AI. The UN’s Convention on the Rights of the Child is also recommended as a valuable guide to the governance of children’s emotional AI technologies. It is also suggested that policymakers should consider a ban on using children’s emotion data to market to them or their parents.
In short, emotional AI design must be informed by good governance that reviews and adapts existing laws, and must be built around fairness to children, support for the nuanced roles of parenting, and care when involving AI in our early inner lives.
Read the full report and findings here.