Digital Mental Health: the past, present and future

By Dr Faith Matcham

At the time of writing this, there were 52,564 healthcare and medical apps available to download on the Google Play store. According to a Deloitte report[1], global spending on mental health apps is likely to reach $500 million in 2022. Meanwhile, the smart-glasses, smart-textile and wearable electronics sector is expected to grow from $2.3 billion in 2021, to $6.6 billion by 2026[2]. This digital transformation has only been accelerated by the COVID-19 pandemic, with the lockdowns pushing everything online: our friendships, education, employment and healthcare. With an increasing emphasis on private-public partnerships and expansive funding calls for more collaborative, digital research to meet the growing demand for innovation, we may all need to start thinking about how we can integrate technology into our research and clinical practice.

picture of a phone with colourful apps coming out of the screen

My introduction into the world of digital mental health started in my research assistant post at King’s College London in 2011. I worked on a healthcare service delivery project called Integrating Mental and Physical Healthcare Services: Research, Training and Services (IMPARTS), supporting general hospital services to implement routine web-based screening for common mental health problems. Patients would complete a series of tailored questionnaires on an iPad while they waited for their appointment; results would be sent immediately to their Electronic Patient Record for their clinician to review during the appointment, with in-built flags for risk assessments, or where onward referrals to mental health services might be needed. At the time, it was technically complex and a lot of my time was spent liaising with IT services to find out why the 20 questions a patient just completed had not turned up in the system in the 60 minutes they spent waiting to see their doctor.

Fast-forward ten years and I have just finished a 6-year post-doctoral post working on the Remote Assessment of Disease and Relapse – Central Nervous System (RADAR-CNS) project. This involved collecting data from wearable devices, smartphone sensors and app-delivered questionnaires from people with three long-term conditions over the course of 3 years. Depending on the source of data, we could have new information to process and store about every individual every 10 seconds. This data could then be collated and turned into a summary of each participant’s behaviour, physiology, mood or cognitive function every day.

The incredible leap in technological capabilities between my research assistant days and post-doc position still amazes me. We went from having to stand in certain areas of the clinic to make sure we had a proper wi-fi connection, to being able to collect immense amounts of data from people with very little burden to the individual themselves, or the researchers running the project. The use of digital technologies to collect data, and provide personalised interventions is the future of healthcare and it’s exciting. It also heralds the potential to adapt and change our healthcare provision; these data provide rich insight into individuals’ daily lives and may allow us to develop novel interventions which target previously un-measurable characteristics.

However, for it to be meaningfully integrated into clinical care, there are questions which remain unanswered. First and foremost: can this technology provide something of intrinsic value to the patient? Can we improve self-management and a sense of empowerment over an otherwise unpredictable illness? Might having access to one’s own health data inadvertently increase health anxiety, increase inappropriate help-seeking behaviour or even trigger a deterioration in symptoms or relapse? How can we integrate high-volume data usefully into our existing healthcare infrastructures without over-burdening already over-worked healthcare professionals? How do we make sense of what the data mean, and what actions should be taken in response to it? If risk is detected via any system, such as an adverse event related to treatment or a report of intent to self-harm, who’s responsibility is it to intervene?  Who “owns” the data, and how much data is too much data?

I was thrilled to start a new role as a Lecturer in Psychology in the School of Psychology in April 2022. In addition to my teaching responsibilities, I am a developing a portfolio of research activities which are starting to address these questions.  These include a series of EPSRC grants collaborating with multidisciplinary teams across the UK to test different types of remote measurement technologies in different clinical and societal contexts. For example, one of the teams I am working with are developing a smart monitoring textile for measuring loneliness and isolation in an ageing population[3]. Another group are developing an online peer-support intervention for family caregivers, using experience sampling and wearable devices to monitor real-time changes in mood and physiology as carers interact with different elements of the virtual platform. My hope is to build on these existing collaborations and develop new ones with colleagues across the University, to develop a wider understanding of how these technologies can challenge and revolutionise the way we measure and manage health and wellbeing. The future of healthcare is digital, and it’s our responsibility to make sure it’s done thoughtfully, conscientiously and ethically.




Faith Matcham is a Psychology lecturer at Sussex and a Health Psychologist specialising in mental/physical comorbidity and the use of digital technologies to measure and manage chronic health conditions.

Tagged with: ,
Posted in Faculty research, Uncategorized

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.