Should technology be at the centre of wellbeing?

Over recent years, we have seen rocketing numbers of students struggling with mental health issues, with many contributing factors, not least the pandemic. But, as technology continues to weave itself through our day-to-day existence, how far could – or should – it be involved in mental health and wellbeing?

Labels and Labeling thumbnail

Technology and wellbeing

Schools across the globe are increasingly leaning into EdTech, reflecting the modern employment market and society in general. However, wider technology plays a complex role in terms of wellbeing. There’s no doubt its use (e.g. social media, excessive online gaming and so on) has actively contributed to poor mental health for some students, with out-of-control screen time meaning they can’t switch off – yet websites can signpost people to online resources and specialist tech coaching apps are available that can lead them back to better wellbeing.

Finding a balance

between online and offline Ensuring young people understand that technology is a tool and not the solution to their issues is critical. In a vulnerable mindset, it’s not easy to see that balance, especially if tech is giving you the support you think you need at a particular time.

In the absence of a qualified practitioner’s advice or a defined ‘end of appointment’ where people have the space reflect on what has been discussed, tech use for wellbeing could soon spiral – disrupting sleep and further exacerbating any anxiety or stress a student may be experiencing. Such use is tricky to self-regulate and could lead to dependency in place of being able to take time away to process and work things through independently.

Part of feeling ‘seen’ and connected is finding people in similar situations, such as in chat groups and forums. In moderation, these could be helpful places for young people to gain perspective. Still, there is a risk that excessive use could reduce face-to-face human connection, which is critical for full and effective support, socialisation and interaction.

The promise of AI

As we increasingly talk to our smart technology, AI may appear the perfect tool to help with wellbeing, where talking through emotions and feelings is critical. And chatbots can do just that: chat! Specialising in delivering solutions in natural language, apps like Google’s Gemini are now on people’s smartphones and accessible to all. They can be a listening ear when there is no other, providing the support people need when they need it – 24-7 or when they cannot access human-led services. They can help people share their worries, direct them to the resources they need and provide anonymous help if there is fear or stigma around their problems. Students may feel relief about not having to confide in someone they know about issues they are experiencing, especially if discussing them is taboo. It sounds like the perfect solution.

The problem is that general chatbot AI is simply not ready for this role yet.

With AI, there’s no patient confidentiality or privacy. People’s problems are merely ‘data’ to the tech systems involved and are more likely than not being used as training material for AI systems. Are we comfortable with our most private thoughts and feelings being used that way?

Then there are the known issues of bias and ethics – and there is still some way to go to develop AI so that it delivers what we want to see in these areas. (After all, we haven’t set it a great example to follow). An additional issue in this category is AI hallucinations. Can we be sure that the advice any AI system may give students in this delicate area is correct and credible? Is there a risk it could lead them into harm or make things worse, as it did for this reported case in the USA?

Knowing the value of real and virtual

So, should tech be at the centre of wellbeing? It’s accessible, convenient and can fill the gap created by the unprecedented demand for therapist appointments. Used with moderation and balance, it could prove a useful tool. For now though, it’s a matter of finding a balance and making an informed judgment on the kind of tech we turn to for mental health and wellbeing support. Apps for mindfulness and meditation can indeed be helpful, but turning to AI for full-blown support is not advisable at this early stage in its development and with the regulations around privacy and data still being formulated.

The best option for authentic emotional support will always be a human therapist, who brings years of specialist training to a session and can respond to the nuances of issues in ways that AI cannot. So many cases of mental health distress arise from a feeling that people ‘don’t belong’ and, therefore, prioritising in-person interactions and face-to-face conversations can help us learn how to interact and find our place in the real world. We will always need to know how to relate to each other to work together, live as family units and participate in our communities; these are simply not solutions we can find in technology.

However, in the future, with all its issues ironed out, we can potentially look forward to AI being a complementary tool to help with wellbeing, working alongside human counsellors and providing immediate support when needed. It will certainly have a place. Right now, however, discussing these issues with students as part of their digital citizenship education seems wise, so that they understand that this is one case where technology may not currently be the best option.

Written by Al Kingsley, CEO of NetSupport

He is the CEO of the EdTech company, NetSupport, with more than 30 years of experience in educational technology and digital safeguarding. Passionate about education—particularly educational technology, governance, and improving organizational performance—he has taken on numerous roles beyond his responsibilities at NetSupport. These include serving as Chair of a Multi-Academy Trust and an AP Academy, sitting on the Regional Schools Director’s Advisory Board for the East of England, and acting as Chair of the county SEND Board. With over 20 years of governance experience, he is also a FED Council member, Chair of the BESA EdTech Group, and Chair of his regional Business Board.