Categories
The Dark Forest of the Internet The Next Computer Wellness when Always-On

Mental health tech cannot be like regular tech – Part 4

(Part 3 – My personal experience with seeking help with my mental health)

I’d called it simply Someone To Talk To.

Someone To Talk To – a chatbot

Background:
Several – most? – people with anxiety or mild depression would benefit tremendously from just having someone to talk to. Someone who is receptive and non-judgemental, and doesn’t have any other relationship with the person. It’d be ideal if people had someone like this to talk to in person, but it is impractical: not just because it involves travel for one or both people but also that it requires scheduling and therefore a set time, and that people may feel the need to talk at any point in the day. The mobile phone, a deeply personal device available 24×7, is ideal.

Now, this need goes far beyond mood-logging apps that ask for a rating or emoticon to describe one’s mood. Not only do people’s emotions vary significantly during the day (especially those that are anxious or mildly depressed), not only are they more likely than not to rate their overall day as negative, thereby further feeling down at their own constant negative rating of their days, but more importantly that their needs go beyond such a one-point (or even multi-point) rating. They need an outlet for thoughts and anxieties and fears, to put in words.

Another approach to this is journaling apps, including visually beautiful ones like Day One. Journaling apps take on many types, inluding 5 Minute Journals and Gratitude Journal. But the prospect of filling up a blank screenful, especially day after day, is too often overwhelming for someone who has low emotional bandwidth in the first place.

Here are the characteristics of what I think will make a good Someone To Talk To:

It will be conversational. It’ll feel like a chat (and in fact will be one, with pictures and links sent both ways just like in a regular chat) instead of a set of screens and buttons. The bot will be able to organize what one says, with context, into a journal of sorts for the person’s reference any time later. The person will be able to share with the bot not only text, but also pictures and other media. The bot should be able to respond with at least some context – gentle encouragement or reassurance – but must take great care to not overdo it or sound artificial. Not responding is better than responding like, well, a bot. The bot will also learn about the person over time – what the person seems to like and not, and whether that changes over time, specific people, persons, places that the person refers to and their relationship to the person. The person may give them appellations that are not their real names/descriptions, and that is by choice. The bot will know this.

It will be empathetic and sensitive. It will be designed knowing that a regimental approach of asking for a mood update, or a diary entry at the same time every day, as many apps seem to do, is counter-productive and causes more stress than it relieves. It will be designed knowing that on occasion a notification or picture or video or piece of music can trigger anxiety/sadness/distress in people and they may not even recognise it as such, much less know why. Finally, it will know when the person is looking for a response from it, and when it simply needs to ‘listen’, providing occasional acknowledgement of its ‘presence’.

It will be gently intelligent. While it will often initiate conversation, it won’t ask with every interaction if it felt right, or if the person liked it or not. Being the one expected to be ‘in charge’ of the bot-human relationship can feel challenging. If the bot is designed to be a stand-in for a human companion, it must do better than ask for feedback often (wouldn’t it be stressful to have a friend do so?) Just like making decisions, being made to pass judgement also brings pressure. It will never expect anything from the person it engages with, such as a response in a certain amount of time, or at all. It will also be able to gauge improvement or decline in mood over time and adjust accordingly.

In addition to being empathetic, sensitive and intelligent, it will also be realistic. It will never make empty promises about things always getting

As things turned out, I returned to my day job and never actually worked on this beyond a few simple test versions with a couple of other collaborators. But this, still, is exactly how I’d go about bridging the gap between dealing with mild mental health issues on one’s own, and full-fledged therapy.

(ends)