Skip to content

Eliza: it’s all in the mind

How would you feel about using artificial intelligence (AI) to help manage your mental wellbeing? And how much personal information would you be willing to hand over to the company behind it, if they promised it would enable them to better support you?

Seeing the rise of automation and AI in the IT industry has given me an interest in the ‘people’ aspect of my work and questions like this. We’re realising that it isn’t just about computers any longer: technological progress comes with a range of benefits which make our lives seemingly easier, but also has the potential to affect society in unforeseen ways. Younger generations are the first to have grown up with modern technology since the day they were born and seeing how this is changing our relationship with it is fascinating.

This explains why I’ve been keen to play Eliza for a while now. Released in August last year, Zachtonic’s visual novel tells a story about mental health and AI which encourages the player to think about how much of themselves they’re willing to give over to technology. Today’s post isn’t a review as such but more a discussion about some of the questions raised – ones which have been bouncing around in my head since completing the title a few days ago and for which I’m still don’t I have the answers.

The narrative centres on Evelyn Ishino-Aubrey, a woman in her mid-thirties who left a promising IT career three years ago to now resurface as a proxy for a virtual counselling app called Eliza. Her clients attend sessions with her at the coffee-shop-like centre in Seattle where they reveal what is troubling them and it’s her job to make them feel they’re being listened to. The thing is though, Evelyn has no autonomy over how she responds; all her answers are provided in real-time by an AI through a pair of augmented-reality (AR) glasses.

The protagonist admits early in the game that she’s comforted by this lack of freedom because it means she ‘doesn’t have to think’. For the player though, it feels disturbing. The system monitors the clients’ heart-rate, vocal distress and keyword use so it can present the best response to keep the conversation at optimum level. None of it is of use to Evelyn though and it isn’t long before doubt starts to creep in: who is this app really designed to help? Does it have a patients’ best interests at heart or those of the company behind it, Skandha?

Although sections of the game take place outside of Evelyn’s workplace with friends and colleagues, a lot of time is spent with clients in these counselling sessions. You’re unable to make any choices during most of them and must simply select each answer generated by Eliza as a proxy. The more time I spent doing this and effectively behaving like a robot, the more I started to feel uncomfortable. How could a service this impersonal and based only on data analysis truly care for anybody with a mental health issue?

Eliza, video game, man, Darren, sad, artificial intelligence, AI, augmented reality, AR, counselling session, therapy, office

For example: an early patient is Darren, a young man who is worn down by the existential burden of living in a damaged world where people are cruel to each other and those in charge care about nothing but themselves. He’s clearly distressed and crying out for a human connection, but the support Evelyn is able to provide through Eliza felt so far removed from the thing he seemed to need the most. Skandha may promote their app as bringing therapy to those who otherwise couldn’t access it, but I wasn’t so sure it was helping.

Later we see a client named Maya, a female artist who experiences anxiety issues and is struggling because she feels her work isn’t being noticed by anyone in the comic industry. Occasionally she asks questions or makes comments that Eliza is unable analyse properly and it’s here that the limitations of the software begin to be exposed. Instead of generating a direct response, it instead ignores what Maya has said and pushes her back into the strict discussion format of introduction-discovery-challenge-intervention-conclusion.

Then we have Holiday. At first it appears that she is a lonely older lady who has come to check out the counselling centre just to be nosey. Once she signs over access to her personal communications however it becomes clear her situation is far worse she’s letting on during her sessions: she’s in debt, estranged from her children and entirely alone. But instead of offering her channels of support for these problems, Eliza doles out its standard advice of meditation games and possible medication.

As well as examining the effectiveness of such an app, the huge amount of the personal and sensitive information collected through it is highlighted too. Evelyn says herself at one point: ‘the potential for misuse seems kind of high.’ The game frequently returns to questions about data collection and the ethics of using it for research or new business propositions. When the slimy Skandha CEO’s intentions are revealed, it makes you uncomfortably consider where all that information you’re giving away on online is going.

Is it ever ok to access someone’s personal information to such an extent, if they agree to it and it allows a company to better help them – as well as sell to them more effectively? And even with all this data, can an AI truly understand human emotions and correctly interpret the meaning behind our words? Is an app like Eliza really able to support the people who need it the most, or does devolve responsibility to a computer and reduce the care of our mental health to a commercial venture?

So many questions and ones I’m still thinking about. What I can say though is that seeing Evelyn go off script towards the end of the title and push against Eliza’s boundaries is so satisfying. After realising the app isn’t aiding society in the way it was intended, she chooses to take a stand and intervene. It will get her in trouble with her employer, is unlikely to change the company’s culture and won’t completely change the world; but if it helps even just one person then surely it’s worth it?

Evelyn is an unexpected character in some ways. In releases about mental health, the focus is usually on a protagonist who’s struggling or trying to work through their problems. But here we have someone who’s coming out on the other side, who’s gradually starting to peel off the armour she built up over the past three years and that makes her far more vulnerable. She shows us that it’s ok not to have all the answers and to have to find your way one step at a time.

The things she says and the way she behaves make her incredibly relatable. You might see yourself in her or one of the other characters at some point, and realising you’re not alone is an incredibly powerful thing.

Kim View All

Video game lover, Later Levels blogger and SpecialEffect volunteer. Big fan of wannabe pirates and fine leather jackets.

5 thoughts on “Eliza: it’s all in the mind Leave a comment

  1. Eliza was an old DOS AI program that was a virtual therapist. There were versions in basic and C that I saw. We learned to program by adding questions and things. Of course we reprogrammed it to twist everything you typed into something dirty. We learned about the limitations of AI and ways around them. We added a dictionary/thesaurus and defined slang. We kind of taught the thing to understand allegory. As much as was possible. I’ve been warning ppl since the mid 80s about what was coming with AI and networking. I was told I read too much science fiction and what is possible today would never happen in the real world. There have been good things too,like meeting like minded ppl from all over the world.

    Like

    • There are some mentions of the original program in the game which are interesting. When you think about how different it and the Eliza app are, and then realise that the latter is essentially available to us know already, it’s kind of amazing – if a little scary at the same time. I’d love to be able to skip forward ten years and see where we are in terms of AI at that point.

      Like

  2. One of the things I also found interesting is the potential for some clients to get upset when you go off script. Thing is, as a flawed human being, one may or may not be able to say the right thing at the right time. While the intention may be good, the result may end up with an upset person storming off and never coming back, in which case, did you really help or harm?

    Given the vast corporate enterprise Skandha is posited to be, can we really be sure that every low level customer service agent is mature and wise enough to truly treat their clients like a proper therapist should?

    Perhaps the clients really want to talk to the machine, a la the old Eliza DOS program. Because they very well know that the machine cannot really understand, but offers a means of helping them think through or work out their own problems, or at least just verbalize it without judgement from other human beings.

    The proxy is just a means of transferring text to speech. It’s certainly an awkward experience as the proxy, but perhaps the regular almost mechanical check-ins are what the client actually wants, similar to how some people today may use an app like Headspace to meditate, rather than deal with unexpected human interaction and commentary. It’ll be super easy for a less well trained person to end up interjecting and talking more about themselves than appropriately reflecting the client and truly -listening-.

    Food for thought at any rate. One might also wonder about the industry with real human therapists if clients keep coming back week after week. Are they really being helped, or are they just addicted to the weekly human contact that they have to pay for? (In which case, perhaps the Eliza app is a cheaper and more affordable solution.)

    Like

    • It certainly seems that one of the clients depicted in the game is there for company more than counselling. She’s clearly lonely and wants that human contact – and, even though she does have problems which might not be resolved but could possibly be eased by talking, she comes across as being more interested in being able to talk to another person about general things. The weather, how her town has changed over the years, the increase in prices. I found this bit a little uncomfortable because it was so easy to simply dismiss her as being nosy at first.

      I do wonder how much a proxy’s thoughts come across to the client, even though they’re only repeating the answers that Eliza has given them. The section where the tables are turned and Evelyn attends a counselling session herself: the proxy I was sat in front of seemed cold as though she was judging my character, not truly taking in what I was saying. If this is the case, is having a proxy as intermediary really a good way of getting patients to open up?

      I think the only answer I can give is that if we truly want to support people with their mental health, a whole range of options must be made available so each individual can find a channel they feel comfortable with and that works for them. It’s just a shame that not enough funding is still being put towards such things, even though we’re now all more aware of the importance of mental wellbeing.

      Like

Join the discussion

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: