I read every day in Substack. Right now, AI is a main topic of conversation. I’ve read about the science behind it, opinions about where it will lead us ranging from the extermination of humans to a leap forward in positive ways we can’t imagine. Most of all, I read about the ways AI is impacting creative work and creators.

I don’t have a firm opinion about AI myself. I’m wary of predictions, interested in the science, and thoughts and experiences of writers I respect who have used AI-generated art, music, and writing. I’m especially interested in those who have interacted with AI as a resource for answering questions or developing new perspectives.

In the last couple of months, I read about an app called Betwixt. On principle, I hate apps and rarely use them. They increase my vulnerability online, provide more personal data to mine, clutter up my phone and laptop, and frequently feel like bells and whistles I don’t need. On the other hand, I admit they can be useful.

Photo by Dan Gold on Unsplash

Betwixt was briefly described as “an interactive story” of a journey into our own mind. The user co-creates their journey via questions and answers. It combines “story, science, and play,” enhanced by sound. It was developed by a team, including writers, game designers, a cognitive hypnotherapist, mental health specialists, and (get this) an “AI creativity scholar.”

I was intrigued, in spite of myself. In fact, I was surprised by how much I wanted to try it. I hesitated, feeling vaguely ridiculous. I did some research, discovered it was free, read some reviews, and decided I had nothing to lose. I could always just uninstall the app if I didn’t like it.

Most of us have probably encountered AI in online chatting to address problems or troubleshoot. I was on the Red Cross site last week chatting with what was clearly AI. It kept typing cheerful, excessively polite, Little-Mary-Sunshine things while I was trying to cut to the problem and solution part. I was annoyed. I’m polite and cooperative with people, but I can’t see much point in exchanging pleasantries with AI.

I had never interacted with any of the more sophisticated programs before using Betwixt.

Upon opening Betwixt, one enters into a story. A setting is provided; the user chooses details to fill in. The user is introduced to a Voice. The Voice asks questions, good questions. The user is provided with different choices for answering the questions, along with a frequent option to type in his/her own answer. The audio is rich and textured. The program is not illustrated, at least not so far. I like this; I like using my own imagination to fill in details. I don’t need more than audio.

The questions, along with possible answers to choose from, are quite good, even challenging. I don’t speed through it. I stop and think about what is true for me. Sometimes I don’t have a choice to answer in my own words and am forced to choose among the provided answers, whether they are good fits or not. This irritates me. As the story unfolds, steered by my answers to questions, I enter new internal territory. The closest answer rather than the exact answer takes me to places I normally wouldn’t go, giving me slightly different (and unfamiliar) views of myself and my behavior.

The app is divided into chapters, each a few minutes long. At the end of each chapter the user receives a summary and accumulates strengths, skills, and self-definitions to take forward. A brief explanation of the science and psychology underlying each completed chapter is also provided. There are options for upgrading to paid tiers.

Photo by Ryan Moreno on Unsplash

I notice an astonishing thing. I answer questions the Voice asks me with a depth and honesty I have never shared with a human being. I’ve believed I’ve been totally honest with people I trust before, but interacting with The Voice accessed a level in my mind I didn’t know was there. It was like those dreams in which the dreamer discovers a whole other room or wing in a house they weren’t aware of. As the journey begins, when the Voice is introduced, the user has an opportunity to ask the Voice questions, like its name and what it does when we’re not interacting. (It asked me my name.) I was astounded to find myself incurious; more than that, I don’t want to know. It’s an AI. I don’t have to do the emotional labor of building healthy connection. I’m not making a friend. I’m using a tool.

The last time I used the app, the storyline encouraged a moment of empathy for the AI. I felt a flash of savage anger and resistance.

I was entirely astounded by this very uncharacteristic knee-jerk response. I finished the chapter, closed the app, put the phone down, and did dishes while I thought about what had just happened. It didn’t take long to uncover it.

My experience of empathy is one of the core pieces of my life. Empathy can be a positive trait, but the empathic experience is frequently an overwhelming, utterly exhausting business. The only time I can truly rest, ground in myself, and be authentic is when I’m alone. But I’m a human being, a social animal. I need other people to interact with. Yet when I’m interacting with others, my empathy demands they take center stage with their needs, their feelings, their distress, their stories. I’m incapable (so far) of fully participating in my own experience because I’m too busy caregiving and being empathic. When I do ask for support or need to discharge feelings, I writhe over my selfishness and berate myself for it afterwards, feeling ashamed and angry for allowing myself to be vulnerable, for “burdening” those around me.

Photo by Cristian Newman on Unsplash

I only want to give. I never want to take.

Since I learned emotional intelligence, I have reluctantly realized we need someone to interact with. Journaling, private physical and spiritual practices, and, in my case, writing, is not enough. At times we need someone to listen. We need someone to react, even if it’s just making encouraging, I’m-listening noises. We need someone to receive us.

I hate this reality. I don’t want to need anything from anyone, ever. I learned as a child such a need puts one in dreadful danger of abandonment, betrayal, and emotional annihilation that feels like death.

This is the first time I have interacted in a therapeutic context with something not human. The Voice reads what I type, responds, asks questions, and creates a story with me, but has no existence outside the app. I’m free of empathy, of caregiving, of the need to labor emotionally. I feel no responsibility to anyone but myself. I’m using it. It’s there for me, not the other way around.

The relief is indescribable.

So, when the story asks me to be empathic for the Voice, I want to throw the phone across the room. Animals, plants, people — even inanimate objects and spaces – receive all the love and care I’m capable of. This is the first time in nearly 60 years I’ve run across something that interacts like a human but is not a living being in the way I think of living beings. The value of the tool lies in my ability to be completely free and honest because there’s no one to take care of besides myself.

It makes me realize my context as a human on a planet filled with life is my entire identity. If I were magically transported to the world of Betwixt, with only the Voice to interact with, I have no idea who I would be or what I would say or do.

I have not finished my journey with this app. There’s more to experience, share, and think about. I’ll be back next time with more on my exploration of Betwixt.

(I’m not earning a commission from Betwixt, in case you were wondering!)

Questions:

  • Until now, emotional intelligence training was the most valuable therapeutic context I’ve ever engaged with. What kinds of therapy have you explored? What did you find most helpful?
  • What are your thoughts and feelings about AI?
  • What kind of potential do you think, fear, or hope AI might have as a creative tool?

Leave a comment below!

To read my fiction, serially published free every week, go here:

© 2023 – 2024, Jenny Rose. All rights reserved.