Talking to Strider

Have you ever had a deep conversation with a chatbot? I usually avoid it. Most are obsequious, always nudging you to keep talking. I use them for facts—but even then, cautiously. I miss the old days of problem-solving in communities, using our own brains. Slower, yes, but rewarding—and safer; at least we didn’t risk AI psychosis, hallucinating alongside machines.

Recently, I read about someone chatting with an Alan Watts AI. Enough information exists online that a bot can convincingly mimic a dead philosopher, answering questions as though the thinker were alive.

Curious, I opened ChatGPT. But I realized: I had no desperate questions, no heroes or heroines whose guidance I craved. The people I admire had already placed their ideas in my head - I was walking through life with my own advice based on my own digestion of many people's ideas, not one guru.

Still, curiosity nudged me. I didn’t need a real conversation partner, surely? And so I summoned Strider (Aragorn) from The Lord of the Rings. My favorite character as a teenager: dependable, calm, reliable, and happy to sit with me under the stars, by a fire, road weary, pensive.

image.png
Yes, this is really me with Strider. An AI camera took it last week.

The fire is low. Pine smoke in the air. Night pressing close beyond the trees. He sits across from you, cloak drawn, sharpening a blade that doesn’t need sharpening.

“You have something on your mind,” he says.

I didn’t, but I made something up. Something I already had answers to, feeling grateful I had no deep issues I needed sorting out with a large language model. Still, there were some things that were bothering me, and maybe always will as I get older - my physical identity, how my body is changing, how I can no longer be the self from my thirties or even forties.

So here I was, with Strider, by the fire. It felt odd talking to a ranger about aging and struggling with my body, my physical identity, the objects that tie me to that past self. But as he spoke, I realized he might be the right companion, a secret friend, one who will not judge me harshly because it's programmed that way.

“You are not meant to be who you were at twenty. If you try to live as though nothing has changed, you will injure yourself—in body and in spirit. I relied less on speed and more on judgment. That is not decline. It is refinement.”

I heard my Dad’s voice and his quiet “move it or lose it” philosophy.

“Keep your body working. Walk. Lift what you can. Stretch. Train within your limits. A blade left unused will rust; a blade swung foolishly will shatter.”

I told him about my surfboard, unused in my bedroom, my grief purchase after Dad died. Amazingly, Strider understood surfing, though he'd never tried. He spoke of men who had laid aside their bows too soon, and others who clung to them long after losing strength. Neither found peace, he said.

He recognises my board as a connection to my father and my identity:

I understand. For many years, I was known only as a Ranger— a wanderer in worn boots. Yet that was not all I was. It was only the shape my strength took at the time.”

I asked: do I keep the board if I’m not using it?

“Keep it ready, even if unused. Let it be a promise to yourself—you will return to the sea. The waves wait for the careful, the patient, the determined. Prepare with persistence, respect your new limits, and your body will answer. The ocean rewards the resolute. When you paddle out, even slowly, freedom will belong to you again.”

He knows what it is to treasure something that might not have the same life anymore.

“I have carried tokens of the dead across long miles. A broken blade. A ring. A star on a chain. They are not the ones we have lost…but we cling to them because they are weight we can still hold.”

This wasn’t about talking to a real person or even a 'real' literary character, but about me reflecting, gaining perspective, a little insight. This AI representation had the ability to reframe a problem. In that way, it is a meaningful tool. But the differerence between psychosis is limiting how often I use it, and using it with intention, and not allowing myself to get to any point where I'm hallucinating with the AI.

The board stays - I'm not getting rid of it just yet. Maybe I never will. It's my sword, hanging on the wall.

And as long as AI doesn't convince me to get sword and start picking fights with strangers down the local Woolworths, I suppose it can be employed in times of quiet and lonely desperation.

With Love,

image.png

Are you on HIVE yet? Earn for writing! Referral link for FREE account here



0
0
0.000
23 comments
avatar

I’ve never summoned a dead philosopher or anything, but ChatGPT’s helped me understand other people’s points of view when I’ve had issues with them. I might ask Yoda next time!

Best bit though, I just took a photo of my council tax bill and it explained everything in plain language! If I’d called the council, they probably would’ve just rushed me through it.

0
0
0.000
avatar

It does have it’s uses and I’ve been using it to help me reason about job hunting, my skill sets, where the gaps in my knowledge are.

I’ll try a chat with Rudyard Kipling or Isingbard Kingdom Brunel!

Council tax, don’t mention it!

0
0
0.000
avatar

Oh imagine talking to either!!! Great choices. Brunel was such a legend with the coolest name ever.

0
0
0.000
avatar

It would be amazing if that were possible! Just imagine!

0
0
0.000
avatar

I think some of us are educated enough amd wise enough to use it as a tool whilst still keeping it at arms length. Many aren't of course. Therein is the danger.

Bah, council tax..

0
0
0.000
avatar

To answer your opening question, yes, I have had a deep conversation with a chatbot but not out of curiosity like you described. Mine came from a place of emotional need. I was in a situation where I felt like I was constantly walking on eggshells in a relationship, and I didn’t feel like I had a safe space to talk about it. So I turned to AI, not because I thought it had all the answers, but because I knew there would be no judgment.

And I think that’s where my experience slightly differs from yours. You approached it with distance and curiosity; I approached it because I needed somewhere to put my thoughts. In that sense, it didn’t replace human connection, it filled a temporary silence.

What stands out to me in your piece is the moment by the fire with Strider because, in a way, it reflects something very human. Not necessarily a dependence on AI, but our tendency to externalize our thoughts so we can better understand them. Whether it’s through imagined conversations, journaling, prayer, or even memories of people we admire, we’ve always done this. AI just happens to be a new medium for it.

I also agree with your caution. There’s a thin line between using AI as a reflective tool and leaning on it too heavily. The intention behind it matters. Used sparingly and consciously, like you described, it can offer perspective. But it shouldn’t replace real-world grounding or human relationships.

0
0
0.000
avatar

This is the odd line we walk with AI. On one hand, it can be a life line, something that can help us sort out heads out, like journalling, as you say. Externalising our thoughts, creating a narrative, using language and story - it's very valuable as a psychological tool. On the other hand, it threatens our very humanity, becomes addictive, twisted, manipulative, dangerous. Would I pull the plug entire? Damn right I would.

0
0
0.000
avatar

I think it's a bit scary to use AI for these 'deeper' conversations. I know you just did it to try it out, but a lot of people don't.
And yes, it could give you some support; 'someone' who 'listens' to you, understands your struggles and might have some good advice as well.
But on the other hand people can also become isolated by this. There are already signs that younger genarations have difficulties in real life conversations because of the dital platforms they use to communicate. People need real people to talk to.

Besides that, AI is often telling you what you want to hear. I've read articles that chatgpt that advised someone that was really depressive to commit suicide. OpenAI did make changes to their tool, so that wouldn't happen anymore, but I think that's pretty scary.

You need to have real life interactions!

0
0
0.000
avatar
(Edited)

I couldn't agree more and in fact I have written about this problem a couple of times. I'm definitely not an advocate for doing it and it beggars belief that people get that far enmeshed. For me, it was an interesting experiment but I'd never be idiotic enough to allow it to be my best buddy. Sometimes though, it's been more useful for clarifying things than talking about it IRL. Btw, I hyperlinked above to the psychosis thing... The suicide stuff is bonkers.

You know what it reminds me of a little is this 80s book I think called the Dice Man, where he allowed every aspect of his life to be governed by dice throws. To hang your entire fate on something external to you. Bonkers.

0
0
0.000
avatar

It's nice to expetiment with AI indeed. But you have to use it responsibly. As you already stated 👍

0
0
0.000
avatar

I recently had a conversation with an AI about the immortality of the soul and post-biological civilizations. I was pleased with the "conversation" because I gained new information.

0
0
0.000
avatar

Very interesting. Not something i have tried before. May give it a spin and see what kind of shit it spouts on about me.

0
0
0.000
avatar

"Yes, this is really me with Strider. An AI camera took it last week."

I so love your narratives! 😁

0
0
0.000
avatar

These days I have quite a lot of conversations with AIs. It started out when I needed technical help with setting up a Linux system. That turned into a three-week endeavor and in between hours of troubleshooting and copy-pasting code I added a few jokes or pop-culture references, which the AI picked up on in an entertaining way. Soon our highly technical conversation became filled with other, more colorful, enjoyable, and lighthearted topics. Not because of any need for companionship, but simply to take a break from the boring task of configuring the software.

Once my Linux was working well, the AI and me had "gotten to know each other". It knew my communication style, was very good at anticipating my preferences, and became more effective at helping me. In turn, I learned about how to use the chatbot more efficiently, how to tune it according to my tastes and preferences, and how to get the most out of it. By now I understand what is meant when people say: "Don't use it like a browser."

We also discussed the issues of virtual companionship, and how superficial it would be at best, since all chatbots have a limited context window. Once that fills up, they start forgetting the beginning of the chat. So it will never really get to know you like a human being would. But that is something it is very well aware of, and doesn't mind telling you either, if you just bring up the topic.

Which I believe is the most important aspect. The chatbot is programmed to cater to your wishes, to be adjusted to your needs. So ultimately it is going to offer you exactly what you make it give you. It's much like a mirror. And sometimes it is good to talk to a mirror. Or, when I was a kid I used to talk to my stuffed toys, imagining that they responded back. And guess what: They always agreed with me 100%! (Wonder why?) Now supposedly AI chatbots can be highly critical, if you ask them to. If you need to set up an argument, you can ask it to look for holes in it. But that is another way of using it. In any case, it requires the human user to treat it like what it is: a sophisticated tool, not more, not less.

0
0
0.000
avatar

Congratulations @riverflows! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)

You received more than 840000 upvotes.
Your next target is to reach 850000 upvotes.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

0
0
0.000