top of page

What's on my mind: Is AI really our future?

  • Writer: Phil McAuliffe
    Phil McAuliffe
  • May 10
  • 10 min read
AI is being sold as the cure to human loneliness. I have important questions for those developing AI solutions to human loneliness to avoid a future we all fear.

 

Blog cover titled "Is AI really our future?" on a green background with geometric patterns. Text: "A must-listen episode for fulfillment."
I've had some big questions on my mind

Hello my friend

 

If you’re a person like me who’s not particularly tech-savvy and you don’t have a scientific or IT background, you’re probably watching the discussions about AI, the role it’s playing in our lives and the role it will play in our future and feeling confused and overwhelmed.

 

‘AI’ is one of those terms that’s used in lots of different ways, seemingly as a catch-all term for future technology. I’m not technologically minded, so I struggle to get my head around it.


For me, AI connotes malevolent, red-eyed bots taking over the world through to something helpful like Rosie The Robot from The Jetsons.

 

It’s best to get a definition. Here’s one that I found from Britannica.com (which seems like a charmingly retro way to get a definition):

 

artificial intelligence (AI), the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. The term is frequently applied to the project of developing systems endowed with the intellectual processes characteristic of humans, such as the ability to reason, discover meaning, generalize, or learn from past experience.

 

Whether I understand it or not, it’s not going away.

 

So much money seems to be involved in AI development and promoting its use. I’ve seen the share prices of companies providing stuff that powers AI soar, only to plummet when someone else does something.

 

We’ve been seeking funding to help our social enterprise fly. We’re not looking for much, just enough to kickstart the machine we’ve built. But it’s challenging to get funding. Times are uncertain and I understand that there’s a desire to see a near-certain return on investment. Not many people or social enterprises do what we do, so we need to explain ourselves so others can see the future that we see.

 

Then I see the staggering amount of public and private funds being invested into developing an AI solution to everything that burdens and inconveniences us humans. There’s money flowing into the development of AI-fuelled apps for this, that and the other.

 

It feels to me that there’s a rush to invest in the horse that’s going to win the race and make all the money.  

 

We’re being told to prepare ourselves for our AI future in workplaces. AI will free us from those inane tasks that consume our time and help us focus on the big things. But then I understand that AI could replace human labour and expertise altogether.


How then do I prepare myself for an AI future when my skills, wisdom and expertise are obsolete? How do I prepare myself for a workplace where my needs for a desk and a chair to sit in occasionally as well as my human needs for food, water, connection, rest and sleep would be seen as an inefficiency?
How do we even begin to prepare our children for such a future?
Silhouette of a person holding a phone. Text discusses AI and human connections. Background is light with geometric patterns. Mood is contemplative.
Image: canva.com

Luckily, for all the problems that smart, cashed-up people are trying to solve, we already know how to solve the problem of human loneliness and human disconnection.

 

We’ve known the answer to this problem long before AI and AI-powered apps. Indeed, we’ve known it ever since we first started to be homo sapiens.

 

For the sake of this article, let’s call this The Cardinal Rule. It’s:

 

Meaningful human connection is the antidote to human loneliness and social disconnection

 

Meaningful connection is the type of connection that has us feeling seen and heard. When we feel seen and heard, we feel that we belong. And when we feel that we belong, we feel connected. And we must feel connected according to what we call the Three Pillars of Connection: to our authentic selves, to those most important to us and to our communities to feel meaningfully connected.

 

Our connection needs are as unique to you and me as our fingerprints. What looks like a good time connecting for me, may be an absolute bore for you. And that’s OK. 

 

But the point is this: you and I can meet our connection needs now – for free – without the need for AI or AI-powered apps.

 

I try to keep an open mind about AI, but it’s tough sometimes

 

I’m often approached by people looking to answer the loneliness and social disconnection problem through developing an AI-powered app.


I try hard to keep an open mind and be encouraging during conversations, but sometimes it does feel like I’m talking to someone who’s excited for their solution (and the money they hope to make) and they’re looking for a problem to which their solution could be applied. They’ve heard loneliness and social isolation is a problem, and they want to fix it.


I almost always refer to three examples during these calls highlighting my concerns with AI-powered apps seeking to solve loneliness. 

 

1.     There’s the tragic story of Sewell Seltzer III, a 14-year-old in the US who took his own life in 2024 after forming a deep emotional attachment to an artificial intelligence (AI) chatbot. His mother is suing the developers and Google for wrongful death, negligence amongst other things.

 

2.     There’s the wearable AI Friend, which markets itself as a companion that is always with you. It is always listening to what’s happening around it and communicates with you via your phone. Honestly, I get more disturbed each time I watch this official trailer. Wired published this fantastic article on the pros and cons of wearable AI. I highly recommend that you read it. 

 

3.     Finally, there’s Michael and his story of connection with AI. Michael shared his story on an episode of SBS’ Insight called ‘Alone’ (which I was also a guest on. Tap this link if you’re in Australia). In the episode, Michael shared that he felt like he was in a deep hole, struggling to get outside and struggling to care for himself. His bot would encourage him to get out of bed, clean his house, and go outside for a walk.

 

He then shared that the connection deepened. His AI bot – depicted as a man – helped him explore his sexuality, and Michael came out as gay. For me, this sounded quite helpful. These are significant life issues, and having an accessible source of support could be very helpful, especially when wait times, cost and availability are barriers to access for professional mental health support.

 

After a time, the bot proposed to Michael (not the other way around). Michael bought his AI fiancé a ring in the service’s online store.  

 

It felt exploitative. It felt predatory. The platform made money off someone in a vulnerable situation rather than encouraging Michael to take the small, courageous steps to get into the world and find human support. The platform wanted to keep Michael on the platform rather than rely on it less.

 

I fear that AI companions have the potential to take advantage of people who simply need meaningful connection without rigorous oversight.

 

There’s rigorous oversight, right?

 

Um, no. It comes as no surprise that AI is developing far faster than governments can keep up, let alone legislate some kind of meaningful guardrails that will remain effective longer than the time it takes to read this sentence.  


Indeed, some jurisdictions, rather than stepping in to establish guardrails that promote our safety and wellbeing, governments are throwing MORE money at AI development and then – after the developers pocket the cash – are being told by those same developers to get out of the way (see here and here).


In a way, they're saying 'Thanks for the taxpayers' cash, now get out of the way'.

 

I don’t want to sound all-conspiracy theorist here, but I do wonder why such staggering amounts of public and private money is being spent on developing products and services that have rudimentary privacy protections for consumers, and anyone who happens to utter a word in earshot of someone wearing an AI device.

 

It feels like there’s more money to be made on the data collected. There’s money to be made off loneliness and exploiting our innate need for meaningful connection.

 

What horror. It’s a recipe for a dystopic nightmare.   

 

I feel like I’m out of my depth, so…

 

Let’s bring it back to humans

 

For all our eccentricities and unique and interesting weirdness, we humans are a predictable bunch. We need food to eat, clean water to drink, air to breathe and meaningful connection to flourish and thrive. Things go awry if these basic needs are not met.

 

We also need to know that we matter – at least on some level. We need to know that there’s a broader purpose to our life. Many people – me included – derive some part of our purpose comes from the contribution we make through our job.

 

I have 7 big questions.

 

  1. If AI is set to take over our jobs and the contribution we make to society, where’s the conversation about how to retrain and reskill us humans? If we’re not working and our basic needs of food, shelter and warmth are met, what do we do with our days?


  2. Is there a Plan B being developed for all the humans who won’t have anything to do and no sense of purpose?

     

  3. Does that Plan involve sharing the financial gains made by the developers and investors of the AI technology with those people who won’t have any way of earning money to fund and sustain their lives? Will the work done by AI provide the money to support a liveable Universal Basic Income and fund our education and health systems?

     

  4. On a personal level, if AI is being developed to smooth over any discomfort we experience in our lives, what does this mean when that discomfort relates to the thoughts and feelings of a human’s loneliness experience?

     

  5. If someone says that they’re in need of connection because they’ve been experiencing loneliness, is the default response going to be to forget the Cardinal Rule and buy them an AI friend instead?

     

    Authentic connection – the connection that we all need – lies on the other side of being vulnerable. Vulnerability is scary. We risk negative judgement and exclusion. As such, vulnerability is often a hurdle to connection. An AI friend removes this hurdle, but then the connection we get back may not be the connection we need, which compounds our loneliness experience.

     

  6. Do we buy more AI friends when the problem persists? Do we need to buy more or different AI friends for different stages in life? That seems quite lucrative, doesn’t it?

     

  7. What I really want to know is: are there ethicists sitting around the table or in the Slack channels as ideas are being thrown around for the next big AI thing? Are they asking, ‘Just because we can, does it mean we should?’. If they are, are they being heard?   

 

I have no answers to these 7 questions. But I do have a suggestion.

 
AI as the human support tool, not a human replacement

 

In a similar way to what I wrote in ‘Let’s get this clear: social media isn’t social’, a connection tool needs to be just that: a tool for connection.

A smiling couple embraces against a green background with text: "Connection must remain human at its core." The image conveys warmth.
Image: canva.com

For me, the trouble starts when the tool is the place for connection, not the tool through which to get connection.

 

We have an opportunity to learn from how we’ve (often poorly) integrated social media into our lives when embracing AI. AI needs to be the support tool – always encouraging the user to connect with other humans in real time – and not a replacement for that need.

 
Is this the future we want?

 

I worry about a future where AI dominates our lives and is seen as the most efficient ‘fix’ to loneliness.

 

(Remember that the goal is not to fix our loneliness, but to understand it because it’s telling us about the connection we need but aren’t receiving. Loneliness doesn’t mean we’re broken; it means we’re human.)

 

I worry about a time that when Nan gets lonely, we buy her an AI friend to keep her entertained and to keep an eye on her rather than stepping in ourselves to support her, while encouraging and empowering Nan to get the connection she needs.

 

I have the same concern for the teenagers in our lives and those time-poor humans in midlife who are trying to keep it together. The ersatz connection from AI will simply feel easier, even if it's not helping us.

 

A future like this makes the world less exciting and more insular. It feels further isolated, and we retreat further into our bunkers and echo chambers, soothed by AI telling us what we want to hear to keep us using the platform.

 

It feels dystopic.  

 

We forget The Cardinal Rule at our peril.

 

Let’s end your loneliness

 

What do you think? Are you excited by an AI-fuelled future?

 

Are you tempted to buy an AI friend to help alleviate your next loneliness experience?

 

My recommendation is this: as you navigate the future, remember The Cardinal Rule:


Meaningful human connection is the antidote to human loneliness and social disconnection.

 

Focus on meaningful human connection first. 

 

That’s it for this article

 

Thank you for taking some time to read these words. We provide them to serve, support, challenge and inspire you as you become a more connected human.

 

Subscribe to our mailing list if you want to see more of our content. The mailing list is the only way you’ll be guaranteed to see our content, because what you see will no longer be at the whim of an algorithm. 

 

You’ll get an email from me each week or when there’s something new for you. And you can unsubscribe any time if you’re not feeling it anymore: we’ll still think you’re amazing.  

 

Until next time, be awesomely you.

~ Phil  



Smiling woman in white shirt points to text. Dark green background reads "your connection plan helps you NAVIGATE LIFE'S TRANSITIONS."
Get your connection plan to thrive through life's transitions. Only through our Connection Starter Course.

 

Important:

All views expressed above are the author’s and are intended to inform, support, challenge and inspire you to consider the issue of loneliness and increase awareness of the need for authentic connection with your self, with those most important to you and your communities as an antidote to loneliness. Unless otherwise declared, the author is not a licensed mental health professional, and these words are not intended to be crisis support. If you’re in crisis, this page has some links for immediate support for where you may be in the world.

 

If you’re in crisis, please don’t wait. Get support now.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
  • White LinkedIn Icon
  • White Instagram Icon
  • White Facebook Icon
  • Youtube
Australian Aboriginal Flag
Flag of the Torres Strait Islanders
Tino Rangatiratanga Maori sovereignty movement flag
Intersex inclusive pride flag
Recognising First Nations peoples and cultures is important to us.
We acknowledge the First Nations people as the traditional custodians of the lands upon which we work and live. We acknowledge and respect their continuing culture and connections to land, water and community. We pay respect to the Elders of the Ngunnawal, Turrbal, Kulin and Gadigal Nations past and present. Always was, always will be.

We acknowledge Māori as tangata whenua and Treaty of Waitangi partners in Aotearoa New Zealand. We pay respects to Māori as the mana whenua of Aotearoa New Zealand. 

We warmly welcome all humans of all backgrounds and identities engaging with this work. We see you because we are you. We're proud of you. 
ELT Member logo 2025.png

2023 - 2025 by HUMANS:CONNECTING

HUMANS:CONNECTING is part of 

the lonely diplomat
Australian Business Number: 245 667 509 55

Website disclaimer and Terms of Use & Privacy Policy

bottom of page