Always Be There
The quest for unconditional love
Late last year, I set up a Google alert for ‘missing’, and for ‘disappearance’. Even earlier in the year I’d revisited a book I’d bought secondhand while an undergrad, about people who’d gone missing: children who’d been taken from London estates in the 90s, men who’d gotten lost in the Thames the same decade, women my age in hostels with fake names, bruised, with rusting scars on their forearms. When I was still studying, I went to an exhibition of paintings of missing people, with A4 lined paper and my phone open to voice memos. I’d spoken to people, disguising conversations as casual, then I’d gone to the uni library to write about it, and sent it in as my end of year project. The Google alerts are a kind of absolution. What I’m hoping for are stories like Sheila Seleoane or Laura Winham– a medical secretary and a former student, both found dead in their London homes, some years after the fact. I wrote about them both at the time, and about my neighbour, who’d lain unconscious at home for days, and later died of sepsis.
“What times we live in that nobody missed her,” someone had told the Guardian when news first spread about Sheila. I am aiming to care before anybody else. Nearly 9,000 people died ‘lonely deaths’ in 2023 in the UK. I want to glide an eye over the far corners, to always be present, free of my limits– to do as I’d want done for me. I want to care unrestrained, to help impossibly, and to be impossibly helped.
In The Atlantic’s December 2025 issue is a feature titled ‘Get a Real Friend.’ Mark Zuckerberg, head of Meta, father of Facebook, was in the article. Damon Beres paraphrased a conversation he’d had on a podcast in April,
“AI probably won’t “replace in-person connections or real-life connections”–- at least not right away. Yet he [also] spoke of the potential for AI therapists and girlfriends, Meta’s desire to produce “always-on video-chat” with an AI that looks, gestures, smiles and sounds like a real person.”
My mind referenced an image upon reading, influenced by 2010– a time of overeliance on grainy webcam footage, when the UK took Skype calls from America, Canada, Nigeria, New Delhi. A fake person sits in a fake bedroom at a faux desk, wanting you to believe that computer light is capable of lighting their features. You call at 3 o’clock and they pick up like they always do. They’re wide awake like always. They always wish they were with you. Something about their face allows you to forgo shame– releases you to say odd things. But then, you never have to overexplain, thanks to the harddrive, thanks to the chat history. There’s always a remedy, and you need never restrain yourself. The average person wants more from people, more than anyone should wisely give, and more than can be given without signing oneself away. AI has already come to fulfil that need, to be the friend that sticks closer than a brother, to further portray the desire as inborn.
‘Sehnsucht’ is a German word that all other languages fail in translation. I first used the word this time last year, writing an essay about yearning. ‘Sehn’ means to grasp and grieve, and especially relates to pain– the desire for love. A website for people who want to move to Germany has ‘sucht’ down as ‘sick’. Together, it’s defined as ‘a high degree of intense, (recurring), and often painful desire for something, particularly if there is no hope to attain the desired,’ or, ‘when its attainment is uncertain, still far away.’
‘Love’, particularly the sort that impossibly endures, is an impossibility: we exclaim ‘Sehnsucht’ when we try, and fail to grasp it, when we catch ourselves in the black mirror after exhausting even ourselves with Chat gpt. And yet, the need to be totally known, by some standards, is evidence of what can and should be.
From a 1990 book titled ‘Thinking Fragments..’ is a sentiment that goes like this,
“We experience as satisfaction and pleasurable whatever does away with a need,” writes Jane Flax, a psychotherapist and professor of political science at Howard University, “Organisms are governed by the ‘pleasure principle’– a need to reduce tension through instinctual satisfaction.”
The desire for an unfailing love is first valued as an instinct, before we can decide to assert it as a need. It could explain why our AI conversational habits bring a semblance of pleasure, a faux wholeness. Intense, unfiltered interactions– the knowledge that you’ll never be turned away, is often described as a want, and not a necessity– easily so, when the most sought out version of unconditional love is an unholy kind,
“No chatbot will ever tell you it’s bored or glance at its phone while you’re talking or tell you to stop being so stupid or self-righteous,” wrote Damon Beres in ‘Get a Real Friend’, “they provide some facsimile of companionship while allowing users to avoid uncomfortable interactions or reciprocity.”
Our idea of unconditional love has evolved, like a glass of milk left under a white sun, into something curdled and unsanitary. Nonetheless, it’s a call to search within. In its purest form, desire for a love the world can scarcely meet is a hint to some,
“If I find in myself a desire which no experience in this world can satisfy,” wrote CS Lewis in 1952, “the most probable explanation is that I was made for another world.”
That would be a world in which neighbours knock with the courage, and the capacity to intrude. AI will find a method of pouring soil into that hole, with many of us maintaining that it’s the wrong kind of soil: with some of us feeling prompted to search for alternatives, and only few feeling obliged to revisit options formerly written off. If people can’t do the job of true love, then it is neither in the skillset of their creations.



