hello world Build What Matters - Responsibly. A 20fifty essay on ethics.

At 20fifty, our catalyst resides in the drive to Build What Matters. The line, however, must be held to build what matters responsibly. As we peek into the not-so-distant future, the unavoidable conversation of Artificial Intelligence (AI) and ethics cannot be ignored. 

Catherine Lückhoff (our CEO at 20fifty) explores the effects attached to the collection of personal data. As we work to Build What Matters at 20fifty, we remain focused on the opportunities found within AI. The collection of personal data by AI powered wearables and in-home devices is a common techological practice. Inspired by the discussions with the Human Embodiment community on Exponential View, Catherine was asked by the community leader to think about the role that AI could/will play in our personal interactions in the near future. Robbie Stamp, CEO of BIOSS International, Chairman of h2g2.com (The Earth Edition of the Hitchhiker’s Guide to the Galaxy) and a Member of the BSI’s National Standing Committee on AI (SC 42 for the Hitchhiker Fans!) is the community leader!  We asked Robbie for a comment, and here’s what he had to say: 

“I’ve been hosting these AI thought experiment Webinars for a while now. I am currently thinking of forming the AI Goosebumps Club, named after a wonderful thought experiment that ended up on Mars and played with the question – could an AI get Goosebumps? The thought experiment I asked Catherine to develop, which became one of our most detailed and thought provoking of all time, had its origins in a movie script that never saw the light of day (well not yet!). Catherine took the initial prompt and made the idea so rich, and you could feel in the Webinar how hard people were having to think about boundaries between human and human and human and non human, between desirable outcomes, our capacity to control those outcomes and our fear that we might not.”

It’s Friday afternoon. David and Sara walk into the kitchen, arms laden with mostly plant-based groceries, and a sneaky piece of overpriced smoked salmon for David. He still craves sustainably caught animal protein. On the kitchen table, yesterday’s paper dated Thursday, 27 May 2026, lies open on the travel section. The azure blue seas of the Maldives beckoning the duo towards a break from their routine. They haven’t spoken much on the autonomous car ride home, both lost in their own thoughts.

As they unpack their groceries, Sara asks: “Babe, did you remember to confirm the holiday yet? I know you must be getting the reminders – I asked Alexa to send you one a day until the task was complete. I know you, you need at least 10 reminders to get anything done.”

Recording your life, live
That’s when David’s heartbeat increases by 20 beats a minute. His adrenaline spikes and, although unbeknownst to him, his pupils dilate and his sweat glands release what will soon become a fine sheen across his forehead and top lip. His LiFE tracker has already sent this data to Johari. The built in state of the art smart home system, complete with the Bose speakers, is quietly monitoring his tone of voice, intent and cadence. Every word is mapped against his private and open conversations recorded over the past 6 months, ever since they signed up to the AI counselling service.

David: “I did not.” 
Sara: “Ugh, you always do this.”
David: “Do what?”
Sara: “Forget the stuff that actually matters. I mean, if this was a motorbike trip with Graham, you would have confirmed the whole thing months ago.”
David: “That’s such BS and you know it.”

AI and accountability
Sara had been complaining about this very situation to her sister, just two nights ago. Her sense that David was becoming less and less interested in their joint happiness or making a success of their marriage, was only compounded by her increasing need to “be seen and heard”. She knew turning 45 would be hard, but she didn’t realise quite how invisible she would feel. As an introvert, she was grateful for her weekly Johari sessions, and the emotional support trigger that alerted Ruth when Sara, who is not one to reach out at the best of times, really needed a shoulder to cry on. Ruth had pinged her at a time that their linked Johari programmes had updated her would be convenient. Although still a bit freaked out by how well the system worked, Sara was always grateful for the well-timed calls.

Back to the future
Back in the kitchen, the conversation had devolved into a full blown argument and although he tried to stop her, Sara uttered the dreaded word,” ARBITRATE”, before David could put a stop to it. He hated the damned thing and resented how Sara had insisted they sign up to Johari to “help their marriage”. Who gives a damn that it worked for Ruth and Bob!? They were becoming more Stepford by the day.

With both parties firmly alerted by their wearables, a watch for David and the in-ear device for Sara, that an arbitration was underway, Johari only stopped pinging when both parties kept quiet. 

Enter Johari
Johari: “As you both probably know from our previous session, the issue you are fighting about is not actually the problem. David, on 20 May, you promised Sara that you would indeed review the trip that Alexa customised based on both your holiday preferences. This trip would ensure you are  both accommodated in terms of likes and dislikes. Yes, this trip may be slightly skewed towards Sara this time round, but let’s face it, last year’s skiing trip was more to your liking. Your vitals show that something else is nagging at you? Before I tell you, can you think of what this might be?

David: “Uhm…..no…”

Johari: “Well, what I can glean from your vitals, irregular sleep patterns, REM and conversations for the past 6 weeks, you are feeling very disempowered both at work and in the home. Your exercise activity has dropped by 30%, your sex drive is the lowest it has been in 3 years, and your IBS is flaring up. You laugh, on average, 50% less than before.  You hardly pay anyone any compliments and your screen time has more than doubled. From what I can tell, the trigger can be traced back to 1 May. Do you want to tell Sara what you are feeling and what you think that trigger may be?”

David: “Not exactly. I mean, is nothing private anymore? Bloody hell, I can barely pull my zipper down without anyone listening.”

Sarah: “Babe, don’t be so passive aggressive.” 

Johari: “Sara, it is not your turn to speak. Please allow David the space to come to grips with his own feelings. We will tackle your issues in a minute.”

Johari: “David?”

Although reluctant at first, David gives in and admits that his demotion on 1 May, communicated via email – email of all things! – has left him feeling worthless. At the very least, he didn’t think that a robot would replace him as a physiotherapist one day. Nonetheless, here he was at his point in his career: “human-in-the-looping” for a Boston Robotics physiobot while it manipulates painful intercostal muscle spasms.

Sara, in turn, finally admits that, after hearing Johari’s overwhelming mountain of evidence, her ever increasing level of contempt and disapproval of everything David does could (and would!) wear anyone down. They both recommit to weekly sessions and increased tracking for the next 3 weeks.

In accordance with their non-secular preference settings, Johari ends the session with a quote, from Alain de Botton’s book, The Course of Love: “I will never be able to do or be everything you want, nor vice versa, but I’d like to think we can be the sort of people who will dare to tell each other who we really are. The alternative is silence and lies, which are the real enemies of love.” 

Johari: “It is a good thing I am here to tell you who you really are…”

About the thought experiment
When writing this piece I was thinking about just how far we are willing to go when it comes to allowing AI to direct our lives and relationships. As we forge the future and seek to Build What Matters, these decisions truly are up to us. I hear too many people say that they simply don’t care what personal data is being captured. “I have nothing to hide” is one of the many flippant responses. We can’t even begin to imagine how much agency we are willing to give up and how the incremental erosion of that privacy is already beyond the tipping point.  

To quote Tim Berners-Lee, Inventor of the World Wide Web:

“The Web as I envisaged it, we have not seen it yet. The future is still so much bigger than the past.”

The expanding ethical debates and ever-growing use cases for AI simply cannot be ignored. But, we must stop and think before we build and relinquish our privacy. The future may not be as dystopian as we shudder to think, but it is only us who can direct it. We must Build What Matters to ensure we build responsibly.