Skip to main content

Mark Gomes - Four Way Interview

Mark Gomes is a writer and tech executive who uses fiction to ask the questions our systems won’t. His latest novel, Age of Extinction, explores AI as a man-made extinction event—rooted not in rogue machines, but in profit-driven logic. At its core is The Equation—a simple but urgent framework for understanding what it takes for humanity to survive. He’s also the author of The Heavy Butterfly, a work of mystical realism that uses quantum theory and surreal imagery to explore consciousness, identity, and what might lie beyond death. Mark studied the philosophy of science with a focus on Bayesian reasoning but believes deeply that logic means nothing without moral clarity. He lives in Munich, thinks in story, and writes to provoke. His latest is a novel, Age of Extinction looking at the potential impact of AI on society.

Why this book? 

I wrote Age of Extinction because the AI conversation felt like pantomime. Doomers shouting 'Skynet,' TechBros promising utopia—and almost no one asking the simplest, most revealing question: who benefits from all this? When Geoffrey Hinton said AI might pose an existential threat, what struck me wasn’t some vision of rogue machines—it was the very human choices being made, right now, in plain sight. That’s what unsettled me. Not the tools, but the people using them. And what they’re building isn’t a future—it’s a power structure.

I started thinking about extinction-level events. Not metaphorically, but literally. Dinosaurs didn’t survive theirs. What if AI is ours—but dressed in progress, not asteroids? That became the lens through which I built the story: step by step, the societal equivalent of a species being pushed out of relevance. But the bigger question was: how do you get anyone to care? You can write essays, give talks—but most people aren’t reading science journals or watching Senate hearings. They’re scrolling. They’re exhausted. So, I turned to fiction. Something that entertains on the surface but leaves something behind once it’s swallowed. This isn’t a story about artificial intelligence. It’s a story about artificial humanity—and what happens when we forget what being human is even for.

Your picture of an AI-driven future is bleak: do you see any way around massive job losses to AI?

Only if we stop pretending this is accidental.

AI is like a loaded gun. It doesn’t pull its own trigger. The danger isn’t in the tool—it’s in the hands that wield it, and the systems that let them get away with it. Right now, companies are quietly shrinking the workforce. Not with mass layoffs—but by not hiring, not training, not investing in people. Politicians know it’s happening, and they look the other way. Why? Because the people losing out don’t fund campaigns. The ones building the AI do. And through all of it, we’re told to 'upskill,' as if the solution is more training videos. It’s not. The problem isn’t that people lack skills—it’s that our system no longer values human contribution unless it’s profitable.

It doesn’t have to be like this. If AI is trained on our work—our words, our art, our decisions—we deserve a stake in its rewards. If companies cut costs by cutting people, then let them pay for the social damage they cause. We tax cigarettes for harming bodies. Why not tax AI for hollowing out livelihoods? People don’t just need jobs. They need purpose, belonging, meaning. Strip those away, and what’s left? A more efficient economy—and a broken society.

I find it difficult to imagine people would accept chips wired into the brain except for major medical conditions. Do you really think it’s likely to become commonplace?

Yes. Not because people want it—but because some will feel they have no choice. In wealthier countries, it’ll be sold as a lifestyle enhancement: faster cognition, seamless access, performance gains. But in poorer parts of the world, it’ll be about survival. If someone offered you a neural chip and, in return, your family could eat for six months—what would you say? That’s not science fiction. That’s economic reality.

And for the people pushing this—governments, tech giants, billionaire visionaries—it’s not really about medical breakthroughs or human progress. It’s about data. In the modern world, data is the core currency. If you own the platforms, the media, the commerce—and then gain access to thoughts themselves—you don’t just predict behaviour. You control it.

That’s the deeper danger. Once enough people accept the chip—willingly or not—it becomes a gateway to everything: education, employment, insurance, even citizenship. Augmentation won’t be about enhancement. It’ll be about eligibility. Yes, it’s dystopian thinking. That’s the point. Age of Extinction isn’t a prophecy—it’s a provocation. I want to force this conversation into the open now, while we still have the option to say no.

What’s next?

I’ve been thinking a lot about time—not just as a sequence, but as a threshold. The instant before the Big Bang was pure quantum possibility. No direction, no outcome—just potential. What if we’re living in something like that now? A moral tipping point where our choices still matter—until they don’t. That’s the idea behind my next novel: quantum agency vs. corporate determinism. A story about who we become when the future isn’t written yet—but a few powerful players are already trying to copyright the ending.

On a more personal level, I’ve been returning to the question that first pulled me into philosophy. Years ago, a friend who was studying anthropology gave me a thought experiment: Imagine someone who spends eight hours on a factory line, eight hours living as a king in VR, and eight hours asleep. Which version of that person is real?

It got under my skin. I couldn’t stop thinking about it. That’s when I started reading philosophy—not just metaphysics, but logic. Eventually, I studied the philosophy of science, with a focus on Bayesian logic and the reasoning behind algorithms. Not just how they work—but why they’re built the way they are. What assumptions they carry. What trade-offs they enforce. That matters now more than ever. We’ve become obsessed with the what—what AI can do, what’s possible, what’s next. But we’re losing the why. And if we can’t reclaim that—if we don’t reinsert intent, value, and human judgement into the conversation—we’ll be left with systems that are incredibly powerful and completely unaccountable.

That’s why I built The Equation into Age of Extinction. I wanted something as simple as E=mc² but rooted in human systems. A formula that asks: what does it take for humanity to survive — and to deserve survival? It’s not mysticism. It’s logic. Evolution isn’t just a biological process — it’s a test of coherence between individuals, communities, and the systems they depend on. The Equation became my way of expressing that. Something grounded. Immutable. A lens for seeing what’s working and what’s failing in our world.

So, what’s next for me? Trying to write stories that put the why back on the table.

These articles will always be free - but if you'd like to support my online work, consider buying a virtual coffee or taking out a membership:

Interview by Brian Clegg - See all Brian's online articles or subscribe to a weekly email free here

Comments

Popular posts from this blog

We Are Eating the Earth - Michael Grunwald *****

If I'm honest, I assumed this would be another 'oh dear, we're horrible people who are terrible to the environment', worthily dull title - so I was surprised to be gripped from early on. The subject of the first chunk of the book is one man, Tim Searchinger's fight to take on the bizarrely unscientific assumption that held sway that making ethanol from corn, or burning wood chips instead of coal, was good for the environment. The problem with this fallacy, which seemed to have taken in the US governments, the EU, the UK and more was the assumption that (apart from carbon emitted in production) using these 'grown' fuels was carbon neutral, because the carbon came out of the air. The trouble is, this totally ignores that using land to grow fuel means either displacing land used to grow food, or displacing land that had trees, grass or other growing stuff on it. The outcome is that when we use 'E10' petrol (with 10% ethanol), or electricity produced by ...

Battle of the Big Bang - Niayesh Afshordi and Phil Harper *****

It's popular science Jim, but not as we know it. There have been plenty of popular science books about the big bang and the origins of the universe (including my own Before the Big Bang ) but this is unique. In part this is because it's bang up to date (so to speak), but more so because rather than present the theories in an approachable fashion, the book dives into the (sometimes extremely heated) disputed debates between theoreticians. It's still popular science as there's no maths, but it gives a real insight into the alternative viewpoints and depth of feeling. We begin with a rapid dash through the history of cosmological ideas, passing rapidly through the steady state/big bang debate (though not covering Hoyle's modified steady state that dealt with the 'early universe' issues), then slow down as we get into the various possibilities that would emerge once inflation arrived on the scene (including, of course, the theories that do away with inflation). ...

Why Nobody Understands Quantum Physics - Frank Verstraete and Céline Broeckaert **

It's with a heavy heart that I have to say that I could not get on with this book. The structure is all over the place, while the content veers from childish remarks to unexplained jargon. Frank Versraete is a highly regarded physicist and knows what he’s talking about - but unfortunately, physics professors are not always the best people to explain physics to a general audience and, possibly contributed to by this being a translation, I thought this book simply doesn’t work. A small issue is that there are few historical inaccuracies, but that’s often the case when scientists write history of science, and that’s not the main part of the book so I would have overlooked it. As an example, we are told that Newton's apple story originated with Voltaire. Yet Newton himself mentioned the apple story to William Stukeley in 1726. He may have made it up - but he certainly originated it, not Voltaire. We are also told that ‘Galileo discovered the counterintuitive law behind a swinging o...