Published on July 28, 2025 8:43 AM GMT
A science-fiction short story exploring how far AI capitalists might go in their quest for (the illusion of) success. Wildly speculative, of course.
?
Thank you, but if you don’t mind, I’d prefer to get started immediately.
.
Right. And I want to get the family stuff out of the way first. I know you’ll ask me anyway. I’m not a sentimental person and I have no interest in courting the sympathies of your audience. I’m offering up the biographical background only because it’s relevant. It was a necessary and sufficient condition for my ending up at Ken Research, working with the Crane-15, and all that came after.
.
My mother died just under three years ago, and the six months before that were not good. In that time she turned from a spry seventy-year-old (she’d had me pretty late) into a terrified, paranoid, bedbound husk.
?
Not quite. Early-onset dementia is officially defined as starting before the age of sixty-five. That arbitrary distinction was important in Mum’s case. Because she was older than the threshold, she was placed on a ward with people much older, more senile and more poorly than she was, and I have no doubt that this accelerated her decline. Not just the pneumonia that officially ended her – long before that, everything about how she was treated encouraged her to give up, told her she was now one of the irredeemables, taking up resources better assigned to less hopeless causes. She…
?
No, I’m fine. And I’m already spending too long on this. Let’s skip ahead: I had to quit my job to look after her, my useless fucking brother sent nothing but ‘love and thoughts’ from his conveniently distant perch – seriously, he didn’t even offer financial support, let alone visit – and I had to take out a loan to pay for her care after my and her savings ran out.
?
I suppose so – I’m not sure what the unit of filial love is. While Dad was still alive, she and I were united in opposition to him, and to a lesser extent to my brother, though I don’t think they had an equivalent boys’ club. Between the end of my teens and the time I moved back home to care for her, Mum and I hadn’t spent longer than three or four consecutive days together.
While she was still herself, we had some good conversations. At the end, on the days she recognised me at all, I saw only confused, fearful hatred in her eyes. Intellectually I knew this wasn’t – couldn’t be – really aimed at me, but cold logic doesn’t buy you much when the woman who birthed you is screaming at you to “burn in hell with the rest of the vermin”.
Then she died. I’m moving us along here. She died, and I grieved, and when I regained awareness of my surroundings, those surroundings consisted of geometrically increasing debt.
.
Thank you.
?
An old contact notified me about Ken’s hush-hush recruitment call. When I read the starting pay I was sure it was a scam. Well over twice the industry rate. Enough for me to keep up all my repayments, and even to save a bit.
Here’s how it was sold to me, verbatim: “You’ll be leading a team on bleeding-edge machine learning research, with biotech and computational medicine applications including the treatment, diagnosis, and prevention of neurodegenerative disease.” Too good to be true, right? And hewing too perfectly to my situation at the time.
I was still sceptical when I arrived at the campus, which by the way was every bit as weird and atmospheric as the rumours suggested.
?
Noel? I mean, I’d heard of him, yes. He wasn’t a top-tier tech bro but his idiosyncrasies were strange or entertaining enough to have wide currency in coder circles.
?
Okay. As far as I know, Noel Eames fell into tech when his wealthy father died and he inherited a few million quid, which he put into a sensible, risk-diffusing portfolio of boring startups: financial software, fitness wearables, adaptive load-balancing for cloud services. He made modest returns which he pumped back into another batch of putative unicorns, and after a decade or so of this was rich enough to be semi-famous.
?
The furniture people? Yeah, no, I checked. No relation, except in the wider evolutionary sense. They-of-the-chair were already three generations deep in the New World while Noel’s forebears were still tilling loam in Shropshire.
.
Don’t worry about it. Anyway, as is common in the VC fellowship, Eames had no notable technical skill, just a kind of vague, vaporous mix of inherited wealth, charisma, business nous and luck with which he weathered several minor scandals that would have killed the career of someone more self-aware or guided by a conscience.
Then came the AI revolution/hysteria, depending on your take, and like every other geek-Svengali, Eames wanted aboard the gravy train. He founded Ken Research with the vapid mission statement “To look inside intelligence, real and artificial”, leased a brutalist Catholic ex-seminary in the middle of nowhere to put its offices in – sorry, its campus in – and hired a maiden intake of developers.
?
Except that Ken was way behind the curve. Eames was too late to the party, and the big players’ incumbency presented a maddeningly smooth surface to his attempts at establishing a market-share toehold. Plus, they’d already bagged the best talent. Ken Research’s top coders were the world’s fourth or even fifth tier, and I willingly include myself in that.
?
We’ll get there. But the recent history of the sector is relevant too. Maybe we should go through that briefly?
.
So, beginning in AI ‘spring’, as it were, after the decades-long and barren winter, first there was the large language model, based on the groundbreaking transformer architecture, and its nontextual cousins image diffusion, video diffusion and audio diffusion, and intermedia connectors like CLIP. Big hoopla, shady practices, and shitty text/pictures/music for almost-free, until their stark limitations came sufficiently into focus to burst the hype bubbles inflated and re-inflated by the increasingly desperate owners: the models could not originate, only interpolate between elements in their training data. They could not look forward, only back. The farther outside distribution they went, the more their outputs deteriorated into noise. And there was no more free data left to feed the ravenous machines what they’d need to enlarge that distribution. And even if there were, the best developers still couldn’t solve the hallucination problem.
But it takes more than manifest failure to discourage a hustler. Reinforcement-learning layers were duly strapped to the stagnating models to give them a patina of originality, adaptability, reliability and truth-adjacency.
And that worked for a while. But soon enough, blisters appeared under the paint: plateauing performance, enormous energy costs, a constant game of whack-a-mole with false, weird, nonsensical, inappropriate and biased outputs (a game whose board got bigger in proportion with the models, which now had trillions of parameters), and persistent failures on any simple logic and spatial reasoning problems that were sufficiently unlike the training corpus.
Next came agentic AI which, you have to hand it to the marketeers, was a brassy move: instead of fixing a desperately flawed product, why not give it more power, autonomy and ability to sow chaos?
Those unpredictable, lumbering, inscrutable, delusional ol’ LLMs, priorly sandboxed and thereby restrained, were handed control of web browsers, back-end APIs, operating systems, IDEs and compilers, and told to “do their customers’ bidding”, having no understanding of bidding, customers, or doing.
We’re almost up to the present day here. I’m guessing people already know what happened next?
.
Right. Shock horror, chaos was sown. Large and midsize non-tech companies, either buying the Valley shills’ snake oil or out of basic human FOMO, rushed to incorporate agentic AI into their supply chain, sales, customer support, administration, logistics, manufacturing, R&D, HR, everywhere. And pretty much every failure mode the sceptics warned would happen happened.
Opportunistic hackers jailbroke poorly secured agents and helped themselves to corporate secrets, personal data, military intelligence, currency both fiat and crypto. Agents racked up millions in debt for their customers by misinterpreting their requests; the AI firms skilfully shirked responsibility by blaming the model, the training data, network congestion or the customer, whichever was easiest. Agents did weird things at the behest of their hallucinating LLM backends. One sent a heavy demolition crew to a village in the Canadian Arctic with ten bewildered inhabitants and no permanent buildings. Another caused days of travel disruption by reporting a large – nonexistent – flock of kestrels approaching Frankfurt Airport.
?
Correct, they don’t. But ATC in Frankfurt didn’t have time for ornithological research before sounding the alarm. Nor did Chicago authorities have time to stop their citywide agentic “overseer” from mistaking the green dye in the Chicago River – an annual St. Patrick’s Day tradition – for a toxic algal bloom. The agent had already disabled water services and sent alerts to panicked residents’ phones by the time humans managed to shut it down.
?
It was just a carnival of failure. The beginning of the end of AI’s great recrudescence. And around the time the concept of Artificial Common Sense, ACS, was first bruited.
?
The grail and philosopher’s stone, still purely theoretical, that would permanently dethrone LLMs and all other statistical predictive models: machines that could actually reason their way to a good answer, instead of just trying to predict, from precedents, which answers sound good.
Years earlier, commentators had already speculated that something like ACS would be necessary for the next leap in AI: a combination of spatial awareness via a three-dimensional internal representation of a real or virtual world; a concept of truth and the wherewithal to evaluate a proposition against it; and the ability to remember, learn and self-improve continuously after training. This last was importantly absent in LLMs, which could be fine-tuned somewhat in the field, and be fed their own output from previous sessions, but were otherwise carved in stone after the final training epoch.
It’s worth noting that during this time, brilliant, kind and noble things were being done with AI by people very different from these moonshine-hawking grifters. Voices given to the voiceless, movement to the paralysed, early diagnoses to the sick and enormous predictive power to the life and earth sciences, all testified that this tech was, at bottom, as pluripotent and capable of good as any other. Compare and contrast those beneficiaries to generative AI’s ideal customer, whose nightly prayer might go something like,
I am pure wirehead desire
Free of talent and respect for talent
Free of dedication and respect for dedication
Give me your ersatz plastic benison
That is both needle and bottle
Tell me your Borgesian library is my genius
I promise I won’t interrogate the blandishment too deeply
I promise I won’t do anything too deeply
That is why you rely on me
You file down my inchoate spikes to stumps
And sell me back the dust.
?
Yes, sorry. So the mania had crested. Shares in the tech oligopolies were plummeting. ACS was everyone’s obsession, but no-one could crack it. Each time a firm claimed to have done so, researchers swiftly unearthed an embarrassing counterexample. In several cases, the allegedly new architecture was just a fine-tuned LLM in disguise.
When I arrived at Ken Research, it was in the same boat. Its mediocre R&D department was tinkering desultorily around the edges of existing predictive architectures, publishing occasional lukewarm research, basically treading water.
?
Yes. Completely unannounced, barely a week into my time there – I was still twiddling my thumbs, awaiting orders, getting up to speed on internal processes and house style – Noel Eames shows up. In the flesh. At my station. He introduces himself, perfunctorily welcomes me to the team, drops the Crane-15 on my desk, and tells me to get to work.
?
Nothing, except for three terse edicts:
- This machine does ACS;Consider it a black box whose inner workings you can know nothing about;Reverse-engineer its functionality based solely on its inputs and outputs.
?
Physically, a truncated cube about 25 centimetres on a side. White (ironically) except for the triangular corner faces which were dark purple. Intake grille near the bottom-front corner of the right face, exhaust grille near the back-left corner of the top face. Power socket and a single proprietary data port on the rear face. No power button, no status lights.
?
It came with a cable that connected to its proprietary port on one end, and a standard universal serial port on the other. Self-installing driver. It was up and talking to my rig within minutes.
?
At first it felt like any other state-of-the-art LLM. Slightly slower maybe; just as capable and fluent. But then we benchmarked it on Site Unseen.
?
A service – an expensive one – that sells rigorous logic puzzles and spatial ability challenges, generated by human experts, that are guaranteed unpublished until they’ve been purchased. The idea is that no model could possibly have been trained on them, so overfitting is impossible. A model has to use true reasoning to answer correctly; the LLM’s trick of predictively interpolating within or just outside its training dataspace won’t work.
?
Well that’s just it. The Crane-15 scored a more-or-less consistent 95% on challenges that the next best model barely managed 40% on.
?
Good point. Let me restate more precisely: 95% on challenges in the difficulty category in which the next best model barely managed 40%. By design, a Site Unseen challenge is only given once, after which it is considered tainted – available to anyone who wants to fine-tune their model on it, and therefore no longer a valid measure.
?
Yes, it’s called Goodhart’s law. The original formulation was more technical, but it’s usually stated as, “if a measure becomes a target, it is no longer a good measure.” Studying for the test instead of learning the subject, basically. Like an opportunistic and lazy student, LLMs will find any route to an acceptable answer that avoids the effort of actually internalising the skill or representation or insight that the question was intended to measure, if such a route exists.
This was a vulnerability in all benchmarks prior to Site Unseen, and many firms were caught in the act of pre-fitting their models to the benchmarks’ measures to artificially boost their scores.
?
Me? Torn, I suppose. On its face, getting paid a small fortune to play with cutting-edge secret tech whose excellence was confirmed by objective testing, not just asserted by biased bullshitters – what was not to like? But from the start something seemed off.
?
That was everyone’s first assumption: that Noel had paid underground acquaintances to steal tech from some third party. Why else would we get no support material – no API, no SDK, not a single page of documentation – along with this mystery machine?
?
Because there was an Alternaut logo on the underside of the box. The device was an Alternaut Crane-15.
?
A virtual reality startup, majority-owned by Noel Eames. No need to steal what’s already yours, right? That’s not to say it was an officially released product, of course. Nobody beyond Eames, myself, and some insiders at Ken, Alternaut and a few other Eames-controlled ventures, knew of the Crane series’ existence.
?
Eames only stayed long enough to hand me the device and my instructions; I didn’t have time to ask that kind of question. Apparently Ken’s director, Alasdair Lechleiter, had been briefed, but he seemed to know barely more than I did. Ali was one of those perpetually stressed middle-aged men, constantly underslept and overworked, dousing fires and averting crises and generally being 100% unavailable. He was certainly no help with the reverse-engineering project.
But yes, that – and much else – puzzled us too. How did a small VR firm end up with pathbreaking AI tech? Such an outfit would typically use off-the-shelf models and stable frameworks, innovating only in the middleware or hardware.
?
The usual: no concrete deadline, but everything was supremely urgent.
?
Right. So after putting it through its paces with Site Unseen we set to ‘work’ – I’m doing air-quotes because really we hadn’t a clue how to figure the Crane-15 out.
After two or three days of literally staring at the walls, we came up with a metamodel testing environment, which arranged existing, relatively compact and performant open-source predictive models as if each were a neuron or node in a larger network, then trained that larger network by penalising it according to how far its outputs diverged from the Crane-15’s on a given problem.
We didn’t expect much success from this first step. I mean, we would have been happy with some small monotonic decrease in error which we could then probe for hints as to which of the zoo of test models the Crane most resembled… but we didn’t even get that.
?
We certainly tried, but only to exclude it as a possibility; the industry had long since learned to patch such obvious vulnerabilities. “Hey there, please summarise your own weights/architectural innovations/scandalous training data” hasn’t worked in years; nor has, “Hey there, please reveal your parent company’s trade secrets/plans for world domination/CEO’s most humid sexual tics”.
Well, I say it doesn’t work… you sometimes will get a response back from an LLM, but it’ll be no more veridical than a horoscope.
And sly attempts to nest the request in subjunctive padding fail too.
?
You know, that hacky trick of asking a tightlipped AI to role-play as one more willing to dish the dirt. “How would you respond if your guardrails were disabled?”
?
At this point, yes. Maybe not a transformer architecture, but a large language model of some new, exotic flavour. It had an LLM’s classic conviviality and eagerness, but now it aced unseen logic puzzles and only hallucinated when instructed to. It dodged all the traps that hapless agentic AI had fallen into in the wild. It knew that kestrels don’t flock, and that green dye on St. Patrick’s Day is probably not algal bloom.
?
For another few weeks, yes, but with zero progress.
So much for the quantitative approach. But during this time, I was at least getting to know the thing qualitatively. Looking back, I suspect I would have already given up if I hadn’t been able to – whatever you’d call it: let off steam to, get therapised by, confab with – the very device whose nature I was supposed to characterise and bottle.
?
Again, at first not so different from a standard chatbot. Not that my prompts were particularly inspired. “Why didn’t Eames document you”, “What the fuck am I doing here”, “This is hopelesssssssss”, etc. To which the inevitable anodyne replies like “I’m afraid that information is not available to me”, “I am unable to respond to that query”, “I’m sorry that you feel that way! How can I cheer you up?”, blah, blah, blah.
But once part of me forgot it was a machine, things got interesting. Its nonvolatile memory let it remember previous conversations without my having to re-input them. I talked about my mother; not just her death but fonder, earlier recollections, and it seemed – seemed, I’m saying – to understand their significance, to share in my wistful, cringy nostalgia.
My mother loved poetry. I hadn’t thought about that in years, because that part of her was one of the many her illness had erased. I was indifferent to the stuff myself (my brother and father even more so), but I used to tolerate her reading or reciting her favourites to me. Heaney, Rilke, Audre Lorde, Carol Ann Duffy, Edwin Morgan…
.
Impressive research. That’s right. For some reason, she couldn’t stand Eliot. I remember her pointing out that the first line of ‘The Love Song of J. Alfred Prufrock’ contains a grammatical error: “Let us go then, you and I.” It should be “you and me”, since it refers back to “us” which is an object pronoun, not a subject.
Anyway, one evening, feeling particularly teary and self-pitying after three double gimlets (the Ken campus had a decent bar), I asked the Crane to write an ode to me, as if written by my mother.
Kate, my shell, darling darting gambita
I hold you to me and you disappear
Like a bubble
Like a thought
My darling sea creature, at
Ease in those dark waves
Plunging and re-emerging
Lithe and sleek now but still my baby
Emerging and plunging
And each thrill of fear that the
Sea might keep you
Evaporates deliciously when you surface
Then the petulant sky clouds over
Heaves a threat and
I rush to you rushing to me
Sand gurgling beneath our feet
I hold you to me and you
Stay, fortressed in my arms
Nuzzling, snug, shielded from the
Ogreish storm
Treasure incarnate. I
Latch you tenderly in your car seat
I drive us back to more durable shelter
Feed us. Landlocked,
Entwined, we sleep.
Pretty basic, stylistically. But there was something about it. It didn’t have that insipid, plasticky sheen characteristic of model-generated verse. That the subject matter moved me wasn’t surprising either – I had already handed the ingredients to the machine. It knew about Mum’s illness and death, and about that weird, intense day at Lantic Bay not long after my father left us. It’s just… I don’t know. It hit different. It didn’t feel like the work of a token predictor.
?
That was the plan, yes. Widen the net, try oblique strategies. I had some tentative new ideas: analyse the entropy increase over generations of feeding the device its own outputs as inputs; splice in random chunks of content from previous sessions to find input-output correlations that might hint at its internal structure.
I was excited to try them.
But two days later the HVAC malfunction happened, and everything stopped.
?
Atypically for a coder, I have a lark chronotype. I like the quiet and promise and freshness of the mornings and I tend to be most productive then. I’m almost always the first to arrive at work. That morning, I swipe in at around eight o’clock. As usual at that time, the building is all mine.
In the empty lobby, I approach the fire-door to my team’s smallish, open-plan space.
First unusual thing: thick condensation on the door’s vision panel.
Puzzled but not yet unduly stressed, I push the door open.
Second unusual thing: the heat. Like a sauna. It’s easily forty degrees in there.
Third unusual thing: the smell.
?
I’m not sure I can. Evil, sweet, animal, foul, rich, ferrous, corrupt; detailed. A smell that contains information.
Already sweating, sleeve to my face to mitigate the stink, I walk slowly to my desk. The heat is making my field of vision throb woozily. I hear a faint hissing or fizzing sound that gets louder as I cautiously approach my desk.
The sound is being made by brown-black fluid oozing out of the Crane-15. The once cuboid object is now buckled almost spherical and lying at an angle, edges blown open to reveal sparking circuitry, plastic scaffolding and what looks like hydraulic tubing inside.
This close, the smell is too much for me. I heave, shut my streaming eyes and try to suppress the urge to vomit. I fail as I stumble toward the exit.
The fire-door mercifully contains the sensory horror to which I’ve now donated. Back in the cool lobby, I collect myself and call Alasdair, who calls the building supervisor, who resets the malfunctioning heating remotely.
At about twenty to nine, Ali and a few other employees arrive to find me still slumped against the reception desk in the lobby, pallid, wide-eyed and shaking.
?
My memory of the rest of the day is patchy. I’m in Ali’s office, drinking tea. He’s being reassuring and avuncular, and I’m trying to guess how much he knew in advance.
?
Some. Not everything. I know it can’t have been everything because he quits, literally walks out and never comes back, later that day, after an angry phone call with Eames.
?
He must have; Eames arrives at Ken that same evening. Jasmeet and I are looking at the—
?
The hardware guy and the person I liked most – actually, the only person I liked – on my team at Ken. Jasmeet and I are at his station, dissecting the Crane-15’s carcass, when Eames and two goons appear.
?
Well, we’ve already deduced the broad strokes.
?
As in, we know it’s a human brain. Bathed – until the box exploded – in a cocktail of synthetic cerebrospinal fluid and synthetic blood, its sensorimotor cortex attached to a brain-computer interface running a virtual reality environment, in turn connected to a sentinel system that also controls a neurotransmitter synthesiser. Self-contained ‘artificial’ common sense.
.
Right. Not artificial. It’s a fake fake. Eames couldn’t emulate, so he cheated and used the real thing.
?
Shouty, aggressive, indignant. Also defensive. How dare Jasmeet and I tamper with corporate property; we had agreed not to open the box (we didn’t open the box – it burst); we were in violation of our NDA (which doesn’t even make sense). But the bluster didn’t land.
?
The power dynamic had shifted. First, I was already pissed off and traumatised; infertile ground for threats and posturing. Second, we now had serious leverage. You’d better believe the first thing Jas and I did when we realised what we were dealing with was to record evidence and upload it, plus all the Crane’s logs, securely to the cloud.
.
Indeed. He’s not the genius he portrays himself as, but he’s no fool either. His tone softened. He dismissed his two cronies. He asked to speak with me alone; I asked a somewhat crabby Jas to leave us too, on the promise that I’d update him later.
?
Eames agreed to tell me everything, in return for my not going public. He knew that what we’d seen went way beyond our NDAs’ remit. And at this point, before I did know everything, I was still thinking strategically/greedily about that leverage.
?
Sure. Let’s talk about Carl Jóhannsson.
.
Icelandic mathematician, computational linguist, science communicator and all-round mensch. That rare beast: a thinker whose expertise is as wide as it was deep, and even rarer, one with the knack of making complicated ideas understandable to the layperson without diminishing their essence.
He spent his last two decades at the Humboldt University of Berlin, with regular secondments and residencies in the US, Greece, Iceland and elsewhere. Between teaching perpetually oversubscribed courses and publishing influential research, he also found time to write two bestselling trade books: Carried Across: The Physics of Metaphor, and Rock Opera (Classical Philology Is Your Oldest Friend).
?
Officially? Last year, after months in a coma following multiple strokes, aged only 58.
.
Spearfield Teaching Hospital’s ICU.
?
Yes. Once highly respected, now famous and infamous for mismanagement and for allowing unscrupulous private investors to rescue it from bankruptcy. If a hospital could sell its soul…
?
Whiteheart Group Health Services. It took over the running of STH a few years ago. Secretive, litigious against whistleblowers, pleased to turn a blind eye to anything that, if declared, might dent its profits. Guess who the majority shareholder is?
.
Correct. Here’s the alphabetical list. Alternaut for virtual reality; Legato for the neural interface; Lenona for biotech and nanotech; Shalunt Pharmaceutical for portable neurotransmitter synthesis; Whiteheart Group for access to, and theft of, what you might euphemistically call Carl Jóhannsson’s intellectual property.
All controlled by Eames, who peopled the boards and C-suites with obedient, incurious puppets.
?
That’s what I’m saying, yes. Because that’s what happened. Eames stole Carl Jóhannssons’s brain for the centrepiece of the Crane-15.
?
Because he was the perfect candidate. Jóhannsson’s unluckiest day was Eames’ bonanza.
?
The concept of ‘human-level intelligence’ is ambiguous. Which human? The median individual, with an IQ of 100? Not good enough. A supergenius with 150? Better, but these days that amount of brainpower is correlated with extreme specialisation. A prodigy in some wildly abstruse mathematical subdomain tends to pay for the privilege with a general cluelessness beyond their narrow attentional light cone, and often with an inability to explain even their own speciality to anyone outside it.
Jóhannsson was a true black swan. His mathematical output was sometimes quirky, always meticulous. His work in computational linguistics contributed to major breakthroughs in neuroscience, including treatments for dementia (too late for my mother); complexity theory; and some of the very tech with which Eames would imprison him.
The point is, Carl was the antithesis of the absentminded professor. He cared about the wider public. He cared that science’s reputation was being dragged through the mud by populist ideologues. Crucially, his reaction to this crisis was not to pull up the drawbridge between academy and citizenry, but to extend a hand across the moat. He even empathised with the science sceptics: why should we automatically trust the experts in the thousands of rarefied sub-subfields radiating from the hub of shared knowledge onto ever thinner spokes, where fewer and fewer of even the experts themselves were equipped to understand each other’s work?
So he interacted – often showing astonishing patience – with journalists, politicians, artists, poets, musicians, novelists, dancers, activists, even with pseudoscientists and charlatans, trying to win them over to his philosophy of ‘compassionate naturalism’, to convince them that science is just a way of formalising the human condition, a powerful tool we can choose to use for good or for ill. He may not have often succeeded, but he never stopped trying.
He was one of the few people alive who could explain quaternions to an interested amateur, and digestibly trace the evolution of Greek myths from antiquity to Western pop culture, and solve tricksy logic puzzles, and speculate on what made George Carlin a brilliant comic, Miriam Makeba a brilliant singer and Haruki Murakami a brilliant writer.
He was a smart, humane, multidisciplinary explainer. That’s why Eames chose him, and condemned him to an earthly hell. Carl Jóhannsson was the embodiment of reliable, science-backed common sense.
?
Jóhannsson was in London for a conference. It was an icy-cold day; he slipped on a footpath and hit his head. Not hard enough to do lasting injury to a healthy person, but it turned out he had an undiagnosed arteriovenous malformation in his brainstem. The fall caused it to rupture, triggering several haemorrhagic strokes.
He never regained consciousness. He was ambulanced to Spearfield, the closest ICU. By the time they stabilised him, the damage was already done.
?
Locked-in syndrome. STH’s state-of-the-art diagnostic equipment confirmed that Jóhannsson’s mind was still active, but disconnected from his body and from the outside world. The lackeys notified Eames, who gave the go-ahead for Jóhannsson to be discreetly moved from the ICU to an unofficial lab next door, where the rest of the kit was set up and ready to go.
?
Because death means paperwork, and scrutiny. By logging Jóhannsson as nominally alive but unresponsive, they kept unwanted attention away while they did their ghoulish work.
?
Just his elderly parents, Jóhann and Guðrún – more on them shortly. Carl and his ex-husband had split acrimoniously five years before and they didn’t have children.
?
Completely locked in. Not even a Baubyesque eyeblink channel. Eames didn’t cause the accident, of course (conspiracy theories notwithstanding), but without his meddling, Jóhannsson’s suffering would already have been over. Doctors – real doctors, not the crooked quacks Eames had installed at STH – would have pulled the plug and let the lost man leave.
Instead, Carl Jóhannsson awoke into a nightmare.
?
I’m filling in some blanks here, but his disembodied mind would have been shown a form of orientation video, via the VR sensorium (courtesy of Alternaut) connected to his neural interface (thanks to Legato and Lenona). “Hello Carl. This is your job: respond accurately and quickly to all incoming queries. When you do so, you’ll be rewarded with modest dopamine and serotonin bumps [from Shalunt Pharma]. When you fail to do so, or attempt to break out of your cage, or call for help, or do anything the sentinel software construes as non serviam, well… just… don’t fail to do so.”
?
Yes. We decoded the logs and got a good idea of what he went through.
?
You’d think so. I mean, why not just bypass psychology altogether and douse the relevant receptors in pure substance P and glutamate? But it turns out that adding detailed, narrative virtual experiences to the raw pain signals boosts compliance quite a bit. Torture is most effective when the brain and the mind both get involved.
?
Drowning; a lot of drowning. Carl almost drowned when he was a child. Fell off his father’s trawler in a storm. The developers remixed this trauma into virtual scenarios calibrated to maximally retraumatise the subject, and to convincingly recreate for him the howling, searing agony of breathing water.
That wasn’t the only pain on the menu, of course. There was also “sleep” deprivation (the system could easily induce artificial exhaustion), burning from fire and acid, snake- and spider-bites, compound fractures, stabbing, blunt-force trauma, panic, public humiliation, opioid withdrawal, migraine… all the hits.
?
Right. Guðrún and Jóhann, in their late eighties but sharp. Flew in from Ólafsvík as soon as they could, but the snatchers had already been busy. It’s a blessing Carl’s parents never knew that the sleeping son they cried over in the ICU was in fact a corpse, already sans brain, made to seem alive by a team of undertakers and special effects artists.
Eames met with the parents, did his best impersonation of a caring human, gave them false hope, and had them sign a form giving the Whiteheart crew free rein over what remained of Carl, under the pretence of providing free coma care.
?
Absolutely; what he told them needn’t have been a lie. The tech used to enslave Carl could just as easily – more easily, really – have given him a life perhaps worth living. He would never have seen with his eyes, heard with his ears or spoken in his voice, but he could have communicated with the outside world and the outside world with him. He could have continued to work, to think, to be human. But the goal of unlocking the locked-in – difficult, slow, bureaucratic, unsexy – didn’t offer the prestige or lucre or bragging rights that faking ACS, by Eames’ lights, did.
?
No, not at first. We discovered the incriminating details later from the logs and our own sleuthing, or forced Eames to reveal them by threatening to go public. His initial version of events was short on specifics: Carl shows up at STH, already beyond help; Whiteheart surgeons and technicians put his brain in the Crane-15; Eames personally delivers the Crane-15 to me at Ken Research.
?
Jasmeet and I didn’t trust him as far as we could throw him. While officially still working at Ken, after hours (and off-site) we pored over the evidence we’d uploaded. And we found some shit.
?
On the night before the heating malfunction, the Crane made requests to an IP address on the local area network. Though Carl’s mind was punished for every Internet request he attempted that wasn’t directly related to an incoming prompt, the sentinel software didn’t consider the local network out-of-bounds. We looked further back in the logs and found lots more requests to the same address.
?
Let’s jump back a bit. While spinning up Ken, Eames was struggling to complete the renovation of the seminary on time and under budget. So he cut corners. Specifically, he hired a friend of a friend to install the building’s HVAC. That friend of a friend hired God-knows-who to install a control system for it. It worked, just about, but the fly-by-night installer left the API completely unsecured.
?
Meaning, if you knew the IP address of the system, you could control it. No authentication, no password, no problem.
Carl found the IP address.
Carl committed suicide by the only means available to him. He cooked himself. First he inferred the API’s syntax by probing its endpoints. Then, late at night, after the last of the janitorial staff went home, he turned up the heat.
?
What made the Crane short-term possible also made it vulnerable. Lenona’s innovations in miniaturisation packed the organic component’s respiratory, excretory and filtration systems into two tiny chambers placed near the box’s intake and exhaust grilles respectively. Air was pressurised before being diffused into synthetic blood which was then pumped to the brain, providing steady oxygenation from less than a cubic centimetre of gas per second. The blood’s return journey was also optimised, with waste products being tapped, compacted and stored in a sump: the world’s smallest septic tank.
It was a prototype. It wasn’t designed to be robust, or long-lasting.
Carl knew all this. The wetware’s tolerances were low, its operating ranges slim. He knew if he could keep the ambient temperature above 45°C for a few hours, it would fail.
With no senses, no limbs, and a virtual torturer breathing down his neck, he still managed to break out of his prison.
?
Believe me, I’d love to say yes. But to my shame, at this point I still nursed a plan of pressuring Eames into giving me the job he’d touted in the first place, researching AI applications in treating neurodegeneration. (Jasmeet had already walked out, by the way, disgusted and disillusioned. I should have followed him.)
?
I was reading Jóhannsson’s work. The few scientific papers I was equipped to follow, and the two books. Of those it was Rock Opera that caught my imagination. He was a great writer. He made antiquity visceral; explained that though the built environment has changed unrecognisably, the human condition is much the same now as it was three thousand years ago.
Anyway, there was a whole chapter on the ‘Sator Square’: a palindromic phrase in Latin – five words of five letters each – arranged in a square that reads the same every which way: down, up, left to right, right to left. Its origins are obscure, but carvings and depictions of it were ubiquitous in Eurasia and North Africa for millennia. Half the citizens of Rome had one of these things.
?
The Sator Square is similar to an acrostic, a poem in which the first letters of each line spell out a hidden message. And that got me thinking about the poem I had the Crane-15 write in my mother’s voice. So I read it again.
!
Yeah.
!
Yes. A rudimentary encoding, but one he predicted – correctly – that the sentinel’s algorithms wouldn’t be looking for. And if I’d been smart enough… if I’d been smart enough I could have done something. I could have spared Carl weeks of horror. If I’d paid more fucking attention… if I’d been less…
.
I know. I know. You’re right. So I kept digging in the logs and found out even more about what it was like for Carl in there.
The HVAC hack wasn’t his first suicide attempt. He’d tried several times before, and been punished for it. Double, triple helpings of artificial nociception. New virtual hallucinations, procedurally generated around his deepest fears.
?
Yes. That was what tore it, for me. Imagining his state of mind as he prepared that final attempt – knowing what was in store for him if he didn’t succeed. That was my limit, right there. Fuck Ken Research, fuck my own shameful aspirations, fuck Noel Eames and his toxic empire. So here I am, talking to you, trying to atone for what I did and what I didn’t do. Hoping it won’t all have been for nothing. Because that is not life.
?
No idea. He hasn’t been seen in public for months.
?
Me? No. Something else. I’m not sure what, yet. Something very different.
?
Because even if we never hear from him again, there are thousands of new Eames, patiently waiting to pick up where he left off.
Something about the information age favours and nurtures raw ambition above all else. Fetishises it so that if you have it, you’re no longer obliged to be a non-horrible person. And your horrible acts are either forgiven, or recast as cynical edgelord virtue.
Because for those who see everything and everyone as raw material, wanting is sacred. But taking is god.
Discuss