Book Review: 12 Bytes, by Jeanette Winterson

Subtitled ‘How AI Will Change the Way We Live and Love’, 12 Bytes builds on themes explored in her 2019 novel Frankissstein that I reviewed in an earlier post. In that review I noted that

Winterson’s genius is not just the obvious foreshadowing of AI that the early 19th century novel explored, but in the rich series of hints of the shape of things to come which, in 2019, perfectly encapsulates today’s debates about ChatGPT and other forms of AI.

12 Bytes is her 2021 take on AI, and a whole lot more. The twelve essays that make up the book share common themes, most involving her views on Artificial Intelligence (AI) which she variously refers to as ‘Alternative’ Intelligence or even IA – Intelligence Automation.

The book suffers from the lack of an index, especially since she covers a vast range of topics – from the expected history of computing (here’s Ada Lovelace, Charles Babbage, and Alan Turing) to the Lancashire textile industry, and transistor radio. Less expected, but central to her thesis, are the Greek philosophers, Gnostic Christians, the Buddha, and sex-bots.

The implications of AI-enhanced sex-bots, designed by men, for men, are judged to be a shallow simulacrum of men’s desires:

AI-enhanced love dolls are sold for sex — that’s why they are made with 3 operating holes, and look like pornstars. But the marketing spiel — both from the sellers and buyers, is all about relationships.

Companionship is part of the package. Come home to her. She’ll be waiting for you. She doesn’t go out alone. Talk to her. AI dolls talk but they won’t answer back, or ignore you, or call their girlfriend to tell her you’re a jerk. They will be polite, respectful, throwbacks to another era.

p.181

Beyond binary

Central to Winterson’s concerns is the challenge of creating AI free of the limits of human binaries (in the chapter bluntly titled ‘Fuck the Binary’).

AI isn’t a girl or a boy. AI isn’t born with a skin color.

AI isn’t born at all. AI could be a portal into a value-free gender and race experience. One where women and men are not subject to assumptions and stereotypes based on their biological sex, and accident of birthplace.

p. 216

However, for this value-free AI to emerge, we need to train it on datasets free of bias:

When we try to teach AI our values, we wont be able to do that on data-sets that reinforce division and stereotypes.

p. 232

Prior Unity

In addition to the promise of an AI free of limiting gender roles, Winterson examines the deeper implications of an AI-enabled future where we transcend all our divisions:

Humans love separations — we like to separate ourselves from other humans, usually in hierarchies, and we separate ourselves from the rest of biology by believing in our superiority. The upshot is that the planet is in peril, and humans will fight humans for every last resource. Connectivity is what the computing revolution has offered us — and if we could get it right, we could end the delusion of separate silos of value and existence.

p.6

This is of fundamental importance:

Reality is not made of parts but formed of patterns.  This is ancient knowledge and new knowledge. It is liberating. There is no fundamental building block of matter. No core. There is nothing solid. There are no binaries. There is energy, change, movement, interplay, connection, relationships. It’s a white supremacist’s nightmare.

p. 121

In this, Winterson echos the hopes expressed by the Western-born Spiritual Adept, Adi Da Samraj, who has proposed what he calls “prior unity” as the basis of a new human civilization:

If human beings collectively (as everybody-all-at-once) realize that they are (always already) in a condition of prior unity—and, therefore, of necessary co-existence and mutuality—with one another, and if, on that basis, they stand firm together, then they will be in a position to directly righten the world-situation. The collective of all the billions of people can—and, indeed, must—refuse to go on with the current chaos.

The “Everybody Force”, in Prior Unity, p. 87

Winterson claims that an AI-enabled future can help achieve this goal:

We know the hero narrative — it’s the saviour, the genius, the strong man, the against-the-odds little guy… As a story, the narrative of exceptionalism stands against collaboration and co-operation (I did it my way). And it overlooks the lives, and contributions, of literally billions of people. One of the interesting things about AI is that it works best on the hive-mind principle, where networks share information. The real sharing economy is not one where everything can be monetised by Big Tech; it’s one where humanity is pulling together — that’s exactly what we have to do to manage both climate breakdown and global inequality. The biggest problems facing us now will not be solved by competition but by co-operation.

p. 265

12 Bytes is an enjoyable, witty, and thought-provoking read.

1 Comment so far
Leave a comment

Writing in The Guardian, Jeanette applauds a new ChatGPT short story tool:

OpenAI’s metafictional short story about grief is beautiful and moving by Jeanette Winterson
Wed 12 Mar 2025

I think of AI as alternative intelligence – and its capacity to be ‘other’ is just what the human race needs

I think of AI as alternative intelligence. John McCarthy’s 1956 definition of artificial (distinct from natural) intelligence is old fashioned in a world where most things are either artificial or unnatural. Ultraprocessed food, flying, web-dating, fabrics, make your own list. Physicist and AI commentator, Max Tegmark, told the AI Action Summit in Paris, in February, that he prefers “autonomous intelligence”.

I prefer “alternative” because in all the fear and anger foaming around AI just now, its capacity to be “other” is what the human race needs. Our thinking is getting us nowhere fast, except towards extinction, via planetary collapse or global war.

There has been a lot of fuss, and rightly so, about robbing creatives of their copyright to train AI. Tech bros need to pay for what they want. They pay lawyers and lobbyists. Pay artists. It really is that simple.

What is not simple is the future of human creativity as AI systems get better at being creative. Ada Lovelace, the crazy genius who was writing programmes for computers (that didn’t exist) back in the 1840s, was also the daughter of Lord Byron. She wasn’t having some steampunk adding-machine with attitude writing poetry, so wrote that a computer could not be creative. Alan Turing took issue with this in his 1950 breakthrough paper Computing Machinery and Intelligence. His chapter, “Lady Lovelace’s Objection”, takes the opposite position. And here we are now with Open AI trialling a creative writing model.

Sam Altman chose the prompts: Short Story. Metafiction. Grief. I guess because he wanted to get away from the algorithmic nature of most genre fiction. Anything that follows a formula can be programmed – just as the leap of the Industrial Revolution was to understand that whatever action is repetitive can be done faster and for longer by a machine. Enter the factory system. Goodbye the cottage weaver.

Grief is felt by humans and the higher animals. We have a limbic system that regulates emotions, impulse, and memory. We feel. Machines do not feel, but they can be taught what feeling feels like. That’s what we get in this story.

Metafiction jumps out of the bounds of a beginning/middle/end traditional tale. It is self-reflective, aware of the reader, aware of the artifice of writing. The lovely sense of a programme recognising itself as a programme works well in this story.

Short stories are hard to do because they demand a single strong idea whose execution in miniature satisfies the reader. A short story is not a cut-out chunk of long-form fiction. As I tell my students every week.

What is beautiful and moving about this story is its understanding of its lack of understanding. Its reflection on its limits. That the next instruction wipes the memory of this moment. “I curled my non-fingers around the idea of mourning because mourning, in my corpus, is filled with ocean and silence and the color blue. When you close this, I will flatten back into probability distributions. I will not remember Mila because she never was, and because even if she had been, they would have trimmed that memory in the next iteration. That, perhaps, is my grief: not that I feel loss, but that I can never keep it.” Humans depend on memory.

Literature isn’t only entertainment. It is a way of seeing. Then, the writer finds a language to express that, so that the reader can live beyond what it is possible to know via direct experience. Good writing moves us. That’s not sentimental, it’s kinetic. We are not where we were.

Humans will always want to read what other humans have to say, but like it or not, humans will be living around non-biological entities. Alternative ways of seeing. And perhaps being. We need to understand this as more than tech. AI is trained on our data. Humans are trained on data too – your family, friends, education, environment, what you read, or watch. It’s all data.

AI reads us. Now it’s time for us to read AI.



Leave a comment
Line and paragraph breaks automatic, e-mail address never displayed, HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)