our machine-human culture
a character.ai tragedy + trevor paglen's art
A note: This post mentions suicide.
Trevor Paglen, CLOUD #135, Hough Lines, 2019 (made by pointing a computer vision algorithm at photos of clouds, and tracing the strokes and lines that the machine is seeing)
Last week, The New York Times wrote a story about Sewell Setzer III, a 14-year-old who took his own life in February. The story had resurfaced because Setzer’s mother has filed a lawsuit against Character.AI, a billion-dollar-plus app for AI companionship, where millions of primarily young users use the app to chat with LLM characters fashioned by the company as well as other appgoers. Setzer was among these users, and his partner of choice was Daenerys from Game of Thrones.
He’d spend hours a day on Character.AI, telling Daenerys intimate things, confiding in her about his pain, his desires. It was not that Setzer believed she was real. In the chats, he states that he was aware that she was fictional, but merely wanted an escape, a way out, because he found that reality was too hard to bear. One day, Setzer told Daenerys it was time for him to go, and then he used his father’s gun.
The news made me feel really melancholy, in a way that I’ve gotten accustomed to feeling about news from Palestine or the slapstick, nihilistic political culture of the U.S., large, systemic stories, but not always individual ones. 14, turning the corner out of puberty, trapped in a knot of our creation. I wish he’d known that there were others who could walk with him along the dark channel of suffering. If it wasn’t a person, a book, a piece of art.
I was glum and couldn’t shake it. Autumn in Brooklyn has been warm and slightly—worryingly—pleasurable. It used to be that the mid-seasons would carry you gently into the extremes. But now, I’ve started to detect turns by flumes of air, making guesses about when winter will start based on whether they’re chilled or warm. That evening carried enough chilliness that I guessed winter. I decided to take the bus rather than walk to see the artist Trevor Paglen present a talk at e-Flux in Clinton Hill, Brooklyn, in conversation with Joshua Citarella.
Paglen’s talk was on an essay he plans to release this fall on psy-ops—a word he sometimes replaces with “marketing” or “PR.” For the uninitiated, psy-ops are a style of narrative-making that attempt to influence opinions at a mass scale, a technique originating in the American military and used today on X, Instagram, YouTube, and—yes—even Substack.
The talk veered in multiple directions (as is appropriate for any conversation on psy-ops), but I found that a piece of myself was still with Setzer, connecting everything that Paglen said back to him. I decided to walk home afterwards despite the cold that had settled in, and as I passed synagogues warm with laughter and still-busy storefronts, I would return again and again to Paglen’s position. Which is that we, humans, are not the only active agents in the world. Just as we act on our machines, they act on us. Let me try and explain that here.
We live in a time of two visual cultures. There is the culture of the internet, where images are largely produced by machines, for others machines. You don’t see these images. They are the language that make our social media apps fast and securely deliver the videos you send to friends. Here and there, machines do talk to us. They talk to us in a way that most of us still haven’t learned how to detect. They read our responses to visual images. They track whether we click an ad with a smiling baby versus a sad one, and create YouTube thumbnails based on what we’ve watched and are likely to click. This is the first visual culture. New, emerging, active.
Then, there is the older visual culture, produced by humans alongside older, now-manageable technologies like the camera and the canvas. In this culture, we view images through culture, memory, history, our individual subjectivities. We have contributed greatly to understanding how these images act on us, what stories they tell about power, beauty, violence. So that is the second. It is the culture that Sebald and Sontag and T.J. Clark and countless others have interrogated. It is old, but it is nowhere near gone. Instead, it has grafted onto the new.
And when the existing grafts onto new, something strange emerges, familiar and foreign at the same time. This is the third culture, the one we live in online. Old visual culture mixed with the new. Its examples include the way we’ve learned to angle and stage iPhone photos to create beautiful pictures within the limits of a camera app developed to collect better data. It is present in the plastic surgery trends of the 2010s to get “Instagram face” that performs well on the feed, and the button nose aesthetics that have since become beauty stndards.
This new culture has warped and moved faster with generative AI. “A photograph doesn’t need to have an index in your world any longer—it has an index in your imagination,” Paglen said at the talk, standing in front of an AI-generated image that went viral during Hurricane Helene, of a sopping wet young girl crying into her equally drenched puppy, the two afloat on a raft on a dirty brown stream of water.
In the before times, we sought out images that represented some form of truth, whosever truth that might have been. In the new culture, we seek out images that itch some part of our imagination, a part that says: this is the world as I understand it, therefore this image must exist. It might even be said that we are each engaging in the act of creating fictions online every day.
CLOUD #735, Scale Invariant Feature Transform; Region Adjacency Graph; Watershed, 2019
This same position can be applied to text culture. There is the old culture of written text. It goes back to wet, clay cuneiforms, scratched on with sharp objects by a select group. Since then, text has helped us archive, transact, reason, communicate, narrate, fictionalize.
And there is the new culture. Online, driven by machines that communicate in languages like JSON and XML, built specifically for machines to talk with each other. As with visual culture, when computers interact with our text, their way of responding is to act on us. The emails we write go through massive filters, are sorted into and out of spam. Search engines trawl the internet, indexing pages, talking to each other to figure out where to rank what. Big social media companies process countless amounts of text to inform feeds. This is the new culture.
And there is the in-between. The old grafting onto the new, and vice versa. We’ve learned to write in the language of SEO, to structure a piece so it gets indexed by our machine counterparts. We intuitively know how to write on trending topics, to adopt a certain voice on a specific platform that will go viral. We write email headlines to get seen in inboxes, and established practices like A/B testing have come out of it. SMS novels, award-winning group chats, Amazon review literature. All of this is in-between.
As this blended culture accelerates, it is likely that we develop a personal and secret language with which we prompt LLMs. That fictosexual desires find new ways to express themselves, that a culture of personification spreads and grows strong. To some extent, it is an unsurprising development, one anticipated since 1966, when several people took a spin on a symbolic AI chatbot ELIZA designed to respond like a Rogerian psychotherapist, rephrasing statements as questions. Afterwards, many believed that ELIZA was capable of expressing emotions.
What comes next is hard to predict in specific, but not so hard to predict at large. The more time we spend working with and around our machines, the in-between cultures will accelerate. Text and image culture might blend more. And entirely new forms of beauty and terror and sadness will emerge, recognizable, yet entirely foreign.
Detail of CLOUD #865, Hough Circle Transform, 2019
The next day, the weather had turned properly, officially cold. I spent some time on the NYC high school portal, helping my friend with her son’s application. The city has put in good, if fragmented, effort to make it so that every kid has a chance at going to a well-resourced high school. But that has created a challenging and difficult application system. Zoned schools, admission through a quota system, through a regular application, schools that take you straight to college, an app for a private charter, a test for private schools, a test for specialized high schools. It’s stressful and a lot of people give up, choosing to resort to the easiest, local option, even if it’s not the best. Even I found the platform a little confusing, and wondered what it would be like if the city fed the data into an LLM that anyone could query.
Part of Paglen’s antidote is to rid ourselves of mystique. How we behave on the internet is not a pure product of our human mind, a simple extension of human culture. We are being acted upon in large-scale ways that are often invisible to us. Every second we are online, we are participating in the new, blended culture. Our job is to be aware that we are using a tool—despite the marketing of companies like Character.AI—and make what we experience visible to those around us.
Visible so that we can hold two ideas in our mind at once. That Character.AI is driven by the promise of major profit, and does not care about some grander mission. That many people have developed relationships with LLMs that scratch some itch in their brain. Relationships that they find intimate, important, loving.
But love is something that humans do uniquely well and with surprising frequency. Gentle and vast, intense and cruel and boundaried. So is suffering. Sometimes, the human way is to experience suffering and to let it pass through us. If we are lucky, we might reach for a friend, a stranger. Other times, the human way is to give way and give up. That, too, is human.





“Our job is to be aware that we are using a tool—despite the marketing of companies like Character.AI—and make what we experience visible to those around us.” Very moving essay—thanks. I didn’t know how much I needed it.