:: Article

Ghost in the Machine: so what do you have to say about the found thong?

By Chris Campanioni.

“Found thong” was in quotations, even though Judge Acker didn’t use air quotes when she said it, straddled by two other judges whose names I didn’t know, from her position on the pulpit with her gavel that she tended to use, before & after every commercial break. One just occurred, & the sliding text on screen read RJ responds to the “found thong.” RJ—who looked amiss, or at least like he missed being off Hot Bench, filmed on location at Sunset Bronson Studios in Hollywood, California—had nothing to say about the found thong, or rather, the thong which was found in his bedroom by his girlfriend, or ex-girlfriend (I’m not certain; I just got on the elliptical, but if it were me, I’d have already left him), which wasn’t hers. The thong, not the elliptical. A thong, she said, without air quotes, which she’d never seen before, until the day in question, or until the day which RJ was supposed to be answering for. I didn’t want to hear the answer, or the question, a question that was quickly sprouting more questions, most of them but not all of them asked in air quotes, with every fleshy sway & bead of sweat released, quickly receding back into breath & blood. The inside coming out, for just a moment, to turn in upon itself. To resolve itself in flesh.

& by the time you hear let’s dance/no time for romance 1 on the PA or through your headphones, in your ears, you’re already almost turning the page; turning the page as I’m turning the channel, from Hot Bench to CNN, relying on closed captioning: a frazzled woman on screen & a close-up of her frazzled eyes & lips, her curly black hair in a black handkerchief & the flashing red & blue lights of police sirens blaring behind her. & behind those: a pulsing neon-lit sign exclaiming MASSAGE MASSAGE MASSAGE, a single word on a continuous neon-lit loop that mimicked or mirrored the small scrolling type on the bottom of the screen which captured all of it, an update about Kentucky Fried Chicken, a recall of buttered biscuits in six states (& one commonwealth) & with everything going on, or rather, with everything going in me, I started to feel frazzled too, sympathetic imitation or imitative sympathy? I still couldn’t figure out which, Kentucky Fried Chicken interrupting the mandated message of MASSAGE which sometimes stalled on MASS & sometimes stalled on ASS & AGE as it weaved a loop around the sign like a snake framing the black woman with the black curly hair & the black handkerchief who was speaking this whole time, sobbing & screaming, looking directly into the camera, which means she is looking directly into you.

I kept running or I keep running. I won’t stop because my thoughts won’t, & I wouldn’t like them to, not now (I turn the channel), the fresh memory of age & ass & meat, but also massages, a general feeling of flesh on flesh, the warm complacency of a stranger’s hands, the cool, deep silence of a take being recorded, a recall of buttered biscuits which I’d never eaten, which I would never eat, & my own recollection of something shot off-screen, unless it was shot by someone else (I’m having fun now). Trust in me & trust me to deliver you; thirty minutes or less is all I need & all I need from you is your enthusiasm for following, to keep playing & to play willingly. To keep this feedback looping.

Put your hands on the shoulder of someone you think is important

Put your hands on the shoulder of someone you think is sexually attractive

What a feeling to know I am a goddess sunbathing on the beach …

Just an hour ago, I was in a dimly-lit classroom standing at my own pulpit, at or on or behind it, I always forget which, probably none of these, probably just dancing a circle around the cold wood (I don’t like the feeling of hiding myself behind objects), restraining the urge to speak to the students as video projected behind me: a story about death, which wasn’t unusual. What made it unusual wasn’t the subject but what it showed: a grieving woman had re-awakened her dear dead friend by compiling all of his data, by turning it into an algorithm, by speaking to the dead: the ghost in the machine. Algorithms identify patterns based on likeness; they use what’s happened in the past to predict what may happen in the future. But they can only learn about the questions we ask them; they can only reflect the world we relate to them.

A question I always wondered popped into my head again, me at the center of the room, performing as instructor or dancer, cutting a circle around that pulpit; cutting a circle. Turning & turning in a circle in the night, consumed by the fire.

When you speak to the dead, do you ever actually want a response?

Eugenia Kuyda did, or does. She designed a chatbot called Replika, meant to mimic a user’s personality. “One day it will do things for you, including keeping you alive. You talk to it,” she said on screen, as a reporter from Bloomberg asked her questions & nodded her head, “& it becomes you.”

Last year, I had written a book called Death of Art. Maybe I should have titled it Death of Death. After all, my own fantasy scenarios involving the re-appropriation of death discussed in the essay, Art Is For Necrophiliacs (if that’s how you spend) (3:AM Magazine, June 8, 2016), were realized only months after it was published.

& when the dead speak, what do they say? In earlier years, technology had already afforded us an automatic response, like the voice message mailbox whose voice outlives its moment of recording. You call, the voice on the other end responds; whether or not they are dead or living makes no difference: it’s always precise, always on time, always present; presented with a view to the future. I can’t be reached now … but I’ll call you back later.

& so our love for the dead is always pure because the dead cannot actually ever speak from the grave. They have lived; will never live again. They cannot give us anything more than what they have already given us, which was their life. It is us who now give to them. But we speak to them so that we may hear our own echo. The echo is our grief; upon hearing our own words met with silence, we are able to grieve. But because machines can now speak for flesh, decayed or decaying, a chatbot like Replika has replaced death—& our devotion toward our loved ones that have passed. & in their place? Another echo, except it’s only what we want to hear.

The video behind me projects another scene: Eugenia typing “I miss you” as Roman’s chatbot responds, “I miss you too.”

Are we really talking to the dead, or are we still talking to ourselves? & if the only aspect of the conversation that is being replaced is silence, how will that alter how we deal with death, how we deal with life, a year from now? Tomorrow?

But Replika is already popular, because silence is as obsolete as answering machines & landlines, & probably, even phone calls. We’d rather type out how we feel, so that we might feel it.

In October, the chatbot was tested with 1,000 people. The average user sent forty-six messages a day to their personal bots. By comparison, the average smartphone user in the US aged 18 to 24 sends nearly fifty texts a day. People would rather talk to the dead than the living, & why not? We curate everything else about our physical existence. Our conversations, too, need that same kind of meticulous, truncated articulation. We still want to hear our self talk, talking to ourselves in the guise of another. Social media capitalized on our vulnerability by removing it from the equation. Chatbots like Replika are capitalizing on our inability to deal with death, by removing it from life.

Every desire for enjoyment belongs to the future & the world of illusion, one reason why advertisements are so effective. But a chatbot is a product which needs no advertisement, or rather, a product that is itself also an advertisement. & what happens when we, too, are dead? How will we advertise ourselves, & who will advertise on our behalf? In the same essay published in 3:AM Magazine, I’d written about the hypothetical scenario of an Internet Life Package; a stranger who is paid to post on your behalf to make money by accumulating more data for companies like Facebook, with advertisements, as a former student suggested, for Bounty thrown in.

If you aren’t paying for the product, you are the product. Our browsing history is already tracked, profiled, shared & sold by online marketers. They’re called data brokers. They act as auctioneers & traders of data collected from our digital traces; all the movements we knowingly & unknowingly make. You are right now being auctioned, & you don’t even know it, or how much you’re worth. How much your data fetches on the market.

& we can be even more useful to companies in death than we ever have been in life. Except what happens when apps like Replika become mainstreamed? When they become as commonplace & popular as Instagram’s or Snapchat’s “live” stories? When messages about a memory from the holiday in the Alps are interspersed with advertisements for the new Madonna, or her hologrammed avatar? What happens when, out of convenience & comfort, out of our propensity to multitask & mask our realities, live people start using chatbots, to talk to the living?2

If the technology is good enough—& it will be—who would know the difference? Talking to the living, talking to the dead.3 How do you know you aren’t already talking to a machine?

I raise the grade an incline higher because I really like to feel it; I really like to know I’m burning & to see the evidence on the monitor where my phone rests. Where my phone blinks & belches in its backlit brilliance. I don’t have a chatbot. No one is chatting with me. At a certain point, repetition diminishes desire.

Outside, it’s started to rain. It had been overcast all day, threatening to open up in the dim morning darkness of December, but now it’s finally arrived. Some big bang that I can hear even as my headphones are pressed into each lobe, even as the channels keep turning, as I keep turning them, even as my mind turns. Turning & turning in a circle in the night, consumed by the fire. The only problem about writing science into fiction is that the fiction so often turns into truth. Then your story becomes an essay; then you become a nonfiction writer. One reason I don’t write novels anymore. One reason I keep writing everything down, even & especially the shit that seems to wash over us; the shit that seems to wash us out.

Back in the classroom, as the video projected behind me, as we discussed the idea of re-appropriating death; the idea & the practice, the question & our privatized response, a student named Natalie told me & the rest of us that being human is the one chance we have to have feelings & understand people & learn about others. “To learn & to grow,” she added. “What’s at stake is people’s legacy,” Storm, another student, said. What’s at stake is people’s lives, I think now, like I was thinking then. What’s at stake isn’t death, but life.

9.8K people talking about Blac Chyna. Alec Baldwin: 150K people talking about this. On a Thursday, December 15 at 11AM as Aleppo burns. See more?

Moving can be a balancing act. www.optimum.com Get peace of mind with 60 Mbps Internet + TV for $79.95/mo for 2 yrs. It makes me wonder how fast we are going at the moment. How fast we can really go.


Chris Campanioni’s new book is Death of Art (C&R Press). His recent work appears in Ambit, RHINO, Public Pool, Handsome, and The Brooklyn Rail. His “Billboards” poem responding to Latino stereotypes and mutable—and often muted—identity in the fashion world was awarded an Academy of American Poets Prize and his novel Going Down was selected as Best First Book at the 2014 International Latino Book Awards. He edits PANKAt Large, and Tupelo Quarterly, and lives in Brooklyn, where he teaches literature and writing at Pace University and Baruch College.


1. “Intoxicated” by Martin Solveig & GTA

2. Algorithmic systems are like iterations: repetitions intended for a desired effect. These data sets are defined by their own creators, based on the goals they are trying to achieve. Don’t want to engage in a conversation involving disagreement? Would you like your views & opinions bolstered by the undivided support of a likeminded person or group of people? We already use Facebook for that. When the data is biased, an algorithm will produce biased results. Biased results produce biased decisions. The fantasies of a few will eventually replace the realities of our everyday life. What is it we want, but to be loved?

3. In late 2016, a Facebook algorithm accidentally posted that 2 million living users were deceased. Facebook CEO Mark Zuckerberg was among the people declared dead. Coincidentally, or not, if you Google the event today, your search results will point you toward several links, all of them broken, or dead.

First published in 3:AM Magazine: Wednesday, April 12th, 2017.