The emergence of Large Language Model (LLM) technology and generative AI chatbots has had major repercussions across industries around the world. Software engineers are wondering if ChatGPT can soon write code quicker and more accurately at a fraction of the cost. Writers are afraid on two counts — one, that their copyrighted works will be fed free-of-cost into the all-consuming LLM machine, and two, that the resultant outputs will soon put them out of business. Generally speaking, there is a lot of mistrust and disinformation in the mix, along with a host of genuine concerns and question marks about the way generative AI is going to change the world.
It was only a matter of time before a serious, well-regarded, tech-savvy writer produced a book-length meditation on the ways in which we interact with these technologies. Vauhini Vara’s new essay collection Searches: Selfhood in the Digital Age is that book. Forty-three-year-old Vara is the author of two previous books: the novel The Immortal King Rao (2022), a finalist for the Pulitzer Prize, and the short-story collection This Is Salvaged (2023). Vara covered Silicon Valley for The Wall Street Journal for a decade before working at The New Yorker, so she has been observing and writing about this world for a while now. With Searches, Vara turns her gaze on all of the interconnected ways in which we use technology to not just affirm, but in some cases shape our very identities.
The genesis of the book lies in two essays, both included here: the titular essay ‘Searches’, published in 2019, and the essay ‘Ghosts’ published in 2021. The titular essay is written entirely as a sequence of Google searches, one after another. Ranging from the utilitarian (“what happens if you take too many puffs of an inhaler”) to the tragic (“what to send when a child dies”) to the existential (“what do people need to live”), this deceptively straightforward essay will change the way you look at the humble Google Search. ‘Ghosts’, on the other hand, is written as a series of dialogues with GPT-3 (one of ChatGPT’s predecessors), asking the chatbot to produce texts describing the life and death of Vara’s older sister during her undergraduate years, from Ewing’s sarcoma (a form of cancer affecting the bones and soft tissues).
“When I wrote those two pieces I don’t know if I was thinking at a particularly conscious level about the intellectual goal of those exercises,” Vara recalled during a video interview. “With time I realised that what I was doing with those pieces was trying to engage with what I see as a fundamental problem with technology companies. What makes these companies’ products so compelling for us is often tied up with the ways in which these companies exploit us, exploit other people, exploit the planet as a whole. Just like there’s something worth investigating in the way people talk to each other, there’s something worth investigating here: how we communicate with tech companies through their products.”
Chances are you will end up reading and re-reading ‘Ghosts’ a few times, like I did. Not only is Vara’s own writing brilliant and deeply affecting, her conversations with the AI are produced before us step-by-step. As Vara feeds more of her own genuine recollections, the AI comes up with increasingly elaborate fictions, describing moments of sisterly warmth, conjuring memories that never happened in the first place. Here are some of Vara’s own lines about her sister:
“Here I should conjure my sister for you. Here I should describe her so that you feel her absence as I do— so that you’re made ghostly by it, too. But, though I’m a writer, I’ve never been able to conjure her. I remember the same small set of details: her loud laugh; her bossiness and swagger; her self-consciousness about her broad nose, her curly hair. But even this isn’t fixed. Her hair fell out. Her nose narrowed. She began moving slowly and carefully; we’d go down to Clarke Beach that spring when she was dying—she wanted to show us where to spread her ashes—and when we walked back up, I’d have to put a hand on the small of her back and push her.”
Just like there’s something worth investigating in the way people talk to each other, there’s something worth investigating here: how we communicate with tech companies through their products, says Vauhini Vara, author
Share this on 
Interestingly, in response to lines like these, the generative AI produces several fictions: a writing instructor who enters Vara’s life offering her solace, or a memory of Vara holding hands with her sister at an intersection (neither of those things happened in real life). Finally, in the face of an extended version of Vara’s essay, the technology gets caught up in a glitchy loop, going on and on about a ghost hurtling through space in a spaceship—a fascinating example of a generative AI “hallucinating”.
“I’d question the word ‘hallucination’ a little bit,” Vara says “It is indeed the commonly understood term for when an AI-based product generates text that is inaccurate. But I feel like the term has caught on because tech firms are always interested in making their products not-that-different from people, and the word ‘hallucinations’ primes you to think of it as, ‘Oh, it’s just a person who is not in their right mind’. Whereas what has actually happened is a tech glitch, one that—at the moment—appears to be fundamental to these products.”
The other essays in the book are asymmetric approaches to the human-technology interface. The concluding chapter, for example, is basically crowdsourced. It consists of worldwide responses to specific prompts like “What do you know about the lives of those who raised you”. Another chapter sees Vara using Google Translate and other tools to give us two versions of an essay: the original Spanish-language essay she wrote during the process of learning the language, and a Google Translate English version. As a result, both texts are midwived into existence by technology and therefore, the reader’s ideas of ‘authenticity’ and ‘originality’ are challenged elegantly.
“I think what the Spanish-English chapter did was remind me (and hopefully, the readers) that ultimately, the meaningful thing about communicating, whether you are a professional writer or not, is the effort itself,” Vara recalls. “Ultimately, I couldn’t communicate what I wanted to, but there was something effective, something poignant about the failure. The fact that I tried so hard, that I was vulnerable, I embarrassed myself…. that’s the meaningful part. What makes that chapter (hopefully) valuable to readers isn’t the role played by Google Translate but the effort that went into all of it.”
When readers, especially students and professors of literature, finish reading the entirety of Searches, there will be a lot of attention to technology underpinning the text, of course. But what is even more impressive about the book is that its digitally produced mini texts end up resembling really old literary structures and conventions. The profusion of voices in the last chapter can be read as a Greek chorus of sorts, while ‘Searches’ can be seen as a demonstration of the dialectic method. The GPT-produced interludes in between the chapters, focusing on the book’s structure and messaging style, will make writers of metafiction smile in recognition.
“Part of the reason some of these technologies feel familiar is that they are well-versed in the forms that you and I are familiar with,” Vara says. “ChatGPT is part of a linguistic continuum. The thing that has been true throughout history is that we have always used language as a tool, for communion, for understanding, for liberation. And one of the most interesting and exciting things about language is that it can be used to serve the goals of the powerful as well as the goals of the powerless.”
More Columns
Janhvi Kapoor: South Story Kaveree Bamzai
Season of Spies Kaveree Bamzai
Children of Men Kaveree Bamzai