I BOUGHT MY FIRST mobile phone, a cheap Philips model, 22 years ago, a week or so after I began a Master’s programme in Global Media and Communications, at the London School of Economics.
It rewrought the contours of life as I had known it. The phone’s beeps and vibrations confirmed that people were thinking of me, in spaces where I previously wouldn’t have had the ability to know this—the toilet, the street, the classroom. Through the pressure of my thumb on its buttons, I was able to conduct a romance at the same time as writing essays in the library. It was magic!
The mobile upended hitherto extant constraints of space and time and inveigled its way into becoming part of my sense of self. Without it, I was less than with it. I went on to write an award-winning Master’s thesis titled ‘Mobile Matters: Mobiles Matter’ on how mobile phone usage had altered the structure of experience for users. They disrupted established, socially defined boundaries and regulations concerning the use of space—by going off in movie theatres, for example. They brought the public into the private—think chatting with a business associate while attending a funeral, while simultaneously creating private auditory spaces in public.
Moreover, the potential of being reached at any time and in any space led to the kind of hyperconnectivity that seems banal today, but was not yet something we’d become desensitised to, two decades ago. For many analysts of culture and technology, the mobile was a frightening beast, destined to whittle away at human autonomy. The fact that these fears about mobiles existed long before they became connected to the internet can make those anxieties seem laughable today.
In the year 2000, my phone had spent the majority of its time in my bag. Today, it’s more akin to a glove on my hand. My children, because I have those now, are digital natives. My older son is accompanied by a permanent auditory soundtrack. Wherever he goes, disembodied voices waft from the phone in his hand/pocket. He is more likely to leave home without his clothes than without his phone.
Screens are no longer something apart from us. We live in them; they are constituted by them. Our smartphones, smart cars, smart watches, smart TVs are all so much smarter than us that we rely on them as extensions of our brains. But technology breaks down the dichotomies through which we have long understood the world: the lines between work and leisure, human and non-human, entertainment and study are blurred. Even the boundary between the mad-sad, bag lady and the AirPod-using high-tech executive is unclear, as they walk down the street, each apparently talking to herself.
Like all parents, I worry about the potential zombification of my children. My boys actually watch other people playing video games on YouTube. This can feel truly egregious—in a “the-world-is-coming-to-an-end-look-at-today’s-youth,” manner.
I was brought up short, however, by the explanation that the offspring gave me.
“Do you enjoy watching tennis, Mama?” they queried.
I do.
“And do you play tennis yourself?”
I don’t.
“So, what’s the difference between you watching someone else playing tennis on TV and us watching a professional Minecraft player, playing a game on the internet?”
Since we are on the subject of sport, let me extend the metaphor and state that I was clean bowled by that logic. It put me in mind of past moral panics around technology and children.
Almost a century ago, in 1935, the director of the Child Study Association of America noted of a new media: “no locks will keep this intruder out, nor can parents shift their children away from it.” This was a reference to the radio, which was widely held at the time to be habit-forming, its effects on mental and physical health likened to alcohol addiction by leading paediatricians.
Rewind a couple of centuries and the novel, so beloved by today’s pedagogically minded parents, was associated with excessive risk-taking and immoral behaviour in readers. In the 18th century, young people were often diagnosed with reading addiction, alternatively described as reading rage or reading lust. This “epidemic” was linked to morally dissolute, promiscuous comportment, and even suicide.
I’d learned that panic about humans losing their free will to technology had existed around every breakthrough, from the telegraph to the typewriter. And long before these, when the alphabet itself was ‘new-tech’, someone as exalted as the Greek philosopher Socrates himself had been convinced that writing would ‘introduce forgetfulness into the soul of those who learn it’, degrading the capacity of humans to remember
Share this on
The Sorrows of Young Werther, a 1774 novel by German great Goethe, for example, was considered so dangerous as an incitement to suicide (spoiler alert: the novel’s protagonist kills himself) that it was banned in Denmark, Italy and Leipzig.
While doing research for my Master’s thesis, I’d learned that panic about humans losing their free will to technology had existed around every breakthrough, from the telegraph to the typewriter. And long before these, when the alphabet itself was “new-tech”, someone as exalted as the Greek philosopher Socrates himself had been convinced that writing would “introduce forgetfulness into the soul of those who learn it,” degrading the capacity of humans to remember. Socrates believed that writing would give students the appearance of wisdom without the actuality of intellect.
In some ways, the rise of the mobile phone is a return to the primacy of orality over the written word. Books no longer have the caché with the young that they had with my generation. My boys spell badly—because of spellcheck. And they would struggle to handwrite anything over a page; anything legible over a page is another matter altogether.
Yet, they are without doubt, more knowledgeable than I was at their age. They wield fearsome quantities of information on topics ranging from astrophysics to Roman architecture. My younger child is an authority on ancient Greece mythology. My older one basically knows everything. They have learned chess playing with bots. Their piano has improved, thanks to access to YouTube videos.
Yes, their handwriting is sub-par. They don’t write letters. They find it difficult to do nothing. But perhaps we are in danger of romanticising all of these “losses”. There are certainly trade-offs. For there are pleasures in all of the analogue activities above, made more pleasurable by their difficulty when compared to the speed and ease of the digital world.
But are we truly in danger from our phones? Or is our preoccupation with screen time along the historic continuum of moral panics around new technologies? I would venture that as is often the case, the truth lies somewhere in the middle of these two extremes. Technology has rarely been inherently good or evil despite what the techno-utopians or dystopians would have us believe.
My feelings about new technologies are similar to those about the various countries I’ve lived in. It’s better to appreciate them for what they are, than what they are not. India in its place, as Japan in its. So, too, books in their place, as YouTube in its.
I will conclude by quoting Socrates in The Phaedrus, elucidating his scepticism about writing: “When it has once been written down, every discourse roams about everywhere, reaching indiscriminately those with understanding no less than those who have no business with it, and it doesn’t know to whom it should speak and to whom it should not.”
Something to chew on. But do keep in mind I wouldn’t have known any of this had it not been for Google!
About The Author
Pallavi Aiyar is an award-winning foreign correspondent who has spent the last two decades reporting from China, Europe, Indonesia and Japan. Her most recent book is Orienting: An Indian in Japan. She is a contributor to Open
More Columns
Time for BCCI to Take Stock of Women In Blue Team and Effect Changes Short Post
Christmas Is Cancelled Sudeep Paul
The Heart Has No Shape the Hands Can’t Take Sharanya Manivannan