Like many fundamental assumptions about The Way The Internet Works, the idea that the things you put online stay online feels both arbitrary and inviolable. Once upon a time, the lovable nerds who built the first bulletin board systems decided that anything posted to it would persist on disk until the disk hit capacity, and that (as they say) was that.

It’s not hard to squint back in time and see why that decision was made. Until the internet came along, the vast majority of writing any of us encountered would have been formal writing: books, newspapers, magazines, and so on. Someone had to submit it, someone had to edit it, and someone had to publish it, and it made sense for something with that much effort poured into it to be rewarded with some level of permanence. When it came time to decide how digital writing would be stored, it’s not unreasonable that we opted for persistence.

Of course, informal writing—tweets, texts, emails, memes—doesn’t require nearly the same amount of effort. (I mean, it can, but it doesn’t have to.) And now that enough of us have been on the internet for a few decades, enough to be embarrassed by things we thought twenty years ago even if they were ultimately harmless, it’s high time to take a good look at our paradigms for digital memory and ask if they still make sense.


Did you know that every time we remember something, we’re actually remembering the last time we remembered it? Like a JPEG artifacting every time it’s stored and retrieved, each evocation of a memory distorts it just a little bit. I first learned about this in my late teens, and I remember (or maybe I only think I remember) being profoundly upset at the idea that my brain could betray me so easily. And yet, this gradual blurring of our mental images is what allows us to live our lives without succumbing to the overwhelming sharpness of our emotions. If I felt about my wonderful husband the way that 12-year-old me felt about the crush I’d never spoken to, I would never get anything done. Contrarily, grief never quite goes away, but it does become more manageable, taking up a smaller slice of our field of view.

One of my favourite short stories is Ted Chiang’s Truth of Fact, Truth of Feeling. Set in a future where everyone carries around a video archive of every moment of their life, it’s told from the perspective of a journalist reporting on a new technology that renders that archive instantly searchable. In the course of reporting, the narrator discovers that a pivotal emotional event in his life did not, exactly, play out the way he remembered it.

The passage from the story that’s always stuck with me is this, near the end:

The idea that accounts of the past shouldn’t change is a product of literate cultures’ reverence for the written word. Anthropologists will tell you that oral cultures understand the past differently; for them, their histories don’t need to be accurate so much as they need to validate the community’s understanding of itself. So it wouldn’t be correct to say that their histories are unreliable; their histories do what they need to do.

Right now each of us is a private oral culture. We rewrite our pasts to suit our needs and support the story we tell about ourselves. With our memories we are all guilty of a Whig interpretation of our personal histories, seeing our former selves as steps toward our glorious present selves. [emphasis mine]

The narrator in Ted Chiang’s story declares that computer-assisted memory is here to stay, and we ought to embrace its potential for helping us craft more accurate personal narratives. I’ve always liked the idea that technology could help us become truer versions of ourselves, but these days that feels almost absurdly utopian. Few of us actively choose to frame ourselves as the heroes of our own stories, it’s just what happens. In the year of our lord 2020, our elected leaders have amply demonstrated how strong this instinctual myth-making is, how easy it can be to tell ourselves fairy tales about who we are even when faced with seemingly irrefutable evidence of our failures.


A few years ago, I went through months of internal debate trying to decide whether I should start deleting my old tweets. On one hand, the aftermath of Gamergate and the accelerating pace of online harassment had made it an entirely sensible safety precaution. On the other, it felt rude and uncouth to break threads and reply chains, removing records of meaningful conversations I had with other people from the semi-public spaces in which they took place.

The tension between an individual’s right to control their words and the protection of the communal experience is not a question unique to our current forms of social media. In any format of communication online, be it newspaper comments or message boards or web rings (remember web rings), we have to deal with the question of data retention.

On a practical and harm-reductionist level, the question feels easy to answer. The internet of 1989 is very different from the internet of 2005 is very different from the internet of 2020, and there are myriad reasons why someone might want to erase their decades-old selves. As much as I would love to be able to freely trawl the web archives of my favourite thinkers, I don’t know what circumstances in their lives might lead them to want to scrub their histories: maybe they’re trying to avoid a stalker, or handle identity theft, or make it harder for overzealous law enforcement to track them down for exercising their freedom to protest. No one’s curiosity and education should supersede someone else’s safety, and no one is entitled to learn from anyone else’s experience, not even if that person had previously freely shared it1.

On a philosophical level, it’s worth interrogating when and how maximizing the persistence of records has become a first principle. The internet is built on hyperlinks, and studies have shown that the half-life of a link is about two years. That means that 50% of all links you click on today will be broken by August 2022. In a culture of abundance that says our lives are best when we have access to everything at all times, this is generally presented as a bad thing. But not all human communication is designed for permanence—we don’t expect to go to Starbucks and hear the ghostly reverberations of every conversation that happened there—and I don’t see why that should be different on the internet.

Our early creations on Geocities and Angelfire would be entirely gone now, but for the diligent work of the Internet Archive, and I feel instinctively sad about that possibility. But did those sites not accomplish what their creators set out to do, give rise to an expression at a point in time when it was needed? Why isn’t it enough for something to have brought joy to the world for a little while, contributing to the culture at a specific point in time, before fading gracefully into the ether?


As inherently social creatures, we have always structured our behaviour, consumption, and company to some extent by what those things said about us. But I admit to being captivated by the way the internet has essentialized this performance into the personal brand, distilled it down to a handful of nouns on an Instagram profile. It was probably inevitable that the social internet would be framed around The Profile as an externalization of the identity we perform to the world. We want to be seen, and heard, and witnessed, and so we needed virtual selves to see and hear and witness.

The thing that’s pernicious about personal brands isn’t the performance, or not just the performance. It’s that our algorithmic overlords don’t make money off of our growth and evolution; they profit from consistency of engagement. A brand that changes is a brand that’s difficult to categorize and recommend. A person who changes is a person whose consumption patterns are difficult to predict and commodify. It’s not a coincidence that most recommendation engines show you more of the type of thing you had already asked for, which is also why Amazon keeps trying to sell you vacuum cleaners weeks after you’ve bought one. (I didn’t say they were good at this.)

When I deactivated my Facebook a few months ago, it was an incredible relief to no longer be reminded on a daily basis of people whose lives were no longer relevant to me. That sounds callous, but it’s also true. This wasn’t necessarily because I felt any kind of way about their milestones and updates, but simply because our paths had diverged. Some friendships are for a season, and that is both great and fine.

But Facebook is designed to pull you back into those lives, from annual retrospective to “On This Day” memorials to increasingly pushy notifications about the longevity of your friendships. Anytime I saw a significant update from someone with whom I used to be close, I would feel this pang of regret for having let that connection fade, and wonder if I should attempt to rekindle something. But these fleeting concessions to nostalgia don’t change the fact that these relationships are fragments of the past, representing people whose absence from my life has little meaning other than that time passes and people change and sometimes they don’t change in the same direction.

My therapist told me once that trauma is your brain getting stuck at a fixed point in your past, continually running through the scenario to try and arrive at a different ending, because it doesn’t know yet that you’ve survived, moved on. One of the strategies for working through trauma is to visualize the rest of what happened, how you brought yourself to the point you are now, in order to convince your brain that it’s safe to let go. These ghosts in the archive—not just our personal histories on social media, but the streetview map that still shows a beloved home you had to leave, the 2 AM search engine rabbit hole—continually pull us back to stuck points in time. How do we carve out enough breathing room to let ourselves flourish and evolve in an attention economy in which our performances are never off and our programs never change?


The entire time I was working on this essay, I was listening to Taylor Swift’s surprise album folklore pretty much on repeat. Taylor Swift is the same age as I am, the same age as the world wide web, so it was striking to me how many songs on folklore are about events and people from her youth. They were frequently set against the backdrop of high school, and much of the album is spent revisiting themes she has already explored on previous albums.

One of the hallmarks of being a teenager is that everything feels permanent and unchanging. The smallest slight feels life-altering and world-ending, because we haven’t yet had the wealth of life experience that teaches us that this too, shall pass. At her best, Taylor2 has the ability to set this feeling to music, to take a polaroid of a point in time and pull out a sweeping universality that may not be true in fact, but is certainly true in feeling. But the stories we tell ourselves about who we are at thirteen are different from the stories we tell at twenty-two or thirty one. As we worry away the sharpest edges of memory in our repeated remembering, we create space for new narratives and fresh interpretations in the light of the people we’ve become since then, letting go of heartache we were convinced we would never forget until we did.


The General Data Protection Regulation enacted by the European Union and the California Consumer Privacy Act are two of the broadest-ranging privacy protection legislations currently in effect. Both of them enact the right for an individual to request a copy of all the data they’ve ever submitted to a website or service, and for that data to be erased from the internet. Broadly speaking, these rights are known as the right to be forgotten.

But the right to be forgotten is also the right to forget, to no longer be confronted by the mundanity of the past, and most importantly, to frame your own present. We are all slightly different versions of ourselves with different people, and we are all slightly different versions of ourselves at different times in our lives. The thing that made me decide, eventually, in favour of deleting tweets wasn’t just the threat of harassment. Rather, in a world in which people often meet your social media profiles before they get to know you, I wanted to be sure that they were meeting me as I am today.

Chiang is right that technology-enhanced memory is already here; we’ve been externalizing human memory since the invention of writing. The fact that technology inevitably changes our relationship to the mind is not a good or bad thing, it’s just a thing. My point is that the capacity for more memory does not intrinsically lead to more truthful stories, and when we argue for the sanctity of archives we need to make sure we are not elevating the preservation of the past above all else.

I am entirely aware that the right to be forgotten, as all rights, can be abused, that it can allow bigots to whitewash their past and hide their mistakes. I also struggle with how to reconcile it with the structural problem that our histories are written by the victors, or even more simply, by those who have time to write. But under the status quo, we are already casting individuals as the sum of all their passing public thoughts. The powerful wishing to scrub their pasts already have far more recourse to do so than those with less power, and so the question becomes not about whether reinvention is possible, but who is allowed and empowered to reinvent themselves.

I am not arguing, now or ever, for less transparency, less information, or less knowledge. I am not even arguing that social networks ought to enshrine universal deletion policies, though I think the popularity of services like Snapchat and Instagram stories have demonstrated that there is a desire and space for ephemeral connections. I am simply arguing for a little bit more intention in our curation. The pace of history is accelerating, and often it is being written even as we are still struggling with our place in it. We all need to be able to offer one another—and ourselves—a little bit of grace to muddle through our lives, a little bit of room to grow, even as history is unfolding around us.


  1. This is especially true as the internet expands beyond text to include inherently biometric data like voice and photos and video, but that’s a whole other 2,500 word essay. 

  2. We’re definitely on first name terms.