the internet: like an electronic elephant

This morning Slashdot linked to a piece over at Ars Technica quoting Viktor Mayer-Schönberger of Harvard's Kennedy School (whew!) on the subject of digital forgetfulness. The problem, he says, is that anyone participating in online society will accumulate an ever-larger tail of embarrassing cruft. It'll be trivially easy for us to confront one another with beer-belt pictures and Inuyasha fanfic written at age fourteen. "Gotcha!" will move out of the realm of politics and into the office.

I can already see my friends' concern over this manifesting itself. I think that their (partial) blogospheric exodus toward Facebook is motivated, to some extent, by these sorts of worries. Clients and coworkers read their sites, and they don't like the constraints that imposes. No offense to the clients and coworkers reading this, but I don't always like it either — it's disappointing to feel like there are limits on your personal writing. I'm unwilling to flee to Facebook, but that's just my own hangup — if your online creative output includes technology, it feels like a straightjacket (and, of course, its founder is a thief, which makes my web-developer self loathe to endorse it in any way). But I can understand why others would want to.

Still, fleeing to proprietary communities is just a stopgap measure. My completely-neglected Facebook profile currently has at least one friend request from a client waiting. Of course I could decline to approve it, or grant limited access to my profile (so I'm told, anyway). But what are the social implications of doing that? Is it considered a snub? If it isn't, how long until it will be? No, it seems inevitable that your activity in any given online community will eventually become part of your publicly-known personal history, limiting the sorts of ways that you can comfortably express yourself.

Admittedly, all of this is sort of peripheral to Mayer-Schönberger's point. Social circles will no doubt continue to flee across the internet as the grown-ups (so to speak) encroach on them. Staying one step ahead of your professional contacts is, by and large, a viable strategy for not poisoning your work relationships by exposing your horrible true self.

Mayer-Schönberger doesn't seem concerned with these ongoing public/private struggles. Rather, he's worried about the potential for finding embarrassing information about a person's history at a single given point in the future. It's bound to happen: there's real utility to be had by, say, exposing Facebook information to a search engine. If Google refuses to do it, someone else will. And of course there are plenty of other sources of potentially embarrassing information out there (http://www.flickr.com/photos/YOUR-NAME-HERE/tags/drunk). If anyone dug up digitized copies of the short stories I wrote for my high school literary journal, I'd fully expect to be penniless and living on the streets by nightfall (it would be well-deserved, I assure you).

M-S (if I can call him that) suggests a legally mandated technological system that would, by default, cause data to be deleted after a period of time. I'm sure his heart is in the right place, but this is dumb for all the same reasons that DRM is dumb. You really, really can't control the spread or persistence of publicly available digital information. Efforts to do so are a waste of everyone's time.

But it's a real problem nonetheless. I see two likely solutions: first, increased adoption of Darknets, invite-only communities and largely-anonymous forums like Unfogged (although the protective namelessness of that community is pretty much gone). But that's not a complete solution, for the reasons listed above.

The real answer is just for us, as a society, to get over ourselves: to stop pretending that no one ever gets drunk in college, ever says things they don't mean, or has a sex drive. It's wildly optimistic, I know. But we've gotten over needing our politicians to be undivorced teetotalers who never say anything dumb (and how!). Maybe the generations that have been online their entire adult lives will have a diminished capacity for puritanical self-deception. I hope so, anyway.

Comments

If anything, I'd think this is an argument for blogs and things you have control over instead of things like Facebook. We wouldn't have been able to do a fast mass redaction when someone threatened to out FL, for example, if we didn't have direct access to our databases and hadn't explicitly set our pages not to be cached.

 

I completely agree. It's relatively harmless now, I suppose, but I think that's largely because Facebook is waiting to be acquired. At that point their new owner will want to monetize the site, and it'll be filled with spam the way that MySpace is. And at that point, who knows what'll happen to your personal info? They've already shown that they're not squeamish about making radical changes to users' level of exposure on the site (as shown by that whole feed business).

 

I think this is just an argument for finer-grained access permissions on our online communications. Livejournal does this pretty well. Not only can you do "friends-only" posts, but you can also restrict individual posts to subsets of your friends. It requires a bit of set-up on the front end to categorize your friends, but once you've done that, it's easy to do a post and mark it as "all friends," or "college friends," or "DC friends," or whatever.

 

I think this is an argument for less control. (Hi, I just read History of Sexuality. That should be transparently obvious when I say that inventing new ways to not talk about sex when we're talking about sex seems like a poor investment of technology and more than a little pointless—surely the Internet puts us over the event horizon for total transparency in society.) The kids in their infinite online folly are going to grow up to be adults with readily searchable closets, skeletons open to all. Doesn't that promise a society in which Googling a prospective employee's name is pointless? The only thing we can do is believe it when we say that it's relatively harmless, full stop.

 

You seem to suggest that my proposal is similar to IP secured by DRM. It seems that you haven't actually looked at my paper. In the paper I suggest that a DRM-like approach (as Lessig has made in Code 2.0) would be overkill. What I desire is not perfect solution, just a shift in defaults that makes users think again about the choice of forgetting.

I encourage you to read the paper - it's a free download, including from my website, and I think addresses some of the concerns you seem to have.

In contrast your solution (cognitively accepting the fact that we are transparent and thus weigh things differently) depends on our brains ability to adapt - not something that cognitive scientists have much hope in I am afraid, espcially since biologically we are wired to forget.

Kind regards,

VMS

 

Post a comment