What Is True/Slant?
275+ knowledgeable contributors.
Reporting and insight on news of the moment.
Follow them and join the news conversation.

Jun. 25 2010 - 5:19 pm | 785 views | 0 recommendations | 6 comments

Public Regions: The Fate of Solitude in the Age of Always Connect

Of course, it had to happen: Jeff Jarvis—speedtalking hyperpundit and phallusblogger around town—happened on my critique of his new, self-winding meme, the Civic Virtues of Radical Transparency (which he, with that tin ear for euphony that separates the typers from the writers, insists on calling “publicness”).

Social Network. The horror, the horror... (Official movie poster; all rights reserved.)

Now, Jarvis has answered my argument with a 1500 (and 1!)-word rebuttal, and, true to form, has answered it with admirable restraint and judiciousness. A becoming dignity is the mark of the man. That, and the  urbane, understated wit we’ve come to expect from a writer who titles his book on privacy, publicness, and prostate cancer Public Parts. Not for him the shrill whine of the punctured windbag. He was even gentlemanly enough to pay me the supreme compliment of insisting my style was poles apart from his—high praise, from a writer whose prose is equal parts PowerPoint and chloroform; a quasihemidemisemi-intellectual who cites Julia Allison with approval.

But seriously: the debate about what Jarvis calls “the end of privacy and the benefits of publicness” in the Age of Always Connect is too important to be left to blogorrheics, corporate tools, and breathless cyberprophets of a great big beautiful tomorrow.

Even more thoughtful tech-culture critics such as Steven Berlin Johnson, the thinking man’s Jeff Jarvis, have left some of the most interesting questions raised by this debate unaddressed.

For example…

…there’s the question of the virtues of solitude in an age of compulsive gregariousness.

In his Chronicle of Higher Education essay “The End of Solitude,” William Deresiewicz offers a postmodern parable about our mania for connectivity and the toll it may be taking on our ability to be alone with ourselves, wandering the byways of our minds or simply experiencing our immediate surroundings:

[W]e live exclusively in relation to others, and what disappears from our lives is solitude. Technology is taking away our privacy and our concentration, but it is also taking away our ability to be alone. [...] I was told by one of her older relatives that a teenager I know had sent 3,000 text messages one recent month. That’s 100 a day, or about one every 10 waking minutes, morning, noon, and night, weekdays and weekends, class time, lunch time, homework time, and toothbrushing time. So on average, she’s never alone for more than 10 minutes at once. Which means, she’s never alone.

I once asked my students about the place that solitude has in their lives. One of them admitted that she finds the prospect of being alone so unsettling that she’ll sit with a friend even when she has a paper to write. Another said, why would anyone want to be alone?

My inner skeptical inquirer raises an eyebrow at Deresiewicz’s unsupported assertion, familiar from Nicholas Carr’s The Shallows, that technology is “taking away” our ability to concentrate. Hard facts regarding technology’s effects on the human brain are hard to come by, and too many of them look squishy under close scrutiny. Moreover, today’s calamity howling about the brain-eating horrors of hypertext sounds uncomfortably similar to the alarms raised in the late ’70s by the shoot-your-TV school of media criticism—Marie Winn’s fulminations, in The Plug-In Drug, about the boob tube’s effects on early brain development; Jerry Mander’s insistence, in Four Arguments for the Elimination of Television, that TV is pickling our prefrontal lobes.

Nonetheless, Deresiewicz’s anecdotal claim that the frenetic connectedness of today’s teens leaves little room for solitude is amply evidenced by a recent Pew Internet & American Life study, which found that half of all teens who own cell phones (75% of 12-to-17-year-olds, up from 45% in 2004)  send 50 or more text messages a day and one in three teens, like the girl in Deresiewicz’s story, “send more than 100 texts a day, or more than 3,000 texts a month.”

You’re thinking: And the point is—? Teenagers are, and always have been, the most manically social subspecies of naked ape; what else is new?

What’s new is the near-constant nature of their connectedness—the colonization of every spare second by digitally mediated socializing or interaction with entertainment media—and the consequent atrophy of the desire, maybe even the ability, to be alone.

Is the girl in Deresiewicz’s anecdote who wonders why anyone would want to be alone an outlier or a poster child for our times? What does it say about us, as a society, if we’re unable to be alone and unplugged without being bored or lonely?

Is the pervasive resistance to untethering ourselves from our social worlds or disconnecting ourselves from the media drip, even for an instant, at root a fear of the emptiness in our heads?

If Deresiewicz is right, should we preserve some small space in our lives for solitude—a Walden of the mind, away from the Matrix?

Questions to be asked, at least.

Then, too, we might ask ourselves what we mean when we talk about the self, whether social or solitary. History is littered with fossil selves: the sovereign self of the Age of Reason, confident that the conscious “I” (the “I” that says “I”) is the only  “I” there is; the haunted house of the Freudian self, with its unconscious gnawing its cage bars in the basement;  the alienated, existentialist self begotten by modernism; the fluid or multiple selves—”liquid subjectivities,” “anti-Oedipal” egos—of critical theory; the postmodern self that rips and remixes its self-image from media fragments and, like the characters in novels by Don DeLillo or J.G. Ballard, only recognizes itself as truly three-dimensional when it sees itself in the media mirror, a living image reified by celebrity.

“What does the contemporary self want?” Deresiewicz wonders.

The camera has created a culture of celebrity; the computer is creating a culture of connectivity. As the two technologies converge…the two cultures betray a common impulse. Celebrity and connectivity are both ways of becoming known. This is what the contemporary self wants. It wants to be recognized, wants to be connected: It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us, this is how we become real to ourselves — by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.

My thoughts exactly, as expressed in a passage written before I’d read Deresiewicz’s essay:

Isn’t [our fetishization of fame] the motivation for much of what we call oversharing, online? Ours is the age of nanocelebrity…In the age of reality TV and Paris Hilton, American Idol and YouTube (which has the power, if your video goes viral, to turn you into a global celebrity, even if you’re just some guitar geek shredding Pachelbel’s Canon), we see fame as our Warholian birthright. [...] Thus, we’re increasingly comfortable with the disappearance of privacy and the prying media eye, not only because it affords a few minutes of Warholian fame but because, like the characters in White Noise, we only feel that we truly exist when we see ourselves reflected in the media eye, because that’s where the real reality is, these days: on the other side of the screen.

A meta- level up from all these interrelated points of argument is the binary opposition, as theory jocks like to call it, between the data cloud—the online welter of free-floating information and images—and the unplugged world of immediate experience; between face-to-face, here-and-now reality and the being-in-nothingness of our virtual lives, which are time-asynchronous, unmoored from geographical coordinates, and easily uncoupled from our identities, not to mention our bodies.

An article in this April’s New York Times (“Antisocial Networking?“) dramatizes that opposition. The reporter quotes researchers  who claim that “initial qualitative evidence” supports the finding that “the ease of electronic communication may be making teens less interested in face-to-face communication with their friends.” According to the Times, child psychologists are worried that today’s youth, “unlike their parents—many of whom recall having intense childhood relationships with a bosom buddy with whom they would spend all their time and tell all their secrets—may be missing out on experiences that help them develop empathy, understand emotional nuances and read social cues like facial expressions and body language. With children’s technical obsessions starting at ever-younger ages—even kindergartners will play side by side on laptops during play dates—their brains may eventually be rewired and those skills will fade further, some researchers believe.”

Another Times article, from June of this year, quoted the MIT psychologist and tech-culture researcher Sherry Turkle observing, “There’s something that’s so engrossing about the kind of interactions people do with screens that they wall out the world…I’ve talked to children who try to get their parents to stop texting while driving and they get resistance, ‘Oh, just one, just one more quick one, honey.’ It’s like ‘one more drink.’”

The article kicked off with a quote from an early-childhood researcher who observed a disturbing interaction between a mom and her kid, emblematic of the sometimes jarring disconnect between our online lives and the offline world:

While waiting for an elevator at the Fair Oaks Mall near her home in Virginia recently, Janice Im, who works in early-childhood development, witnessed a troubling incident between a young boy and his mother. The boy, who Ms. Im estimates was about 2 1/2 years old, made repeated attempts to talk to his mother, but she wouldn’t look up from her BlackBerry. “He’s like: ‘Mama? Mama? Mama?’ ” Ms. Im recalled. “And then he starts tapping her leg. And she goes: ‘Just wait a second. Just wait a second.’ ” Finally, he was so frustrated, Ms. Im said, that “he goes, ‘Ahhh!’ and tries to bite her leg.”

How does this relate to the civic virtues of publicness—and the social costs of oversharing? The point, obviously, isn’t that the Web is the Devil’s Workplace; it’s that, increasingly, we tend our online social worlds at the cost of our face-to-face lives.

Regrettably, Jarvis misses the opportunity to wrestle with any of these pressing questions, preferring the easier out: an Ultimate Cage Fight with a straw man.

In his rebuttal, he tilts with a “prudish, disapproving, controlling, Victorian, media-obsessed, retrograde, predictable, snippy, snarky, and self-righteous” Comstock “with some apparent penis and anus problems” who wants to police standards of public behavior and would “like to tell us what not to say on the Internet,” a censorious Church Lady whose campaign to abolish all talk of penises might, if successful, cause men to “continue to not get checked,” which means “more will die” because, you know, Silence Equals Death.

(Sound of gritted teeth)

For the record, I was at pains in my post to applaud “Jarvis’s decision to publicize his cancer scare as a wake-up call to men of a certain age, a sort of PSA about PSAs,” as  “truly generous of spirit.” As a fellow prostate-cancer survivor, I could hardly have done otherwise.

Moreover, I readily conceded that his “desire to reach out to his online flock in his hour of need is understandable enough,” although I questioned his reflexive faith in the medical advice of the comment-thread claque.

Also for the record, my Inner Civil Libertarian will defend to the death Jarvis’s right to talk about penises large or small, diseased or hale, marching proudly with head held high by Viagra or shriveling in fear at the approach of the fearsome Foley catheter. I have no argument with the Jarvis unit; long may it wave, or whatever it does.

Jarvis wants to reframe critiques of oversharing (which might be defined the uncritical notion that radical publicness is an unalloyed good) as a bluenoses’ campaign to police his speech—to deny him the right to share and share and share, already, about Little Jarvis. “The solution is for you not to read me,” he huffs. “Anyway, I’m not talking to you. I’m talking with my friends.”

But isn’t that the point?

As I’ve argued in my Boing Boing essay on friendship in the age of Facebook, social networking is deflating the currency of friendship by repurposing the term to include people we barely know—faceless names whose real purpose, on sites like Facebook and Twitter, is to burnish Brand Me by inflating my, er, social standing. (Even in cyberspace, Size Matters.) When Jarvis says he’s talking to his “friends,” does he really imagine that the legions who read his blog or his 44,304 Twitter followers are friends? What value can the term possibly have, in such a context? Loose usage blurs the distinction between followers, fans, and friends.

And while we’re kicking around the question of unrestrained “shariness” in a time of perpetual connectedness (with our online worlds, at least), shouldn’t we give some thought to the people we’re doing our sharing with? If we consider the disembodied inhabitants of these loose-knit social ecologies friends, as Jarvis seems to, we need to juxtapose that perception with the inconvenient truth that close friendships, according to some studies, are on the decline in America. A recent Time feature cited a 2006 Duke University study that found that “from 1985 to 2004, the percentage of people who said there was no one with whom they discussed important matters tripled, to 25%; the same study found that overall, Americans had one-third fewer friends and confidants than they did two decades ago.” Time is no stranger to fearmongering trend stories driven by dubious poll data and sexed up with scare quotes, so I’ll reserve judgment on the magazine’s speculation that intimate friendships, in America, have fallen victim to an empathy deficit incurred by “our increasing reliance on digital communication and other forms of new media.” Even so, the Duke study does suggest that face-to-face interaction is a key component in forming intimate social bonds, and that the wiring of our social lives may be partly to blame for the relative dearth of intimate friendships.

Social Isolation and New Technology,” a 2009 study by the Pew Internet & American Life Project, pointedly refutes some of the Duke study’s findings while emphatically confirming that

compared to the relatively recent past, most Americans now have fewer people with whom they discuss important matters, and the diversity of people with whom they discuss these issues has declined. There is a wealth of scholarship to suggest that the implications of this trend for individuals and for American society are starkly negative. Smaller and less diverse core networks diminish personal well-being by limiting access to social support. There are simply fewer people we can rely on in a time of need – whether it is a shoulder to cry on, to borrow a cup of sugar, or to help during a crisis. Small and narrow core networks also impede trust and social tolerance; they limit exposure to the diverse opinions, issues, and ideas of others. If we increasingly rely and trust only a small inner circle of likeminded others, it becomes increasingly difficult to recognize, accept or understand opposing points of view. A great deal of research has shown that diversity within our closest relationships—even in the age of the internet—is vital for the flow of information, for informed deliberation, and to maintain the participatory ideals of a democracy.

The Pew researchers refuse to lay the blame for the decline in intimate friendships and the diversity of our social circles on technology’s doorstep; in fact, they argue that “people’s lives are likely to be enhanced by participation with new communication technologies, rather than by fearing that their use of new technology will send them into a spiral of isolation.” That said, the study neglects entirely any nuanced discussion of the widely varied types of online social relationships—Twitter followers, Facebook friends, readers who comment on your blog, e-mail correspondents, and, if you’re a public figure such as Jarvis, fans and even critics or outright enemies who actively engage you in discussion in any or all of those forums. As well, it avoids any in-depth comparison of the differences between embodied (face-to-face) and disembodied (online) social interaction.

In his blog posts and published articles about publicness, Jarvis thumps his tub not just for the quintessentially human need to share, or for the joys of sharing, but for the virtuousness of sharing. It seems never to occur to him that our nonstop connectedness borders on the compulsive, and that our compulsion to connect, always and everywhere, may have hidden costs: the inability to enjoy the companionship of our thoughts in solitude or to savor the world around us, unmediated by screens.

Let me be clear: I have no quarrel with the desire to reach out and touch someone, online; my crosshairs, in the offending post, were trained on a society “so outward-focused, so frenetically interactive, so terminally social that we get a death letter and ‘the instinctive response is, I’d better tweet this up right away.’” It’s not the decision to tweet that galls, but the insistence that it must be tweeted right away, a conclusion so foregone it isn’t so much considered but “instinctive.” (These words, by the way, are Steven Berlin Johnson’s, but he gives the impression in his essay, for which he interviewed Jarvis, that Jarvis shares the sentiment.)

My argument wasn’t with the notion of sharing, it was with the elevation of oversharing to a civic virtue. Alert every man within tweeting distance to the urgent importance of getting tested for prostate cancer, reach out and touch someone online in your Dark Night of the Soul, but do we really need a pathologist’s report on every grisly, gristly detail of what ails you? Exhibitionism can be a way of hiding in plain sight: the full monty as half-truth. For all Jarvis’s confessionalism, we know more about his anatomy than we do about his psyche—his most searching thoughts and scarifying feelings in the face of every man’s worst nightmare, this side of death: erectile dysfunction. Jarvis’s publicness is a sign of our times; in these days of reality TV and celebrity sex tapes, we tell all and reveal nothing.


3 T/S Member Comments Called Out, 6 Total Comments
Post your comment »
  1. collapse expand

    what we need is a cyber-heidegger and an updated version of sein und zeit. all of these multiple modes of being/performing and the dromological implosion of our perception of time might be factors of such boringly deformed psyches (see jarvis, jeff).

    when we tweet, do we collapse all of our followers’ time zones into one immanent greenwich line in our overextended ego? can we crowd-source our shame and fear? can our status updates be more relevant than our meatspace conditions? are we really leaving it up to the johnsons and jarvises of the internets to answer these questions?

    i don’t mind jarvis’ scat talk and public flagellation, i just have a problem with the quality. bdsm can be titillating or depressing. reading jarvis’ blog makes me feel like i’m complicit in a really bad gonzo porn shoot, where the director swears that filming himself being pegged by a dominatrix called cancer makes his film about art, not about him being turned on by people watching him humiliate himself.

    • collapse expand

      Deablero: Bonus points for namechecking Heidegger *and* working a Virilio cite into a True/Slant comment thread. If we get the readers we deserve, I. am. not. worthy.
      But seriously: Fascinating. The Greenwich riff sounds like an outtake from Ballard’s “Glossary for the 20th Century” in that volume in the Zone series on the body.
      I do think, for some users, status updates inflect their meatspace conditions; the polarities of real and virtual are inverted, since the magnetic pull of online life trumps the offline quotidian.

      In response to another comment. See in context »
  2. collapse expand

    I’m not quite sure where to begin with all that’s going on here, but I’m struck by: “Is the pervasive resistance to untethering ourselves from our social worlds or disconnecting ourselves from the media drip, even for an instant, at root a fear of the emptiness in our heads?”

    Contrast that “emptiness” with the orgy of quantification that defines social networking: 40,000 Twitter “followers,” an equal number of Facebook “friends.” 3,000 text messages a month. (I imagine some installation artist papering the walls with blown-up prints of his/her texts.) This kind of connectivity lends itself to quantitative e-valuation; private thoughts do not. You can’t parcel out your thoughts, count the number you’ve had that day, share that number with the world. To some, I imagine that unknowable number — the quantification of, you know, singular human consciousness — probably appears an abyssal emptiness. So back to broadcasting, tweeting, and status-updating: hitching your self momentarily to an outgoing message. But once the message goes out, what sender does it leave behind?

    Walter Kirn, in The Atlantic Monthly, declared “Boredom is extinct,” that, “even the worst blind dates don’t bore us now; we’re never more than a click away from freedom, from an instantaneous change of conversation partners.” Perhaps true, but one might conceive of life as a long blind date with oneself, without the option texting out for a new partner.

    Solitary thoughts enable you to build a non focus-grouped self. Maybe in boredom you find yourself. You might create a self, discover a self, simply spend time with a self. How else can you plumb the deep reservoir of self, if it exists?

    None of that self-reflection (revealing word), I think, happens automatically — we may, indeed, be naturally gregarious creatures. But we also have a long tradition of inward exploration, as far back as the ancient Greek “Know thyself.” This has always been a challenging dictum, but it seems, as you conclude, that confession (outward broadcasting) has displaced self-reflection. Why?

    Kirn notes that previously we’ve thought of boredom as an oppressive, almost physical presence. In reality, of course, it’s a mental disposition. (And the mind, like it or not, is a solitary place.) Compare Kirn’s scenario — “escaping” boredom by finding a more distracting entertainment — with this approach to boredom, with this passage from a writer who’s subtlety, prescience, and compassion I much admire, David Foster Wallace:

    “Bliss — a-second-by-second joy and gratitude at the gift of being alive, conscious — lies on the other side of crushing, crushing boredom. Pay close attention to the most tedious thing you can find (Tax Returns, Televised Golf) and, in waves, a boredom like you’ve never known will wash over you and just about kill you. Ride these out, and it’s like stepping from black and white into color. Like water after days in the desert. Instant bliss in every atom.”

    To me, this sounds not like trying to escape the solitary self, the apparent “emptiness” of one’s own mind, and instead redoubling one’s attention in order to find a perspective other than the dichotomy of boredom/entertainment.

    • collapse expand

      Jesse: Fascinating theme-and-variations on my essay. I like your insight that the harmonic convergence of social-networking platforms and the tropism toward “lifecasting” updates (disparaged, by infidels, as braindroppings) quantifies our social transmissions. But are they really thoughts, I wonder? Boswell’s Johnson is on my nighttable, for whatever inscrutable reason, and I’m struck by the droll charm—and depth and breadth—of the Good Doctor’s mind. When he has a thought, at least as reported by Boswell (yes, I know: the proverbial unreliable narrator; but Johnson was by all accounts a font of witty sallies, apercus, and choice morsels of wisdom), it’s almost invariably intellectually substantive; by comparison, is a Jarvis tweet such as “I hate a leaky burrito” really a thought? It strikes me as more of a convulsive twitch of the ganglia, reflexively broadcast to the million. “I’d better tweet this up right away!”
      As well, I think you put your finger on the nub of the question when you ask what sort of transmitter these transmissions leave behind. Answer: the hypermediated star of “LIfe: The Movie,” in the terminal stages of ego bloat, who imagines that a waiting world is desperate to know that his burrito leaked. (Better that than…other things, I suppose.) I can’t even *imagine* thinking anyone cares. “one might conceive of life as a long blind date with oneself, without the option texting out for a new partner”: Excellent! Why isn’t this in my OED, under the definition for McLuhan’s Narcissus Complex, truer now than it ever was in his day, though not for the Innis-ian, somatic reasons he imagined.
      “Solitary thoughts enable you to build a non focus-grouped self. [...] [C]onfession (outward broadcasting) has displaced self-reflection.”
      Deadeye accurate, and pithily put. I like this idea of the focus-grouped (or perhaps crowdsourced?) self, and may have to steal it (with the proper citation, of course!) Confession has displaced self-reflection because the sheer volume of information overload, the cataract of images and ideas assaulting us, has reversed the magnetic poles of interior and exterior: our dream lives and waking mental landscapes are overrun by media fictions, celebrity phantasms, advertising jingles, manufactured fads, et. al., even as we project ourselves into the public arena via “We Media” such as YouTube, Twitter, Facebook, Tumblr, etc.
      The DFW quote sounds a distinctly (John) Cagean note in its Buddhist insistence that the merest shift in perception can turn boredom (say, the car-horn symphony of that traffic jam outside your window) into music. Or do I misread him?

      In response to another comment. See in context »
      • collapse expand

        Mark, I imagine if every Tweet were a bon mot of (Samuel) Johnsonian caliber, none of us would worry nearly as much about this. However singular Johnson’s mind, it seems inarguable that he took the time — applied the conscious intention — necessary to create such instantly quotable gems. As a writer (and lexicographer!) he concerned himself with both the substance and style of thought; you distinguished between writers and typers early in your post, and no one would call Johnson a typer.

        “I hate a leaky burrito” might be a thought — you might actually hear those words in your head, depending on what else is in there — but I think we agree on its lack of value as a thought worth sharing. Leaky burrito = leaky ego? If “public” becomes your default mode, maybe considerations of audience cease to matter. You don’t have to think about how others might consider your under-articulated, self-interested thoughts. You simply put it all out there. That seems like the instinctual ganglia spasm you mention. (You know how children constantly narrate the world, as if discovering everything for the first time? Some Piaget probably applies to “Jarvis” the public phenomenon, especially his petulant declaration that if you don’t like him, don’t read him.)

        Working the information overload angle, we might say that “I hate a leaky burrito” is less a thought than a piece of information. It adds to a store of “raw” data, but that’s about it. It doesn’t suggest a way to interpret or assimilate similar data; it’s a fact rather than an idea, and we don’t care because we can’t DO anything with it. Except, I guess, agree? Of course, when all your followers like you…

        DFW addresses this information overload in a few places: in-depth in Infinite Jest, but more concisely in “Deciderization 2007 – A Special Report” and in the Kenyon commencement speech. I suspect he had some affinity for Buddhism (though he once explicitly declared himself NOT Buddha), though I think part of his project concerned reclaiming the mind as a sovereign space. You mention ego-bloating above, and detail the shifting polarities between interior and exterior. I might conceptualize it as an ego-leak: no longer knowing where you stop and the world begins. Without that membrane separating you from the world — which, yes, defines you as solitary, with a great world spinning around you — it becomes nearly impossible to find the quiet interior space for thinking through your own values. I think DFW wanted to gently prescribe taking control of your own mental machinery, partly out of self-preservation, partly as an attempt to be a more moral person. This is a very difficult project.

        In response to another comment. See in context »
Log in for notification options
Comments RSS

Post Your Comment

You must be logged in to post a comment

Log in with your True/Slant account.

Previously logged in with Facebook?

Create an account to join True/Slant now.

Facebook users:
Create T/S account with Facebook

My T/S Activity Feed


    About Me

    I'm a cultural critic. Doom Patrol is a series of drive-by essays, mostly on America in the Age of Anxiety, as the title suggests, but also on whatever wild surmise crosses my mind. I've written for publications ranging from The New York Times Magazine to Rolling Stone, Bookforum to Cabinet. My books include The Pyrotechnic Insanitarium: American Culture on the Brink and Escape Velocity: Cyberculture at the End of the Century. I'm associated with the concept of "culture jamming," the guerrilla media criticism movement I popularized through my 1993 essay "Culture Jamming," and "Afrofuturism," a term I coined in my 1994 essay "Black to the Future" (in the anthology Flame Wars: The Discourse of Cyberculture, which I edited). More: http://www.markdery.com/author.html Mail: markdery at verizon dot net.

    See my profile »
    Followers: 52
    Contributor Since: December 2009