Skip to content Skip to footer
The Intellectual Situation – Internet as a Social Movement
We love a parade. Up and down Fifth Avenue

The Intellectual Situation – Internet as a Social Movement

We love a parade. Up and down Fifth Avenue

We love a parade. Up and down Fifth Avenue, voices ring against the burnished buildings.

“WE WON! WE WON!”

Who won? Won what?

“Something to do with the internet,” says a kneeling mailman, reaching with his chained key to open an olive-green postbox on the curb.

“WI-FI! WI-FI!”

The marchers split into cells, pushing off in crowds of sixes and sevens. Like torn-off dough hurled back into the kneaded mix, they enact a complex choreography: blob, stretch, blob, blob; then, at a whistle, they grapevine into another blob, trailing ropy bands of demonstrators.

“SER-VER ERR-OR! WE WON! FA-TAL EX-CEP-TION!”

Confetti rains down from the cornices, as trombone and tuba blow “Happy Days Are Here Again.”

A flying wedge splits the full street, pantsless, its constituents taking pictures of one another without pants. “NO PANTS DAY!” they cheer.

Lawrence Lessig passes by on a float drawn by free-range chickens.

“THIS IS WHAT THE INTERNET LOOKS LIKE! THIS IS WHAT THE INTERNET LOOKS LIKE!”

“Looks like a Verizon commercial,” says a white-haired dog walker, his schnauzer tangled up with us on the sidewalk.

“This is a protest against the skeptics!” retorts a 30-something man with a soul patch. He hands us a leaflet. “Get out of the new road if you can’t lend a hand! This is a demonstration! Read our program!”

But the leaflet is blank.

+++

Internet as Social Movement

Alexander Blok was enchanted by the Bolshevik Revolution. The leading poet of the pre-revolutionary symbolist school, Blok and his pale handsome face had been freighted in the years before 1917 with all the hopes and dreams of the Russian intelligentsia. In early 1918, when that intelligentsia was still making fun of the crudeness, the foolishness, the presumption of the Bolsheviks—the way contemporary intellectuals once made fun of Wikipedia—Blok published an essay urging them to cut it out. “Listen to the Revolution,” he counseled, “with your bodies, your hearts, your minds.”

Three years later, Blok was dead, and Vladimir Mayakovsky, the tribune of the Revolution, wrote his obituary. “Blok approached our great Revolution honestly and with awe,” Mayakovsky wrote. But it was too much for him: “I remember, in the first days of the Revolution, I passed a thin, hunched-over figure in uniform warming itself by a fire near the Winter Palace. The figure called out to me. It was Blok. We walked together. . . . I said, ‘How do you like it?’ ‘It’s good,’ said Blok, and then added: ‘They burned down my library.’”

A group of peasants had torched Blok’s country house. Blok, however, refused to choose between the “good” he saw in the Revolution and the burned library, Mayakovsky wrote. “I heard him this past May in Moscow. In a half-empty auditorium, he read some old poems, quietly and sadly, about Gypsy songs, about love, about a beautiful woman—the road led no further. Further on was death, and it came.”

+++

Ninety years later, we are living through a different revolution. Like the Russian one, it will seem in retrospect—may already seem—like a smooth inexorable process, but was in fact a series of discrete advances: First, the creation of easy-to-use web interfaces (the first recognizable browser, Mosaic, launched in 1993) and blogging platforms (Moveable Type, 1999), which enabled non-specialists to navigate and publish on the web. Second, the improvement of search technology, so that search spam could be weeded out and relevant results delivered (the most radical advance in this field was made by a Stanford graduate student named Larry Page in 1998; his PageRank algorithm would also prove the eventual financial salvation of the internet, via search-based advertising). Third, the digital integration of various media other than text (through, first, their easy digitization, and then the increase in bandwidth that allowed their continuous broadcast), including music, photos, and videos, so that more and more things could be placed online. Fourth, most recently, the spread of the internet to wireless and handheld technology, which has freed the web and its user from the shackles of the deskbound networked computer.

All of this was difficult, amazing, perplexing, astonishing—but so was the laying of the railroads and the sending of telegraph signals across the ocean. And historians of technology like to point out that great fanfare and promises have greeted all sorts of new devices, from the radio to the fax machine. But even before former Grateful Dead lyricist John Perry Barlow penned his “Declaration of the Independence of Cyberspace” (“Governments of the industrial world,” it began, “you weary giants of flesh and steel”), the internet was no mere fax machine.1 From the first, and in no small part because of its fervent supporters, it has felt less like a technology and more like a social movement—like communism, like feminism, like rock and roll. An ideology we could call webism. While the rest of us look up movie times, buy sweaters, and post jihadi videos, the webists proclaim the new age.

In its purest form, webism comes from a specific place: California. The computer and the internet spent their childhoods there. If the rhetoric of the webists sometimes sounds like nothing so much as a mutant futuristic strain of hippie-speak, this is why. Stewart Brand, creator of the great hippie handbook Whole Earth Catalog (1968–72, mostly), was a firm believer in technology as a pathway to a better, more liberated life; influenced by the techno-transcendentalism of geodesic dome–builder Buckminster Fuller and the oracular techno-apocalyptic pronouncements of Marshall McLuhan, Brand was the founder of the first online community, the WELL, which in turn influenced the founding editors of Wired. Almost all the great computer companies and innovations have come from a very small stretch of California known as Silicon Valley, which is essentially an extension of Stanford University. The founders of Hewlett-Packard, Yahoo!, and Google all came from Stanford—as did Stewart Brand. An hour north lies San Francisco, the historic home of the counterculture. And as Fred Turner—a communications professor at Stanford—has convincingly argued, it is a mixture of the technophilia of Stanford and the countercultural ethos of San Francisco that has created the ideology of the web as we know it. The first business venture of Steve Jobs and Steve Wozniak, the founders of Apple, both college dropouts who grew up in the Silicon Valley town of Cupertino, was to build little circuit boxes to steal long distance service from the phone company in 1976. They sold them in the dorms of nearby universities—for $100 you could get a little circuit breaker and save some money on your long distance bill. Best of all, with this gadget you could stick it to the man. Thus the era of the great freeload began.

Computers, initially, were meant to keep track of weapons and personnel in the era of mass warfare; the internet to keep alive flirtatious intragovernmental email in the event of nuclear attack. Silicon Valley sought to effect an ideological reversal. “One person, one computer,” Apple sloganeered in the early ’80s; “the web is for everyone,” Netscape said when it launched its first browser in 1994. From the mechanism of our mass administration, the computer would be the means of our individual liberation. Another early Apple slogan: “Why 1984 won’t be like 1984.” This was pronounced at the end of the famous commercial (directed by Ridley Scott and aired during Super Bowl XVIII) in which a young American female athlete threw a sledgehammer at a gigantic screen on which Big Brother was delivering his latest motivational lecture (“We have created, for the first time in all history, a garden of pure ideology,” Big Brother was saying). Which reminds us of the other, related source of utopian webism: the collapse of a utopian dream elsewhere, in Soviet Russia. There is something uncanny about the fact that Tim Berners-Lee wrote his first proposal for the world wide web in March 1989, six months before the Berlin Wall came down.

+++

“It’s not a revolution if no one loses,” leading webist Clay Shirky has written. The first ones the internet revolution came for were the travel agents, those nice people who looked up flight times and prices for you on a computer, before you could do it yourself at home. Then Captain Kirk returned from the future to zap them all. Next to fall was the music industry. And have you been to a mall lately? (Have you been outside lately?) Ultimately, very few industries were unaffected. Google, built on Larry Page’s search algorithm, is now a giant corporation, perched with its $22 billion in annual revenue right next to Delta Airlines, Coca-Cola, and Bristol-Myers Squibb. (Apple, at $32 billion, is in even more exalted company.)

And then the internet came for the print media. This process has been longer, more intricate, and more emotionally fraught than the interaction of the internet with any other media.

At first the idea was merely to transfer some aspects of the print world online, with slight wrinkles. The early web magazines were Slate, Salon, Feed, and Suck—as with the best print magazines, you could count them on the fingers of one hand. Some used the unlimited space of the internet to run longer features; others used the limited attention span of internet readers to run short. Slate (funded by Microsoft) was a lighter-hearted New Republic, minus the great book reviews; Feed (funded by venture capitalists) was a more earnest version of the Village Voice. Things changed after the crash of the tech stocks in 2000. At this point the founding editor of Feed, Steven Johnson, announced that Feed and its sister webzine Suck were folding and being replaced by something called Plastic.com. Plastic.com was a new kind of site: a news aggregator. User-contributors would post links to interesting articles, with a summary, and then everyone would discuss them. This would be called “user-generated content.” It was the future of the internet, Johnson explained. Here was a man who’d burned through more than a million dollars of funding by paying a living wage to his writers and editors to produce a high-quality product that competed with traditional print media. The world, it turned out, was not ready for that. It’s still not ready.

At the time, the world wasn’t ready for Plastic.com, either. In the years to come its formula would be copied with some vulgarization and more success by sites like Reddit and Digg. News aggregator blogs like Boing Boing and Gawker would also find glory in curating and annotating news items (and only then inviting commenters in). But the apotheosis of user-generated content would come with the rise of the social networks, where the content being generated by users was not just links to interesting news items but entire photo albums, playlists, recipes, recommendations . . . in short, entire selves.

Web 2.0 has been revelatory in lots of ways—user-generated naked photos, for one—but the torrent of writing from ordinary folks has certainly been one of the most transfixing. Over the past five years the great American public has blogged and Tweeted and commented up a storm and fulfilled a great modernist dream: the inclusion, the reproduction, the self-representation of the masses. Walter Benjamin spoke of “modern man’s legitimate claim to being reproduced” by film, a claim denied modern man by the capitalist film industry; James Joyce’s Leopold Bloom lamented the fact that the wisdom of the street found no outlet in literature. Now, through a million open channels, the wisdom of the people is represented, and they can write back to power—or at least to posters of YouTube videos. A lot of this writing has been insightful, strange, and witty. A comparable amount has been racist, homophobic, misogynistic—and a great many people have simply posted very cute photos of their pets.

It is unfortunate (though also logical) that this desacralization of the written word should be taking place simultaneously with the economic destruction of the once proud print institutions. One can imagine a world in which a million voices declared that the Times was a piece of shit—and yet the Times marched on. In fact that world existed for a hundred years. Remember Noam Chomsky in Manufacturing Consent, demonstrating with twenty years of painstakingly collected press clippings that the Gray Lady was misrepresenting the plight of East Timor, Burma, Nicaragua? Remember Rick Perlstein explaining that David Halberstam’s reporting in the early 1960s pulled us into Vietnam? The New York so-called Times helped prop up dictatorships (as well as our two-party system), pushed the increasing technophilia of our culture, was a patsy for the Bush Administration’s Iraq War strategy—and intolerably elitist to boot. To denounce the imperialist Times was a rite of passage for young American leftists. And yet, as the Trotskyist Irving Howe once wrote, “Blessed New York Times! What would radical journalism in America do without it?” For all its defects, if you read to the end of the endless articles you got most of the facts. It was the best and most comprehensive newspaper in the world.

In the past five years, no institution has wrestled with the implications of the internet as painfully, and as publicly, as the New York Times. It has devoted tremendous resources to keeping its website updated with fresh stories, and it has assigned some of its best young talent to the various Times blogs. Cursed by its own authority, and the limitations this authority places on what it can say and do, the Times has been outhustled or outshined or simply mocked by the blogosphere, but has persevered. The Times has also devoted as much room to the “story” of the new media as anyone. One of its best critics, Virginia Heffernan, now writes almost exclusively about the internet; one of the paper’s most commented-on stories in the past two years was a Times Magazine essay about the life of a compulsive blogger. (There were so many comments, and many of them were so angry, that the Times shut the comment thread down.)

Often all this attention to the new media has been to the detriment of serious reporting: last year during the protests after the rigged presidential election in Iran, there was almost as much in the Times about Twitter, and the “Twitter Revolution,” as there was about the situation on the Iranian streets. At other times, the obsession with new media has led to strange outbursts—as when the writer of a piece on Robert Caro’s monumental 1,200-page biography of Robert Moses suddenly and entirely irrelevantly bemoaned the “age when sentence fragments on a blog pass for intellectual argument.” Even as the institution itself was struggling desperately to adapt, this sort of dig at the internet emerged from the editorial desk on a regular basis, like a cry of pain.

And then there was the Times’s media critic David Carr. It fell to Carr to describe the destruction of his way of life. In the face of collapse, Carr was stoical. He did not sing and dance, but neither did he moan and weep. He wrote a memoir of his crack addiction, and (in a move to out-tradition any traditionalist, in the age of the partly fake memoir) fact-checked it. But Carr also agreed to flatter the self-regard of the young: “For every kid that I bump into who is wandering the media industry looking for an entrance that closed some time ago, I come across another who is a bundle of ideas, energy and technological mastery. The next wave is not just knocking on doors, but seeking to knock them down.” This just a few days before various websites broke the news about the hundred Times staffers taking a “buyout” and leaving the paper for good. The door being knocked down was to Carr’s house.

On the other hand, Carr had 245,000 followers on Twitter—microblogging waited like an escape helicopter on his roof.

The webists met the Times’s schizophrenia with a schizophrenia of their own. The worst of them simply cheered the almost unbelievably rapid collapse of the old media, which turned out, for all its seeming influence and power, to be a paper tiger, held up by elderly white men. But the best of them were given pause: themselves educated by newspapers, magazines, and books, they did not wish for these things to disappear entirely. (For one thing, who would publish their books?) In fact, with the rise of web 2.0 and the agony of the print media, a profound contradiction came into view. Webism was born as a technophilic left-wing splinter movement in the late 1960s, and reborn in early ’80s entrepreneurial Silicon Valley, and finally fully realized by the generation born around 1980. Whether in its right-leaning libertarian or left-leaning communitarian mode it was against the Man, and all the minions of the Man: censorship, outside control, narrative linearity. It was against elitism; it was against inequality. But it wasn’t against culture. It wasn’t against books! An Apple computer—why, you could write a book with one of those things. (Even if they were increasingly shaped and designed mostly so you could watch a movie.) One of the mysteries of webism has always been what exactly it wanted, and one of the paradoxes that emerged during the long death of print was that the webists wanted to help. They wanted to “spread the word” about new books. They wanted to “make reading exciting again.” Over and over, in increasingly irritated and sometimes outright aggressive tones (most recently at the “New Think for Old Publishers” panel at last year’s SXSW), they urged the print companies to learn how to “use the web.” They meant this in all sincerity; their anger at the publishers for failing to “use” them properly was proof of this. But to urge the “use” of something was to think of it as merely a technology. It was to forget that the amazing and powerful thing about the web was precisely that it was not a toaster; it was not a hammer. The web could not simply be “used.”

In the end one got the sense that the Times was going to be all right—that it was taking in so much of the internet (neighborhood blogging, idea blogging, slide shows, video) that eventually, after many missteps, it would hit upon the right formula. (Even if it declined to take most readers with it—see “Addled,” below.) Much less easy to imagine was a situation in which the book publishers could be made whole again. The internet could certainly be used to sell physical books (Amazon overtook Barnes & Noble as the largest bookseller in the US in early 2007); no doubt it can also be used to sell digital copies of books. But these will be the same old books, repackaged a little to fit the file requirements of the e-platforms. And to those searching for the “new think,” that will be—already has been—disappointing.

There’s a very good reason that publishers’ moves in the direction of webism—setting up author websites, blogging boringly about new books, Tweeting obligatorily about positive reviews—have been so tepid and lame when compared with the struggle of the New York Times. Ultimately, untranscendably, the publisher needs to sell you a book for $20, or $10, whether you download it or buy it at a big-box store. And if you’re a webist, and believe in crowd-sourcing, collaboration, and above all in free, you may not be buying. The confusion surrounding the internet’s relation to the book has been created by the fact that many webists emerged from the culture of the book (rather than television, say); that they themselves genuinely liked books; and that communications online took place in the medium of text. “The internet is the largest group of people who care about reading and writing ever assembled in history,” posited the SXSW publishers’ panel in 2009. But what kind of reading, what kind of writing? The internet is the largest group of people ever assembled, period. Some join Infinite Jest discussion groups. Others can’t read to the end of a wire story. Book-length literature is the product of certain historical conditions, of a certain relationship to written language. Assimilate book-ism to webism and the book looks like nothing so much as an unreadably long, out of date, and non-interactive blog post.

“The Russian Revolution,” Wired founding editor Louis Rossetto once said, “was like a schoolyard game compared to the change that’s been driven by the digital revolution.” It’s an interesting comparison. In 1917, the Bolsheviks seized power in the world’s largest country, moved the capital to Moscow, terrorized their enemies, wrote poems in praise of themselves. They began tearing down the monuments they didn’t like, and building new ones. And, of course, in the 1930s, under Stalin, they started terrorizing and murdering people who weren’t their enemies at all.

Artistically, the revolution helped usher in an explosion of public creativity—in theater, architecture, and film it was the era of the triumph of the avant-garde. After about a decade, the explosion was stifled, and the history of that stifling, under Stalin, is always read as tragedy. And it was a tragedy. But socialist realism in literature and film, Stalinist architecture on the streets, “folk” paintings in the museums and metro stations—these were more popular, by far, than the products of the avant-garde. The rejection of the avant-garde did accord with the conservative tastes of Stalin and his circle, but it also accorded with the tastes of the great Soviet people. Someone had to denounce the Formalists, the Constructivists, the theater of Meyerhold, to Stalin. Someone told the police about Mandelshtam’s anti-Stalin poem. It came from above, but it came even more from below. Between high modernism and Stalinism, Stalinism definitely got more hits.

History of course does not repeat itself quite so neatly. History, as the great Bolshevik-Trotskyist writer Victor Serge once said, is a series of rooms, and one needs to keep opening the doors. So, yes, the internet these days displays Stalinoid tendencies (been “denounced” on the internet lately? Give it a minute), but that doesn’t mean the commenters will soon be lining us up against the wall. And, yes, the most successful, innovative sites on the internet are mostly devoted to celebrity gossip, but that doesn’t mean they won’t eventually be supplanted. The nobler goals of this revolution are to disseminate information to parts of the world that do not have it, to strengthen democracy, to give a voice to everybody, and to speak truth to power. At the same time, if you believe that the internet is a revolution, then you must take seriously the consequences of that revolution as it is. The mistake that many supporters of the Bolsheviks made was to think that once the old order had been abolished the new order would be fashioned in the image of the best of them, rather than the worst. But the revolution is not just something you carry inside you; the web is not your dream of the web. It is a real thing, playing out its destiny in the world of flesh and steel—and pixels, and books. At this point the best thing the web and the book could do for one another would be to admit their essential difference. This would allow the web to develop as it wishes, with a clear conscience, and for literature to do what it’s always done in periods of crisis: keep its eyes and ears open; take notes; and bide its time.

To continue reading “The Intellectual Situation” please visit n+1 Magazine here.

We’re not going to stand for it. Are you?

You don’t bury your head in the sand. You know as well as we do what we’re facing as a country, as a people, and as a global community. Here at Truthout, we’re gearing up to meet these threats head on, but we need your support to do it: We must raise $18,000 before midnight to ensure we can keep publishing independent journalism that doesn’t shy away from difficult — and often dangerous — topics.

We can do this vital work because unlike most media, our journalism is free from government or corporate influence and censorship. But this is only sustainable if we have your support. If you like what you’re reading or just value what we do, will you take a few seconds to contribute to our work?