Winter 2010 »
print  |  email

Symposium: E. J. Dionne, Jr.

By E. J. Dionne, Jr.
Fifteen years ago, Todd Gitlin offered a precise and devastating metaphor for what he saw then as the academic Left’s default from democratic politics. In The Twilight of Common Dreams, Gitlin noted that while the Left was “marching on the English department,” the right took the White House.

More than they ever want to admit, intellectuals of the Left are influenced by the cultural politics that dominate their time. While the political right spent the 1980s and 1990s preaching the gospel of privatization and the virtue of pursuing individual satisfactions, many in the progressive academy engaged in their own form of withdrawal. An aesthetic radicalism replaced political radicalism, and a battle over texts and canons displaced the fight over whose interests would be served by government and whose ideas would define mainstream politics.

It seemed that the Right, far more than the Left, had learned from Antonio Gramsci, the revisionist Marxist who understood the power of ideas in shaping the outcome of political contests and economic struggles—even though Gramsci was, in fact, the vogue on much of the academic Left.

Typically, it is the Left that finds itself accused of excessively politicizing its intellectual and cultural work. But beginning in the mid-1960s, it fell to the American Right to devote itself to producing politically useable ideas. Its narratives on the futility of government-sponsored efforts at social reform, the dangers posed to the economic sector by state interference, and the primacy of moral breakdown as an explanation for poverty shaped the political discussion through the Clinton years.

Something changed after 2001. The instrument of that change was not the rise of a compelling vision on the Left, but a visceral reaction against George W. Bush’s presidency, not only on the Left, but also across much of the political center. In an odd way, the response to Bush, even on the Left, was rooted in what might be seen as a conservative revulsion over the recklessness of Bush’s policies, particularly his approach to the war in Iraq.

Bush’s radicalism—there is no other word—was captured rather chillingly in a 2004 New York Times Magazine article by Ron Suskind. Here’s how Suskind recounted a conversation with a Bush lieutenant:

The aide said that guys like me were “in what we call the reality-based community,: which he defined as people who “believe that solutions emerge from your judicious study of discernible reality." I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works anymore," he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We‘re history’s actors . . . and you, all of you, will be left to just study what we do.:

It dawned on much of the intellectual Left that control of the English department was not sufficient. Self-marginalization meant being confined to the wayside to “study” what others did. This was an act of democratic irresponsibility. It was time to rejoin the ranks of “history’s actors.”

But becoming actors in the American story required accepting the disciplines—and limitations—of democratic practice. It demanded a faith in the wisdom of fellow Americans and a dedication to the task of popular persuasion. It meant tempering utopian expectations and accepting the need for near-term reform. It meant moving from the seminar room to the precincts and the neighborhoods. The excitement so many experienced during the 2008 Obama campaign was nothing more or less than a rediscovery of the joy of democratic activism.

IN A DEMOCRACY, political engagement is an act of patriotism, a declaration of faith in the judgment of one’s fellow citizens and thus, ultimately, in one’s nation. Michael Walzer is right that the truly effective social critics are embedded in their societies and operate at least as much out of love as from alienation. And love is usually dominant.

In The Company of Critics, Walzer quotes the Polish intellectual Adam Michnik: “A movement that does not honor society’s constant values is not sufficiently mature to undertake the reshaping of that society.” Walzer draws the right conclusion: “Criticism is most powerful . . . when it gives voice to the common complaints of the people or elucidates the values that underlie those complaints.”

Note the twin obligations Walzer imposes on the critic: the democratic obligation to voice “common complaints” and the intellectual obligation to elucidate values. The latter can be quite subversive of accepted understandings, exposing as it typically does the ways in which a society ignores or violates the values it claims as its bedrock.

Few leaders better embodied the patriotism inherent in embedded criticism than Abraham Lincoln and Martin Luther King, Jr. Both drew on the insistence of the nation’s founding document that all men are created equal to launch social and political movements that revolutionized the country.

Lincoln spoke of “a new birth of freedom,”[emphasis added] paying homage to a nation that had been “conceived in liberty.” He appealed to the nation’s “bonds of affection,” to “the better angels of our nature,” and he spoke of the country’s “unfinished work.” He was confident that the nation was capable of finishing its work.

King described the “promissory note” to the nation’s African-Americans, a pledge to the “unalienable rights of life, liberty, and the pursuit of happiness.” King’s declaration that the note was “a bad check, a check which has come back marked ‘insufficient funds’” was a classic exercise in the elucidation of values. The nation’s sin originated not in the values that lay at its core, but in its failure to apply those values consistently. Therein lay King’s hope. “We refuse to believe that there are insufficient funds in the great vaults of opportunity of this nation,” King declared. “So we have come to cash this check.”

But as King knew, demanding that the check be honored is only a first step. And democratic politics is an ongoing commitment. A single election campaign, however exhilarating, is just the beginning of engagement. Moreover, embedded critics—their ranks include, but are not limited to, academics and intellectuals—have a necessarily ambiguous relationship to power.

Here, the differences between Lincoln, the politician, and King, the prophetic activist and critic, are clear. The politician focuses on the work that can get done and is called upon to have a realist’s sense of the limits of the possible. The critic is dogged in pointing to the work that remains unfinished, the reforms that are not adequate, the crooked places that have not yet been made smooth. “No, no,” King declared, “we are not satisfied, and we will not be satisfied until ‘justice rolls down like waters, and righteousness like a mighty stream.’” That is, to say the least, a standard that politicians cannot live by. But it is the standard to which they must be called.

The late Michael Harrington tried to square this circle by insisting that he was fighting for “the left wing of the possible.” It’s a powerful phrase because it asks activists and critics to keep in mind both of Max Weber’s categories for political action: the ethic of responsibility and the ethic of absolute ends.

AS A GENERAL PROPOSITION, democratic politics demands an ethic of responsibility. Persuasion is a long process, reform is always achieved in steps, compromise is inevitable, and moving forward is better than moving backward—even if the number of steps taken at any given moment can be limited by circumstances. A single election, a lone health care reform bill (even a big one), this civil rights bill, that labor law reform: all are steps down a road. They are not a destination.

But some critics will hold out and say they are not satisfied. They will call power to account even when those in power have some sympathy for their goals. They will lay out the requirements for a future better than the present even during times of progress—perhaps especially during times of progress.

Both kinds of critics are necessary. Both can, if they keep Weber’s admonitions in mind, contribute to democratic progress. Lincoln needed the abolitionists and the proddings of Frederick Douglass; Franklin Roosevelt needed the labor movement; John Kennedy and Lyndon Johnson needed the civil rights movement and Martin Luther King, Jr.

What is not an option in democratic politics is self-marginalization. Gestures are not enough. Flag burning does not cleanse a nation. The English department is not the White House. “Politics,” Weber wrote, “is a strong and slow boring of hard boards. It takes both passion and perspective.” This, at least, is something that most progressive intellectuals learned in the years between 2001 and 2009.

E. J. Dionne, Jr., is a university professor at Georgetown, a senior fellow at the Brookings Institution, and a syndicated columnist for the Washington Post. He is the author of Why Americans Hate Politics and, most recently, Souled Out: Reclaiming Faith and Politics after the Religious Right.

Symposium: Martha Nussbaum

By Martha Nussbaum
What relationship American intellectuals should have toward mass culture—television, films, mass-market books, popular music, and the Internet—will vary as much as the people themselves.

I think that it’s good if there are some intellectuals who get deeply involved with these media, because this will help intellectuals keep contact with a wider public. It’s much harder to do that now than it was formerly, given the decline of print journalism. But I hope not too many will become starry-eyed about these media and forget about the habit of slow reading, which is such a large part of good thinking. Sometimes the new media can help reading: for example, I now listen to novels on my iPod while I am running, and I “read” a lot more Trollope and Eliot than I used to. Often, though, the new media discourage people from reading books. I see this in many of my students, and it distresses me. We need to remind them that thinking is slow and rigorous, and that it does not always go well with the fast pace and the flash of popular culture.

On balance, the academy is a great help in furthering the engagement of intellectuals with American society When we think of the political philosophers of the fairly recent European past, most of them had to struggle to make ends meet, because their radical ideas made it impossible for them to hold tenured academic positions or to be protected by the deficient standards of academic freedom that then prevailed. Rousseau’s books were banned, and he was not employable in a university. Kant held a university appointment, but he always had to fear, and sometimes encounter, the suppression of his writings. Bentham and Mill published, but they were not employable in universities because of their atheism. Think of how much more Mill could have written had he not had a day job. Even the highly respectable Sidgwick had to resign his fellowship because he found that he could not support all the Thirty-Nine Articles of the Church of England. (The rule was changed, and he resumed his fellowship, but he still had to conceal his sexual orientation, as Bart Schultz’s biography now shows us.) Closer to our own time, both Bertrand Russell and John Dewey encountered significant problems of academic freedom, though they kept their positions. The U.S. university system is not perfect, and we must always be extremely vigilant about potential denials of academic freedom. During the Vietnam War era, in particular, there were abuses. It is, however, better than most systems have been in most times and places.

Of course, these protections may lull intellectuals into ignoring issues of their time, and that is bad when it happens; but it is still better that the protections be there, in the strongest possible form.

Academic freedom is especially important because, I believe, the best way for intellectuals to engage with American society is for us to think, write, and teach. Sometimes some of us may take up actual political positions, but beware: the person who does so loses a lot of freedom. As I contemplate friends of mine who are serving in the Obama administration, I feel so lucky to have the ability to say whatever I like and to work things out the way I like to work them out I think of what Cicero said about some of his contemporaries who refused to get involved in politics: they “claim for themselves the same privilege as kings—to obey nobody and to enjoy liberty, saying and doing whatever they please.” That is what I am doing, staying here in Chicago, and my friends are doing what Cicero thought one ought to do, serving the republic at a serious cost of freedom. He would consider my choice self-indulgent. However, I think that most of us serve the republic better by our writings than we would by going to Washington and giving up writing. Look at Cicero: his direct political action had little effect on history, but the books he wrote during his periods of exile changed the world.

I DO CONSIDER myself a world citizen, but I have also come to believe that Mazzini was correct: patriotism of the right sort is an essential source of political stability and, ultimately, of global concern. Mazzini saw that people are usually preoccupied with their own narrow affairs. So it is very unlikely that they will be motivated to serve all humanity. The idea of the nation, however, can be transmitted in a powerfully motivating symbolic form, calling the heart to the service of noble ends, and these ends, rightly formulated, can lead on to the service of all humanity. Of course, most patriotism is not like that, but it can be: look at Nehru’s “Tryst with Destiny” speech for a beautiful example of what I have in mind. One may also find this in the oratory of Lincoln and Martin Luther King, Jr.: all used a resonant and moving idea of the nation to attach people’s hearts to abstract moral values that ultimately acquire a cosmopolitan significance.

Martha Nussbaum is Ernst Freund Distinguished Service Professor of Law and Ethics at the University of Chicago, appointed in Law, Philosophy, and Divinity. Her latest book is From Disgust to Humanity: Sexual Orientation and the Constitution.

Symposium: Leon Wieseltier

By Leon Wieseltier
I am human and I consider nothing human alien to me”: this statement has always struck me as preposterous. Of course there are human creations and activities that are alienating, or worse. (The famous sentence in Terence’s comedy is in fact spoken in bad faith, as an excuse for an obtrusive neighbor to intervene in a matter that is none of his business.) And to the inventory of alienating human productions one must add a good deal of American mass culture—for its transformation of a citizenry into an audience; for its hardening of an entire population toward the most obscene representations of violence, which we call entertainment; for its grotesque sexualization of an entire society, which has the effect not least of degrading sex, even dirty sex; for the mental passivity inculcated in millions of people who are helpless before its big and little screens, and who mistake screen-experience for experience; for the vicarious and self-estranged character of existences that are fascinated by the celebrity culture; for the surrender of people’s confidence in their own judgment as a result of its barrage of pseudo-expertise and pseudo-authority—I could go on.

But I do not want to be mistaken for a snob or a prude. All of the above debasements notwithstanding, I can think of three good reasons for intellectuals to engage with popular culture. The first reason is humanism: there are wise and deep expressions of the human spirit in popular film, popular music, and even television. High culture has always found inspiration in low culture—Romantic music would be inconceivable without folk songs and folk dances—owing to the discovery by great artists of the human truth in popular forms. And not all of those forms can be high-mindedly reduced to “mass culture.” Is jazz high or low? The question answers itself. Whether or not Monk is like Debussy, he sure as hell is not like Kanye West. The second reason is criticism: if millions of Americans are kindling to a song or a movie, anyone wishing to understand America must become acquainted with that song or that movie. This is the case also with certain (but not all!) bestselling books. The social and cultural critic must be a traveler through the realms. The third reason is hedonism. There is pleasure, because there is life, in mass culture. “I want to be/at least as alive as the vulgar,” Frank O’Hara declared in a poem called “My Heart.” For all I know, he wrote that line at his desk at the Museum of Modern Art, in the very epicenter of cultural mandarinism. And there is no use denying that lifelessness may also be found in high art.

The point, I suppose, is never to confuse the spheres. The championship of mass culture by intellectuals must be vigorously challenged when it is done as an attack upon the legitimacy of the categories and the distinctions—for a leveling end, as yet another gospel of relaxation; or to establish irony as the highest value of culture; or as the cultural program of a political ideology. I must confess that I regard intellectuals who are immune to the power of Winterreise or The Flaying of Marsyas or Modern Love or The Four Temperaments as incomplete intellectuals, insofar as they cannot grasp such refinements of structure and meaning and make of them refinements of their own souls. I think that the life of the mind should be soulful; but that is my own inclination. Otherwise, as I say, protect the differences, find truth and beauty where you can, and slum on.

ON THE QUESTION of the academy, may I take an incomplete? (My rant—excuse me, my meditation—might wound some people I admire and even adore.)

AMERICAN INTELLECTUALS should participate in American politics truthfully, and with a lasting scruple about the integrity of argument. Alone or in a gang, they should say what they really believe, and proceed to justify it. They should espouse their ideas as if their ideas really might come to power—they should neither despise power nor worship it—and they should do so in a language that ordinary Americans can understand. Stifle the aporia and leave the hybridity at home. The analysis of a bill is not the analysis of a poem. They should learn to respect policy, which is less lofty and glamorous than politics; and they should make their contribution in a manner that may be useful to the makers of policy, even if only indirectly, in the clarification of the philosophical foundations. There is no shame in partisanship, though there is often stupidity, and intellectuals in politics have a particular obligation, obviously, not to be stupid. They should deny themselves the ugly thrill of populist anti-intellectualism: derisive talk about elites and the “new class” and so on. The anti-intellectualism of intellectuals is especially awful, and none of us work in the mines. They should not condescend to Washington, as if they themselves live in Athens. Above all, they should never lose their heads. (The ecstasy about Obama was disgraceful, even though he was supportable. Ecstasy is not an intellectual accomplishment, which is precisely why it is so often sought.) They should always be prepared to be disappointed, or proved wrong. They owe their loyalty to principles, not to persons.

A PATRIOT OR a world-citizen? No, a patriot and a world-citizen. I do not see a contradiction between them, in the way that I do not see a contradiction between the particular and the universal. They seem equally real and equally reliant on each other. Who would want to be, who could be, only particular or only universal? We are too compounded and too complicated for single loyalties. Single loyalties are a human deformation. I have double, triple, quadruple loyalties, and would gladly consider more; an addiction to allegiances, which I can justify not only sentimentally but also philosophically. I am loyal to two countries; to a variety of languages and cultural traditions, though I do not belong to some of them, and am not adequately educated in some of them; and to many principles. (A principle can sometimes feel like a country.) An almost embarrassing number of things—beings and entities and ideas—rightly claim me. I am willing to sacrifice for all of them—not in the same measure, of course, but I cannot be indifferent to the predicaments of any of them. I am willing to “prioritize,” but not to shrink. The list of all that is valuable to which I am indifferent is always too long. All these allegiances I regard as obligations of self-transcendence. Some of them originate in love, some in honor, some in both; and some were acquired as a gift of experience.

I understand that such a collection of causes runs the risk of promiscuity, of a sort of consumerist approach to the most precious things in life; but my conscience in this regard is pretty clear. The difference between dilettantism and multivariousness is work. At this late date in the discussion about identity, almost everybody recognizes that identity is multiple and plural, but not everybody recognizes the burden that this bounty represents—the chores of complexity. Identities are not received, they are chosen; and even the ones that are received need to be chosen. In this regard, too, the great sin is passivity. I understand also that all these allegiances may not add up, but the ideal of adding up is a Hegelian illusion that infects individuals and communities with a totalizing tendency. The political effects of totalization in one’s picture of the self and the world are well known, but even before one laments the danger of the consequences one should lament the falsity of the concept. A monistic account of human existence is a lie. This lands us in the lap of what philosophers call the problem of the incommensurability of values, but I have never been overly tormented by this. For a start, the tension between values that do not go together is a foundation for the development of intellectual judgment. The standpoints expose each other’s limitations, and so they serve as instruments of criticism. Criticism comes naturally to a pluralist universe. There are so many aspects, so many measures.

Isaiah Berlin taught that there is a tragic dimension to the conflicts between values, because they all cannot always go together, and in politics the stakes of decision are sometimes high. I accept his teaching gratefully. I do not believe that we have escaped the rigors of zero-sum any more than I believe that we have escaped the law of the excluded middle. If human character did not change in 1910, as Virginia Woolf foolishly said it did, neither will it change in 2010. But I do not see only tragedy in “value pluralism.” I see also delight. We are not all presidents and prime ministers, and the antinomies of an ordinary life thoughtfully lived are signs also of its richness. The more attachments, the more sorrows; but the more attachments, the more joys. The opposite of patriotism is not cosmopolitanism. The opposite of patriotism is Buddhism. I do not say this facetiously. One who is immersed in the plenitude of commitments can easily understand the fantasy of shedding them. But I used to dream of escape more than I do now. Now a little stillness goes a long way. Now I have a sensation of stewardship, of responsibility for the building and the maintenance of certain institutions of meaning—the elders are almost all gone, it is our watch now, on our world. Which is to say, the world’s hooks are in me for good. So show me the flags.

Leon Wieseltier is the literary editor of the New Republic.

Symposium: Katrina vanden Heuvel

By Katrina vanden Heuvel
Some questions are really not worth asking, even as they nag. What relationship should American intellectuals have toward mass culture: television, films, mass-market books, popular music, and the Internet may be one of them. Before answering it, let me first attack any effort to do so.

I don’t think we have a recognizable group of American intellectuals of real political weight, at least not intellectuals of the sort celebrated by and occasionally inhabiting the old Partisan Review. That is, we don’t have an identified bunch of very smart and socially interconnected people—of course, often neurotic, passionate, and sometimes delusional—who judge their life by its contribution to human science or art and who see themselves as the guardians of its standards before a debasing and resolutely meretricious mass culture.

One reason we don’t is that, whatever the searing inequalities of the American economy, talented Americans of almost every conceivable background have access to higher education. Another is that technology has erased virtually all barriers of entry to broadcasting individual opinion. A third is that America now lacks anything like a responsible business elite or a working class that might provide a natural audience. The well-heeled abandoned this country some time ago. The democratic public, without much organization or political leadership, is still not fully formed. We don’t have standards of public discussion in this country. We don’t even have political debate requiring rhetorical regard for the public interest.

I also don’t consider myself such an intellectual. I consider myself a reasonably intelligent editor and publisher, running an independent magazine of opinion, whose chief social interests are political. I’d like to make a contribution to achieving this country and making peace in the world. The most immediate way I know to do that is by getting the next issue out, making it as interesting as possible, and by disseminating its values and opinions on radio and television.

Perhaps I mistake myself. The word “intellectual” traces to the Dreyfus affair. It was the new collective name taken by those diverse writers and artists who, in sudden and articulate concert, condemned the injustice of his treatment. I’m sure I would have joined them. A generation ago, Noam Chomsky said the responsibility of intellectuals was to tell the truth and expose lies. I try to do that every day. However, I think of intellectuals as not only fearless truth-tellers but as people materially contributing to that truth by advancing science or art. I don’t do that.

I also don’t think “mass” means much these days. Most of the commercial boundaries between high-brow and low-brow culture have long since dissolved. Certainly that distinction isn’t policed by the technologies themselves. It’s not as if book readers are high-brow and blog readers are low—after all Stephenie Meyer’s Twilight novels and Glenn Beck’s screeds dominate book sales while scholars like Juan Cole and Michael Bérubé reach their largest audiences via blogs. Cornel West has over 12,000 Twitter followers. And how would one describe the phenomenon of Oprah’s Book Club, which can instantly put works by William Faulkner and Leo Tolstoy on the middle-brow New York Times bestseller list through the magic of TV talk and paperback mass marketing? Did “high-brow” opera (Puccini’s “Nessun Dorma” from Turandot) become “low-brow” trash when cell phone salesman Paul Putts turned it into a global hit on the reality TV show Britain’s Got Talent? Or was it when Luciano Pavarotti popularized it during the 1990 soccer World Cup games? These distinctions have long since stopped making sense—if they ever did.

These caveats entered, I guess my answer to the question would be “critical embrace.” And in giving it, I’ll use “intellectual” in the broad sense of “a thinking American with interests in public affairs,” to include myself.

I take it to be manifestly crazy, even were it possible, for such intellectuals to ignore or shun mass culture. It’s too important. That’s where most Americans, especially but surely not only the young, get most of their information, opinions, and general take on politics. Their other source is their friends, who are generally watching and reading the same things. So of course we should engage. Frank Rich must have written a dozen columns using AMC’s Mad Men to frame these times. And 24 may have influenced how a generation thinks about torture. As Judge Sonia Sotomayor admitted of herself, Perry Mason and Law & Order shape popular thinking about law. Before Twyla Tharp choreographed to Billy Joel, the Joffrey Ballet danced to music from the Purple Paisley god himself—Prince. The cheapest forms of popular culture (comic books, TV, pop music, and so on) have forever shaped the imagination of current and future artists and presidents and offered the consolations of escape and control and pleasure.

It’s also true that the profit imperative and relentless consolidation of corporate media, and more than a little of human nature, means that much of mass culture is utter junk or worse—indeed degraded, inhumane, politically backward (sexist, racist, materialistic, and so on), or just stupid. But nothing’s new here except its totalizing reach, total because of the continued decline of alternative sources of authority. That’s the way of a depoliticized capitalism: no real secular community, politics and society as largely spectacle, mass privatization of civic culture. It may be that changes of degree have produced a change in kind, that people have actually become lobotomized, not just idiotically entertained. But I doubt it.

For one thing, the notion of “mass” in our culture is transforming before our eyes. At the ground level, the fact that the costs of broadcasting and information retrieval have dropped to near zero, and the limitless possibilities of peer-production and self-organizing made available by the Internet, is the greatest social technology fact of our time. The Internet has already changed political campaigning and social movement organization and advocacy. It is well on its way to transforming government and almost all critical economic relations: the structure of the firm, the divisibility of property rights, national and local strategies of economic development. I think this opens enormous possibilities for progressives. They should stop congratulating themselves for cottoning to the Internet just a tiny bit faster than the Right and devote themselves to collectively mastering and diffusing liberation technologies.

A LITTLE HIGHER UP, the expansion of the number of commercial broadcast channels and segmentation of audiences has of course increased the dangers of only listening to oneself. But it has also manifestly opened up a host of mid-sized audiences for good content. I can get as much opera and political satire as I want, along with home-shopping networks and reality TV. What I miss from television is my own Fox, a source of intelligent analysis and widely resourced coverage that I can rely on, in the same way its current audience relies on its lethal distortions. But even here, I think we’re a bit better off than a while ago. Would Rachel Maddow, a decidedly intellectual, openly gay, and progressive commentator, have commanded an audience of any size ten or twenty years ago? Would an academic like Charles Ferguson have gone into business, and then decided to make an Oscar-winning political documentary? I doubt it. We’re also seeing the rise of a new generation of intellectuals who freely combine high and low culture, demanding and easy analysis, in ways that find a decent-sized audience. Whatever else one thinks of him, Michael Moore exemplifies this, as does Spike Lee, whose mostly rigorous documentary When the Levees Broke was broadcast on HBO the same year his crime drama Inside Man hit theaters.

This is all to the good. They and others are producing smart, nuanced, thoughtful mass-culture products that also find an audience, even if that audience isn’t the same size as, say, the one for The DaVinci Code or Dancing with the Stars.

So I don’t worry about intellectuals being able to penetrate mass culture. I assume the next generation of them will assume, with everyone else in our Internet-united humanity, that there are a variety of technologies available to make their arguments and art and a variety of genres and styles that can be mixed in making it. What I am worried about is that their contribution will become merely another form of niche entertainment, with no real bite. I want the public itself to have the information and capacity to act on arguments, and worry that that is diminishing. This is almost entirely a political matter, and this is where the critical part of the embrace comes in. We need to declare, to one and all, that in the mass of commercial speech we also need easily accessed, publicly driven or public-minded sources of news and information. In news, it’s past time that the United States join the rest of the world in having some public-minded alternative to the major commercial networks. And in the United States and the rest of the world, I think it essential that we stop the commercial erosion of the democratic global space opened up by the Internet. Here I think intellectuals have some intellectual work to do—to design a more public-minded communications system than the one we have, while keeping the barriers of entry to its use low, and to design an intellectual property rights regime, globally, that will not choke off humanity’s current capacity for improvement. In truth, I think most of the work is political—to make the case for why that’s important at all. As with other public goods, a democratic media and communications system will be hard to achieve without a public.

Katrina vanden Heuvel is editor and publisher of the Nation.

Symposium: Michael Tomasky

By Michael Tomasky
I’m not qualified to answer question two, so consider this a response to the other three questions.

Internet, film, television, and popular music are rather broad categories, each containing nutritious wheat and faddish chaff. By “television,” do we mean The Wire or Dancing with the Stars? By “Internet,” do we mean or pornography? But without wasting space on a virtually endless inventory of such distinctions, I say, Embrace!

On balance, these are overwhelmingly liberating and progressive forces. At this point in history, I can’t imagine that there’s even any argument about this. In fact, I thought it was more or less settled by the 1990s, let’s say, that the antique left-intellectual disdain for popular culture had been rather embarrassing. Go back and read, as I once did, the initial grudging and snobbish assessments of The Beatles in the highbrow journals (“No Soul in Beatlesville,” ran the Nation’s headline in March 1964). How silly does that look today, as do so many indictments issued by the Left in those years of television and other expressions of popular culture. At the time, it was all dismissed as masscult drivel. But through history’s lens that view holds up about as well as 1948 predictions that come Election Day, Henry Wallace just might surprise some people.

Can it be any clearer today that these forces are our friends? The fundamental fact of the Internet is that it releases both information and knowledge to people who haven’t had access to them. Don’t count me against that. I admit I don’t listen to much new music, but from what I gather the singers and rappers are still for the most part counseling their young listeners to question authority and smash convention and engage in kindred healthy activities. Film, or at least American film, seems to me to be in a bit of a moss-gathering period in general—too many projects aimed at fifteen-year-old-male Cowper’s glands, but even some of those can have a certain subversive wit about them.

It is for television, though, that I reserve my clearest enthusiasm. If you still don’t watch even the critically acclaimed shows: do. They’re good. Much television writing is good today. Boundaries are pushed on TV that even movies shy away from. And Jean Baudrillard turns out to have had it wrong: I say television creates real communities. Friday Night Lights is no false simulacrum. It’s practically as real as real life—a show about high-school football that’s also about race and class and physical handicap and angst and sex (fraught sex between teenagers, mature sex between their parents) and why people fear things they don’t know. When I watch it, and know that millions of others are—and when I visit its Web site or read chat rooms devoted to the show—I become a part of something. (On the other hand, Baudrillard is probably right about television news, which people should watch only with deep skepticism.)

IT SEEMS TO ME the responsibility of intellectuals today to be engaged with the world and with our country. I don’t mean intellectually engaged, which is a given, but literally and physically: get out there. Go to a Home Depot, an Applebee’s, a courthouse square, a small town’s theater company production; indeed, a high school football game.

I am mindful here of the quote, which I read too long ago to have down exactly, from Philip Rahv, who left Manhattan to take a drive around America and reported back to William Phillips with horror about the “monsters out there.” That kind of thinking is now a caricature, but I’d venture that many liberal-left intellectuals still have some sympathy with the general idea. That may have been okay then, when New York was the cynosure of virtually all intellectual activity in America and could dictate tastes and mores to the rest of the country. But that is not the case now. It is also not the case any longer that New York is all that singular (there’s a used-book store in Bethesda, Maryland, as good as any I currently know of in Manhattan). Our people are everywhere. Outnumbered, often—but everywhere.

I go into all this because I think it is the first prerequisite for true political participation these days: engage, investigate, see. There’s a bit of tactical politics in my position. The right gets such vast mileage out of its jibes about liberals and the coasts and “flyover” country and our alleged contempt for regular folkways. So we should do what we can to neutralize those arguments.

But that’s just a part of it. The larger part is simply that today’s America is a very interesting place. No, I confess, I’m not moving to Alabama anytime soon. I live in Montgomery County, Maryland, which is as blue as the Upper West Side of Manhattan, and I’m staying put. But the spread of information and education; the impact of immigration, which has delivered people from all over the world even to small towns; and, yes, the liberalizing impact of the popular culture discussed above—these and other factors have transformed the country. The nation represented in the news media—blue versus red, divided, irate—is not really the nation that exists in most places. It’s worth going out there and seeing this.

It follows from the above that of course I am a patriot. America is vast, weird, anomalous, and I love it. I admire the principles to which the nation aspires in its better moments. Yes, we’ve done a lot of harm in the world. A lot. But it’s also the case that we cannot—even Barack Obama cannot, it turns out—settle ancient scores that are not in the first instance about us.

Again, I think the old and stereotypical liberal-left view of patriotism as somehow jingoistic or simple-minded could do with some revisionism. And now is an especially good time for it: the Republican Party and the Tea Party Right, accusing our side of harboring infectious, alien schemes, is paradoxically sounding crazier and crazier (and more un-American) with each passing year, month, and week. When a sitting governor (Rick Perry of Texas) idly muses about secession; when a member of Congress (Michelle Bachmann) says that Americans should refuse to participate in the Census; when other Republican members of Congress refuse to say whether Obama is a rightful citizen,they are themselves taking positions that average folks recognize as alien. There’s an opening there to redefine patriotism and rebrand it, as it were, with our stamp.

I am a world citizen, too, emphatically so. And maybe the time is right to fuse the two concepts a bit more than we have. It may infuriate Glenn Beck and his followers, but again, I believe most Americans (perhaps just barely, but yes, most) understand that so many of our challenges—the environment, poverty, development, disease—are transnational and global. Patriotism, today and in the future, includes recognizing this and acting accordingly. A fight this will be, no doubt of that. It is a fight our side will win eventually.

Michael Tomasky is editor of Democracy: A Journal of Ideas and American editor at large for the Guardian.

Symposium: Katha Pollitt

By Katha Pollitt
I want to focus on the question of patriotism. If an American child and a Peruvian child were drowning, would you rush to save the American child first? If you were in charge of feeding an international crowd of travelers stranded by a disaster, would you give the Americans extra pie? Would you refuse on principle to marry a foreigner? Of course not. In our lives as individuals we would find it unfair, bigoted, even bizarre, to give automatic preference to another American. What about fairness, equality, merit, relative need, and simple human feeling?

But when one speaks not of Americans but of “The United States of America” everything changes. An enormous collective-emotion machine inculcates in us virtually from birth the notion that we have a moral obligation to put our country’s interests first, to love the United States above all other countries, and indeed above all else, not only because it is our home—because actually, people don’t always love their home, and sometimes they have good reason—but because it is the best. We understand, of course, that the inhabitants of other countries are under a similar obligation: the Japanese are supposed to love Japan the most, and Jamaicans Jamaica. That is the paradox of patriotism: everyone is supposed to think her or his own country is ”the best,” but only one set of inhabitants can be right. (And we all know who they are. Hint: not the French.)

In the United States, Left and Right both make much of their love of country. True, they don’t agree about what makes the USA so superior: conservatives insist they protect the ”Real America,” where family farms and family values resist the incursions of Hollywood, socialism, and sex. Progressives counter with what I call Abstract America, America as the sum of its best parts: constitutional democracy; freedom of speech; Huck Finn and Leaves of Grass; fighting Hitler; shared prosperity; a general sense of optimism and openness; and, most important, ideals of social justice extended through struggle to more and more people (blacks, workers, women, gays, the disabled). America has faults, of course, and fixing them is what the Left is all about. But the faults never challenge the plot line of continual self-improvement (as if other nations have not also become more tolerant, inclusive, and fairer over the centuries). A leftist who wants to make a systematic critique of actually existing America has to tread warily. Attack the religious Right too vigorously and you’ll find yourself accused of elitism—in the Nation, no less. Suggest that bombing villages is not the way to liberate Afghan women and to “liberal hawks”you’re a cultural relativist. The self-described reasonable Left—what Michael Walzer in these pages called “the decent Left”—regularly attacks as America-hating what it calls the “Chomskyan Left,” that tiny segment of the Left that espouses a less forgiving narrative—America as an imperialist superpower, fomenting wars, supporting autocratic regimes, and despoiling the world. America-hating—it’s the ultimate, unanswerable charge.

I know what you’re thinking: patriotism is not the same as nationalism. Vats of ink have been spent distinguishing the two, but how different are they really? If I say America should run the world because we’re Number One, and you say, Well, it shouldn’t run the world all by itself, but still, it’s unique in world history because its identity is so fluid, or it was founded on ideals of equality and freedom, or it’s a fabulous generator of cultural innovation, aren’t we both saying a version of the same thing: America is superior because I was born there? If we were Romanians or Gambians or Uzbeks, we would trumpet just as loudly our unique and special way of life, our history and customs and beautiful landscapes (for which people curiously take credit, as if they put the mountains and forests there). It may be natural to love one’s country, but it’s less a noble virtue than a habit, the way people tend to like the food they grew up with, even if it’s haggis or lutefisk or roasted rats on a stick.

Why is patriotism bad for America? It prevents us from seeing ourselves the way others see us. To us, for example, the detention without charges or trial of some six hundred prisoners in Bagram is a small item in the ongoing and mostly uplifting story of American justice. That’s not how it looks in the Muslim world. We’re constantly being surprised that the rest of the world doesn’t automatically love us. We might see why more clearly if we weren’t so in love with ourselves.

Patriotism, with its narrative of progress, also makes it hard to see ways in which we’re moving backward. Our class system is becoming more rigid, not less. We imprison more people per capita than any other country—that’s new. Public schools are rapidly resegregating. Yes, we have a black president, and that says something truly wonderful about our ability to overcome our past. But it doesn’t erase the bigger picture, which is that in important ways most people in other wealthy, industrialized countries have a better chance of flourishing than Americans do. I’m thinking of universal health care; good schools; access to social services; much lower rates of violence, especially murder; much less poverty, child poverty, and homelessness; safe jobs at decent wages; dignified support for single mothers; more leisure time; less unwanted pregnancy and childbearing; and less religious folderol.

Our commitment to the patriotic progress narrative means we end up living in the past, like Italians who would rather reminisce about Garibaldi than face up to Berlusconi.

Finally, patriotism lends itself to an Us versus Them worldview that fuels our grotesque military budget and all too easily leads to war. That’s what happened after 9/11. Those who challenged the Bush administration’s self-serving account (they hate our freedom) were demonized as unpatriotic, even for saying the most innocuous, self-evident things, such as that Mohammed Atta and the other perpetrators were not cowards. Even feminism can be shanghaied into the narrative of America exceptionalism as justification for war.

Pundits usually indifferent at best to women’s rights (or in the case of David Horowitz actively hostile to them) point endlessly to the fact that American women have freedoms women in Muslim countries do not. Don’t you wonder why women in India, China, Africa, Latin America, Russia, or for that matter Utah, never enter into the question?

My Nation column after 9/11 about not flying the flag was widely attacked as anti-American, cold-hearted, foolish, and ill-advised.

I’m sure I could have written more carefully and sensitively. The tone of that column was unnecessarily prickly, and I went too far when I identified the flag with racism and jingoism, because of course it has many meanings, including anti-racism and rejection of ignorant chauvinism. But my central point was, I believe, a good one: we need to think in a larger framework than our own country and be wary of appeals to patriotism in a crisis, because when the flags come out, people tend to turn off their brains, and the next thing you know, we’re at war. In fact, that is what happened. That is what is still happening.

WHAT IF WE took seriously the idea of one world? Today’s most serious problems are global—climate change, environmental ruin, the north-south divide, the oppression of women, famine, disease, and overpopulation. They cannot be solved if every country gazes lovingly in the mirror and refuses to give up any of its privileges. As Americans, we need to stop living in a Ken Burns documentary and take more seriously the fact that politically we are just one nation among many, and economically we are 5 percent of the world’s population using 25 percent of the world’s resources—a gigantic national potlatch of overconsumption and waste.

I realize that criticizing patriotism generally doesn’t go over very well, let alone telling people they’re not so great and even a bit greedy. But what has all our flag-waving done for us in the end?

Maybe that’s the question we need to think about.

Katha Pollitt is a columnist with the Nation. Her most recent book is The Mind-Body Problem, a collection of poems.

Symposium: Jackson Lears

By Jackson Lears
Coming of age during the Vietnam War, I cut my cultural teeth on an exalted idea of intellectuals. They were the people who challenged the official pieties, especially the easy equation of power and virtue, the American civil religion that justified imperial misadventure. Sometimes, even at my conservative southern university, they were my professors—especially the historians Paul Gaston and Bill Harbaugh. By exposing the mendacity of American policy, they fostered a critical spirit in their students. By challenging the equation of anticolonial nationalism with Soviet communism and exploring the futility of foreign attempts to crush a popular insurgency, they gave us an alternative way of seeing the U.S. role in the world.

Of course, there were prominent counter-examples of intellectuals at work. Washington was full of mandarins who defended deranged policies in the name of “pragmatism” or its cousin “realism,” who cavorted in counterinsurgency fantasies, defended doctrines of surveillance and secrecy, and constructed rationales for the nuclear arms race. These were the likes of Walt Rostow, Herman Kahn, McGeorge Bundy, Henry Kissinger, and their legions of imitators. These were the devotees of the “crackpot realism” identified by C. Wright Mills. As a naval officer and a participant in the anti-war movement, I encountered the calm insanity of the crackpots at close range—the denatured language deployed to justify mass slaughter, the “pragmatic” preoccupation with technique rather than purpose. For years I read and re-read Randolph Bourne’s critique of John Dewey’s justification for American entry into World War I, convinced that the misuse of pragmatism was one of the original sins of American intellectual life.

Still, for a while it seemed that there were vital alternatives to the technocratic discourse of national security policy. Criticism of imperial excess even penetrated the halls of Congress, energizing the Church Committee and other inquiries into executive crimes. But serious re-examination of American empire proved impossible to sustain for long; crackpot realism was too thoroughly institutionalized, too pervasively intertwined with entrenched ideological needs and economic interests.

As hopes for a New Left faded, my contemporaries and I became preoccupied with the failure of countercultural protest, the ease with which it became trivialized and reabsorbed into the mainstream of consumer culture. How did dominant groups defang dissent? How—apart from guns and money—did the ruling class rule? Graduate school was a good place to raise these questions during the 1970s. The air was full of rueful re-examinations, but also fresh possibilities. Christopher Lasch was penning spirited (if sometimes unfair) polemics against “the fake radicalism of the counterculture”;

E. P. Thompson and Raymond Williams were reviving Marxism with a cultural emphasis, groping their way toward what Thompson would call a “greenish libertarian socialism”; Eugene Genovese and my own mentor, David Brion Davis, were exploring the writings of Antonio Gramsci, unraveling the concept of cultural hegemony.

What they discovered was that cultural hegemony was not to be confused with social control; it was about legitimation, not manipulation. The problem was not that ordinary people were brainwashed into accepting policies against their interests, but that certain ideas and values were simply not admissible into the charmed circle of “responsible opinion.” The ruling class ruled by keeping some ideas out of circulation, as gatekeepers in the mass media and other cultural institutions declared them “tasteless,” “irresponsible,” or simply “unrealistic.” Of course it was not simply a conspiracy: journalists and other gatekeepers were acting in accordance with their own professional standards and practices, as well as their own beliefs; policy makers serving elite interests persuaded themselves that they were acting in the service of society and indeed humanity at large. But the consequence was that challenges to established hierarchies were either trivialized beyond recognition—as countercultural protest was transformed by journalistic convention into a riot of sex, drugs, and rock ’n’ roll—or else screened out entirely from public discourse—as single-payer health care has been eliminated from contemporary debate.

THE CONCEPT OF cultural hegemony helps illuminate the disappointing intellectual history of the last thirty years—disappointing, at least, for anyone who believes that intellectuals have a responsibility to be critics rather than servants of power. Since the 1970s, the Right has deployed the lessons of Gramsci with devastating success. This was a conscious strategy, articulated in 1971 by Lewis Powell (who would soon be appointed to the Supreme Court by Richard Nixon) in a memorandum to the U.S. Chamber of Commerce. Powell was alarmed by what seemed to him to be the dominance of left-wing views in universities and the mass media. He urged the friends of capitalism to retake the field by establishing right-wing think tanks, endowed professorships, and media outlets. All of this happened, of course, with a vengeance.

The resurgent Right was quickly crowned with electoral success in 1980: the ascendance of Reagan ratified and reinforced their ideological counteroffensive. The change in the atmosphere was palpable and unmistakable, as reporters from the Washington Times began appearing on television talk shows alongside “fellows” from the Heritage Foundation and the American Enterprise Institute. Within little more than a decade, the fulcrum of debate shifted sharply to the right: “economic reform” was redefined as deregulation of business; “special interests” as welfare recipients. Views that had once been considered the province of a revanchist fringe—free market fundamentalism, bullying militarism—acquired a fresh patina of respectability. By the end of the century the Right had reframed public discourse in the United States. Amid the fearful chauvinism of the post 9/11 era, George Will began to look like a moderate, and the Washington Post began to look like the Washington Times, with an op-ed page that resembled a public relations document from the Pentagon. People who called themselves intellectuals, in this brave new era, were often little more than craven apologists for wealth and power.

And where was the Left during this ideological putsch? On talk shows, the “liberal” position was usually represented by a gray figure from the Brookings Institution or some other centrist think tank, who spoke in the soporific idioms of managerial efficiency. In mainstream politics, liberalism lost its moral fervor as it dissolved into technocratic and therapeutic jargon: the Clinton-Gore administration caught the new tone perfectly. Intellectuals, in this world, were trivialized into “policy wonks.”

What was left of the left intelligentsia retreated into the academy, where the tragedy of 1960s cultural politics was replayed as farce. Partly this involved the dominance of identity politics. Its sources were compelling and wholly understandable—the desires of women and minorities to vindicate and explore a separate sense of self, independent of the hegemonic standard established by white males. But one unintended consequence of the quest for alternative identities was that it created a new kind of fragmented, interest-group politics, unmoored from any larger vision of the good society. Cut off from engagement with actual policy debates (the province of “wonks”), the left intelligentsia retreated into academic politics—micromanaging curricular reform with ferocious intensity, debating the finer points of “cultural theory” with scholastic precision.

The regnant modes of theory shifted as well. Cultural Marxism fell into disuse. There were many new theorists on the block, but the most influential across disciplines was Michel Foucault, a subtle and challenging thinker whose work was in many ways tailor-made for understanding the new forms of coercion unleashed by the “war on drugs” and the emerging surveillance state. But another side of Foucault proved more broadly influential: for many left academics, he became less a theorist of the surveillance state than an advocate of Nietzschean individualism, whose vision of “heterotopia” celebrated myriad sites of resistance to repressive authority rather than any larger notion of commonweal. All of this comported well with the emerging cultural politics of the academy, which in many ways constituted a mirror image of free-market individualism. From the mid-1980s on, it was possible to discern a kind of left-wing Reaganism among academics in the humanities and social sciences, most visible in the postmodern tendency to celebrate consumer culture as an arena of choice, liberation, and self-creation. Fearful of seeming to be puritanical killjoys, left intellectuals backed away from environmentalist critiques of heedless consumption. No wonder the Right had such an easy time establishing its cultural hegemony.

Now, we have been told, the reign of the Right is over—though it is a little too soon to celebrate. To be sure, the election of Barack Obama signified a great triumph of insurgent democracy, and a deliverance from the unfolding coup d’état conducted by the Bush administration. The dominant political culture created by the Right has been challenged but not fundamentally changed. Obama’s early policy decisions reveal the inertial pull of powerful institutions—investment banks too big to fail, national security bureaucracies too sensitive to reveal their secrets, Pentagon contractors too hungry to forswear their appetite for overseas bases. And Obama’s cautious pursuit of bipartisanship may be another misuse of pragmatism.

Still, this is a moment of possibility. The Right has disgraced itself by its inability to govern and, even more, by its disregard of international law and fundamental constitutional traditions. Not since the Great Depression has there been such an opportunity for the Left: a chance to make politics more than a matter of managerial technique—to take the moral high ground, to reassert the claims of commonweal. That is where the intellectuals come in, to articulate that larger vision. Or so we can hope.

Jackson Lears is editor of Raritan and author, most recently, of Rebirth of a Nation: The Making of Modern America, 1877-1920.

Symposium: Alice Kessler-Harris

By Alice Kessler-Harris
I count myself among those disappointed in Barack Obama’s presidency so far. I had not expected miracles, but I had hoped for a more dramatic turnaround in our politics: for an end to the war in Afghanistan; a rapid closing of Guantánamo; and a denunciation of torture, rendition, and the endless pursuit of an elusive and protean terrorism. On Election Day last year, I anticipated a more generous health care bill and a restoration of modest regulations on banks and financial investment firms. Obama led us to expect these things of him when, in his mellifluous and powerful voice, he advocated “change you can believe in.” I understood candidate Obama’s call to be not simply one of political style—not simply a cry to throw the scoundrels out. I wanted to believe that it was also a call to recalibrate our moral compass. As an academic, an intellectual, a student of twentieth-century American history, I resonated to the call.

I now think that we, we intellectuals on the Left, have failed our president and our country, that we share some of the responsibility for the political mire in which we are embedded. Candidate Obama raised some big issues. He called for a return to such constitutional principles as habeas corpus and for a redefinition of justice. He advocated a restoration of the rule of law in such arenas as civil rights and occupational health and safety. Above all, he promoted a spirit of fairness that would usher in a “post-racial” moment and redefine our commitment to a more egalitarian society. During the campaign, Obama, very tentatively, challenged the prevailing antigovernment perspective, suggesting that government could be a positive instrument—that it could enhance the safety of communities, promote equal opportunity, foster the greater good. But despite the fact that these images captured the imaginations of the voting population, the promise has been left dangling.

Since the campaign, these calls for a new direction have disappeared, obscured by the daily political battles in Washington as well as the failure of the Obama administration to adequately articulate principled and ethical aspirations. It is in this last area where I believe that left intellectuals could play a vital role. Over the past forty years, conservatives (intellectuals among them) shifted the popular spirit from a liberal commitment to social good (exemplified by the New Deal) to a massive mistrust of government. Using the language of freedom and individualism, and denigrating anything social, conservatives redefined perceptions of government from positive to negative. “Public” became a four-letter word connoting unnecessary taxes, poor service, regulatory constraints, and inefficiency. “Liberalism” became a political death sentence. “Democracy” lost its cachet, as conservatives convinced the electorate that getting more people to vote was merely a plot to expand the influence of the poor. Phrases like “right to life” and “family values” found their way into the popular media, embedded with meanings intended by a new Right and paving the way to new policies with astonishing speed. Without public protest Americans now allow private companies to run their jails, employ mercenary armies to win wars, create almost insurmountable obstacles to the organization of workers, and reject funding for most forms of public welfare except for poorly paid work. They also define as tax increases any effort to modify a tax system that ensures huge income gaps between rich and poor.

Perhaps our loss of control over words and language explains why, offered a politician courageous enough to open up questions about the nature of our moral center, we have not pursued the opportunity. It isn’t just that we, on the Left, lack agreement about alternative goals and the policies that would achieve them. We agree on many things, including a restoration of balance between the public and private good, trust in government, the value of civil liberties and civil rights, and basic assumptions about fairness and justice. But we lack a language capable of capturing public attention. Because we offer no way of speaking to a larger audience, we have little access to the media and almost no capacity to shape public opinion.

I do not agree that, as Irving Howe once argued, the retreat of intellectuals behind university walls that harbor and feed them, has subdued our voices. As intellectuals, we have not shirked our responsibilities to raise important questions about what is “American.” In the last generation, historians (to name just one group of articulate social critics) have refocused scholarly debates to suggest the positive as well as the negative value of government for daily life. We have turned our accounts of movements like progressivism or the New Deal to reveal how effective government programs could and did enhance individual liberty rather than constrain it. We have produced a literature that explores the relationship of race and gender to legislation enhancing civil liberties and civil rights. We have fostered a wide-ranging debate about the economic and social circumstances that have led to military commitments over the centuries. And we have spearheaded discussions about immigration, homelessness, and the century-long efforts to achieve health care and other social policies. Together these add up to a new portrait of an America that reveals the changing and expanding nature of the American Dream.

Our work (as well as our thought) has found its way into limited venues as the subjects of research rather than as inspirations to action. Fifty or sixty years ago, we might have argued that these outlets—magazines like Dissent, Partisan Review, Politics, and Commentary—could reach out beyond the intellectual community to serve as conduits to the desks of presidents and politicians. But today, in the light of powerful Internet media, and a newly personal pop culture, the influence of these outlets has been diluted. Our work participates in a process effectively captured by the pundit who described the New York Review of Books as “The New York Review of Each Other” Even outlets with higher circulation like the Nation and the New Republic don’t come close to reaching the millions who participate in blogs and Twitter, who read the mass circulation media, and who watch nightly television. This is not to denigrate their function, only to suggest that, in the modern world, to influence popular culture, to transform a mindset requires access to new cultural outlets. We won’t get there until we can play that game.

Look at how a multi-layered conservative movement made use of the mass media to alter public culture and construct a new public ethos around the meaning of public good. Case in point, and frequently cited, is Ronald Reagan’s brilliant use of the slogan “It’s morning in America” to capture the presidency. Milton Friedman’s “free to choose” comes a close second in the contest to engage the American imagination. And we are not just talking about slogans. We are talking about how words like family, tradition, and opportunity came to have a loaded content; others, like liberty, have new and unrecognizable meanings; equality has all but disappeared from the lexicon of politics. The most powerful example, however, is the notion of a ”War on Terror”—on idea that one political group is invested with such vicious content that it has the force that communism once had to stir a population to action.

If the power to shape words and language is essential to capturing the media, it also poses something of a paradox. The ability of left intellectuals to speak a common language requires both nuance and subtlety, but nuance makes simple and meaningful language difficult to achieve. The solution then might be to do as conservatives have done: to overlook small differences in the interest of articulating our moral center. If we can’t do this, we will have abandoned the popular media, essentially withdrawn from the game, and allowed President Obama’s agenda to rest on political manipulation rather than a shifting public opinion about what our society stands for. For my part, I am convinced that if we can reshape the language, we can reverse the moral course.

We can do this by constructing a language of shared goals rather than one of difference. Recent experience suggests the media will respond to such a language, even if it seems subversive. We could find ways of introducing this language, perhaps by developing a television program on the model of William Buckley’s Firing Line that would promote debate about serious issues. Or we might involve programs like Law and Order and the old West Wing—both of which have attracted huge audiences—on the issues we care about. We can be confident that advertising money will follow success. It has done so in the case of Rachel Maddow, whose gutsy approaches to the news we all admired but have insufficiently supported. We might follow the model of Michael Moore, who has been unafraid to debunk words like capitalism even as he utilizes capitalist methods to make his point. Can we take advantage of his initiatives (there are others, like Air America that have less expansive audiences) to develop the missing narratives by writing reviews in local newspapers, organizing responsive support groups, or in other ways becoming more active participants? And what about music and theater? Where are the grants and the institutes that support the Arthur Millers and Bob Dylans of the twenty-first century—both of them masters at coining a phrase? Bruce Springsteen comes to mind—and yet we haven’t yet elevated him to the ranks of those whose music inspires social change. We cannot separate ourselves from those worlds nor ignore our obligation to generate funds for a popular culture that speaks to the best in human nature. But neither can intellectuals ignore a world in which new forms of communication require us to speak a language that can be heard.

Alice Kessler-Harris is a member of the Department of History and the Institute for Research on Women and Gender at Columbia University. She is the author of In Pursuit of Equity: Women, Men, and the Quest for Economic Citizenship in Twentieth-Century America.

How to Save Journalism

By John Nichols & Robert W. McChesney

This article appeared in the January 25, 2010 edition of The Nation.

January 7, 2010

Text Size

The article below is excerpted from John Nichols and Robert McChesney's new book The Death and Life of American Journalism.



The founders of the American experiment were even by their own measures imperfect democrats. But they understood something about sustaining democracy that their successors seem to have forgotten. Everyone agrees that a free society requires a free press. But a free press without the resources to compensate those who gather and analyze information, and to distribute that information widely and in an easily accessible form, is like a seed without water or sunlight. It was with this understanding that Washington, Jefferson, Hamilton and their contemporaries instituted elaborate systems of postal and printing subsidies to assure that freedom of the press would never be an empty promise; to that end they guaranteed what Madison described as "a circulation of newspapers through the entire body of the people...[that] is favorable to liberty."

» More

Two centuries after Madison wrote those words, American news media are being steered off the cliff by investors and corporate managers who soured on their "properties" when the economic downturn dried up what was left of their advertising bonanza. They are taking journalism with them. Newsrooms are shrinking and disappearing altogether, along with statehouse, Washington and foreign bureaus. And with them goes the circulation of news and ideas that is indispensable to liberty. This is a dire moment for democracy, and it requires a renewal of one of America's oldest understandings: that a free people can govern themselves only if they have access to independent information about the issues of the day and the excesses of the powerful, and that it is the duty of government to guarantee both the promise and the reality of a free press.

When we recommended government subsidies last year in a Nation cover article ("The Death and Life of Great American Newspapers," April 6), some publishers and pundits objected, forgetting their Jeffersonian roots and arguing, with no sense of irony, that policies promoting diversity and robust debate would foster totalitarianism. Even well-intended Congressional hearings on the crisis avoided discussion of this logical response.

But as 2009 wore on and the crisis extended--with the venerable Christian Science Monitor and newspapers in Seattle and Ann Arbor ceasing print publication to exist solely online, with papers in Denver, Tucson and other cities closing altogether, and with talk of closures from San Francisco to Boston--the urgency of the moment, and the recognition that journalism would not be reborn on the Internet or saved by foundation grants, made it harder to dismiss subsidies. By year's end, the Columbia Journalism Review was highlighting a report by Leonard Downie Jr. and Michael Schudson that proposed requiring "broadcasters, Internet service providers, and telecom users to pay into a fund that would be used to support local accountability journalism in communities around the country." CJR called the idea a "radical suggestion."

If the rather modest proposal by Downie and Schudson is "radical," then it is merely a fraction of the radicalism of America's founding. And like so many founding precepts, it is a radicalism that has long since been accepted as common sense by the rest of the world. Now Americans must re-embrace that common sense if we are to have journalism worthy of the Republic's promise and sufficient to meet its needs. This is an unavoidable reality. No reasonable case can be made that journalism will rebound as the economy recovers from a recession that accelerated but certainly did not cause the crisis confronting newspapers--or that a "next big thing" will arrive as soon as news organizations develop good Internet business plans. Many of the nation's largest papers are in bankruptcy or teetering on the brink, and layoffs continue at an alarming rate. The entirety of paid journalism, even its online variant, is struggling. There are far fewer working journalists per 100,000 Americans today than existed one, two or three decades ago. At current rates of decline, 2020 will make 2010 look like a golden age. When the Federal Trade Commission held its unprecedented two-day conference on the state of journalism in December, the operative term was "collapse." Conversely, the ratio of PR flacks to working journalists has skyrocketed, as spin replaces news.

The implications are clear: if our policy-makers do nothing, if "business as usual" prevails, we face a future where there will be relatively few paid journalists working in competing newsrooms with editors, fact-checkers, travel budgets and institutional support. Vast areas of public life and government activity will take place in the dark--as is already the case in many statehouses across the country. Independent and insightful coverage of the basic workings of local, state and federal government, and of our many interventions and occupations abroad, is disappearing as rapidly as the rainforests. The political implications are dire. Just as a brown planet cannot renew itself, so an uninformed electorate cannot renew democracy. Popular rule doesn't work without an informed citizenry, and an informed citizenry cannot exist without credible journalism.

This is more than academic theory; it is how the Supreme Court has interpreted the matter. As Justice Potter Stewart explained in 1974, the framers believed the First Amendment mandated the existence of a Fourth Estate because our experiment in constitutional democracy cannot succeed without it. That is hardly a controversial position, nor one that is necessarily left wing. It should be inviting to readers of the Wall Street Journal and BusinessWeek, as markets cannot work effectively or efficiently unless investors, managers, workers and consumers have the credible information produced by serious journalism. Moreover, political decisions about economic issues will respect Main Street concerns only if citizens are kept abreast of the issues by independent news media. American officials urged Asian economies during the financial crisis of the late 1990s to develop independent media or suffer from the corruption and stagnation of "crony capitalism." We need to take a dose of our own medicine, and fast. Unfortunately, misconceptions about the crisis and the proper relationship between government and media warp the debate. Addressing these misconceptions, and getting beyond them, will be the great challenge of 2010.

The most dangerous misconception has to do with journalism itself. Journalism is a classic "public good"--something society needs and people want but market forces are now incapable of generating in sufficient quality or quantity. The institution should be understood the way we understand universal public education, military defense, public health and transportation infrastructure. The public-good nature of journalism has been largely disguised for the past century because advertising bankrolled much of the news, for better and for worse, in its efforts to reach consumers. Those days are over, as advertisers no longer need or seek to attach their appeals to journalism to connect with target audiences. Indeed, to the extent commercial media can scrap journalism standards to make the news "product" more attractive to advertisers, the cure will be worse than the disease.

This takes us to the second great misconception: that the crisis in journalism was created by the rise of the Internet and the current recession. In fact, the crisis began in earnest in the 1970s and was well under way by the 1990s. It owes far more to the phenomenon of media corporations maximizing profits by turning newsrooms into "profit centers," lowering quality and generally trivializing journalism. The hollowing out of the news and alienation of younger news consumers was largely disguised by the massive profits these firms recorded while they were stripping newsrooms for parts. But that's no longer possible. The Internet, by making news free online and steering advertisers elsewhere, merely accelerated a long-term process and made it irreversible. Unless we grasp the structural roots of the problem, we will fail to generate viable structural solutions.

By ignoring the public-good nature of journalism and the roots of the current crisis, too many contemporary observers continue to fantasize that it is just a matter of time before a new generation of entrepreneurs creates a financially viable model of journalism using digital technologies. By this reasoning, all government needs to do is clear the path with laxer regulations, perhaps some tax credits and a lot of cheerleading. Even David Carr of the New York Times, who has consistently recognized the point of retaining newsrooms and journalism, falls into the trap of assuming that the "cabals of bright young things" who are swarming New York might create a "fresh, ferocious wave" of new media that will turn the Internet from killer of media into savior. Carr's vision may work for entertainment media, but it is a nonstarter for journalism. As Matthew Hindman's new book, The Myth of Digital Democracy, convincingly demonstrates, the Internet is not some "wild west" incubator, where a new and more democratic journalism is being hatched. Internet traffic mostly gravitates to sites that aggregate and reproduce existing journalism, and the web is dominated by a handful of players, not unlike old media. Indeed, they are largely the same players.

There is no business model or combination of business models that will create a journalistic renaissance on the web. Even if the market and new technologies were to eventually solve journalism's problems, the notion that we must go without journalism for a decade or two while Wall Street figures out how to make a buck strikes us, frankly, as suicidal.

There will be commercial news media in the future, and the right of anyone to start a business that does journalism should remain inviolable. But there is no evidence that the news media democracy requires will be paid for by advertisers or subscribers. Nor will they be supported by foundations or billionaires; there simply are not enough to cover the massive need. And while it might be comforting to think we can rely on tax-deductible citizen donations to fund the news media we need, there is scant evidence enough money can be generated from this source.

House Energy and Commerce Committee chair Henry Waxman was right when he told December's FTC workshop on journalism, "This is a policy issue. Government is going to have to be involved in one way or another." Journalism, like other public goods, is going to require substantial public subsidy if it is to exist at a level necessary for self-government to succeed. The question, then, is not, Should there be subsidies? but, How do we get subsidies right?"

To do that, policy-makers, journalists and citizens must take an honest look at the history of journalism subsidies here and abroad, and they cannot cling dogmatically to the Manichaean view that press subsidies inexorably lead to tyranny.

Even those sympathetic to subsidies do not grasp just how prevalent they have been in American history. From the days of Washington, Jefferson and Madison through those of Andrew Jackson to the mid-nineteenth century, enormous printing and postal subsidies were the order of the day. The need for them was rarely questioned, which is perhaps one reason they have been so easily overlooked. They were developed with the intention of expanding the quantity, quality and range of journalism--and they were astronomical by today's standards. If, for example, the United States had devoted the same percentage of its GDP to journalism subsidies in 2009 as it did in the 1840s, we calculate that the allocation would have been $30 billion. In contrast, the federal subsidy last year for all of public broadcasting, not just journalism, was around $400 million.

The experience of America's first century demonstrates that subsidies of the sort we suggest pose no threat to democratic discourse; in fact, they foster it. Postal subsidies historically applied to all newspapers, regardless of viewpoint. Printing subsidies were spread among all major parties and factions. Of course, some papers were rabidly partisan, even irresponsible. But serious historians of the era are unanimous in holding that the extraordinary and diverse print culture that resulted from these subsidies built a foundation for the growth and consolidation of American democracy. Subsidies made possible much of the abolitionist press that led the fight against slavery.

Our research suggests that press subsidies may well have been the second greatest expense of the federal budget of the early Republic, following the military. This commitment to nurturing and sustaining a free press was what was truly distinctive about America compared with European nations that had little press subsidy, fewer newspapers and magazines per capita, and far less democracy. This history was forgotten by the late nineteenth century, when commercial interests realized that newspaper publishing bankrolled by advertising was a goldmine, especially in monopolistic markets. Huge subsidies continued to the present, albeit at lower rates than during the first few generations of the Republic. But today's direct and indirect subsidies--which include postal subsidies, business tax deductions for advertising, subsidies for journalism education, legal notices in papers, free monopoly licenses to scarce and lucrative radio and TV channels, and lax enforcement of anti-trust laws--have been pocketed by commercial interests even as they and their minions have lectured us on the importance of keeping the hands of government off the press. It was the hypocrisy of the current system--with subsidies and government policies made ostensibly in the public interest but actually carved out behind closed doors to benefit powerful commercial interests--that fueled the extraordinary growth of the media reform movement over the past decade.

The argument for restoring the democracy-sustaining subsidies of old--as opposed to the corporation-sustaining ones of recent decades--need not rest on models from two centuries ago. When the United States occupied Germany and Japan after World War II, Generals Eisenhower and MacArthur instituted lavish subsidies to spawn a vibrant, independent press in both nations. The generals recognized that a docile press had been the handmaiden of fascism and that a stable democracy requires diverse and competitive news media. They encouraged news media that questioned and dissented, even at times criticized US occupation forces. They did not gamble on the "free market" magically producing the desired outcome.

In moments of crisis, our wisest leaders have always recognized the indispensible role of journalism in democracy. We are in such a crisis now. It is the character of the crisis, and the urgency of the moment, that should make Americans impatient with blanket condemnations of subsidies. State support is vital to higher education; on rare occasions professors have been harassed by governors or legislators over the content of their research or lectures. But only an extreme libertarian or a nihilist would argue to end all public support of higher education to eliminate the threat of this kind of government abuse. Likewise, the government does not tax church property or income, which is in effect a massive subsidy of organized religion. Yet the government has not favored particular religions or required people to hold religious views.

As for the notion that public broadcasting is a more propagandistic or insidious force than commercial broadcasting because of the small measure of direct state support it receives, the evidence suggests otherwise. When the United States geared up to invade Iraq in 2002, commercial broadcast news media, with only a few brave exceptions, parroted Bush administration talking points for war that were easily identified as lies. In contrast, public and community broadcast coverage, while far from perfect, featured many more critical voices at exactly the moment a democracy requires a feisty Fourth Estate. Not surprisingly, public broadcasting is the most consistently trusted major news source, with Americans telling pollsters it deserves far greater public funding.

Perhaps the strongest contemporary case for journalism subsidies is provided by other democracies. The evidence shows that subsidies do not infringe on liberty or justice; they correlate with the indicators of a good society. In The Economist's annual Democracy Index, which evaluates nations on the basis of the functioning of government, civic participation, civil liberties, political culture and pluralism, the six top-ranked nations maintain some of the most generous journalism subsidies on the planet. If the United States, No. 18 in the index, spent the same per capita on public media and journalism subsidies as Sweden and Norway, which rank 1 and 2, we would be spending as much as $30 billion a year. Sweden and Norway are also in the top tier of the pro-business Legatum Group's Prosperity Index, which measures health, individual freedom, security, the quality of governance and transparency, in addition to material wealth. The United States ranks ninth.

The evidence is also clear that huge journalism subsidies and strong public media need not open the door to censorship or threaten private and commercial media. Consider the annual evaluation from Freedom House, the pro-private media organization that annually ranks international press freedom. It has the keenest antennas for government infringement of private press freedoms and routinely places nondemocratic and communist nations in its lowest, "not free" category. (It ranks Venezuela, for example--highly regarded by some on the democratic left for its commitment to elections and an open society as well as its wide-ranging adversarial media--as having a "not free" press.) Strikingly, Freedom House ranks the heavy subsidizing nations of Northern Europe in the top six spots on its 2008 list of nations with the freest news media. The United States ties for twenty-first. Research by communications professor Daniel Hallin demonstrates that increases in subsidies in Northern Europe led not to a docile and uncritical news media but to a "more adversarial press." In short, massive press subsidies can promote democratic political cultures and systems.

But must Americans pay $30 billion a year to get the job done right? Possibly not. Digital technologies have dramatically lowered production and distribution costs. Still, the main source of great journalism is compensated human labor, and, as the saying goes, you get what you pay for. We're longtime advocates of citizen journalism and the blogosphere, but our experience tells us that volunteer labor is insufficient to meet America's journalism needs. The digital revolution has the capacity to radically democratize and improve journalism, but only if there is a foundation of newsrooms--all of which will be digital or have digital components--with adequately paid staff who interact with and provide material for the blogosphere.

The moral of the story is clear: journalism and press subsidies are the price of civilization. To deliver this public good in sufficient measure to sustain democracy, it must be treated as we treat national security. No one would dare suggest that our military defense could be adequately covered by volunteer labor, pledge drives, bake sales, silent auctions and foundation grants. The same is true for journalism. Cautious proponents of press subsidies think in terms of nickels and dimes, but we need to think in terms of billions. Columbia Journalism School professor Todd Gitlin got it right: "We're rapidly running out of alternatives to public finance.... It's time to move to the next level and entertain a grown-up debate among concrete ideas."

How can we best spawn a vibrant, independent and competitive press without ceding government control over content? There are models, historic and international, from which we can borrow. No one-size-fits-all solution will suffice, since all forms of support have biases built into them. But if citizens spend as much time considering this issue as our corporate media executives and investors do trying to privatize, wall off and commercialize the Internet, journalism and democracy will win out.

In our new book, The Death and Life of American Journalism, we offer proposals for long-term subsidies to spawn independent digital journalism. But we do not claim to have all the answers. What we claim--what we know--is that it is now imperative that emergency measures be proposed, debated and implemented. People need to see tangible examples of "public good" interventions, or the discussion about renewing journalism will amount to little more than fiddling while Rome burns. The point now is to generate popular participation in and support for a small-d democratic response.

The starting point could be a debate about "bailouts" to keep struggling commercial news media, especially newspapers and magazines, afloat. As a rule, we oppose bailing out or subsidizing commercial news media. We believe subsidies should go primarily to nonprofit and noncommercial media. We are not doctrinaire on this point and believe it should be subject to debate, especially for short-term, emergency measures. If subsidies do go to commercial interests, the public needs to get something of substance in return. But the lion's share of subsidies must go now and in the future to developing and expanding the nonprofit and noncommercial sector. Journalism needs an institutional structure that comports with its status as a public good.

What are we talking about? For starters, spending on public and community broadcasting should increase dramatically, with the money going primarily to journalism, especially on the local level. We never thought one commercial newsroom was satisfactory for an entire community; why should we regard it as acceptable to have a single noncommercial newsroom serve an entire community? Let's also have AmeriCorps put thousands of young people to work, perhaps as journalists on start-up digital "publications" covering underserved communities nationwide. This would quickly put unemployed journalists to work. Let's also craft legislation to expedite the transition of failing daily newspapers into solvent nonprofit or low-profit entities. It is healthy for communities to have general news media that cover all the relevant news and draw everyone together, in addition to specialized media. Shifting newspapers from high-profit to low-profit or nonprofit ownership allows them to keep publishing as they, and we, complete the transit from old media to new.

Americans will embrace some of these ideas. They will reject others. The point is to get a debate going, to put proposals forward, to think big and to act with a sense of urgency. Let's assume, for the sake of journalism and democracy, that there will be subsidies. Then all we must do is put them to work in the same spirit and toward the same end as did the founders.

About John Nichols

John Nichols, a pioneering political blogger, has written The Beat since 1999. His posts have been circulated internationally, quoted in numerous books and mentioned in debates on the floor of Congress. Nichols writes about politics for The Nation magazine as its Washington correspondent. He is a contributing writer for The Progressive and In These Times and the associate editor of the Capital Times, the daily newspaper in Madison, Wisconsin. His articles have appeared in the New York Times, Chicago Tribune and dozens of other newspapers. He is the co-author, with Robert W. McChesney, of The Death and Life of American Journalism, just published by Nation Books. more...

About Robert W. McChesney

Robert McChesney is Gutgsell Endowed Professor in the Department of Communication at the University of Illinois. He hosts the program Media Matters on WILL-AM every Sunday afternoon from 1-2pm central time. He and John Nichols, The Nation's Washington correspondent, are the founders of Free Press, the media reform network, and the authors of Tragedy and Farce: How the American Media Sell Wars, Spin Elections, and Destroy Democracy (New Press) and The Death and Life of American Journalism (Nation Books). He has written sixteen books and his work has been translated into fifteen languages. more...

The Resistance of Painting: On Abstraction

By Barry Schwabsky

This article appeared in the January 4, 2010 edition of The Nation.

December 16, 2009

Text Size
<i>After Joy Division</i> (2009), by Rosy Keyser PETER BLUM GALLERY, NYC

After Joy Division (2009), by Rosy Keyser

Abstract painting is nearing its centenary. Although what exactly abstraction is, who first achieved it, and when and where, are questions open to interpretation, the best art-historical thinking dates its inception to around 1912, when Wassily Kandinsky, Kazimir Malevich, Robert Delaunay, Piet Mondrian and Arthur Dove quite separately made their breakthroughs across two continents. While it's not always true of people that young revolutionaries become old conservatives, it seems almost inevitable that in the arts as much as politics, radical ideas and movements whose glory is not preserved by quick defeat turn into shibboleths and establishments.

» More

It would be easy to make the argument that abstraction has long since settled into its comfortable dotage--that it has become an art choking on good taste and mannered reticence. On this view, abstraction was deposed by movements of the 1960s such as Pop Art, with its rehabilitation of vernacular imagery and its immersion in demotic culture; Conceptual Art, with its emphasis on language and critical context; and even Minimalism, which (despite its inheritance from the Constructivist strain within abstraction) laid such great stress on what its foremost detractor decried as mere "objecthood" that a boundary was fatally breached between art and everyday things.

That's not how things look to me, though, and not only because a view of art focused on movements that succeed one another like waves crashing ineffectually against the shore has never answered to my experience of art, which has mostly centered on individual artists and particular works. Abstraction arguably should have even less to do with movements than any other art: a movement of abstractionists would be a contradiction in terms, like a church of atheists. Abstractionists, like atheists, are united only in what they reject. Abstraction is not a specific way of doing art--on what basis can Jackson Pollock, Lucio Fontana and Daniel Buren be considered part of a single movement? Rather, it is a considered effort not to do what Western artists have made it their job to do for hundreds of years: namely, to construct credible depictions of people, places and things. What if anything else goes?

Perhaps that's why, as Bob Nickas points out in his new book Painting Abstraction: New Elements in Abstract Painting (Phaidon Press; $75), "so many contemporary artists who paint nonrepresentational pictures reject the notion that their work is in fact abstract." They realize that the name itself, as handy and unavoidable as it undoubtedly may be, conveys a false sense of unity. Other commonalities, even those that would rightly strike us as quite superficial, can be more important. Consider, for example, two painters who make very small paintings that are introspective and intimate in feeling. Though one of them paints images and the other does not, one might well feel that they have much more in common than either one does with a painter who prefers to work on a grand scale and with reference to important public issues. Yet no one would think of grouping the two artists together as part of a "small-scale art" movement--one called, say, "intimism." Why, then, is abstractionism as an idea any more relevant than intimism as an idea?

In fact, there is a good reason for it: the making of pictures is not merely a historical inheritance for painting but its default mode. The pursuit of abstraction is always to some extent a mode of resistance. There was a brief period when this fact could have been forgotten, and while that period was arguably that of abstract painting's triumph--I am referring, of course, to the fifteen years following the end of World War II--it was also when abstraction threatened to become an orthodoxy, which would have killed its spirit. You might say that all those artists who turned away from abstraction in the 1960s and '70s were honoring it in the breach rather than the observance. Then, once again, abstraction could become an art for aesthetic dissidents.

Raoul De Keyser is one of them. If there were a school of contemporary intimists, critics would be tempted to see his work as part of it, but his paintings would never really allow for it--they're too tough and too phlegmatic. His career illustrates the stealth with which the best abstract painting often proceeds today. (Not long ago Raphael Rubinstein characterized De Keyser's work, among that of other "provisional painters," as "major painting masquerading as minor painting.") De Keyser was born in 1930 in the Flemish town of Deinze, Belgium, where he still lives, and his reputation was almost entirely confined to his home country and the Netherlands until 1990, when he began exhibiting regularly abroad, first in Germany and then throughout Europe and further afield, not only in one-person shows but in big international exhibitions like Documenta 9 (Kassel, 1992) and "The Broken Mirror," an important painting survey in Vienna in 1993. Being championed by Luc Tuymans, a much younger and more famous Flemish painter, could not have hurt.

At David Zwirner Gallery in New York City, De Keyser recently showed several series of drawings and watercolors from 1979 to 1982 alongside paintings finished over the past three years (several were started as long ago as 1998). Some of the works on paper use large, simple blocky forms; in others, fields of small marks create a sort of broken, refracted visual texture that's surprisingly reminiscent of Impressionism. References to landscape are rife. Each of the "Hill Series," from 1981, contains a single large five-sided shape in black ink, its edges nearly parallel with those of the sheet on which it has been drawn, except that one of its upper corners has been replaced by a diagonal line, like the slope of a hill. There is some bare white paper around all the sides of the resulting irregular pentagon, so that despite the reference to nature that the title insists on, it always remains a closed shape, never becoming a view of something larger. The trick--this short-circuiting of reference and abstraction--is simple but effective, so much so that it could easily have been irritating, except that the execution of it is so blunt and unpretentious that the quizzical feeling evoked by this play, not only between abstraction and image but between earnest concentration and triviality, evokes an almost childlike freshness of vision.

One of De Keyser's new paintings shows him looking back at the same idea. It contains a single large red form with a sloping top and with a white surround. It's about the same size as the drawings too. The funny thing is its title, Complex (2009): it's the least complex of the seventeen paintings that were shown. All the rest are simple enough, but in odder, sometimes seemingly arbitrary ways. Mark Rothko once visited a fellow artist's studio and, after studying his works carefully, declared that he couldn't see the point of them because their forms were too numerous: "I can understand that two are man and woman, three are man, woman and child, but five are nothing." De Keyser's are often, in Rothko's sense, paintings of nothing--of very little that is somehow also too much. Often there are no more than two or three colors, but there is no drama of opposition or synthesis. A multiplicity of small, detached, nondescript shapes seems to echo the randomness that snapshots have taught us to see in everyday life, but only rarely in these paintings do everyday things come into focus, as does the red banner in Turkish 1 Mai in Belgium (2009) or the distant mountain peak in Top (2009). More often there is a rough geometry that seems to describe something or other but nothing in particular, as in Company (2008) or the teasingly titled Scene (2008). Remembering that De Keyser had been a sportswriter as a young man, I wondered whether Scene depicts some twisted goal posts, but I couldn't quite see it that way. Still, something is being seen through these paintings, but glancingly, out of the corner of one's eye, even when the shapes are outlined with graphic clarity (though De Keyser is almost as likely to show you a blur). Always, there's something rather blank and awkward about them that carries an inescapable poignancy: what one sees in them seems to be the mere vestiges of something that disappeared in the very act of being grasped. Undemonstrative, these paintings nonetheless bear a distinctive timbre or vibration of feeling. Their sensuality is in their very dryness.

About Barry Schwabsky

Barry Schwabsky is the art critic of The Nation. Schwabsky has been writing about art for the magazine since 2005, and his essays have appeared in many other publications, including Flash Art (Milan), Artforum, the London Review of Books and Art in America. His books include The Widening Circle: Con­sequences of Modernism in Contemporary Art, Vitamin P: New Perspectives in Painting and several volumes of poetry, the most recent being Book Left Open in the Rain (Black Square Editions/The Brooklyn Rail). Schwabsky has contributed to books and catalogs on artists such as Henri Matisse, Alighiero Boetti, Jessica Stockholder and Gillian Wearing, and has taught at the School of Visual Arts, Pratt Institute, New York University, Goldsmiths College (University of London) and Yale University. more...