Monday, February 29, 2016

Stop dismissing small talk as shallow or boring. It’s a crucial social lubricant!

In Defense of Small Talk

It's not shallow, boring, or offensive. It's a crucial social lubricant as valuable as wine or laughter.

Practicing the art of small talk.

Thinkstock Images/Thinkstock

an you believe this weather? First a major snowstorm, now unseasonable warmth. The election is really a doozy. And don't get me started on the Oscars!

Ruth Graham is a regular Slate contributor. She lives in New Hampshire.

Small talk gets a bum rap as an enterprise for the shallow, the callow, and the dull. "Life is more meaningful than the weather," the modern high priests of depth tell us. Small talk is for "those who are too simple minded or lack the attention span to engage in more weighty conversations." Chatting about sports or TV is boring, we are told; to ask basic questions about family or current events risks offending with our nosiness or our assumptions.

A few weeks ago, a Boston actuary named Tim Boomer wrote an installment for the New York Times' "Modern Love" column that perfectly captured the current anti–small talk attitude. After a bad breakup, Boomer overhears a couple on a first date chatting about bus routes and the rain. "I wanted no part of this game," Boomer declares:

I decided to approach my re-entry to dating with a no-small-talk policy. Not that I would insist we talk only about heartfelt subjects; ideally, there would also be plenty of flirtatious joking and witty banter. I simply wanted to eliminate the dull droning on about facts and figures—whether it's snowing or raining, how cold it is, what we do for work, how long it takes to get to work, where we went to school—all those things that we think we have to talk about with someone new but that tell us little about who the person really is.
Why can't we replace small talk with big talk and ask each other profound questions right from the start? Replace mindless chatter about commuting times with a conversation about our weightiest beliefs and most potent fears? Questions that reveal who we are and where we want to go?

Well, hold your horses there, Tim! There's something awfully presumptuous about pressing people to share their "weightiest beliefs and most potent fears" while you're still on the appetizer course. Call me old-fashioned, but I wouldn't want to talk about my most intense past love experience on a first date. I'll share my deepests and darkests when I'm good and ready.

Small talk saves us from such forced intimacies. But not only does Boomer's approach sound exhausting—like something dreamed up by That Guy in your freshman philosophy class—it's just plain wrong. Small talk is not wasted talk. It's a social lubricant as essential as wine and laughter that allows strangers to make crucial first connections across demographic lines. And it's far from meaningless. People are rebelling against it today in a misguided dismissal of social graces that seem old-fashioned, boring, or wasteful. In fact, we've never needed such graces more.

Dismissiveness toward light conversation is nothing new. The New Testament book of 2 Timothy urges readers to avoid "irreverent babble" because it leads to ungodliness; various translations condemn "foolish talk," "vain babblings," "pointless discussions," and "empty speech." In The Canterbury Tales, Chaucer uses the Manciple's Tale to preach the dangers of "jangling," a term that encompassed most pointless chatter:

A loudmouth is to God abominable.
Read Solomon, so wise and honorable;
Read David's psalms, let Seneca be read.
Don't speak, my son, but only nod your head.

"Idle talk" in these contexts is closely connected with gossip, but the term "idleness" suggests it's a conversation's very triviality that turns it toxic. Medieval penitential manuals often warned that gossips would have to account for their every thoughtless word on Judgement Day. If silence was golden, small talk was tawdry.

Perhaps the reason so many people find small talk tedious is simply that they're bad at it.

Post-industrialization, people became less concerned about the moral dimensions of chit-chat. Instead, they began to fear it was conformist and shallow, a poor reflection of one's personal depth. The German philosopher Heidegger (patron saint of That Guy) devotes a long section in his 1927 masterwork Being and Time to dismissing inauthentic "idle talk," which he connects to the "dictatorship of the 'they' ": "We take pleasure and enjoy ourselves as they take pleasure; we read, see, and judge about literature and art as they see and judge ... we find 'shocking' what they find shocking." Anthropologist Bronislaw Malinowski, who formulated the first academic theory of small talk, belittled what he termed "phatic communion"—conversation whose purpose is social, not informational—as "purposeless expressions of preference or aversions, accounts of irrelevant happenings, comments on what is perfectly obvious."

The latest anxieties over small talk are even smaller in scope. First, there's whether it's bad for our health. A small psychology study a few years ago found that people who spent more time in "substantive" conversations were happier than those who wasted their time on lighter fare. When researchers recorded snippets of conversations over the course of several days, the happiest person in the study engaged in only a third of the amount of small talk as the unhappiest. But there's other evidence that small talk is salubrious, since social interaction seems to decrease stress. As one recent paper's subtitle has it, "Minimal social interactions lead to belonging and positive affects."

Unfortunately, as Boomer's "Modern Love" essay illustrates, we are living in a low moment for the art of minimal social interactions. "The criteria by which one chooses what to say shift from 'what's true; what's most interesting' to 'what lubricates the exchange; what sets people at ease,' " a Vox writer lamented last year. "It's like trying to speak a foreign language." Small talk feels phony to some, in part thanks to its embrace by salesperson types—"My motto is 'every conversation is an opportunity for success,' " a networking expert (shudder) chirped to Fast Company. To others, in an era of ruthless efficiency, pleasantries of the past can come to seem like dead weight. With calendars programmed to five-minute increments, and podcasts filling every interstitial moment of silence with wit, shouldn't conversation be economical and nutritious? Small talk looks like a fussy hors d'oeuvre in the age of Soylent.

Ironically, it's on the Internet—that bottomless maw of unnecessary but entertaining chatter—that haters of small talk express their complaints. Take, for example, the inexhaustible well of introvert-targeted content all over the Web. Introverts have declared war on small talk. They loathe it because they "crave meaning," because of the "barrier it creates between people," and because they "don't see a reason for beating around the bush with social pleasantries." The underlying message of most introvert porn is that introverts are deeper than you are, and their aversion to small talk is yet more proof.

Then there's the whole genre of exquisitely sensitive listicles, seemingly designed to scare people off from even attempting small talk. With headlines like "87 Things Never to Say to Your Babysitter," they make it clear that even the lightest and most well-meaning blather will be read as problematic by someone. When speaking with a pregnant woman, for example: Don't remark on the baby's sex, don't joke, "You'll never sleep again," don't exclaim "Wow" or ask "When are you due?," and so on. It's hard to escape the conclusion that it would be easier just not to talk with pregnant women at all. Be careful, too, when speaking to mothers (don't ask if they're going to have more kids), to cancer patients (don't tell them they're strong), and to atheists (don't ask them about the origins of the universe). There's plenty of common sense here—seriously, don't tell a sick person about a random treatment you read about online!—but the cumulative effect is the conclusion that even the most innocuous chit-chat is a minefield.

Of course small talk has always been a tool to avoid the minefield of unintended boorishness. ("The rain in Spain falls mainly on the plain" is Eliza Doolittle's polished bit of small talk to make her fit in with high society, but she veers disastrously off course when she starts to ad-lib about gin and murder.) Even those who found small talk uninspiring once recognized its utility, like the British statesman Lord Chesterfield, who's responsible for the first-known use of the phrase. "There is a sort of chit-chat, or small talk, which is the general run of conversation at courts, and in most mixed companies," Lord Chesterfield wrote to his son in 1751. "It is a sort of middling conversation, neither silly nor edifying; but, however, very necessary for you to be master of."

But I think small talk can be edifying in its silliness, and a pleasure too. Small talk is fun precisely for the reasons Boomer thinks it's boring: It requires playing within the lines. Using sports, weather, family, and other unremarkable raw material, the skilled conversationalist spins it into gold—or at least cotton candy. In a way, making small talk is like writing a sonnet. It's the restrictions of the form that make the best examples of it beautiful. Perhaps the reason so many people find it tedious is simply that they're bad at it.

Chatting about work and education, not to mention trivialities like bus routes and rain, can tell us quite a lot about "who the person really is," as Boomer puts it. Not because it's a snobby shorthand for sorting a person by her pedigree, but because it lets you evaluate how she talks about her experiences, how she tells the story of herself, and how she approaches trifles like bad weather. Is she whiny? Wry? Cheery? It's all informative, and none of it requires badgering anyone to reveal the moment of their most soul-shattering humiliation over cocktails. Ice-breakers like "Tell me about your weightiest belief" ask that your interlocutor dredge the depths of her soul on demand; small talk lets self-revelation unspool with a more civilized subtlety.

It also allows people to speak to each other across demographics. Try asking the plumber "What place most inspired you and why?" or "What's the most in love you've ever felt?," as Tim Boomer asks his dates. Maybe you'd get somewhere, which might indicate you and your plumber should explore your friendship on your own time. But more likely, you'd form a quicker bond by talking about "small" subjects like the White Sox or the wintry mix.

Top Comment

I have no idea whether this is broadly applicable, but it's been my own experience that I've gained appreciation for small talk in the years since my life has been touched by tragedy and complication.  More...

183 Comments Join In

On this count, the networking experts are right: Excelling at small talk will make you popular, and justifiably so. Mastering it makes you a pleasure to be around. Someone who can carry on a conversation with anyone; someone who is sparkling and witty on simple topics; someone who puts everyone at ease—that's the definition of a perfect guest, perfect host, and perfect co-worker.

Big talk, weird talk, deep talk, smart talk—pick your preferred opposite-of-small talk, and there's room for plenty of it in the conversational repertoire. When it happens serendipitously, it's one of life's great joys, and certainly more memorable than how's-the-weatherisms. But small talk will always be with us, because it's the solid ground of shared culture. The more divided a people—culturally, politically, economically—the fewer conversational topics we can share. The more productivity-obsessed, the less time for old-fashioned pleasures. And that means small talk is no small thing at all.

Evernote helps you remember everything and get organized effortlessly. Download Evernote.

Academic Drivel Report

Academic Drivel Report

Confessing my sins and exposing my academic hoax. 

 don't know if there is a statute of limitations on confessing one's sins, but it has been six years since I did the deed and I'm now coming clean.

Six years ago I submitted a paper for a panel, "On the Absence of Absences" that was to be part of an academic conference later that year—in August 2010. Then, and now, I had no idea what the phrase "absence of absences" meant. The description provided by the panel organizers, printed below, did not help. The summary, or abstract of the proposed paper—was pure gibberish, as you can see below. I tried, as best I could within the limits of my own vocabulary, to write something that had many big words but which made no sense whatsoever. I not only wanted to see if I could fool the panel organizers and get my paper accepted, I also wanted to pull the curtain on the absurd pretentions of some segments of academic life. To my astonishment, the two panel organizers—both American sociologists—accepted my proposal and invited me to join them at the annual international conference of the Society for Social Studies of Science to be held that year in Tokyo.

I am not the first academic to engage in this kind of hoax. In 1996, in a well-known incident, NYU physicist Alan Sokal pulled the wool over the eyes of the editors of Social Text, a postmodern cultural studies journal. He submitted an article filled with gobbledygook to see if they would, in his words, "publish an article liberally salted with nonsense if it (a) sounded good and (b) flattered the editors' ideological preconceptions." His article, "Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity" (published in the Spring/Summer 1996 issue), shorn of its intentionally outrageous jargon, essentially made the claim that gravity was in the mind of the beholder. Sokal's intent was not simply to pull a fast-one on the editors, but to challenge the increasingly popular "post-modern" view that there are no real facts, just points-of-view. His paper made the bogus case that gravity, too, was a "social construction." As soon as it was published, Sokal fessed up in another journal (Lingua Franca, May 1996), revealing that his article was a sham, describing it as "a pastiche of Left-wing cant, fawning references, grandiose quotations, and outright nonsense … structured around the silliest quotations [by postmodernist academics] he could find about mathematics and physics."

Sokal's ruse was more ambitious than mine. He wrote an entire article. I simply wrote a 368-word abstract. He submitted his for publication. I just submitted mine to a conference. Although his paper was filled with absurd statements, it actually reached a conclusion—however bogus—that gravity was still an idea open to serious debate. In doing so, Sokal actually had a serious point to make about the silliness of much "post-modern" thinking that viewed science as a version of the humanities where all views should be given equal weight. 

My paper had no point at all. It was filled entirely with non-sequiturs. I didn't even bother to mention anything about "the absence of absences," because I had no idea what it meant and would have thus revealed my ignorance of the panel's organizing theme. 

In writing my abstract for the "Absence of Absences" panel, I violated every rule of good writing to which I usually try to adhere. Here's how it happened.

In early January 2010 an email arrived in my inbox from a colleague. He had forwarded to me an announcement of a panel, called "On the Absence of Absences," that several American sociologists were organizing for an international conference in Japan sponsored by the Society for Social Studies of Science and the Japanese Society for Science and Technology Studies. The announcement said the following:

This panel addresses absences—the gaps, silences, and remains within the construction of knowledge and ignorance—in order to contribute to an ongoing STS dialogue; one that has roots in Bloor's "sociology of error" to more recent work in agnotology (Proctor and Scheibinger) and in residues (Bowker and Star). From feminist and postcolonial theory, we have learned to be continually vigilant about the dynamics and non-dynamics in knowledge construction and application. This panel addresses these negations, unseen crevices, deletions, and leftovers from multiple perspectives. Its aims to identify and theorize some of those areas that demand our vigilance in order to broaden and provide systematic ways to understand how absences and gaps are a continual part of social interactions and our STS studies. Interested Presenters: Please send us a brief abstract and title of your talk with your name, email and affiliation. We would like contributions no later than 15 January to compile and submit the session.

I admit I had no idea what any of this meant. But I took that as a challenge. So I wrote a few hundred words of complete nonsense and in January 2010 submitted it to the panel organizers under the title, "Music, Religion, Politics, and Everyday Life: The Tensions of Utopianism and Pragmatism in Movements for Change." Here's what I sent them:

Universidad Carlos III de Madrid/Creative Commons

Alan Sokal perpetrated an academic hoax in 1996 with an article in Social Text

From the scribes and rabbis who wrote the original Torah, to the troubadour-activists who sang "Which Side Are You On?" and "Waste Deep in the Big Muddy," to the gangbangers and hip-hoppers who create contemporary street rap, the relationship between culture, politics, religion and everyday life has been poorly understood. As Bloor observes: "In fact sociologists have been only too eager to limit their concern with science to its institutional framework and external factors relating to its rate of growth or direction this leaves untouched the nature of the knowledge thus created." There is an obvious tension between romanticism and reality, between humanity and barbarism, between self-reflection and communal expression, which pervades both the written word and the oral tradition. Can a society promote utopianism and dystopianism simultaneously, while allowing its governing officials, whether military conquerors or democratically elected, to perform the necessary day-to-day functions of street-cleaning, sanitation, animal rescue, industrial production, hunting-and-gathering, maintaining law and order, and (what Heideger called the "organicity of intellectual work") educating children and reproducing the next generation. We might call this a kind of scientism of contradiction, or the contradictions of scientific production, or the contradictory intellectual discipline of everyday life. In other words, can the rigors of so called "pure" intellectual work (including those of the priestly class and its modern counterparts), the artistry of craftwork (or the craft of artistry), and the degradations of subsistence agriculture, mining, factory work, and retail sales co-inhabit the same society without igniting the ticking time bomb of social implosion, as we've recently seen in riots in the French suburbs and in the ghettos and barrios of Los Angeles? How, in other words, does the globalization of both production and knowledge work (the so-called "Walmartization" of societies) challenge our ability to think clearly about what is true in contrast to what is delusion? Self-delusion and self-discipline inhibits the reflective self, the postmodern membrane, the ecclesiastical impulse forbidden by truth-seeking and sun worship, problematizing the inchoate structures of both reason and darkness, allowing knowledge, half-knowledge, and knowledgelessness to undermine and yet simultaneously overcome the self-loathing that overwhelms the Gnostic challenge facing Biblical scribes, folksingers, and hip-hop rappers alike. Sociologists ignore these topics at their peril.

Note the two quotes that appear in my abstract. I'd never heard of David Bloor before, but his name showed up in the announcement about the panel, so I looked up one of his writings and lifted a sentence into my abstract, although the quote I selected has no relevance to anything else in the abstract. The quote from Heidegger (whose name I misspelled in the abstract) is a complete fabrication. I made it up. Name-dropping is always useful in academic circles.

A few weeks after submitting this abstract I received an email from Professor Wenda Bauchspies of the Georgia Institute of Technology, one of the organizers of the panel, informing me that my paper had been accepted and inviting me to present it at the Tokyo conference in August of that year.

Professor Bauchspies soon sent me another email listing the other papers that had been accepted for the panel. They had the following titles:

·      "Agnotology and Privatives: Parsing Kinds of Ignorances and Absences in Systems of Knowledge Production."

·      "Science, Ignorance, and Secrecy: Making Absences Productive"

·      "Alter-Ontologies: Justice and the Living World"

·      "The Motility of the Ethical in Bioscience: The Case of Care in Anti-ageing
Science"

·      "Mapping Environmental Knowledge Gaps in Post-Katrina New Orleans: A Study of the Social Production of Ignorance"

·      "The Absence of Science and Technology Equals Development?"

Using the Internet, I researched the works of my fellow panelists. Each had an impressive track record of publications and academic appointments at respectable universities, including the University of Arizona, the Georgia Institute of Technology, Washington State University, the University of Exeter (England), and Cardiff University (Wales). (Indeed, I've learned that versions of each of the papers they presented in Tokyo have subsequently been published in scholarly journals).

Once my abstract was accepted, I had to decide whether to turn it into a full-blown academic paper (typically about 15 to 25 pages) and travel to Tokyo to deliver it at the conference in person. I wasn't sure I had the fortitude to turn one page of gibberish into at least 15 pages of gibberish, which I would then have to summarize in 10 minutes in front of an international audience of academics. I was also reluctant to ask my own college to pay for me to travel to Tokyo to pull off this hoax.

I felt somewhat guilty when I received another email from the panel organizer that said, "I look forward to meeting you in Tokyo."    

Ultimately, I couldn't bring myself to carry out the hoax to its logical conclusion.

Ultimately, I couldn't bring myself to carry out the hoax to its logical conclusion. I decided that I would be absent from the panel on the "absence of absences." I never registered for the conference, although my abstract appeared in the conference program. Given its provocative (although meaningless) title, it is even possible that some conference-goers showed up at the panel expecting to hear my presentation.

I assume the panel was successful, although I never heard from the organizer or any of the other panelists inquiring why I didn't make it. I even allowed myself to ponder the possibility that all of them were absent. Although I never wrote a paper or gave a presentation, my abstract can still be found on a website devoted to preserving the papers and proceedings of various academic conferences.

Although this episode may seem like a waste of time, I did, like Sokal, have a serious point to make in submitting the abstract. I wanted to pull back the curtain on academic pomposity.

American higher education is under attack by pundits, plutocrats and public officials who believe that many professors don't work hard and that what they produce is of little value to society. Most of their attacks are off-base, but there is a grain of truth in their claims. Academics who believe in the mission of higher education—teaching, research, and public service—need to defend academic freedom, but some of our colleagues have to clean up their acts, because it is difficult to defend the indefensible.

dpa/picture-alliance/dpa/AP Images

German philosopher Martin Heidegger.

There are many academics who write books, articles, and technical papers for colleagues in their own areas of expertise, but who also know how to translate their work into prose accessible to the general public. They share a commitment to the idea that colleges and universities—subsidized directly and indirectly by taxpayers—have an obligation to serve society. That means climbing down from the ivory tower and sharing their knowledge with people who aren't academics. The tradition of liberal arts colleges and land-grant universities alike is the notion of "enlightenment," which means educating, explaining, and illuminating ideas that might be practically useful or simply interesting for their own sake. Over the past decade, there's also been growing interest in getting both professors and students to participate in various forms of public service, including a big increase in student internships and engagement with the worlds outside the campus.

While it is true that some academics write for non-academic general audiences in books, magazines, and newspaper articles, most scholars—and a growing number of them—are content to write books, journal articles, and conference papers that will be read by a very small group of fellow specialists.

I am more than willing to admit that just because I don't understand something doesn't mean it isn't well reasoned or accurate. But the proportion of things published in academic journals has become less and less accessible to anyone who isn't a specialist in that field. We live in an era of increasing academic specialization. As academia becomes more and more fragmented and balkanized into more narrow niches, an increasing proportion of what academics produce is unnecessarily obscure and obtuse, and, not surprisingly, poorly written. Graduate students read this drivel written by their academic elders, and then seek to emulate it, perpetuating the rule of pompous prose.

In the 39 years since I finished graduate school, specialization has become more and more narrow, so that even people in different subfields of the same discipline—say, Japanese history and colonial American history, or Renaissance literature and Southern poetry—aren't expected to understand, or at least judge the quality of, others' work. I realize that all academic disciplines have their own language, concepts, history, disputes, and intellectual paradigms that require a level of specialized knowledge to fully understand. I can read, but I can't understand, most articles published in physics, zoology, or mathematics journals. 

I am a professor with a Ph.D. in sociology who now teaches in a political science department and chairs a department of urban and environmental policy. In other words, I do not have strong disciplinary loyalties and think that the boundaries between many academic fields are pretty blurry. I believe that most social scientists—sociologists, historians, economists, anthropologists, geographers, and political scientists—should be able to read and understand most of what their fellow social scientists write, if only they would write in relatively clear prose. Although I'm not formally trained in English literature, or art history, or other humanities subjects, I can grasp the basic points, if not the nuances, of most articles published by scholars in these fields, if they are written to be understood rather than to impress and intimidate.

In his 1959 book, The Sociological Imagination, the radical sociologist C. Wright Mills mocked fellow sociologist Talcott Parsons not only for his "grand theory" ideas that seemed to have no connection with social reality or serious social problems, but also for his incomprehensible and convoluted prose, such as the following sentence:

A role then is a sector of the total orientation system of an individual actor which is organized about expectations in relation to a particular interaction context, that is integrated with a particular set of value-standards which govern interaction with one or more alters in the appropriate complementary roles.

If Mills were alive today, he'd be saddened by the exponential growth of bad writing by academics

If Mills were alive today, he'd be saddened by the exponential growth of bad writing by academics, especially by those on the left. The problem of academic jargon is not confined to a single political or ideological wing, but it certainly dominates much of the writing by leftists in the social sciences and humanities. I consider myself a person of the left, and my research and writing—focusing on American politics, urban policy, social movements, and labor studies— generally explores issues of social justice and democracy. But I have little patience for much of what passes for left-wing academic writing in the social sciences and humanities, which emphasizes criticism (often called "deconstructing" or "problematizing" by academics) of conservative and liberal ideas and social institutions, but makes little or no attempt to figure out what to do to make things better. 

I also have little patience for the kind of embarrassingly obtuse writing style preferred by many postmodern and allegedly leftist academics that obscures more than it enlightens and is often a clever mask for being intellectually lightweight. Professor Daniel Oppenheimer of Princeton University made a similar point in an article published in the October 2005 issue of Applied Cognitive Psychology entitled, "Consequences of Erudite Vernacular Utilized Irrespective of Necessity: Problems with Using Long Words Needlessly." The Atlantic in March 2006 summarized Oppenheimer's point thusly: "Insecure writers tend to reach for the thesaurus."

Here are several recent examples from my own field of urban studies.

I was recently asked to review a paper submitted to the Society for Existential and Phenomenological Theory and Culture for its upcoming annual conference at the University of Calgary in Canada. The paper was entitled "Detroit: Sense of Place and Self-Overcoming," which I hoped would have something to do with the class and race struggles of the city's working class, which has suffered due to the decline of its auto industry and Michigan's increasingly right-wing and anti-union policies. Instead, here's how the author summarized his ideas:

Institute for Policy Studies/Creative Commons

C. Wright Mills with journalist Saul Landau. 

We build, maintain, and structure cities. Cities, however, maintain and structure certain attitudes in us. Given the attitudes generated by our sense of a place, critical perspectives that only target overt structures within city systems are incomplete. Jacobs outlines several design aspects of the city that are "Anticity…" Hardt and Negri identify the task of the politics of the metropolis as "…to organize antagonisms against hierarchies and divisions of the metropolis…" To fully engage the attitudes generated by our sense of a place requires what Nietzsche describes as self-mastery. Though important factors, design and politics alone are insufficient.

A prominent urban scholar at Harvard describes his latest research project as following:

Through a conceptual distinction between concentrated urbanization (agglomeration) and extended urbanization (operational landscapes) … we aim to investigate the historical geographies of the capitalist urban-industrial fabric in ways that supersede inherited metageographical binarisms while opening up new sociological, cartographic and political perspectives on the contemporary global-urban condition.

This same professor recently co-authored an article with two colleagues, published in a journal called City, which they summarized thusly:

Theoretical, conceptual and methodological choices must be framed in relation to concrete explanatory and interpretive dilemmas, not ontological foundations. In engaging with the limits and possibilities of recent assemblage-based work in urban studies, our concern has been to help forge new analytical tools for deciphering emerging patterns of planetary urbanization, which have unsettled the coherence and viability of earlier intellectual frameworks. As urbanization is changing, so too must urban theory change, and it must do so in ways that provide critical purchase on emergent sociospatial divisions, conflicts, struggles and transformations at all spatial scales and across divergent places and territories. To this end, responding to several strands of the debate on assemblage urbanism that has unfolded in previous issues of City, here we clarify our meta-theoretical stance, address several methodological questions and reiterate our arguments regarding the importance of a reinvigorated geopolitical economy of planetary urbanization. We insist on the importance of abstraction as a necessary methodological moment in any reflexive approach to urban knowledge formation.

Detroit and other American cities face enormous problems—poverty, homelessness, suburban sprawl, decaying infrastructure, underfunded schools, pollution, racial profiling by cops, and others. Professors and researchers who study and care about cities—and whose work is subsidized directly and indirectly (through foundation grants and government-sponsored financial aid to students) by tax dollars—have an obligation to help address these problems, in part by explaining the roots of the urban crisis and what's needed to address it. It is difficult to see how this kind of abstract theorizing and impenetrable prose contributes to improving our cities.

I am sure that the other people who were on the "Absence of Absences" panel in Tokyo are serious scholars who care deeply about their research and can explain it in plain language if asked to do so. But as long as academics write primarily for tiny niches of other academics in language that obscures more than it enlightens, the general public will justifiably continue to question the value of higher education and whether their hard-earned tax dollars should be invested in the work of scholars who seem to have little interest in making their ideas accessible to the general public or useful to society.

Evernote helps you remember everything and get organized effortlessly. Download Evernote.

In Praise of Jargon

In Praise of Jargon

It's easy to sneer at academic journals. But they make even the best general-interest magazines look like thin gruel.

The Chronicle
By Cass R. Sunstein FEBRUARY 14, 2016

When I served in the Obama administration, some of my colleagues, who had recently been academics, wondered, with something like despair, how they could ever return to academic life. "After this," they wondered, "what could possibly be the point of going back to write academic articles?" When I asked them to elaborate, one of them sent me this quotation from Theodore Roosevelt:

It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.

On Writing

Steven Pinker has made the case that — along with wearing earth tones and driving Priuses — professors can be identified by their bad writing. The rise of this opaque technical language, academese, is not a spontaneous occurrence; in fact, it has deep roots in the psychology of academe.

You can probably think of a colleague, a grad student, or a friend who could use a little straight talk on the subject.

That's why we've compiled a collection for Chroniclesubscribers, downloadable below. In it, Pinker diagnoses what ails academic writing, and four experts offer advice on potential cures.

Downloading is simple: Click here.

Wherever they are, academics are not in the arena. Their faces are rarely marred by dust, sweat, or blood. (If so, they are in the wrong profession.) They do not really know either victory or defeat. (Having an article accepted or rejected doesn't exactly count.) Few people may read what they write, no matter how much they obsess over the title, the abstract, or the concluding paragraphs.

In one sense, academics may know the triumph of high achievement (their work might be superb), but insofar as they write scholarly articles, they cannot be said to "strive to do the deeds." Finishing footnotes is a deed — but hardly "the deeds."

But there is a countervailing argument, and it comes from John Maynard Keynes:

The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist.

Keynes's claim is the ultimate revenge of the nerds. Those "men in the arena"? They're slaves of the theorists. Those who "do the deeds"? They're marching to the beat of someone else's drum. And the drummer, it turns out, is an economist or political philosopher (or perhaps a professor of literature, history, or religion). In the end, it's the critic who counts. From the sidelines, he calls the tune.

Maybe Keynes is right. But his argument is unbecomingly self-serving; it is also very far from self-evidently correct. Do ideas rule the world? Keynes himself had some extraordinarily influential ideas, and so in his own case, that claim has a degree of truth. But are practical men — Theodore Roosevelt, Ronald Reagan, Barack Obama, Ted Cruz — really slaves of defunct economists? And what does "defunct" mean, exactly? Might today's officials, or titans of industry, be acting under the influence of outmoded ideas? Are they marching to the beat of academic drummers from the 1950s and 1960s? Are poets, as Shelley had it, the "unacknowledged legislators of the world"? Are professors of literature their chiefs of staff?

Common sense can be affected by academic influences, and is sometimes a product of it.
 In my view, Keynes wrote far too confidently about the role of intellectual influences. Practical people often seem to operate in accordance not with high theory but with some version of what is taken as common sense, perhaps traceable to intellectuals and theorists, but also a product of the intensely pragmatic judgments of colleagues, friends, and family members, and (perhaps above all) relevant cultural influences (religious organizations, labor unions, public interest groups, political parties, and movies and television, too). Common sense emerges from the interactions of numerous people, extending over time, and it creates the relevant music. (A great critic of Keynes, Friedrich Hayek, had relevant things to say about that, emphasizing how culture emerges spontaneously, not by top-down dictation. But Hayek was also of course an academic, and he continues to influence public officials)

Still, Keynes had a point. Common sense can be affected by academic influences, and is sometimes a product of it. It might seem self-evidently true that significant increases in the minimum wage would increase unemployment, or self-evidently true that they would not. Political correctness may or may not be politically correct. Scholarship helps determine what views count as common sense. Does it follow that academics enslave practical men? If scholars have the opportunity to serve in the White House, should they return to academe not with despair and a little shame, but with enthusiasm and energy?

My own field is law, where the debates over the usefulness of academic work are heated and, I think, exemplary. In a famous 1936 essay, "Goodbye to Law Reviews," the Yale law professor Fred Rodell wrote:

There are two things wrong with almost all legal writing. One is its style. The other is its content. That, I think, about covers the ground. And though it is in the law reviews that the most highly regarded legal literature — and I by no means except those fancy rationalizations of legal action called judicial opinions — is regularly embalmed, it is in the law reviews that a pennyworth of content is most frequently concealed beneath a pound of so-called style. The average law review writer is peculiarly able to say nothing with an air of great importance. When I used to read law reviews, I used constantly to be reminded of an elephant trying to swat a fly.

Bob Dylan once said, about Alicia Keys, "There's nothing about that girl I don't like." There's nothing I like about Rodell's article. He is smug, and he sneers, and he's full of contempt for both his students and his colleagues. Contempt is not the best thing to be full of, and in this case, it's unjustified. He is disparaging something that has real value. One reason is that truth matters, and academic work in law, as elsewhere, is frequently designed to figure out what's true. Another reason is that such work can be meticulous and it often explores first principles. That takes care and patience, and it's admirable.

Rodell wasn't opposed to the academic enterprise as such, but his article has a whiff of anti-intellectualism. Some modern critics of academe offer a bit more than a whiff. Justice Stephen Breyer was a distinguished professor of law, and he is unusually learned, but he complains: "There is evidence that law review articles have left terra firma to soar into outer space. Will the busy practitioner or judge want to read, in February's Harvard Law Review, 'The Paradox of Extra-legal Activism: Critical Legal Consciousness and Transformative Politics'?" (That's an awkward title, but the topic is not uninteresting.) Or consider the words of Chief Justice John Roberts:

Pick up a copy of any law review that you see, and the first article is likely to be, you know, the influence of Immanuel Kant on evidentiary approaches in 18th-century Bulgaria, or something, which I'm sure was of great interest to the academic that wrote it, but isn't of much help to the bar.

If you do pick up a copy of any law review you see, I doubt you'll find a discussion of that topic, but is it fair to ask whether and to what extent "help to the bar" is the appropriate criterion for legal scholarship?

Breyer and Roberts are speaking of the choice of topic, but there is also pervasive concern about academic writing itself, and it extends far beyond law. Revisiting his argument in 1962, Rodell wrote:

I now put the finger on style, not substance, as the greater evil. … Without a style that conceals all content and mangles all meaning, or lack of same, beneath impressive-sounding but unintelligible gibberish, most of the junk that reaches print in the law reviews and such scholarly journals could never get itself published anywhere — not even there. … I doubt that there are so many as a dozen professors of law in this whole country who could write an article about law, much less about anything else, and sell it, substantially as written, to a magazine of general circulation.

It's true, academic work could rarely be "sold" to such magazines. But why does that matter? The Quarterly Journal of EconomicsCalifornia Law ReviewPsychological SciencePolitical Science QuarterlyThe American Historical Review, and Philosophy & Public Affairs do not have the same audience as TimeThe Atlantic, or Commentary. The great essays of Daniel Kahneman and Amos Tversky on heuristics and prospect theory could not be published in general-interest magazines, and that is not a knock against Kahneman and Tversky. No such magazine would publish excerpts from Ronald Coase's papers on bargaining and transactions costs, Edna Ullmann-Margalit's book on social norms, Martin Weitzman's work on the limits of cost-benefit analysis for climate change, or Lisa Randall's technical papers on dark matter. So what?

Even excellent general-interest magazines look like pretty thin gruel -- mere bumper stickers, a kind of wind, even when written by professors.
For the record, Coase, Ullmann-Margalit, Weitzman, and Randall use technical language, but they do not mangle "all meaning, or lack of same, beneath impressive-sounding but unintelligible gibberish." They write for a specialized audience, and that's fine. All of them are interested in fundamental questions, and they approach those questions in ways that many general readers would find obscure. What might seem to be unintelligible gibberish, or jargon, often has precision, shorthand, and nuance that cannot be captured in ordinary language. From an abstract of one of Randall's recent papers:

"FDM also has interesting baryogenesis implications. One possibility is that both DM and baryon asymmetries are simultaneously diluted by a late entropy dump. Alternatively, FDM is compatible with an elegant non-thermal leptogenesis implementation in which decays of a heavy RH neutrino lead to late time reheating of the Standard Model and provide suitable conditions for creation of a lepton asymmetry."

I don't understand it, but that's my problem, not Randall's.

Essays in general-interest magazines could not be published in academic journals, and one reason is a lack of sufficient rigor. They might be clear and beautifully written, but usually they don't add much if anything to the stock of knowledge. In some ways, blogs are the modern equivalent of the magazines of Rodell's time, and even when they are written by professors, they are often glib, cheap, and superficial. That's not an objection to blogs; they have a distinctive audience of their own. (The world's best blog, by the way, is Tyler Cowen and Alex Tabarrok's Marginal Revolution.) But for academics, general-interest outlets are hardly the gold standard.

One of the most valuable recent books of social theory is Well-Being and Fair Distribution (Oxford University Press, 2012) by Matthew Adler, a professor of law at Duke University. I doubt that Chief Justice Roberts would be enthusiastic about it. Consider this sentence, chosen at random:

"More specifically, a prioritarian SWF with a fairly modest degree of inequality aversion would say that we should be indifferent between increasing the well-being of an individual at level U by small amount Δu/K, and increasing the well-being of an individual at level KU by amount Δu."

As with Randall, general-interest readers would struggle to understand that sentence, and would puzzle over its relevance after that struggle. And yet Adler's book is a major contribution. It tells us a great deal about the moral standing of cost-benefit analysis — about its foundations and its limitations. True, it would take a great deal of further work to see how Adler's arguments might be applied; but the foundational analysis is significant. (I kept a copy of Adler's book in my government office.)

It is true that academic writing can be meaningless, that academics often lose themselves in abstractions, and that they sometimes use far too much jargon. Every year or so, every scholar should read George Orwell's "Politics and the English Language," which Richard Posner recommended to me in my first year as a law professor. (Decades later: Thanks, Dick!) Orwell offers valuable guidance that is systematically violated by academics. Part of the problem lies with journal editors, who often discourage creativity with respect to style or format.

More important, academic writing can exemplify what the social theorist Jon Elster calls "hard" and "soft" obscurantism. Hard obscurantism involves formal or mathematical work that can be exceptionally impressive, but that does not illuminate anything about reality; it is a kind of display. When economists bring formal models to bear on social problems, hard obscurantism is at least a risk.

Soft obscurantism is even worse. It involves complex, sometimes impenetrable abstractions and theories, which do not adhere to the proper standards for logical argument. Consider "functional explanations," which try to explain the existence of a phenomenon by pointing to its functions. The fact that some phenomenon X has some function Y does not mean that it exists because it has that function. (Functional explanations are easy to find in Marxism, postmodernism, history, and psychoanalysis, and also in Foucault.) And whenever large abstractions (such as "democracy" and especially "legitimacy") do more work than they should, a form of soft obscurantism is at work.

Jargon-filled or not, academic work often orients our politics, our culture, and our lives.
Even when it avoids obscurantism, some academic work can be seen as a form of exercise — or a marathon — in which the ultimate product is like a work of art, one that might be beautiful, but that will not attract much of an audience. Many superb articles and books fall into this category. As a highly distinguished professor once told me, "We write for our friends." Is excellent work its own reward? There is no obvious answer. Maybe so. What is clear is that academic work often adds to what humanity knows, and that should not be disparaged.

Such work often has a commitment to rigor, care, discipline, and sheer quality — to avoidance of the narrowly ideological and to mischaracterizing other people's arguments. When it comes to scholarship, fairness is a coin of the realm. When they are working well, academic journals discourage arguments that are glib, sloppy, or circular. They require conclusions to be earned. They also require both development of and sympathetic engagement with competing points of view, rather than easy or rapid dismissals. Counterarguments are encouraged, even mandatory. There is a kind of internal morality here, one that is connected with and helps account for some of its rigidity. The morality involves respect for the integrity of the process of argument, which entails respect for a wide range of arguers as well.

The literary critic Wayne Booth has written about the "implied author" — the character or persona behind the text, who may or may not be similar to the actual author. For all their diversity, academic articles tend to have broadly similar implied authors. They are usually careful, formal, earnest, diligent, serious, and fair-minded. They are rarely playful, funny, silly, joyful, angry, tricky, or outraged. There are exceptions, but that is the general pattern. And while the usual implied authors of academic articles may not be people you'd like to go out drinking with, you can probably trust them.

For Rodell, general-interest magazines were the standard, but rigor is not exactly their stock-in-trade. The arguments that can be found there are often lively, engaging, attention-grabbing, result-oriented, careless, and even a little ridiculous. As for people, so too for genres: Their vices are a product of their virtues. Many academics seem strongly drawn to popular outlets, producing blog posts or online columns, where significant numbers of readers might be found, and where publication is more immediate. Indeed, some academics seem to have abandoned — to some extent or altogether — academic journals and books in favor of popular outlets. Rodell might have applauded. But I do not think there is any reason for applause.

I wish, devoutly, that some of the immensely talented academics who write for popular outlets would reallocate some of their time to their academic work, which is less like popcorn. Sure, there's room for different kinds of writing. But whatever else they do, academic journals often display great care and rigor, in a way that can make op-eds, blog posts, and essays in even excellent general-interest magazines look like pretty thin gruel — mere bumper stickers, a kind of wind, even when written by professors.

A point for Keynes: Jargon-filled or not, academic work often orients our politics, our culture, and our lives. Whether or not they are anybody's slaves, those in the arena have good reason to celebrate it.


Cass R. Sunstein is a university professor at Harvard. His latest books, The World According to Star Wars (HarperCollins) and The Ethics of Influence (Cambridge University Press), will both be published later this year. A version of this article will appear in the Michigan Law Review.
Evernote helps you remember everything and get organized effortlessly. Download Evernote.

Wednesday, February 24, 2016

Should Dictionaries Do More to Confront Sexism?

Should Dictionaries Do More to Confront Sexism?

BY 

After a recent controversy over dictionary entries containing examples like "rabid feminist" and "nagging wife," lexicographers must decide whether it's possible to describe the language without sanctioning its ugly side.CREDITILLUSTRATION BY ROMAN MURADOV

A few days after the Democratic Presidential debate in January, Michael Oman-Reagan, a doctoral student in anthropology at the Memorial University of Newfoundland, was composing a tweet about the problems with American voters, and he was searching for the perfect word. The idea of a "rabid sports fan" popped into his head and struck him as just the attitude he was seeking to convey. He pulled up the definition of "rabid" on his Mac's dictionary, whose content is licensed from Oxford Dictionaries; he wanted to make sure that the word was as pejorative as he intended. It was, and so was the example phrase provided: "a rabid feminist," it read.

Alarmed by what struck him as a dated and offensive construction, Oman-Reagan took a screen shot of the page and tweeted it at Oxford Dictionaries with the suggestion, "maybe change that?" When he woke up the next morning, he found that his rebuke had been retweeted and favorited hundreds of times, and his followers were sending him their own discoveries. Apple's example sentence for "shrill" referenced "women's voices," and the one for the word "psyche" read, "I will never really fathom the female psyche." Oman-Reagan found that the pronouns in entries for "doctor" and "research" were male, while a "she" could be found doing "housework." He kept up his barrage on Oxford, which finally issued a flippant response on Friday: "If only there were a word to describe how strongly you felt about feminism." It added, in a subsequent tweet, "Our example sentences come from real-world use." The online melee that ensued left no one unscathed. Oman-Reagan says that his detractors started at least two online forums devoted to harassing him, while the head of content creation at Oxford Dictionaries, Katherine Connor Martin, told me that watching men's-rights activists defend the dictionary was, for her, "not a proud moment." Oxford ultimately tweeted an apology, with a promise to review the "rabid" example sentence, but made no public mention of "shrill," "psyche," or the other problem entries.

Feminists and linguists have been talking about the sexism that lurks beneath the surface of dictionaries since at least the nineteen-sixties. The question of how to eradicate it is bound up in a broader debate about the role of lexicography: Should dictionaries be proscriptive, establishing a standard of usage, or should they be descriptive, reflecting usage as it exists in the world? In the eyes of editors, their mandate is the latter. As the University of Oxford linguist Deborah Cameron puts it, when Oxford Dictionaries says its examples "come from real-world use," it's suggesting that "the sexism is in the world, and we just describe it." This reasoning turns out not to hold up in the case of "rabid feminist," though: Oxford tweeted that when its lexicographers searched their corpus—the archive of linguistic data, drawn from books, newspapers, and other writing, from which most dictionaries select example sentences—they found that combinations like "rabid fan" and "rabid supporter" were more commonly used; therefore, linguists told me, the entry might warrant adjusting for reasons of accuracy as well as sensitivity. The solution isn't so obvious when it comes to words such as "housework" and "shrill," or in other cases where Oxford's corpus may confirm that the most representative usage is, indeed, a sexist one. To address these larger patterns, dictionary editors—and readers—must decide whether it's possible to hold up a mirror to language without sanctioning its ugly side.

In "Websters' First New Intergalactic Wickedary of the English Language," a feminist dictionary published in 1987, the radical philosopher and activist Mary Daly wrote an entry for a word of her own coinage: "Dick-tionary, n: any patriarchal dictionary: a derivative, tamed and muted lexicon compiled by dicks." Rooting out the sexism in dictionaries was a priority for feminism's second wave. The nineteen-seventies and eighties witnessed a profusion of alternative volumes like Daly's, which highlighted biases that belied mainstream dictionaries' descriptive ideals. Deborah Cameron, writing about feminist dictionaries, has cited the example of the word "lesbian," which wasn't included in the Oxford English Dictionary until 1976, roughly two centuries after it entered common usage. (The O.E.D. is Oxford's "historic" dictionary, intended to reflect the entire evolution of language, whereas the New Oxford American Dictionary, which appears on Apple products in North America, is a "synchronic" dictionary that aspires to provide a snapshot of usage at the time of publication.) When "lesbian" finally was added, the entry included as an example sentence a quote from the writer Cecil Day-Lewis: "I shall never write real poetry. Women never do, unless they're invalids, or Lesbians, or something." Alternative dictionaries—and dyketionaries, as some were called—also critiqued the very concept of a book that could reflect society with dispassionate objectivity. Cameron cites the entry for "cuckold" in "A Feminist Dictionary," published in 1985, which notes that "The wife of an unfaithful husband is just called wife." (In fact, "cuckold" does have a feminine correlate—"cuckquean," which appears in the O.E.D.—but the word is considered obsolete and does not appear in common-use dictionaries.)* Underscoring that sexual double standard is not "neutral"—but neither, feminist linguists argue, is ignoring it, as mainstream dictionaries do.

Lexicographers are more aware of these complexities than most of their readers are. When I called Merriam-Webster, the editor-at-large, Peter Sokolowski, read me a series of internal memos, written in the nineteen-eighties and nineties, which stipulate that terms with feminine endings must have their own entries and be defined on their own terms: "abbess" cannot be included as a subsidiary under "abbot"; just as a "waiter" is "a man who serves food or drinks," a "waitress" should be defined as "a woman who serves food or drinks" rather than as "a female waiter." The Linguistic Society of America's "Guidelines for Nonsexist Usage," released in 1996, urge lexicographers to "avoid peopling your examples exclusively with one sex" and to "avoid gender-stereotyped characterizations" in their illustrative sentences. Steve Kleinedler, the executive editor of the American Heritage Dictionary, told me that he also notes how often women appear as the subjects of verbs, and how often as objects. Oxford's Connor Martin told me, "We have had conversations about the vision of the world expressed through pronouns and example sentences before this time. On the other hand, whatever our personal feelings may be about particular social issues . . . we don't have a political intent."

But the choices about what to include in a dictionary, like the construction of any historical record, are, arguably, inherently political. There is a circular logic to the descriptivist ethos: lexicographers say that the words and meanings they add to the dictionary have already been validated by the public's use, but, to the public, a word's inclusion in the dictionary is the thing that legitimizes it. For this reason, feminist linguists argue that, in some instances, lexicographers should put a thumb on the scale. In a corpus, "it may be that the most common collocate feels a little sexist, or a little something else," Anne Curzan, a historian of English at the University of Michigan, told me. "As an editor, you can decide to use the second most common example." Connor Martin told me that she and her colleagues often look for example sentences without gendered pronouns, especially to illustrate socially "fraught" words. Sarah Shulist, a linguistic anthropologist at MacEwan University, suggests that if the corpus shows gendered usage for a word, like "shrill," lexicographers can choose to reflect that fact, but they should mark it as pejorative instead of presenting it without comment. Dictionaries have increasingly relied on labels to demarcate racist language, for instance, since the National Association for the Advancement of Colored People threatened to boycott Merriam-Webster over its definition of the N-word, in 1997. (The entry back then read, "a black person—usu. taken to be offensive"; today, Merriam-Webster notes that the word "ranks as perhaps the most offensive and inflammatory racial slur in English.") "It's standard practice with slurs to mark them as a slur," Shulist said. "The real question is where to draw the boundary. I think they should move it." For lexicographers, making this shift might mean acknowledging that words such as "overbearing" and "hysterical," or "bossy" and "nagging"—two more entries whose gendered examples Oman-Reagan and his followers flagged as sexist—have never been neutral.

After "rabid"-gate, as some linguists have taken to calling it, Connor Martin says that she's more aware than ever of Oxford's role as an arbiter of language—whether she and her colleagues like it or not. "We need to know that the dictionary, as an institution, has a cultural power beyond the sum of its parts," she told me. "And that does carry with it a responsibility to realize that we exist within that tension, and to not always hide behind the idea of descriptivist lexicography." But while it's axiomatic that language evolves, it's in the nature of dictionaries to lag at least one step behind that evolution. The sheer size of a dictionary such as the New Oxford American—not to mention a multi-volume historic dictionary such as the O.E.D.—means that the vast majority of entries will go for years or decades without being formally reëxamined. In the meantime, the responsibility to question the compendiums' contents falls on ordinary users. Curzan told me, "We tend to defer to the dictionary as authoritative. But when it's about ourselves, or people we know and love, we feel more ownership. We feel more authorized to say, 'Wait a minute. That doesn't seem right to me.'

This post has been changed the clarify that there is a feminine correlate for the word "cuckold."

Evernote helps you remember everything and get organized effortlessly. Download Evernote.

宿白:现代城市中古代城址的初步考查

  现代城市中古代城址的初步考查 文 / 宿白 处理好文化遗产保护与城市发展的关系,首先要了解城市发展史。要了解城市发展史,最重要、也是最实在的手段,是考古遗迹的辨认。我们有不少历史名城沿用了好多朝代,甚至一直到今天还不断更新建设。这里说的历史名城主要指隋唐以来的城市。隋以前,选...