Evernote helps you remember everything and get organized effortlessly. Download Evernote. |
Friday, July 27, 2018
装置的政治
Thursday, July 12, 2018
On Semicolons and the Rules of Writing - The Millions
On Semicolons and the Rules of Writing
Related Books:
1.
Kurt Vonnegut's caution against the use of semicolons is one of the most famous and canonical pieces of writing advice, an admonition that has become, so to speak, one of The Rules. More on these rules later, but first the infamous quote in question: "Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college."
To begin with the lowest-hanging fruit here—fruit that is actually scattered rotting on the ground—the "transvestite hermaphrodite" bit has not aged well. The quote also, it seems, may have been taken out of context, as it is followed by several more sentences of puzzlingly offensive facetiousness, discussed here.
That said, I also have no idea what it means. My best guess is that he means semicolons perform no function that could not be performed by other punctuation, namely commas and periods. This obviously isn't true—semicolons, like most punctuation, increase the range of tone and inflection at a writer's disposal. Inasmuch as it's strictly true that you can make do with commas, the same argument that could be made of commas themselves in favor of the even unfussier ur-mark, the period. But that is a bleak thought experiment unless you are such a fan of Ray Carver that you would like everyone to write like him.
Finally, regarding the college part, two things: First, semicolon usage seems like an exceedingly low bar to set for pretentiousness. What else might have demonstrated elitism in Vonnegut's mind? Wearing slacks? Eating fish? Second, in an era of illiterate racist YouTube comments, to worry about semicolons seeming overly sophisticated would be splitting a hair that no longer exists.
But however serious Vonnegut was being, the idea that semicolons should be avoided has been fully absorbed into popular writing culture. It is an idea pervasive enough that I have had students in my writing classes ask about it: How do I feel about semicolons? They'd heard somewhere (as an aside, the paradoxical mark of any maxim's influence and reach is anonymity, the loss of the original source) that they shouldn't use them. To paraphrase Edwin Starr, semicolons—and rules about semicolons—what are they good for?
As we know, semicolons connect two independent clauses without a conjunction. I personally tend to use em dashes in many of these spots, but only when there is some degree of causality, with the clause after the em typically elaborating in some way on the clause before it, idiosyncratic wonkery I discussed in this essay. Semicolons are useful when two thoughts are related, independent yet interdependent, and more or less equally weighted. They could exist as discrete sentences, and yet something would be lost if they were, an important cognitive rhythm. Consider this example by William James:
I sit at the table after dinner and find myself from time to time taking nuts or raisins out of the dish and eating them. My dinner properly is over, and in the heat of the conversation I am hardly aware of what I do; but the perception of the fruit, and the fleeting notion that I may eat it, seem fatally to bring the act about.
The semicolon is crucial here in getting the thought across. Prose of the highest order is mimetic, emulating the narrator or main character's speech and thought patterns. The semicolon conveys James's mild bewilderment at the interconnection of act (eating the raisins) and thought (awareness he may eat the raisins) with a delicacy that would be lost with a period, and even a comma—a comma would create a deceptively smooth cognitive flow, and we would lose the arresting pause in which we can imagine James realizing he is eating, and realizing that somehow an awareness of this undergirds the act.
An em dash might be used—it would convey the right pause—but again, ems convey a bit of causality that would be almost antithetical to the sentence's meaning. The perception follows temporally, but not logically. In fact, James is saying he doesn't quite understand how these two modes of awareness coexist.
Or consider Jane Austen's lavish use of the semicolon in this, the magnificent opening sentence of Persuasion:
Sir Walter Elliot, of Kellynch Hall, in Somersetshire, was a man who, for his own amusement, never took up any book but the Baronetage; there he found occupation for an idle hour, and consolation in a distressed one; there his faculties were roused into admiration and respect, by contemplating the limited remnant of the earliest patents; there any unwelcome sensations, arising from domestic affairs changed naturally into pity and contempt as he turned over the almost endless creations of the last century; and there, if every other leaf were powerless, he could read his own history with an interest which never failed.
Periods could be ably used here, but they would not quite capture the drone of Elliot's stultifying vanity. Again, form follows function, and the function here is to characterize the arrogantly dull mental landscape of a man who finds comprehensive literary solace in the baronetage. More than that, the semicolons also suggest the comic agony of being trapped in a room with him—they model the experience of listening to a self-regarding monologue that never quite ends. We hardly need to hear him speak to imagine his pompous tone when he does.
The semicolon's high water usage mark, as shown here, was the mid-18th to mid-/late 19th centuries. This is hardly surprising, given the style of writing during this era: long, elaborately filigreed sentences in a stylistic tradition that runs from Jonathan Swift to the James brothers, a style that can feel needlessly ornate to modern readers. Among other virtues (or demerits, depending on your taste in prose), semicolons are useful for keeping a sentence going. Reflecting on the meaning of whiteness in Moby Dick, Melville keeps the balls in the air for 467 words; Proust manages 958 in Volume 4 of Remembrance of Things Past during an extended, controversial rumination on homosexuality and Judaism. There is a dual effect in these examples and others like them of obscuring meaning in the process of accreting it, simultaneously characterizing and satirizing the boundaries of human knowledge—a sensible formal tactic during an era when the boundaries of human knowledge were expanding like a child's balloon.
Stylistically, the latter half of the 20th century (and the 21st) has seen a general shift toward shorter sentences. This seems understandable on two fronts. First—and this is total conjecture—MFA writing programs came to the cultural fore in the 1970s and over the last few decades have exerted an increasing influence on literary culture. I am far from an MFA hater, but the workshop method does often tend to privilege an economy of storytelling and prose, and whether the relationship is causal or merely correlational, over the last few decades a smooth, professionalized, and unextravagant style has been elevated to a kind of unconscious ideal. This style is reflexively praised by critics: "taut, spare prose" is practically a cliche unto itself. Additionally, personal communication through the 20th century to today has been marked by increasing brevity. Emails supplant letters, texts supplant emails, and emojis supplant texts. It stands to reason that literary writing style and the grammar it favors would, to a degree, reflect modes of popular, nonliterary writing.
Beyond grammatical writing trends, though, semicolons are a tool often used, as exemplified in the Austen and James examples, to capture irony and very subtle shades of narrative meaning and intent. It might be argued that as our culture has become somewhat less interested in the deep excavations of personality found in psychological realism—and the delicate irony it requires—the semicolon has become less useful. Another interesting (though possibly meaningless) chart from Vox displays the trend via some famous authors. As fiction has moved from fine-grained realism into postmodern satire and memoir, has the need for this kind of fine-grained linguistic tool diminished in tandem?
Maybe. In any case, I have an affection for the semi, in all its slightly outmoded glory. The orthographical literalism of having a period on top of a comma is, in itself, charming. It is the penny-farthing of punctuation—a goofy antique that still works, still conveys.
2.
A larger question Vonnegut's anti-semicolonism brings up might be: Do we need rules, or Rules, at all? We seem to need grammatical rules, although what seem to be elemental grammatical rules are likely Vonnegutian in provenance and more mutable than they seem. For instance, as gender norms have become more nuanced, people—myself included—have relaxed on the subject of the indeterminately sexed "they" as a singular pronoun. Likewise, the rule I learned in elementary school about not ending sentences with prepositions. Turns out there's no special reason for this, and rigid adherence to the rule gives you a limited palette to work with (not a palette with which to work).
We know, on some level, that writing rules are there to be broken at our pleasure, to be used in the service of writing effectively, and yet writing is such a difficult task that we instinctively hew to any advice that sounds authoritative, cling to it like shipwrecked sailors on pieces of rotten driftwood. Some other famous saws that come to mind:
Henry James: "Tell a dream, lose a reader."
Elmore Leonard: "Never open a book with weather."
John Steinbeck: "If you are using dialogue—say it aloud as you write it. Only then will it have the sound of speech."
Annie Dillard: "Do not hoard what seems good for a later place."
Stephen King: "The road to hell is paved with adverbs."
And more Kurt Vonnegut: "Every character should want something, even if it is only a glass of water"; "Every sentence must do one of two things—reveal character or advance the action"; "Start as close to the end as possible."
In the end, of course, writing is a solitary pursuit, and for both good and ill no one is looking over your shoulder. As I tell my students, the only real writing rule is essentially Aleister Crowley's Godelian-paradoxical "Do what thou wilt, that shall be the whole of the law." Or alternately, abide by the words of Eudora Welty: "One can no more say, 'To write stay home,' than one can say, 'To write leave home.' It is the writing that makes its own rules and conditions for each person."
Image: Flickr/DaveBleasdale
Adam O'Fallon Price is a writer and teacher living in Carrboro, NC. The paperback edition of his first novel, The Grand Tour, is now available on Anchor Books. His short fiction has appeared in The Paris Review, VICE, The Iowa Review, Glimmer Train, and many other places. His podcast, Fan's Notes, is an ongoing discussion about books and basketball. Find him online at adamofallonprice.com and on Twitter at @AdamOPrice.
Evernote helps you remember everything and get organized effortlessly. Download Evernote. |
Thomas Bayes and the crisis in science – TheTLS
Thomas Bayes and the crisis in science
DAVID PAPINEAU
We are living in new Bayesian age. Applications of Bayesian probability are taking over our lives. Doctors, lawyers, engineers and financiers use computerized Bayesian networks to aid their decision-making. Psychologists and neuroscientists explore the Bayesian workings of our brains. Statisticians increasingly rely on Bayesian logic. Even our email spam filters work on Bayesian principles.
It was not always thus. For most of the two and a half centuries since the Reverend Thomas Bayes first made his pioneering contributions to probability theory, his ideas were side-lined. The high priests of statistical thinking condemned them as dangerously subjective and Bayesian theorists were regarded as little better than cranks. It is only over the past couple of decades that the tide has turned. What tradition long dismissed as unhealthy speculation is now generally regarded as sound judgement.
We know little about Thomas Bayes's personal life. He was born in 1701 into a well-to-do dissenting family. He entered the Presbyterian ministry after studying logic and theology at Edinburgh and lived in Tunbridge Wells for most of his adult life. Much of his energy seems to have been devoted to intellectual matters. He published two papers during his lifetime, one on theology, and the other a defence of Newton's calculus against Bishop Berkeley's criticisms. The latter impressed his contemporaries enough to win him election to the Royal Society.
The work for which he is best known, however, was published posthumously. Bayes died in 1761, but for some time up to his death he had been working on a paper entitled "An Essay towards Solving a Problem in the Doctrine of Chances". His work was passed on to his friend Richard Price, who arranged for it to be presented to the Royal Society in 1763. Bayes's essay marks a breakthrough in thinking about probability.
Probability theory was in its infancy in Bayes's day. Strange as it may seem, before the seventeenth century nobody could calculate even such simple chances as that of a normal coin landing five heads in a row. It wasn't that the information wouldn't have been useful. There was plenty of gambling before modernity. But somehow no one could get their head around probabilities. As Ian Hacking put in his groundbreaking The Emergence of Probability (1975), someone in ancient Rome "with only the most modest knowledge of probability mathematics could have won himself the whole of Gaul in a week".
By Bayes's time, the rudiments of probability had finally been forged. Books such as Abraham de Moivre's The Doctrine of Chances (1718) explained the basic principles. They showed how to calculate the probability of five heads on a normal coin (it is 1/32) and indeed more complex probabilities like five heads on a coin biased 75 per cent in favour of heads (that would be 243/1024 – about ¼). At last it was possible for gamblers to know which bets are good in which games of chance.
Not that the Reverend Bayes was any kind of gambler. What interested him was not the probability of results given different causes (like the probability of five heads given different kinds of coin). Rather he wanted to know about the "inverse probability" of the causes given the results. When we observe some evidence, what's the likelihood of its different possible causes? Some commentators have conjectured that Bayes interest in this issue was prompted by David Hume's sceptical argument in An Enquiry Concerning Human Understanding (1748) that reports of miracles are more likely to stem from inventive witnesses than the actions of a benign deity. Be that as it may, Bayes's article was the first serious attempt to apply mathematics to the problem of "inverse probabilities".
Bayes's paper analyses a messy problem involving billiard balls and their positions on a table. But his basic idea can be explained easily enough. Go back to the coins. If five tosses yield five heads in a row, then how likely is it that the coin is fair rather than biased? Well, how long is a piece of string? In the abstract, there's no good answer to the question. Without some idea of the prevalence of biased coins, five heads doesn't really tell us anything. Maybe we're spinning a dodgy coin, or perhaps we just got lucky with a fair one. Who knows?
What Bayes saw, however, was that in certain cases the problem is tractable. Suppose you know that your coin comes from a minting machine that randomly produces one 75 per cent heads-biased coin for every nine fair coins. Now the inverse probabilities can be pinned down. Since five heads is about eight times more likely on a biased than a fair coin, we'll get five heads from a biased coin eight times for every nine times we get it from a fair one. So, if you do see five heads in a row, you can conclude that the probability of that coin being biased is nearly a half. By the same reasoning, if you see ten heads in a row, you can be about 87 per cent sure the coin is biased. And in general, given any observed sequence of results, you can work out the probability of the coin being fair or biased.
Most people who have heard of Thomas Bayes associate him primarily with "Bayes's theorem". This states that the probability of A given B equals the probability of B given A, times the probability of A, divided by the probability of B. So, in our case, Prob(biased coin/five heads) = Prob(five heads/biased coin) x Prob(biased coin) / Prob(five heads).
As it happens, this "theorem" is a trivial bit of probability arithmetic. (It falls straight out of the definition of Prob(A/B) as Prob(A&B) / P(B).) Because of this, many dismiss Bayes as a minor figure who has done well to have the contemporary revolution in statistical theory named after him. But this does a disservice to Bayes. The focus of his paper is not his theorem, which appears only in passing, but the logic of learning from evidence.
What Bayes saw clearly was that, in any case where you can compute Prob(A/B), this quantity provides a recipe for adjusting your confidence in A when you learn B. We start off thinking there's a one-in-ten chance of a biased coin but, once we observe five heads, we switch to thinking it's an even chance. Bayes's "theorem" is helpful because it shows that evidence supports a theory to the extent the theory makes that evidence likely – five heads support biasedness because biasedness makes five heads more likely. But Bayes's more fundamental insight was to see how scientific methodology can be placed on a principled footing. At bottom, science is nothing if not the progressive assessment of theories by evidence. Bayes's genius was to provide a mathematical framework for such evaluations.
Bayes's reasoning works best when we can assign clear initial probabilities to the hypotheses we are interested in, as when our knowledge of the minting machine gives us initial probabilities for fair and biased coins. But such well-defined "prior probabilities" are not always available. Suppose we want to know whether or not heart attacks are more common among wine than beer drinkers, or whether or not immigration is associated with a decline in wages, or indeed whether or not the universe is governed by a benign deity. If we had initial probabilities for these hypotheses, then we could apply Bayes's methodology as the evidence came in and update our confidence accordingly. Still, where are our initial numbers to come from? Some preliminary attitudes to these hypotheses are no doubt more sensible than others, but any assignment of definite prior probabilities would seem arbitrary.
It was this "problem of the priors" that historically turned orthodox statisticians against Bayes. They couldn't stomach the idea that scientific reasoning should hinge on personal hunches. So instead they cooked up the idea of "significance tests". Don't worry about prior probabilities, they said. Just reject your hypothesis if you observe results that would be very unlikely if it were true.
This methodology was codified at the beginning of the twentieth century by the rival schools of Fisherians (after Sir Ronald Fisher) and Neyman-Pearsonians (Jerzy Newman and Egon Pearson). Various bells and whistles divided the two groups, but on the basic issue they presented a united front. Forget about subjective prior probabilities. Focus instead on the objective probability of the observed data given your hypothesized cause. Pick some level of improbability you won't tolerate (the normally recommended level was 5 per cent). Reject your hypothesis if it implies the observed data are less likely than that.
In truth, this is nonsense on stilts. One of the great scandals of modern intellectual life is the way generations of statistics students have been indoctrinated into the farrago of significance testing. Take coins again. In reality you won't meet a heads-biased coin in a month of Sundays. But if you keep tossing coins five times, and apply the method of significance tests "at the 5 per cent level", you'll reject the hypothesis of fairness in favour of heads-biasedness whenever you see five heads, which will be about once every thirty-second coin, simply because fairness implies that five heads is less likely than 5 per cent.
This isn't just an abstract danger. An inevitable result of statistical orthodoxy has been to fill the science journals with bogus results. In reality genuine predictors of heart disease, or of wage levels, or anything else, are very thin on the ground, just like biased coins. But scientists are indefatigable assessors of unlikely possibilities. So they have been rewarded with a steady drip of "significant" findings, as every so often a lucky researcher gets five heads in a row, and ends up publishing an article reporting some non-existent discovery.
Science is currently said to be suffering a "replicability crisis". Over the last few years a worrying number of widely accepted findings in psychology, medicine and other disciplines have failed to be confirmed by repetitions of the original experiments. Well-known psychological results that have proved hard to reproduce include the claim that new-born babies imitate their mothers' facial expressions and that will power is a limited resource that becomes depleted through use. In medicine, the drug companies Bayer and Amgen, frustrated by the slow progress of drug development, discovered that more than three-quarters of the basic science studies they were relying on didn't stand up when repeated. When the journal Nature polled 1,500 scientists in 2016, 70 per cent said they had failed to reproduce another scientist's results.
This crisis of reproducibility has occasioned much wringing of hands. The finger has been pointed at badly designed experiments, not to mention occasional mutterings about rigged data. But the only real surprise is that the problem has taken so long to emerge. The statistical establishment has been reluctant to concede the point, but failures of replication are nothing but the pigeons of significance testing coming home to roost.
Away from the world of academic science and its misguided anxieties about subjectivity, practical investigators have long benefited from Bayesian methods. When actuaries set premiums for new markets, they have no alternative but to start with some initial assessments of the risks, and then adjust them in the light of experience. Similarly, when Alan Turing and the other code-breakers at Bletchley Park wanted to identify that day's German settings on the Enigma machine, they began with their initial hunches, and proceeded systematically on that basis. No doubt the actuaries' and code-breakers' initial estimates involved some elements of guesswork. But an informed guess is better than sticking your head in the sand, and in any case initial misjudgements will tend to be rectified as the data come in.
The advent of modern computers has greatly expanded the application of these techniques. Bayesian calculations can quickly become complicated when a number of factors are involved. But in the 1980s Judea Pearl and other computer scientists developed "Bayesian networks" as a graph-based system for simplifying Bayesian inferences. These networks are now used to streamline reasoning across a wide range of fields in science, medicine, finance and engineering.
The psychologists have also got in on the act. Statisticians might be ideologically resistant to Bayesian logic, but the unconscious brain processes of humans and other animals have no such scruples. If your visual system is trying to identify some object in the corner of the room, or which words you are reading right now, the obvious strategy is for it to begin with some general probabilities for the likely options, and then adjust them in the Bayesian way as it acquires more evidence. Much research within contemporary psychology and neuroscience is devoted to showing how "the Bayesian brain" manages to make the necessary inferences.
The vindication of Bayesian thinking is not yet complete. Perhaps unsurprisingly, many mainstream university statistics departments are still unready to concede that they have been preaching silliness for over a century. Even so, the replicability crisis is placing great pressure on their orthodoxy. Since the whole methodology of significance tests is based on the idea that we should tolerate a 5 per cent level of bogus findings, statistical traditionalists are not well placed to dodge responsibility when bogus results are exposed.
Some defenders of the old regime have suggested that the remedy is to "raise the significance level" from 5 per cent to, say, 0.1 per cent — to require, in effect, that research practice should only generate bogus findings one time in a thousand, rather that once in twenty. But this would only pile idiocy on stupidity. The problem doesn't lie with the significance level, but with the idea that we can bypass prior probabilities. If a researcher shows me data that would only occur one time in twenty if geography didn't matter to hospital waiting times, then I'll become a firm believer in the "postcode lottery", because the idea was reasonably plausible to start with. But if a researcher shows me data that would only occur one time in a 1,000 if the position of Jupiter were irrelevant to British election results, I'll respond that this leaves the idea of a Jovian influence on the British voter only slightly less crazy than it always was.
No sane recipe can ignore prior probabilities when telling you how to respond to evidence. Yes, a theory is disconfirmed if it makes the evidence unlikely and is supported if it doesn't. But where that leaves us must also depend on how probable the theory was to start with. Thomas Bayes was the first to see this and to understand what it means for probability calculations. We should be grateful that the scientific world is finally taking his teaching to heart.
David Papineau's most recent book is Knowing the Score: How sport teaches us about philosophy (and philosophy about sport)
Evernote helps you remember everything and get organized effortlessly. Download Evernote. |
The Evils of Cultural Appropriation
The Evils of Cultural Appropriation
In 1961, the anthropologist William Rowe documented a curious feature of Indian village life. When those of a lower caste—particularly if they were "untouchable"—wore clothing containing threads worn by a higher landlord caste, members of the landlord caste tore the garments from them, beat them and fined them. Rowe called this behavior "upcasteing" and observed that it was a serious social offense.
Similar practices barring access to signifiers of a caste or social grouping superior to one's own can be found throughout the historical record. In the 7th century B.C., Greek law stipulated that a free-born woman "may not wear gold jewelry or a garment with a purple border, unless she is a courtesan." In ancient Rome, only Roman senators were allowed to wear Tyrian purple on their togas—ordinary Romans could not. In feudal Japan, people of every class submitted to strict laws about what they could and could not wear, according to their social rank. In Medieval and Renaissance Europe, the nobility policed the clothing of the middle classes, making sure to keep them in their place. In any society in which there has been high levels of inequality—where monarchs and aristocrats have ruled over commoners and slaves—equality in dress has been considered, at the very least, bad manners.
While sumptuary laws (rules that govern conspicuous consumption, especially of food and clothing) fell mostly out of fashion in the West during the Enlightenment period, they appear to be back in style again, thanks to the orthodoxies of social-justice activism fueled by social media. On April 28, a tweet appeared which read:
The tweet was retweeted nearly 42,000 times and "liked" 178,000 times—viral in social media terms. The flare-up was reported on internationally, and dozens of op-eds both condemning and defending the tweet and the dress spilled forth. Writing in The Independent, Eliza Anyangwe officiously declared that the teenager who wore the offending dress, Keziah Daum, was "the embodiment of a system that empowers white people to take whatever they want, go wherever they want and be able to fall back on: 'Well, I didn't mean any harm.'" The title of the piece was "Cultural Appropriation Is Never Harmless." But it failed to define what cultural appropriation actually is.
For most observers, these complaints are bemusing and baffling. For many, no defense or condemnation of cultural appropriation is required, because such complaints are almost beyond the realm of comprehension in the first place. Without cultural appropriation we would not be able to eat Italian food, listen to reggae, or go to Yoga. Without cultural appropriation we would not be able to drink tea or use chopsticks or speak English or apply algebra, or listen to jazz, or write novels. Almost every cultural practice we engage in is the byproduct of centuries of cross-cultural pollination. The future of our civilization depends on it continuing.
Yet the concept was not always so perplexing. Originally derived from sociologists writing in the 1990s, its usage appears to have first been adopted by indigenous peoples of nations tainted by histories of colonization, such as Canada, Australia and the United States. Understandably, indigenous communities have been protective of their sacred objects and cultural artifacts, not wishing the experience of exploitation to be repeated generation after generation. Although one might be quizzical of complaints about a girl wearing a cheongsam to her prom (the United States has never colonized China) even the most tough-minded skeptic should be able to see why indigenous peoples who have historically had their land and territories taken away from them might be unwilling to "share their culture" unconditionally. Particularly when it is applied to the co-opting of a people's sacred and religious iconography for the base purposes of profit-making, the concept of cultural appropriation seems quite reasonable.
Nevertheless, the concept quickly becomes baffling when young Westerners, such as Mr. Lam, of the cheongsam tweet, use the term as a weapon to disrupt the natural process of cultural exchange that happens in cosmopolitan societies in which culture is, thankfully, hybrid. When controversies erupt over hoop earrings or sombrero hats or sushi or braids or cannabis-themed parties, the concept of cultural appropriation appears to have departed from its formerly understood meaning—that is, to protect sacred or religious objects from desecration and exploitation. It appears that these newer, more trivial (yet vicious) complaints are the modern-day incarnation of sumptuary laws.
Elites once policed what their social inferiors could wear, in part to remind them of their inferiority, and in part to retain their own prestige and exclusivity. In Moral Time, the sociologist Donald Black, explains that in feudal and medieval societies, sumptuary laws were often articulated with religious or moralizing language, but their intention and effect was simply to provide a scaffold for existing social hierarchies. Writing in the 15th century, French philosopher Michel de Montaigne made the astute observation in his essay "Of Sumptuary Laws": "'Tis strange how suddenly and with how much ease custom in these indifferent things establishes itself and becomes authority."
***
New orthodoxies of social justice have arguably done the most obvious damage in the world of fiction writing. In the past five years, concerns about cultural appropriation in literature have exploded, with Young Adult writers being "dragged" online for the sins of appropriation while sensitivity readers are employed by publishers to ensure that minorities are represented in a sufficiently "correct" manner. Perhaps not coincidentally literary sales have reportedly declined by 35 percent over the same time period.
In 2016, a flare-up exploded over author Lionel Shriver's speech at the Brisbane Writers Festival—where she infamously wore a sombrero hat while delivering her speech about freedom in fiction writing. This prompted Australian-Muslim activist Yassmin Abdel-Magied to publish a purple-prose version of the event in The Guardian:
The faces around me blurred. As my heels thudded against they [sic] grey plastic flooring, harmonizing with the beat of the adrenaline pumping through my veins, my mind was blank save for one question.
"How is this happening?"
Yet Shriver's core argument was so self-evident as to be banal. She proposed that if a writer could not put herself in the shoes of her characters—who might be from a different cultural background than herself—then fiction would cease to exist: "The kind of fiction we are 'allowed' to write is in danger of becoming so hedged, so circumscribed, so tippy-toe, that we'd indeed be better off not writing the anodyne novel to begin with," she observed.
Abdel-Magied did not think such an argument was self-evident. On the contrary, she wrote:
In demanding that the right to identity should be given up, Shriver epitomized the kind of attitude that led to the normalization of imperialist, colonial rule: "I want this, and therefore I shall take it."
The attitude drips of racial supremacy, and the implication is clear: "I don't care what you deem is [sic] important or sacred. I want to do with it what I will. Your experience is simply a tool for me to use, because you are less human than me. You are less than human…"
Shriver's full speech can be read here. To argue that it "drips of racial supremacy" is to project something that is simply not contained in the text. She defends fiction as a vehicle for empathy, not derision. Nowhere does she argue that the "right to identity" be "given up."
Yet Shriver's metaphorical kneecapping was complete. Her speech was officially disavowed by the organizers of the festival, who threw together a "Right of Reply" seminar session the following day. It is important to note that it was never suggested that Shriver actually committed colonial crimes that had harmed people of color. It was simply suggested that she was guilty by association, presumably because of the color of her own skin.
Such events are surely baffling to anyone over the age of 30. It was only a few years ago when writers were celebrated for their imaginative and empathic reach if they wrote about characters who were from a different background than the author's own. When Philip Roth wrote probingly and unflinchingly about a character who was black but passed as Jewish in The Human Stain, his book became a national bestseller and won numerous accolades including the New York Times "Editor's Choice" Award and the PEN/Faulkner Award. When the book was released, the critic Michiko Kakutani offered the following review:
It is a book that shows how the public zeitgeist can shape, even destroy, an individual's life, a book that takes all of Roth's favorite themes of identity and rebellion and generational strife and refracts them not through the narrow prism of the self but through a wide-angle lens that exposes the fissures and discontinuities of 20th-century life.
In 2013, GQ listed The Human Stain as one of the best books of the 21st century. Arguably, such a book could not be written today, and would almost certainly cause a firestorm if published. That's a pretty sharp turnaround in sensibility in a very short period of time.
The context surrounding the drama at the Brisbane Writers Festival is important for understanding why it happened. Abdel-Magied migrated to Australia at the age of two and, from a relatively young age, entered the public sphere as a "model minority." She was an articulate activist and an accomplished student (becoming an engineer and memoir writer) who appeared capable of promoting a modern, sophisticated image of an urban Muslim-Australian. For her activism, she was showered with awards and publicly funded appointments, and given international trips for the express purpose of promoting Australia abroad. Yet for all her accomplishments, accolades, money, and travel opportunities—or perhaps in exchange for them—the young woman was stuck with the felt identity of a victim. This apparent feeling of victimhood was so strong that she interpreted arguments for creative license in art to be "lay[ing] the foundation for genocide."
Many people—both then and now—find it hard to understand how such complaints can come from a place of good faith. Activists like Abdel-Magied seem unwilling to empathize with those who may genuinely want to show appreciation for cultures which are not their own, or writers who genuinely want to empathize with those who are different or marginalized, or simply to reach beyond a single layer or caste of the multicultural societies in which they live, an ambition for which writers and thinkers have historically been applauded.
What also seems odd is that activists like Abdel-Magied rarely appear to attempt to persuade others to engage with the foreign cultures they are purportedly defending in more sensitive or better-informed ways. Rather, their complaints have a hectoring, absolutist quality, focusing on the disrespect and lack of deference that white people have shown them. Listening to these complaints, it is difficult to come away with the view that they are about anything other than exercises in power. While being an effective social-media activist, Abdel-Magied is not a particularly good writer, which means identity-as-victim is therefore valuable currency at a writers festival. If literature is not reducible to identity, and representation is not a group property, then her own claim to literary significance would be a dubious one.
It is by considering the power dynamics at play that the logic of cultural appropriation starts to become clear. In a culture that increasingly rewards victimhood with status, in the form of op-ed space, speaking events, awards, book deals, general deference, and critical approbation, identity has become a very valuable form of currency. It makes sense that people will lie, cheat, and steal in order to get some. Expressing offense over a white person wearing a sombrero hat might seem ridiculous on its face—but for those who live inside these sententiously moralistic bubbles, it may be both a felt injury and a rational strategic choice.
Complaints about cultural appropriation are not really complaints, they are demands. When Abdel-Magied walked out on Shriver, it was not because of her insensitivity, it was because of her defiance: her refusal to kowtow to the orthodoxies written up by her moral betters, from which Abdel-Magied's own claims to significance and social status are derived.
In their newly released book, The Rise of Victimhood Culture: Microaggressions, Safe Spaces, and the New Culture Wars, the moral sociologists Bradley Campbell and Jason Manning describe the three main moral cultures that exist today, which they give the shorthand labels of dignity, honor, and victimhood. A dignity culture, which has been the dominant moral culture of Western middle classes for some time, has a set of moral values that promotes the idea of moral equality and was crystallized in Martin Luther King Jr.'s vision that people ought to be judged according to the content of their character, not the color of their skin.
Victimhood culture departs from dignity culture in several important ways. Moral worth is in large part defined by the color of one's skin, or at least one's membership in a fixed identity group: i.e., women, people of color, LGBTIQ, Muslims, or indigenous peoples. Such groups are sacred, and a lack of deference to them is seen as a sign of deviance. The reverse is true for those who belong to groups that are considered historical oppressors: whites, males, straight people, Zionists. Anyone belonging to an "oppressor" group is stained by their privilege, or "whiteness," and is cast onto the moral scrapheap.
In a recent interview in the online magazine which I edit, Quillette, I asked Campbell and Manning what they thought about cultural appropriation. They explained that they found such complaints baffling, like everybody else, but that they also "illustrate victimhood culture quite well." One of the key components of victimhood culture is its projection of collective guilt, social offenses between individuals are no longer about the actual people involved, they are about "one social group harming another."
One might make the case that while complaints about cultural appropriation are annoying, they are ultimately harmless. What is the harm in showing deference to peoples who have historically been the victims of exploitation, discrimination, and unfair treatment? What is the harm in showing respect and compliance with these new rules—isn't it a way of making up for past sins?
The short answer to these questions is, no. The notion that a person can be held as responsible for actions that he or she did not commit strikes at the very heart of our conception of human rights and justice.
We used to take calls for collective punishment much more seriously. In the 1949 Geneva Convention it was determined that: "No protected person may be punished for an offense he or she has not personally committed." Collective punishment was seen as a tactic designed to intimidate and subdue an entire population. The drafters of the Geneva Convention clearly had in mind the atrocities committed in WWI and WWII where entire villages and communities suffered mass retribution for the resistance activities of a few. In their commentary on the outlawing of collective punishment the International Red Cross stated: "A great step forward has been taken. Responsibility is personal and it will no longer be possible to inflict penalties on persons who have themselves not committed the acts complained of."
In times of peace, collective punishment may come in the form of social media dust-ups over sombrero hats or Chinese dresses. Gradual softening on the taboo of collective punishment does not bode well for the health of liberal democracies. Which is also why it is important for us all to remember that social-justice activists who complain about cultural appropriation only represent themselves, and not the minority groups to which they belong. When Jeremy Lam complained about Keziah Daum's cheongsam, other Asian-Americans and mainland Chinese people rebuked him for being "offended over nothing." Hundreds of Chinese and Asian-American people have now responded to Daum's original tweet commenting on how pretty she looked in the dress, and how they felt that she did nothing wrong. Replies include:
As these responses make clear, victimhood culture only exists in certain insulated bubbles. It is by no means the dominant moral culture in the United States, nor any other nation—yet. Most of us still live in a dignity culture where equality, diversity, and inclusion remain paramount principles to aspire to.
As the drama of social-justice activism unfolds, one of the biggest mistakes we could make is to think that the loudest and most dogmatic voices of minority groups speak for these groups as a whole. Minority groups are not homogenous, and are full of people who are conservative, liberal, apolitical, and dissenting. Just because a progressive activist claims to speak on behalf of a marginalized group does not mean we have to indulge their delusions of grandeur.
It also would be a mistake to think that multiculturalism has been a failure. Human beings have a remarkable capacity for cooperation and solidarity with each other—including those whom they are not related to—when they feel that their neighbor is on their team. What helps people move past cultural, ethnic, or religious differences is sharing in a common goal, and feeling a common bond of humanity with their fellow citizens. When activists draw attention to differences between cultures, and attempt to draw boundaries around them, punishing those who step out of line, this sets the stage for escalating intolerance and division. We must not let them succeed.
***
You can help support Tablet's unique brand of Jewish journalism. Click here to donate today.
Claire Lehmann is the Sydney-based Editor-in-Chief of Quillette.
Evernote helps you remember everything and get organized effortlessly. Download Evernote. |
Why the novel matters
Why the novel matters We read and write fiction because it asks impossible questions, and leads us boldly into the unknown. By Deborah Le...
-
文献说明 1895年,年轻的艺术史家阿比·瓦尔堡游历美国,专程前往新墨西哥州普韦布洛印第安人的保留地,参观当地人的宗教仪式和舞蹈。这段经历直至28年后,作为瓦尔堡精神病痊愈的出院讲座,才形成文字为人知晓。两个大陆之间的象征、记忆与仪式在艺术史家的头脑中不断激荡,作为古老异...
-
https://mp.weixin.qq.com/s/4dqIXI2SW3WwJ23PUvekOQ 錢先生的筆記和日札是爲自己作的,爲作文著書之資,不想給人看 【註一】 ,更不會影印出版——善行無轍跡,而良工不示人以璞。日札裏錢先生自己的涂乙勾抹,多施以淡墨細杠,屬於“技...
-
利奥塔:依照塞尚的弗洛伊德(1971) 依照塞尚的弗洛伊德(1971) 文:让-弗朗索瓦·利奥塔 译:余航,同济大学人文学院博士在读 简介 该文最初标题为《精神分析与绘画》(Psychanalyse et peinture),收录于1971年《大百科全书 卷13》( En...