Friday, July 27, 2018

装置的政治

装置的政治
POLITICS OF INSTALLATION
作者:鲍里斯·格罗伊斯(Boris Groys)
译者:陈荣钢(号主,微信:hakuna111)
【*】为译者注
如今,艺术领域常常被等同于艺术市场,而艺术品主要被视为一种商品。毫无疑问,艺术品在艺术市场中发挥着作用,每一件艺术品都是一种商品。然而,艺术也为那些不想成为艺术收藏家的人创作和展出。
事实上,正是这些人构成了艺术公众(art public)的大多数。具有代表性的展览(双年展、三年展、文献展)上,参观者很少把展出的作品视为商品,同时这些大型展览与日俱增。
尽管在这些艺术展览(art exhibitions)上投入了大量的金钱和精力,但它们并不主要为艺术买家而存在,而是为公众而存在——为一个可能永远不会购买艺术品的匿名参观者而存在。
同样,虽然艺术博览会(art fairs)表面上为艺术品买家服务,但现在越来越多地转变为公共活动,吸引了一群对艺术没什么兴趣或没有经济能力购买艺术品的人。
因此,艺术体系正在成为大众文化的一部分,而长久以来,它一直试图从远处观察和分析。
艺术成为大众文化的一部分,不是个人作品在艺术市场上交易的来源,而是一种结合了建筑、设计和时尚的展览实践。正如「包豪斯」、「呼捷玛斯」(VKhUTEMAS)等20世纪20年代先锋派先驱们所面对的那样。【*】
【*】「呼捷玛斯」(VKhUTEMAS)即莫斯科高等艺术暨技术学院,是1920年9月在莫斯科成立的一所国立艺术与技术学校,被称作「俄罗斯的包豪斯」。它是构成主义、理性主义、至上主义这三个先锋派艺术与建筑运动的中心。先锋派艺术家马列维奇、波波娃、罗钦科、利西斯基等都曾在此任教。由于受到了政治上以及来自内部的压力,学校于1930年解散。学校的教师、学生以及资产都被分配到其他六所学校中。图为罗钦科20年代设计的海报。
因此,如今越来越难以区分当代艺术界的两类主要人物:艺术家和策展人。
艺术体系中传统的劳动分工是明确的。艺术作品由艺术家制作,再由策展人挑选和展出。但是,至少在杜尚(Duchamp)死后,这种劳动分工已经瓦解。
如今,艺术创作和艺术展示之间不再存在任何「本体论」(「ontological」)的区别。在当代艺术的语境中,艺术就是把事物表现为艺术。那么,问题出现了:如果是这样,当艺术的生产和展览没有区别的时候,如何区分艺术家的角色和策展人的角色?
在此,我认为这种区分仍然可能。我想通过分析标准展览和艺术装置之间的区别来做出这种区分。
传统的展览被认为是一种艺术物品的积累,这些物品被放置在一个展览空间中,并被依次观看。在这种情况下,展览空间作为中性的、公共的城市空间的延伸,就像一条小巷,过路人在支付入场费后便能进入其中。
参观者在展览空间内的移动与在街上行走并观察左右房屋建筑的人是相似的。瓦尔特·本雅明(Walter Benjamin)通过城市手推车和展览参观者之间的类比,构建了他的「拱廊项目」(「Arcades Project」)。
这绝非偶然。在这个场景中,观众的身体仍在艺术之外。艺术作为艺术对象、表演或电影,艺术在观众眼前发生了。因此,这里的展览空间被理解为一个空的、中性的、公共的空间。
这是「公众」的象征属性。这样一个空间的唯一功能就是让放置在里面的艺术品更容易被参观者看到。
馆长以公众的名义管理这个展览空间,他是公众的代表。因此,策展人的角色是保护公共空间的公共性,同时将个人艺术作品带入这个空间,让公众能够接触到它们,宣传它们。
显然,一件单独的艺术品本身并不能表现出它的存在,这就需要迫使观看者去看到它。它缺乏活力、能量和健康度。从源头来看,艺术作品似乎是病态的、无助的。为了看到它,观众必须被带到它面前,正如访客被医院工作人员带到一个卧床不起的病人面前。
「策展人」这个词在词源上和「治愈」有关,这绝非巧合。策展人就是治愈。策划治愈了图像的无力,因为它无法自己展现自己。
因此,展览实践是治愈原本病态的形象,使其存在、可见的治疗方法。它被带入公众视野,变成公众判断的对象。然而,我们可以说,策展作为一种补充,就像德里达意义上的制药公司:它既治愈了形象,又进一步助长了疾病。
策展的偶像破坏(iconoclastic)潜力最初应用于过去的神圣物品。这些物品在现代博物馆或艺术馆中呈现为中立状态。空旷的展览空间中,它们仅仅是艺术品。事实上,包括馆长在内的策展人最初以现代意义上的艺术创作着艺术。
建于18世纪末和19世纪初,并在19世纪扩建的第一批艺术博物馆得益于帝国的征服和对非欧洲文化的掠夺。这些艺术博物馆收集了以前用于宗教仪式、室内装饰或表现个人财富的「美丽的」功能物品,并将它们作为艺术作品展出。为了纯粹被观看,它们成了非功能化的「自在的」物品。

英国维多利亚和阿尔伯特博物馆(V&A,成立于1852年)收藏展出的莫卧儿帝国皇室用酒盅
一切艺术都源于设计,无论是宗教设计还是权力设计。在现代,设计也先于艺术。在今天的博物馆中寻找现代艺术时,我们必须意识到「什么被视为艺术」。
去功能化的设计片段——从杜尚的小便池到沃霍尔(Warhol)的「布里洛盒子」(Brillo Boxes),以及乌托邦的设计——从德国青年风格派(Jugendstil)到包豪斯,从俄国新浪潮到唐纳德·贾德(Donald Judd),都在塑造未来的「新生活」。

1964年,安迪·沃霍尔完成装置艺术作品「布里洛盒子」(费城艺术博物馆,PMA)
艺术是一种「失能的」设计,因为为其提供基础的社会遭受了历史性的崩溃,如印加帝国或苏维埃俄国。
然而,在现代的进程中,艺术家开始主张他们艺术的自主性——不受公众舆论和公众品味影响的自主性。艺术家要求有权就其作品的内容和形式作出主权决定,而不用对公众作出任何解释或辩护。
艺术家只在一定程度上被赋予这一权利。按照自己的意愿创作艺术的自由并不能保证艺术家的作品会在公共空间中展出。公开展览中的任何艺术作品都必须至少在潜在的公开场合解释和辩护。
尽管艺术家、策展人和艺术评论家可以自由地主张或反对将某些艺术作品纳入其中,但每一次这样的解释和辩护都破坏了现代主义艺术所追求的、艺术自由的自主性及其主权特质。
每一篇为艺术品辩护的文章,都可能被视为对该艺术品的侮辱。这就是为什么策展人被认为是一个不断在艺术作品和观众之间穿梭的人,他让艺术家和观众都倍感无助。
因此,艺术市场似乎比博物馆或艺术馆(Kunsthalle)更有利于现代的、自主的艺术。在艺术市场上,艺术作品呈现出单一化、脱离语境、粗制滥造的特点,这显然为他们创造了一个在没有任何调解(mediation)的情况下展示其主权原点的机会。
艺术市场的功能符合马塞尔·莫斯(Marcel Mauss)和乔治·巴塔耶(Georges Bataille)所描述的「夸富宴」(Potlatch)的规则。【*】艺术家主宰的决定使一件艺术品超越任何辩护,却被一个私人买家主宰的决定所压倒,即为这件艺术品支付一笔无法理解的金钱。
【*】在人类学著作《礼物》(The Gift)中,马塞尔·莫斯提出了「夸富宴」的理论。这是一种具有政治、宗教、亲属关系和经济影响的捐赠制度。这些社会的经济以竞争性的礼物交换为标志。在这种交换中,赠送礼物的人试图超过他们的竞争对手,以便获得更重要的政治、亲属关系和宗教角色。除了北美西北海岸的一些印第安文明传统里有这种礼物经济类型,它还出现在许多热带岛屿的文明中。受《礼物》和尼采《道德的谱系》(On the Genealogy of Morality)影响,乔治·巴塔耶在《被诅咒的部分》(The Accursed Share)一书中发展了这一理论。
现在,艺术装置并不流传(circulate)。相反,它安置在我们文明流传的一切东西上:物品、文本、电影等等。同时,它以一种非常激进的方式改变了展览空间的角色和功能。
装置通过用于展览的公共空间的象征私有化来运作。它可能看起来是一个标准的、策展的展览,但它的空间是根据一个独立艺术家的自主意志设计的。他不该为所选物品公开辩护,也不该为整个装置空的间组织辩护。
装置经常被拒认为一种特定的艺术形式,因为它的媒介并不明显。传统的艺术媒体都是由特定的材料支持来定义的:画布、石头或电影。装置媒介的材料支撑是空间本身。
然而,这并不意味着装置在某种程度上是「非材料的」。相反,装置具有卓越的材料性能,因为它是空间的——「在空间中」是材料的最一般定义。
这个装置把空的、中性的、公共的空间变成了一个单独的艺术作品。它邀请参观者把这个空间作为一个整体的、累加的艺术作品去体验它。任何包含在这样一个空间里的东西都会成为艺术品的一部分,因为它被「放在」这个空间里。
艺术对象和简单对象之间的区别在这里变得无关紧要。相反,至关重要的是被标记的装置空间和未标记的公共空间之间的区别。
1970年,当马塞尔·布达埃尔(Marcel Broodthaers)在杜塞尔多夫美术馆(Dusseldorf Kunsthalle)展示作品「现代艺术」时,他在每一个展品旁边都贴了一个标语:「这不是一件艺术品。」
然而,总的来说,他的装置作品被认为是一件艺术品,而且并非毫无理由。这是一个类似于策划的类比。
但这正是关键所在。在这里,选择和表现方式是艺术家单独享有的主权特权。它完全由不需要任何进一步解释或辩护的个人决定。艺术装置是将艺术家的主宰范围从个体艺术对象扩展到展览空间本身的主权范围的一种方式。
这就意味着艺术装置是一个空间,在这个空间里,艺术家的主权自由与策展人的制度自由之间的差别会立即显现出来。
艺术在当代西方文化中运作的制度(regime)通常被认为是赋予艺术自由的制度。但是艺术的自由对策展人和艺术家来说意味着不同的东西。
正如我所提到的,策展人(包括所谓的独立策展人)最终以民主公众的名义进行选择。实际上,为了对公众负责,策展人不需要成为任何固定机构的一部分:他或她已经是一个机构的定义了。
因此,策展人有义务公开证明他或她的选择是合理的。当然,策展人应该有自由向公众展示他或她的论点,但这种公开讨论的自由与艺术自由无关。它应该被理解为私人的、个人的、主观的自由,超越任何论证、解释或辩护。
在艺术自由的体制下,每名艺术家都有根据个人想象创作艺术的主权权利。以这种或那种方式创作艺术的自主决定被西方自由社会普遍接受,这是假定艺术家实践为「合法」的充分理由。
当然,一件艺术品也可以被批评和拒绝,但它只能作为一个整体被拒绝。批评艺术家的任何特定选择、接纳或排挤是没有意义的。从这个意义上说,一个艺术装置的全部空间也只能作为一个整体被否定。
回到布达埃尔的例子,没有人会批评这位艺术家忽视了这个或那个特定的形象,也没人会批评他装置中的那只特别的鹰。

1972年,马塞尔·布达埃尔在杜塞尔多夫美术馆中展出了他的装置作品
我们可以说,在西方社会,自由的概念是非常模糊的,不仅在艺术领域是这样,在政治领域也如此。
西方的自由被理解为允许在社会实践的诸多领域做出私人的、主权的决定,比如个人消费、私产资本或选择自己的宗教。但在其他一些领域,特别是在政治领域,自由主要被理解为受法律保护的公开讨论的自由——非主权的、有条件的、制度上的自由。
当然,在我们的社会中,私人的、主权的决定在一定程度上受到舆论和政治机构的控制(我们都知道著名的口号「个人的就是政治的」)。另一方面,公开的政治讨论一次又一次地被政治行为者的私人的、主权的决定打断,并被私人利益操纵(然后私人利益又被私人利益所操纵)。
艺术家和策展人以一种显而易见的方式体现出两种不同的自由——艺术创作中主权的、无条件的、公开的、无需负责的自由;策展人制度性的、有条件的、公开负责的自由。
此外,这意味着艺术装置(艺术生产行为与其表现行为相一致的艺术装置)成为揭示和探索西方自由概念核心模糊性的完美试验田。因此,在过去的几十年里,我们看到了创新策展项目的出现,它们似乎赋予策展人以艺术家那种方式行事的权力。
我们也看到,艺术实践寻求合作、民主、传播和授权。事实上,如今的艺术装置常常被视为一种形式,允许艺术家使他或她的艺术民主化,承担起公众责任,开始以某个共同体甚至整个社会的名义行事。
从这个意义上说,艺术装置的出现似乎标志着现代主义对自治和主权的追求的终结。艺术家允许大量参观者进入艺术品空间,因为艺术作品是向民主开放的封闭空间。
这个封闭的空间似乎转变为公众讨论、民主实践、沟通、社交网络、教育等平台。但这种装置艺术实践容易忽略将展览的公共空间私有化的象征行为,这是在向参观者开放装置空间之前的行为。
如前所言,传统展览的空间是一种具有象征意义的公共财产,管理这个空间的策展人以舆论的名义行事。一个典型展览的参观者仍然停留在他或她自己的领地上,而艺术作品使他或她成为被观看、被评判的空间的象征性所有者。
相反,艺术装置的空间是艺术家的象征性私产。进入这个空间后,参观者离开了具有民主合法性的公共领土,进入了威权控制的空间。
可以说,参观者是被放逐的异乡人。他们必须服从艺术家处拿来的他国律法。在这里,作为装置空间主权人的艺术家充当着立法者的角色。
有人可能会说,装置实践揭示了无条件的、主权的暴力行为,这种行为最初会建立任何民主秩序。我们知道,民主秩序从来没有以民主的方式产生。民主秩序总是因暴力革命而产生。
制定法律就是违反法律。第一个立法者不能以合法的方式行事。他建立了政治秩序。即使他后来决定服从他的命令,但他不属于它。
一个艺术装置的作者也是这样一个立法者,他给予参观群体以空间来组成自己,并定义这个群体必须遵守的规则,但此时作者并不属于这个群体,而是留在这个群体之外。
即使艺术家决定加入他或她创造的共同体,该结论仍然成立。
人们也不应该忘记,在启动某种特定的秩序(政治制度、参观者共同体)之后,装置艺术家必须依靠艺术机构来维持这个秩序,来监督装置参观者的流动政治。
关于警察在一个主权国家中的角色,雅克·德里达(Jacques Derrida)认为,尽管警察会对某些法律的运作起到监督的作用,但他们实际上也参与创造了法律。维护一项法律也意味着要永久地重新发明这项法律。
德里达试图表明,暴力、革命、建立法律和秩序的主权行为永远不会在事后被完全抹杀,这种最初的暴力行为可以而且将永远被动员起来。
在现在这个民主中充满暴力输出、植入和保护的时代,这一点尤其明显。但不要忘了,装置是可以移动的。
艺术装置不必在特定的地点,它可以安放在任何地方和任何时间。我们不应该幻想有任何东西可以像一个完全混乱的、达达主义的装置空间,不受任何控制。
萨德侯爵(Marquis de Sade)提出过一个完全自由的社会愿景。这个社会废除了所有现存的法律,只规定了一条:每个人都必须做自己喜欢的事,包括犯下任何类型的罪行。
尤其令人感兴趣的是,萨德同时谈到了执法的必要性,以防止一些传统思想的公民企图回到「家庭有保障」、「犯罪被禁止」的旧有专制国家因此,我们也需要警察来保护这些罪行,以抵御对旧道德秩序的反动怀旧情绪。
然而,构成民主组织共同体的暴力行为没有违背民主的本质。主权自由显然是不民主的,它似乎是反民主的。然而,尽管乍看起来自相矛盾,但主权自由是任何民主秩序出现的必要先决条件。
同样,艺术装置的实践是这一规则的绝好例证。标准的艺术展览只留给参观者一个人,让他或她可以单独面对和注视展出的艺术品。
从一个物体到另一个物体,这样的个体参观者必然会忽略作为整体的展览空间,包括他或她自己在其中的位置。相反,艺术装置恰恰因为装置空间的整体性、统一性而建立起观众「群」。
艺术装置的真正参观者不是一个孤立的个体,而是一群参观者。这样的艺术空间只能被一群参观者感知。这群人成为每个展览的一部分,反之亦然。
大众文化的这个维度常常被忽视,在艺术的语境中却表现得尤为明显。一场流行音乐会或电影放映会在参加者中创造出一个共同体。
这些临时的共同体成员彼此间并不了解,他们的结构是偶然的。不清楚他们从哪里来,要到哪里去。他们彼此也没什么可说的。他们缺乏一种共同的身份或过去的历史,无法为他们提供共同的回忆。
然而,他们是一个共同体。这些共同体类似于火车或飞机上的旅客。
换言之,这就是当代社会,远比宗教、政治或工作共同体重要。
所有传统的共同体都基于这样一个前提,即他们的成员从一开始就被源自过去的东西联系在一起:共同的语言、共同的信仰、共同的政治和历史、共同的教养。
这样的共同体会在自己人和陌生人之间建立界限,因为他们与陌生人之间没有共同的过去。相比之下,大众文化创造的共同体超越了这一点。这正是它巨大的现代化潜力所在,而这种潜力常常被忽视。
然而,大众文化本身并不能充分反映和展现这种潜力,因为它所创造的共同体并没有充分认识到自身的潜力。同样,在当代博物馆和艺术馆的标准展览空间中穿行的群众也是如此。
人们常说博物馆是精英主义的,我一直被这一观点所震惊。我个人的经历与之相反,我是一群不断涌入展览和博物馆的参观者中的一员。
任何曾经在博物馆附近寻找停车场,或者试图在博物馆寄存处落下外套,或者需要找厕所的人都有理由怀疑这个机构的精英主义特征,特别是那些被认为属于精英的博物馆,例如大都会博物馆(MM)或纽约现代艺术博物馆(MoMA)。
如今,全球旅游业潮流让任何精英人士都认为,博物馆可能是一个荒谬的假设。这群受众并没有反映他们自己——他们不构成任何政治。
不过,流行音乐「粉丝」或电影观众的视角就太超前了,(在舞台上或银幕上)他们充分地感知和反映自己所处的空间或他们所处的共同体。这是当代艺术引发的反思,装置艺术是这样,别的实验策展项目也是如此。
装置空间提供的相对空间分离并不意味着远离世界,而是大众文化共同体的去本土化和去疆域化。亦即,以一种帮助他们反思自身状况的方式,给他们展示自我的机会。
当代艺术空间是一个空间,在这个空间里,许多人可以看到自己、赞美自己,就像上帝或国王在教堂和宫殿里观看和庆祝一样。【*】

【*】托马斯·斯特鲁斯(Thomas Struth)90年代拍摄的博物馆照片很好地捕捉到了这个维度。
最重要的是,装置为流动的参观者提供了「此时此地」的「光晕」(「aura」)。正如本雅明所描述的那样,这是一种大众文化版本的「闲荡者」(「flânerie」),这里是「光晕」出现的地方,服务于于「世俗的启迪」(「profane illumination」)。
一般来说,这种装置的运作方式是复制的颠倒(reversal of reproduction)。装置从一个未被标记的、开放的匿名空间中取出一份拷贝,并将其放置在一个固定的、稳定的、封闭的、拓扑定义明确的「此时此地」语境中。
我们当代的状况不可能像本雅明有关「机械复制时代的艺术作品」的著名文章中所描述的那样,在「此时此地」之外的复制品流通中「失去了光晕」。
更确切地说,「当代」编排出一种复杂的错位(dislocation)和重新定位(relocation)、去疆域化和再疆域化、去光晕化和再光晕化的相互作用。
在这个前提下,失去它独特的、原始的背景意味着一件艺术品永远失去它的「光晕」。它成为它自己的复制品。
要重新观察一件艺术品,就需要将传播这件艺术品的大众世俗空间神圣化。当然,这是一个极权主义的法西斯计划。
这是本雅明思想中要探寻的主要问题。他把一件复制品的空间看成是一个普遍的、中性的、同质的空间。本雅明坚持视觉上的可识别性,坚持在当代文化中传播的复制品的自我同一性。
但在本雅明的著作中,这两个主要的前提都值得怀疑。在当代文化框架下,一个图像永恒不断地从一个媒介传播到另一个媒介,从一个封闭的语境传播到另一个封闭的语境。
例如,一些电影片段可以显示在电影院荧幕上,也转换成数字形式出现在别人的网站上;或者被用作例子,显示在一次会议演示上;或者被放到私人客厅的电视上观看,或者放到博物馆的装置上。
通过这种方式,即不同的语境和媒介,这一小段影片被不同的程序语言、不同的软件、屏幕上不同的框架、装置空间中不同的位置所转换,等等。
那么,我们都在使用同样的电影片段吗?是同一份原件的同一份副本吗?
当今的通信、生成、翻译和图像传播网络的拓扑结构非常异构。这些图像在这些网络中不断地被转换、重写、重新编辑和编程——每一步都被「视觉化地」改变着。他们作为复制品的地位变成了一种日常的文化惯例,就像以前的情况一样。
本雅明认为,这种新技术能够越来越逼真地复制副本,事实却恰恰相反。现代技术认为,从一代硬件和软件传输信息到下一代,将是一种重要的转变。「世代」这个隐喻概念现在被运用于技术环境,这一概念尤其具有启示意义。
我们都知道将某种文化遗产从一代传到另一代的意义。没有永恒的复制品,因为没有永恒的原作。复制同样受到原创的影响,就像原创受到复制的影响一样。
在不同的语境中,一件复制品变成了一系列不同的原件。
每一次语境的变化,每一次媒介的变化,都可以被解释为对复制状态的否定——一种本质上的断裂,一种打开新的未来的开始。从这个意义上说,复制从来都不是真正的复制,而是新的原作,只不过是在新的环境中。

蔡国强最后一次在泉州惠屿岛完成《天梯》
每一件复制品本身都是一次次的「闲荡」体验,它的「世俗的启迪」把自身变成了原作,失去了古老的「光晕」并获得新的「光晕」。
它可能仍然是同一个复制品,但它变成了不同的原件。这是一个后现代的项目,反映出图像的重复、反复和繁殖的特性。这也是为什么后现代艺术往往看起来很「新」,即使是新的概念。
我们如何识别一个特定的图像,说它是「原始的」或「复制的」?这取决于语境。
在这个语境中,这个决定被采纳了。这个决定始终是一个当代的决定,它不属于过去,也不属于未来,而是属于现在。这个决定也始终是一个主权的决定——事实上,装置是一个做决定的空间,用来完成「此时此地」「世俗的启迪」。
因此,我们可以说,装置艺术实践表明,任何民主空间都依赖于艺术家作为其立法者的私人主权决定。这是古希腊思想家非常熟悉的东西,因为这是早期民主革命的起源。但最近,这种知识不知何故被占主导地位的政治话语所压制。特别是在福柯之后,我们倾向于在非个人的机构、结构、规则和协议中发现权力的来源。
然而,这种对权力的非个人机制的注视使我们忽视了个人、主权决定和行动在私人、异位(heterotopic)空间中的重要性(借用福柯提出的另一个术语)。同样,现代的民主力量也有「元社会」(「meta-social」)、「元公共」(「meta-public」)和异位起源(heterotopic origins)。
如前所述,设计某种装置空间的艺术家是这个空间的局外人。他或她对这个空间是异位的。但是局外人不一定非得被包括进去才能被授权。也有被排斥的赋权,尤其是自我排斥。
局外人之所以强大,正是因为他或她不受社会控制,他或她的主权行为不受任何公众讨论或任何公众自我辩护需要的限制。如果认为通过现代进步和民主革命就可以完全消灭这种强大的外部力量,那就错了。
进步是理性的。但并非偶然,我们的文化认为艺术家是疯子——至少是被迷住了。福柯认为,巫医、女巫和先知在我们的社会中再也没有突出的地位了,他们成了弃儿,被关在精神病诊所里。但我们的文化主要是一种名人文化,如果你不疯狂(或者至少假装疯狂),你就不可能成为名人。
显然,福柯读了太多的科学书籍,却没读多少社会杂志和八卦杂志,否则他会知道今天的疯子在哪里获得他们真正的社会地位。
众所周知,当代政治精英是全球名人文化的一部分。也就是说,名人文化是他们统治的社会的外部因素。全球的、超民主的、跨越主权国家的、任一民主组织的共同体之外,这些精英在结构上都是疯癫的。
现在,这些反思不应该被误解为对装置艺术形式的批判,并表现出它的主体性。毕竟,艺术的目标不是改变事物——事物总是在自己改变。艺术的作用与其说是展示,不如说是让人们看到那些通常被忽视的现实。
通过以一种非常明确的方式对装置空间的设计承担美学责任,艺术家揭示了政治在很大程度上试图掩盖被当代民主秩序隐藏的主权维度。装置空间是我们直面当代自由概念模糊特征的地方,它在我们的民主中起着主权和体制自由张力的作用。
因此,艺术装置是一个(海德格尔意义上)不被遮蔽的(unconcealment)空间,有着异位的、主权的权力,隐藏在民主秩序的不透明背后。

Measure
Measure

Evernote helps you remember everything and get organized effortlessly. Download Evernote.

Thursday, July 12, 2018

On Semicolons and the Rules of Writing - The Millions

On Semicolons and the Rules of Writing

Adam O'Fallon Price July 10, 2018 | 3 books mentioned 4 7 min read

Related Books:

1.
Kurt Vonnegut's caution against the use of semicolons is one of the most famous and canonical pieces of writing advice, an admonition that has become, so to speak, one of The Rules. More on these rules later, but first the infamous quote in question: "Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college."

To begin with the lowest-hanging fruit here—fruit that is actually scattered rotting on the ground—the "transvestite hermaphrodite" bit has not aged well. The quote also, it seems, may have been taken out of context, as it is followed by several more sentences of puzzlingly offensive facetiousness, discussed here.

That said, I also have no idea what it means. My best guess is that he means semicolons perform no function that could not be performed by other punctuation, namely commas and periods. This obviously isn't true—semicolons, like most punctuation, increase the range of tone and inflection at a writer's disposal. Inasmuch as it's strictly true that you can make do with commas, the same argument that could be made of commas themselves in favor of the even unfussier ur-mark, the period. But that is a bleak thought experiment unless you are such a fan of Ray Carver that you would like everyone to write like him.

Finally, regarding the college part, two things: First, semicolon usage seems like an exceedingly low bar to set for pretentiousness. What else might have demonstrated elitism in Vonnegut's mind? Wearing slacks? Eating fish? Second, in an era of illiterate racist YouTube comments, to worry about semicolons seeming overly sophisticated would be splitting a hair that no longer exists.

But however serious Vonnegut was being, the idea that semicolons should be avoided has been fully absorbed into popular writing culture. It is an idea pervasive enough that I have had students in my writing classes ask about it: How do I feel about semicolons? They'd heard somewhere (as an aside, the paradoxical mark of any maxim's influence and reach is anonymity, the loss of the original source) that they shouldn't use them. To paraphrase Edwin Starr, semicolons—and rules about semicolons—what are they good for?

As we know, semicolons connect two independent clauses without a conjunction. I personally tend to use em dashes in many of these spots, but only when there is some degree of causality, with the clause after the em typically elaborating in some way on the clause before it, idiosyncratic wonkery I discussed in this essay. Semicolons are useful when two thoughts are related, independent yet interdependent, and more or less equally weighted. They could exist as discrete sentences, and yet something would be lost if they were, an important cognitive rhythm. Consider this example by William James:

I sit at the table after dinner and find myself from time to time taking nuts or raisins out of the dish and eating them. My dinner properly is over, and in the heat of the conversation I am hardly aware of what I do; but the perception of the fruit, and the fleeting notion that I may eat it, seem fatally to bring the act about.

The semicolon is crucial here in getting the thought across. Prose of the highest order is mimetic, emulating the narrator or main character's speech and thought patterns. The semicolon conveys James's mild bewilderment at the interconnection of act (eating the raisins) and thought (awareness he may eat the raisins) with a delicacy that would be lost with a period, and even a comma—a comma would create a deceptively smooth cognitive flow, and we would lose the arresting pause in which we can imagine James realizing he is eating, and realizing that somehow an awareness of this undergirds the act.

An em dash might be used—it would convey the right pause—but again, ems convey a bit of causality that would be almost antithetical to the sentence's meaning. The perception follows temporally, but not logically. In fact, James is saying he doesn't quite understand how these two modes of awareness coexist.

Or consider Jane Austen's lavish use of the semicolon in this, the magnificent opening sentence of Persuasion:

Sir Walter Elliot, of Kellynch Hall, in Somersetshire, was a man who, for his own amusement, never took up any book but the Baronetage; there he found occupation for an idle hour, and consolation in a distressed one; there his faculties were roused into admiration and respect, by contemplating the limited remnant of the earliest patents; there any unwelcome sensations, arising from domestic affairs changed naturally into pity and contempt as he turned over the almost endless creations of the last century; and there, if every other leaf were powerless, he could read his own history with an interest which never failed.

Periods could be ably used here, but they would not quite capture the drone of Elliot's stultifying vanity. Again, form follows function, and the function here is to characterize the arrogantly dull mental landscape of a man who finds comprehensive literary solace in the baronetage. More than that, the semicolons also suggest the comic agony of being trapped in a room with him—they model the experience of listening to a self-regarding monologue that never quite ends. We hardly need to hear him speak to imagine his pompous tone when he does.

The semicolon's high water usage mark, as shown here, was the mid-18th to mid-/late 19th centuries. This is hardly surprising, given the style of writing during this era: long, elaborately filigreed sentences in a stylistic tradition that runs from Jonathan Swift to the James brothers, a style that can feel needlessly ornate to modern readers. Among other virtues (or demerits, depending on your taste in prose), semicolons are useful for keeping a sentence going. Reflecting on the meaning of whiteness in Moby Dick, Melville keeps the balls in the air for 467 words; Proust manages 958 in Volume 4 of Remembrance of Things Past during an extended, controversial rumination on homosexuality and Judaism. There is a dual effect in these examples and others like them of obscuring meaning in the process of accreting it, simultaneously characterizing and satirizing the boundaries of human knowledge—a sensible formal tactic during an era when the boundaries of human knowledge were expanding like a child's balloon.

Stylistically, the latter half of the 20th century (and the 21st) has seen a general shift toward shorter sentences. This seems understandable on two fronts. First—and this is total conjecture—MFA writing programs came to the cultural fore in the 1970s and over the last few decades have exerted an increasing influence on literary culture. I am far from an MFA hater, but the workshop method does often tend to privilege an economy of storytelling and prose, and whether the relationship is causal or merely correlational, over the last few decades a smooth, professionalized, and unextravagant style has been elevated to a kind of unconscious ideal. This style is reflexively praised by critics: "taut, spare prose" is practically a cliche unto itself. Additionally, personal communication through the 20th century to today has been marked by increasing brevity. Emails supplant letters, texts supplant emails, and emojis supplant texts. It stands to reason that literary writing style and the grammar it favors would, to a degree, reflect modes of popular, nonliterary writing.

Beyond grammatical writing trends, though, semicolons are a tool often used, as exemplified in the Austen and James examples, to capture irony and very subtle shades of narrative meaning and intent. It might be argued that as our culture has become somewhat less interested in the deep excavations of personality found in psychological realism—and the delicate irony it requires—the semicolon has become less useful. Another interesting (though possibly meaningless) chart from Vox displays the trend via some famous authors. As fiction has moved from fine-grained realism into postmodern satire and memoir, has the need for this kind of fine-grained linguistic tool diminished in tandem?

Maybe. In any case, I have an affection for the semi, in all its slightly outmoded glory. The orthographical literalism of having a period on top of a comma is, in itself, charming. It is the penny-farthing of punctuation—a goofy antique that still works, still conveys.

2.
A larger question Vonnegut's anti-semicolonism brings up might be: Do we need rules, or Rules, at all? We seem to need grammatical rules, although what seem to be elemental grammatical rules are likely Vonnegutian in provenance and more mutable than they seem. For instance, as gender norms have become more nuanced, people—myself included—have relaxed on the subject of the indeterminately sexed "they" as a singular pronoun. Likewise, the rule I learned in elementary school about not ending sentences with prepositions. Turns out there's no special reason for this, and rigid adherence to the rule gives you a limited palette to work with (not a palette with which to work).

We know, on some level, that writing rules are there to be broken at our pleasure, to be used in the service of writing effectively, and yet writing is such a difficult task that we instinctively hew to any advice that sounds authoritative, cling to it like shipwrecked sailors on pieces of rotten driftwood. Some other famous saws that come to mind:

Henry James: "Tell a dream, lose a reader."

Elmore Leonard: "Never open a book with weather."

John Steinbeck: "If you are using dialogue—say it aloud as you write it. Only then will it have the sound of speech."

Annie Dillard: "Do not hoard what seems good for a later place."

Stephen King: "The road to hell is paved with adverbs."

And more Kurt Vonnegut: "Every character should want something, even if it is only a glass of water"; "Every sentence must do one of two things—reveal character or advance the action"; "Start as close to the end as possible."

In the end, of course, writing is a solitary pursuit, and for both good and ill no one is looking over your shoulder. As I tell my students, the only real writing rule is essentially Aleister Crowley's Godelian-paradoxical "Do what thou wilt, that shall be the whole of the law." Or alternately, abide by the words of Eudora Welty: "One can no more say, 'To write stay home,' than one can say, 'To write leave home.' It is the writing that makes its own rules and conditions for each person."

Image: Flickr/DaveBleasdale

Adam O'Fallon Price is a writer and teacher living in Carrboro, NC.  The paperback edition of his first novel, The Grand Tour, is now available on Anchor Books.  His short fiction has appeared in The Paris Review, VICE, The Iowa Review, Glimmer Train, and many other places.  His podcast, Fan's Notes, is an ongoing discussion about books and basketball.  Find him online at adamofallonprice.com and on Twitter at @AdamOPrice.

Measure
Measure
Evernote helps you remember everything and get organized effortlessly. Download Evernote.

Thomas Bayes and the crisis in science – TheTLS

Thomas Bayes and the crisis in science

DAVID PAPINEAU

Footnotes to Plato is a TLS Online series appraising the works and legacies of the great thinkers and philosophers

We are living in new Bayesian age. Applications of Bayesian probability are taking over our lives. Doctors, lawyers, engineers and financiers use computerized Bayesian networks to aid their decision-making. Psychologists and neuroscientists explore the Bayesian workings of our brains. Statisticians increasingly rely on Bayesian logic. Even our email spam filters work on Bayesian principles.

It was not always thus. For most of the two and a half centuries since the Reverend Thomas Bayes first made his pioneering contributions to probability theory, his ideas were side-lined. The high priests of statistical thinking condemned them as dangerously subjective and Bayesian theorists were regarded as little better than cranks. It is only over the past couple of decades that the tide has turned. What tradition long dismissed as unhealthy speculation is now generally regarded as sound judgement.

We know little about Thomas Bayes's personal life. He was born in 1701 into a well-to-do dissenting family. He entered the Presbyterian ministry after studying logic and theology at Edinburgh and lived in Tunbridge Wells for most of his adult life. Much of his energy seems to have been devoted to intellectual matters. He published two papers during his lifetime, one on theology, and the other a defence of Newton's calculus against Bishop Berkeley's criticisms. The latter impressed his contemporaries enough to win him election to the Royal Society.

The work for which he is best known, however, was published posthumously. Bayes died in 1761, but for some time up to his death he had been working on a paper entitled "An Essay towards Solving a Problem in the Doctrine of Chances". His work was passed on to his friend Richard Price, who arranged for it to be presented to the Royal Society in 1763. Bayes's essay marks a breakthrough in thinking about probability.

Probability theory was in its infancy in Bayes's day. Strange as it may seem, before the seventeenth century nobody could calculate even such simple chances as that of a normal coin landing five heads in a row. It wasn't that the information wouldn't have been useful. There was plenty of gambling before modernity. But somehow no one could get their head around probabilities. As Ian Hacking put in his groundbreaking The Emergence of Probability (1975), someone in ancient Rome "with only the most modest knowledge of probability mathematics could have won himself the whole of Gaul in a week".

By Bayes's time, the rudiments of probability had finally been forged. Books such as Abraham de Moivre's The Doctrine of Chances (1718) explained the basic principles. They showed how to calculate the probability of five heads on a normal coin (it is 1/32) and indeed more complex probabilities like five heads on a coin biased 75 per cent in favour of heads (that would be 243/1024 – about ¼). At last it was possible for gamblers to know which bets are good in which games of chance.

Not that the Reverend Bayes was any kind of gambler. What interested him was not the probability of results given different causes (like the probability of five heads given different kinds of coin). Rather he wanted to know about the "inverse probability" of the causes given the results. When we observe some evidence, what's the likelihood of its different possible causes? Some commentators have conjectured that Bayes interest in this issue was prompted by David Hume's sceptical argument in An Enquiry Concerning Human Understanding (1748) that reports of miracles are more likely to stem from inventive witnesses than the actions of a benign deity. Be that as it may, Bayes's article was the first serious attempt to apply mathematics to the problem of "inverse probabilities".

Bayes's paper analyses a messy problem involving billiard balls and their positions on a table. But his basic idea can be explained easily enough. Go back to the coins. If five tosses yield five heads in a row, then how likely is it that the coin is fair rather than biased? Well, how long is a piece of string? In the abstract, there's no good answer to the question. Without some idea of the prevalence of biased coins, five heads doesn't really tell us anything. Maybe we're spinning a dodgy coin, or perhaps we just got lucky with a fair one. Who knows?

What Bayes saw, however, was that in certain cases the problem is tractable. Suppose you know that your coin comes from a minting machine that randomly produces one 75 per cent heads-biased coin for every nine fair coins. Now the inverse probabilities can be pinned down. Since five heads is about eight times more likely on a biased than a fair coin, we'll get five heads from a biased coin eight times for every nine times we get it from a fair one. So, if you do see five heads in a row, you can conclude that the probability of that coin being biased is nearly a half. By the same reasoning, if you see ten heads in a row, you can be about 87 per cent sure the coin is biased. And in general, given any observed sequence of results, you can work out the probability of the coin being fair or biased.

Most people who have heard of Thomas Bayes associate him primarily with "Bayes's theorem". This states that the probability of A given B equals the probability of B given A, times the probability of A, divided by the probability of B. So, in our case, Prob(biased coin/five heads) = Prob(five heads/biased coin) x Prob(biased coin) / Prob(five heads).

As it happens, this "theorem" is a trivial bit of probability arithmetic. (It falls straight out of the definition of Prob(A/B) as Prob(A&B) / P(B).) Because of this, many dismiss Bayes as a minor figure who has done well to have the contemporary revolution in statistical theory named after him. But this does a disservice to Bayes. The focus of his paper is not his theorem, which appears only in passing, but the logic of learning from evidence.

What Bayes saw clearly was that, in any case where you can compute Prob(A/B), this quantity provides a recipe for adjusting your confidence in A when you learn B. We start off thinking there's a one-in-ten chance of a biased coin but, once we observe five heads, we switch to thinking it's an even chance. Bayes's "theorem" is helpful because it shows that evidence supports a theory to the extent the theory makes that evidence likely –  five heads support biasedness because biasedness makes five heads more likely. But Bayes's more fundamental insight was to see how scientific methodology can be placed on a principled footing. At bottom, science is nothing if not the progressive assessment of theories by evidence. Bayes's genius was to provide a mathematical framework for such evaluations.

Bayes's reasoning works best when we can assign clear initial probabilities to the hypotheses we are interested in, as when our knowledge of the minting machine gives us initial probabilities for fair and biased coins. But such well-defined "prior probabilities" are not always available. Suppose we want to know whether or not heart attacks are more common among wine than beer drinkers, or whether or not immigration is associated with a decline in wages, or indeed whether or not the universe is governed by a benign deity. If we had initial probabilities for these hypotheses, then we could apply Bayes's methodology as the evidence came in and update our confidence accordingly. Still, where are our initial numbers to come from? Some preliminary attitudes to these hypotheses are no doubt more sensible than others, but any assignment of definite prior probabilities would seem arbitrary.

It was this "problem of the priors" that historically turned orthodox statisticians against Bayes. They couldn't stomach the idea that scientific reasoning should hinge on personal hunches. So instead they cooked up the idea of "significance tests". Don't worry about prior probabilities, they said. Just reject your hypothesis if you observe results that would be very unlikely if it were true.

This methodology was codified at the beginning of the twentieth century by the rival schools of Fisherians (after Sir Ronald Fisher) and Neyman-Pearsonians (Jerzy Newman and Egon Pearson). Various bells and whistles divided the two groups, but on the basic issue they presented a united front. Forget about subjective prior probabilities. Focus instead on the objective probability of the observed data given your hypothesized cause. Pick some level of improbability you won't tolerate (the normally recommended level was 5 per cent). Reject your hypothesis if it implies the observed data are less likely than that.

In truth, this is nonsense on stilts. One of the great scandals of modern intellectual life is the way generations of statistics students have been indoctrinated into the farrago of significance testing. Take coins again. In reality you won't meet a heads-biased coin in a month of Sundays. But if you keep tossing coins five times, and apply the method of significance tests "at the 5 per cent level", you'll reject the hypothesis of fairness in favour of heads-biasedness whenever you see five heads, which will be about once every thirty-second coin, simply because fairness implies that five heads is less likely than 5 per cent.

This isn't just an abstract danger. An inevitable result of statistical orthodoxy has been to fill the science journals with bogus results. In reality genuine predictors of heart disease, or of wage levels, or anything else, are very thin on the ground, just like biased coins. But scientists are indefatigable assessors of unlikely possibilities. So they have been rewarded with a steady drip of "significant" findings, as every so often a lucky researcher gets five heads in a row, and ends up publishing an article reporting some non-existent discovery.

Science is currently said to be suffering a "replicability crisis". Over the last few years a worrying number of widely accepted findings in psychology, medicine and other disciplines have failed to be confirmed by repetitions of the original experiments. Well-known psychological results that have proved hard to reproduce include the claim that new-born babies imitate their mothers' facial expressions and that will power is a limited resource that becomes depleted through use. In medicine, the drug companies Bayer and Amgen, frustrated by the slow progress of drug development, discovered that more than three-quarters of the basic science studies they were relying on didn't stand up when repeated. When the journal Nature polled 1,500 scientists in 2016, 70 per cent said they had failed to reproduce another scientist's results.

This crisis of reproducibility has occasioned much wringing of hands. The finger has been pointed at badly designed experiments, not to mention occasional mutterings about rigged data. But the only real surprise is that the problem has taken so long to emerge. The statistical establishment has been reluctant to concede the point, but failures of replication are nothing but the pigeons of significance testing coming home to roost.

Away from the world of academic science and its misguided anxieties about subjectivity, practical investigators have long benefited from Bayesian methods. When actuaries set premiums for new markets, they have no alternative but to start with some initial assessments of the risks, and then adjust them in the light of experience. Similarly, when Alan Turing and the other code-breakers at Bletchley Park wanted to identify that day's German settings on the Enigma machine, they began with their initial hunches, and proceeded systematically on that basis. No doubt the actuaries' and code-breakers' initial estimates involved some elements of guesswork. But an informed guess is better than sticking your head in the sand, and in any case initial misjudgements will tend to be rectified as the data come in.

The advent of modern computers has greatly expanded the application of these techniques. Bayesian calculations can quickly become complicated when a number of factors are involved. But in the 1980s Judea Pearl and other computer scientists developed "Bayesian networks" as a graph-based system for simplifying Bayesian inferences. These networks are now used to streamline reasoning across a wide range of fields in science, medicine, finance and engineering.

The psychologists have also got in on the act. Statisticians might be ideologically resistant to Bayesian logic, but the unconscious brain processes of humans and other animals have no such scruples. If your visual system is trying to identify some object in the corner of the room, or which words you are reading right now, the obvious strategy is for it to begin with some general probabilities for the likely options, and then adjust them in the Bayesian way as it acquires more evidence. Much research within contemporary psychology and neuroscience is devoted to showing how "the Bayesian brain" manages to make the necessary inferences.

The vindication of Bayesian thinking is not yet complete. Perhaps unsurprisingly, many mainstream university statistics departments are still unready to concede that they have been preaching silliness for over a century. Even so, the replicability crisis is placing great pressure on their orthodoxy. Since the whole methodology of significance tests is based on the idea that we should tolerate a 5 per cent level of bogus findings, statistical traditionalists are not well placed to dodge responsibility when bogus results are exposed.

Some defenders of the old regime have suggested that the remedy is to "raise the significance level" from 5 per cent to, say, 0.1 per cent — to require, in effect, that research practice should only generate bogus findings one time in a thousand, rather that once in twenty. But this would only pile idiocy on stupidity. The problem doesn't lie with the significance level, but with the idea that we can bypass prior probabilities. If a researcher shows me data that would only occur one time in twenty if geography didn't matter to hospital waiting times, then I'll become a firm believer in the "postcode lottery", because the idea was reasonably plausible to start with. But if a researcher shows me data that would only occur one time in a 1,000 if the position of Jupiter were irrelevant to British election results, I'll respond that this leaves the idea of a Jovian influence on the British voter only slightly less crazy than it always was.

No sane recipe can ignore prior probabilities when telling you how to respond to evidence. Yes, a theory is disconfirmed if it makes the evidence unlikely and is supported if it doesn't. But where that leaves us must also depend on how probable the theory was to start with. Thomas Bayes was the first to see this and to understand what it means for probability calculations. We should be grateful that the scientific world is finally taking his teaching to heart.

David Papineau's most recent book is Knowing the Score: How sport teaches us about philosophy (and philosophy about sport)

Measure
Measure
Evernote helps you remember everything and get organized effortlessly. Download Evernote.

The Evils of Cultural Appropriation

The Evils of Cultural Appropriation

In 1961, the anthropologist William Rowe documented a curious feature of Indian village life. When those of a lower caste—particularly if they were "untouchable"—wore clothing containing threads worn by a higher landlord caste, members of the landlord caste tore the garments from them, beat them and fined them. Rowe called this behavior "upcasteing" and observed that it was a serious social offense.

Similar practices barring access to signifiers of a caste or social grouping superior to one's own can be found throughout the historical record. In the 7th century B.C., Greek law stipulated that a free-born woman "may not wear gold jewelry or a garment with a purple border, unless she is a courtesan." In ancient Rome, only Roman senators were allowed to wear Tyrian purple on their togas—ordinary Romans could not. In feudal Japan, people of every class submitted to strict laws about what they could and could not wear, according to their social rank. In Medieval and Renaissance Europe, the nobility policed the clothing of the middle classes, making sure to keep them in their place. In any society in which there has been high levels of inequality—where monarchs and aristocrats have ruled over commoners and slaves—equality in dress has been considered, at the very least, bad manners.

While sumptuary laws (rules that govern conspicuous consumption, especially of food and clothing) fell mostly out of fashion in the West during the Enlightenment period, they appear to be back in style again, thanks to the orthodoxies of social-justice activism fueled by social media. On April 28, a tweet appeared which read:

The tweet was retweeted nearly 42,000 times and "liked" 178,000 times—viral in social media terms. The flare-up was reported on internationally, and dozens of op-eds both condemning and defending the tweet and the dress spilled forth. Writing in The Independent, Eliza Anyangwe officiously declared that the teenager who wore the offending dress, Keziah Daum, was "the embodiment of a system that empowers white people to take whatever they want, go wherever they want and be able to fall back on: 'Well, I didn't mean any harm.'" The title of the piece was "Cultural Appropriation Is Never Harmless." But it failed to define what cultural appropriation actually is.

For most observers, these complaints are bemusing and baffling. For many, no defense or condemnation of cultural appropriation is required, because such complaints are almost beyond the realm of comprehension in the first place. Without cultural appropriation we would not be able to eat Italian food, listen to reggae, or go to Yoga. Without cultural appropriation we would not be able to drink tea or use chopsticks or speak English or apply algebra, or listen to jazz, or write novels. Almost every cultural practice we engage in is the byproduct of centuries of cross-cultural pollination. The future of our civilization depends on it continuing.

Yet the concept was not always so perplexing. Originally derived from sociologists writing in the 1990s, its usage appears to have first been adopted by indigenous peoples of nations tainted by histories of colonization, such as Canada, Australia and the United States. Understandably, indigenous communities have been protective of their sacred objects and cultural artifacts, not wishing the experience of exploitation to be repeated generation after generation. Although one might be quizzical of complaints about a girl wearing a cheongsam to her prom (the United States has never colonized China) even the most tough-minded skeptic should be able to see why indigenous peoples who have historically had their land and territories taken away from them might be unwilling to "share their culture" unconditionally. Particularly when it is applied to the co-opting of a people's sacred and religious iconography for the base purposes of profit-making, the concept of cultural appropriation seems quite reasonable.

Nevertheless, the concept quickly becomes baffling when young Westerners, such as Mr. Lam, of the cheongsam tweet, use the term as a weapon to disrupt the natural process of cultural exchange that happens in cosmopolitan societies in which culture is, thankfully, hybrid. When controversies erupt over hoop earrings or sombrero hats or sushi or braids or cannabis-themed parties, the concept of cultural appropriation appears to have departed from its formerly understood meaning—that is, to protect sacred or religious objects from desecration and exploitation. It appears that these newer, more trivial (yet vicious) complaints are the modern-day incarnation of sumptuary laws.

Elites once policed what their social inferiors could wear, in part to remind them of their inferiority, and in part to retain their own prestige and exclusivity. In Moral Time, the sociologist Donald Black, explains that in feudal and medieval societies, sumptuary laws were often articulated with religious or moralizing language, but their intention and effect was simply to provide a scaffold for existing social hierarchies. Writing in the 15th century, French philosopher Michel de Montaigne made the astute observation in his essay "Of Sumptuary Laws": "'Tis strange how suddenly and with how much ease custom in these indifferent things establishes itself and becomes authority."

***

New orthodoxies of social justice have arguably done the most obvious damage in the world of fiction writing. In the past five years, concerns about cultural appropriation in literature have exploded, with Young Adult writers being "dragged" online for the sins of appropriation while sensitivity readers are employed by publishers to ensure that minorities are represented in a sufficiently "correct" manner. Perhaps not coincidentally literary sales have reportedly declined by 35 percent over the same time period.

In 2016, a flare-up exploded over author Lionel Shriver's speech at the Brisbane Writers Festival—where she infamously wore a sombrero hat while delivering her speech about freedom in fiction writing. This prompted Australian-Muslim activist Yassmin Abdel-Magied to publish a purple-prose version of the event in The Guardian:

The faces around me blurred. As my heels thudded against they [sic] grey plastic flooring, harmonizing with the beat of the adrenaline pumping through my veins, my mind was blank save for one question.

"How is this happening?"

Yet Shriver's core argument was so self-evident as to be banal. She proposed that if a writer could not put herself in the shoes of her characters—who might be from a different cultural background than herself—then fiction would cease to exist: "The kind of fiction we are 'allowed' to write is in danger of becoming so hedged, so circumscribed, so tippy-toe, that we'd indeed be better off not writing the anodyne novel to begin with," she observed.

Abdel-Magied did not think such an argument was self-evident. On the contrary, she wrote:

In demanding that the right to identity should be given up, Shriver epitomized the kind of attitude that led to the normalization of imperialist, colonial rule: "I want this, and therefore I shall take it."

The attitude drips of racial supremacy, and the implication is clear: "I don't care what you deem is [sic] important or sacred. I want to do with it what I will. Your experience is simply a tool for me to use, because you are less human than me. You are less than human…"

Shriver's full speech can be read here. To argue that it "drips of racial supremacy" is to project something that is simply not contained in the text. She defends fiction as a vehicle for empathy, not derision. Nowhere does she argue that the "right to identity" be "given up."

Yet Shriver's metaphorical kneecapping was complete. Her speech was officially disavowed by the organizers of the festival, who threw together a "Right of Reply" seminar session the following day. It is important to note that it was never suggested that Shriver actually committed colonial crimes that had harmed people of color. It was simply suggested that she was guilty by association, presumably because of the color of her own skin.

Such events are surely baffling to anyone over the age of 30. It was only a few years ago when writers were celebrated for their imaginative and empathic reach if they wrote about characters who were from a different background than the author's own. When Philip Roth wrote probingly and unflinchingly about a character who was black but passed as Jewish in The Human Stain, his book became a national bestseller and won numerous accolades including the New York Times "Editor's Choice" Award and the PEN/Faulkner Award. When the book was released, the critic Michiko Kakutani offered the following review:

It is a book that shows how the public zeitgeist can shape, even destroy, an individual's life, a book that takes all of Roth's favorite themes of identity and rebellion and generational strife and refracts them not through the narrow prism of the self but through a wide-angle lens that exposes the fissures and discontinuities of 20th-century life.

In 2013, GQ listed The Human Stain as one of the best books of the 21st century. Arguably, such a book could not be written today, and would almost certainly cause a firestorm if published. That's a pretty sharp turnaround in sensibility in a very short period of time.

The context surrounding the drama at the Brisbane Writers Festival is important for understanding why it happened. Abdel-Magied migrated to Australia at the age of two and, from a relatively young age, entered the public sphere as a "model minority." She was an articulate activist and an accomplished student (becoming an engineer and memoir writer) who appeared capable of promoting a modern, sophisticated image of an urban Muslim-Australian. For her activism, she was showered with awards and publicly funded appointments, and given international trips for the express purpose of promoting Australia abroad. Yet for all her accomplishments, accolades, money, and travel opportunities—or perhaps in exchange for them—the young woman was stuck with the felt identity of a victim. This apparent feeling of victimhood was so strong that she interpreted arguments for creative license in art to be "lay[ing] the foundation for genocide."

Many people—both then and now—find it hard to understand how such complaints can come from a place of good faith. Activists like Abdel-Magied seem unwilling to empathize with those who may genuinely want to show appreciation for cultures which are not their own, or writers who genuinely want to empathize with those who are different or marginalized, or simply to reach beyond a single layer or caste of the multicultural societies in which they live, an ambition for which writers and thinkers have historically been applauded.

Almost every cultural practice we engage in is the byproduct of centuries of cross-cultural pollination.

What also seems odd is that activists like Abdel-Magied rarely appear to attempt to persuade others to engage with the foreign cultures they are purportedly defending in more sensitive or better-informed ways. Rather, their complaints have a hectoring, absolutist quality, focusing on the disrespect and lack of deference that white people have shown them. Listening to these complaints, it is difficult to come away with the view that they are about anything other than exercises in power. While being an effective social-media activist, Abdel-Magied is not a particularly good writer, which means identity-as-victim is therefore valuable currency at a writers festival. If literature is not reducible to identity, and representation is not a group property, then her own claim to literary significance would be a dubious one.

It is by considering the power dynamics at play that the logic of cultural appropriation starts to become clear. In a culture that increasingly rewards victimhood with status, in the form of op-ed space, speaking events, awards, book deals, general deference, and critical approbation, identity has become a very valuable form of currency. It makes sense that people will lie, cheat, and steal in order to get some. Expressing offense over a white person wearing a sombrero hat might seem ridiculous on its face—but for those who live inside these sententiously moralistic bubbles, it may be both a felt injury and a rational strategic choice.

Complaints about cultural appropriation are not really complaints, they are demands. When Abdel-Magied walked out on Shriver, it was not because of her insensitivity, it was because of her defiance: her refusal to kowtow to the orthodoxies written up by her moral betters, from which Abdel-Magied's own claims to significance and social status are derived.

In their newly released book, The Rise of Victimhood Culture: Microaggressions, Safe Spaces, and the New Culture Wars, the moral sociologists Bradley Campbell and Jason Manning describe the three main moral cultures that exist today, which they give the shorthand labels of dignity, honor, and victimhood. A dignity culture, which has been the dominant moral culture of Western middle classes for some time, has a set of moral values that promotes the idea of moral equality and was crystallized in Martin Luther King Jr.'s vision that people ought to be judged according to the content of their character, not the color of their skin.

Victimhood culture departs from dignity culture in several important ways. Moral worth is in large part defined by the color of one's skin, or at least one's membership in a fixed identity group: i.e., women, people of color, LGBTIQ, Muslims, or indigenous peoples. Such groups are sacred, and a lack of deference to them is seen as a sign of deviance. The reverse is true for those who belong to groups that are considered historical oppressors: whites, males, straight people, Zionists. Anyone belonging to an "oppressor" group is stained by their privilege, or "whiteness," and is cast onto the moral scrapheap.

In a recent interview in the online magazine which I edit, Quillette, I asked Campbell and Manning what they thought about cultural appropriation. They explained that they found such complaints baffling, like everybody else, but that they also "illustrate victimhood culture quite well." One of the key components of victimhood culture is its projection of collective guilt, social offenses between individuals are no longer about the actual people involved, they are about "one social group harming another."

One might make the case that while complaints about cultural appropriation are annoying, they are ultimately harmless. What is the harm in showing deference to peoples who have historically been the victims of exploitation, discrimination, and unfair treatment? What is the harm in showing respect and compliance with these new rules—isn't it a way of making up for past sins?

The short answer to these questions is, no. The notion that a person can be held as responsible for actions that he or she did not commit strikes at the very heart of our conception of human rights and justice.

We used to take calls for collective punishment much more seriously. In the 1949 Geneva Convention it was determined that: "No protected person may be punished for an offense he or she has not personally committed." Collective punishment was seen as a tactic designed to intimidate and subdue an entire population. The drafters of the Geneva Convention clearly had in mind the atrocities committed in WWI and WWII where entire villages and communities suffered mass retribution for the resistance activities of a few. In their commentary on the outlawing of collective punishment the International Red Cross stated: "A great step forward has been taken. Responsibility is personal and it will no longer be possible to inflict penalties on persons who have themselves not committed the acts complained of."

In times of peace, collective punishment may come in the form of social media dust-ups over sombrero hats or Chinese dresses. Gradual softening on the taboo of collective punishment does not bode well for the health of liberal democracies. Which is also why it is important for us all to remember that social-justice activists who complain about cultural appropriation only represent themselves, and not the minority groups to which they belong. When Jeremy Lam complained about Keziah Daum's cheongsam, other Asian-Americans and mainland Chinese people rebuked him for being "offended over nothing." Hundreds of Chinese and Asian-American people have now responded to Daum's original tweet commenting on how pretty she looked in the dress, and how they felt that she did nothing wrong. Replies include:

As these responses make clear, victimhood culture only exists in certain insulated bubbles. It is by no means the dominant moral culture in the United States, nor any other nation—yet. Most of us still live in a dignity culture where equality, diversity, and inclusion remain paramount principles to aspire to.

As the drama of social-justice activism unfolds, one of the biggest mistakes we could make is to think that the loudest and most dogmatic voices of minority groups speak for these groups as a whole. Minority groups are not homogenous, and are full of people who are conservative, liberal, apolitical, and dissenting. Just because a progressive activist claims to speak on behalf of a marginalized group does not mean we have to indulge their delusions of grandeur.

It also would be a mistake to think that multiculturalism has been a failure. Human beings have a remarkable capacity for cooperation and solidarity with each other—including those whom they are not related to—when they feel that their neighbor is on their team. What helps people move past cultural, ethnic, or religious differences is sharing in a common goal, and feeling a common bond of humanity with their fellow citizens. When activists draw attention to differences between cultures, and attempt to draw boundaries around them, punishing those who step out of line, this sets the stage for escalating intolerance and division. We must not let them succeed.

***

You can help support Tablet's unique brand of Jewish journalism. Click here to donate today.

Claire Lehmann is the Sydney-based Editor-in-Chief of Quillette.

Measure
Measure
Evernote helps you remember everything and get organized effortlessly. Download Evernote.

宿白:现代城市中古代城址的初步考查

  现代城市中古代城址的初步考查 文 / 宿白 处理好文化遗产保护与城市发展的关系,首先要了解城市发展史。要了解城市发展史,最重要、也是最实在的手段,是考古遗迹的辨认。我们有不少历史名城沿用了好多朝代,甚至一直到今天还不断更新建设。这里说的历史名城主要指隋唐以来的城市。隋以前,选...