Plurinationalism and Community Votes on Mining

by Katherine Fulz

In my dissertation, I examine the economy of representation about mining in Guatemala, taking “media” in its broadest sense. This includes traditional media such as newspapers and advertisements; digital and social media; performative events such as protests and community votes; and attempts at knowledge creation such as research on public health and human rights. It is impossible to extricate one form of media from another in this context, as both authors and audience freely remix and reinterpret different genres, creating novel hybrid forms in the process. These communicative forms both reflect and contest dominant discursive regimes about mining development and what it means to be Guatemalan.

Page 99 is part of a discussion of the political implications of community votes, which are organized by local communities and anti-mining activists throughout Latin America. These votes are founded on activists’ interpretations of international accords mandating the Free, Prior, and Informed Consent of indigenous peoples for development projects impacting their communities. Although the votes use logistic, aesthetic, and performative elements associated with national elections, they are organized outside of—and organizers might say in opposition to—state electoral structures. The results are almost unanimously against mining development, and usually face contestation from national governments. There have been dozens of votes held throughout Latin America, usually numbering no more than a handful in each country. In Guatemala, however, there have been more than 80 votes to date, which is surprising given the comparatively low number of active mining projects in that country. Part of my goal in the chapter is to examine what it is about the Guatemalan context that makes these votes such an appealing strategy for opposing transnational development.

On page 99, I explore how the concept of plurinationalism applies to indigenous political movements in Guatemala. I argue that community votes point to a potentially transformative and plurinational political project that questions whether international accords protecting human rights are an extension of state power. The discussion of plurinationalism builds up to chapter three, which is an ethnographic account of the performance and documentation of community votes in several highland communities. Even though the votes are legally non-binding, the simultaneous performance of Guatemalan citizenship and indigenous autonomy they embody is significant in the way it disrupts dominant discourses about multiculturalism and democracy.


Fultz, Katherine. 2016. Economies of Representation: Conflict, Communications, and Mining in Guatemala. PhD diss., Department of Anthropology, University of Michigan.


Consultas comunitarias in Guatemala are exemplary of such a plurinational process: not confined to any one region, and even occasionally reflecting pan-continental aspirations, consultas go far beyond the “state within a state” model of indigenous autonomy and seek to fundamentally alter the relationship of indigenous people with the Guatemalan state.         

        In Guatemala, consultas are made possible by two parallel branches of post-war social developments: neoliberal reforms seeking to decentralize state governance and strengthen local and regional autonomy (in tandem with a push toward economic privatization); and multiculturalist reforms that recognized indigenous culture and rights, part of the shift from assimilationist policies of cultural citizenship. Consultas are some of the first concrete instances wherein indigenous groups in Guatemala have sought to reach beyond the national regulatory system and take the structures of governance into their own hands, and as such they are attempts to reformulate the relationship between indigenous rights and the oligarchical state.

Ilana Gershon on her new book, Down and Out in the New Economy

Down and Out in the New Economy

Interview by Matt Tomlinson

The topics your book takes on are complexly intertwined: how people are meant to become their own brands, how patterns of hiring and quitting are changing, and the role of new media ideologies and ecologies. One of the points that emerges in your book is that people who try to connect these strands are themselves often confused, perplexed, and frustrated by the systems and processes. So can you distil your argument into a short summary—the elevator talk or, as this case might be, the elevator blog?

Pithy summaries are indeed the goal of so many of the job-seeking performances I studied, it seems only fair that I attempt to reduce my argument down to a handful of sentences.  My book is an attempt to make the notion of a neoliberal self as rigorous as possible by using historical comparison with earlier forms of capitalism.  So I suggest that Fordist work structures relied on the metaphor that one owns oneself as though one was property.  This means that the employment contract is a moment in which you rent yourself out to an employer for a certain period of time, and get yourself back, so to speak, at the end of the day.   Many union battles were fought over how long you should rent yourself out (the 40-hour work week), or other practical conundrums created by extending this metaphor of self-as-property.  But since Reagan and Thatcher, the metaphor has changed, and under neoliberal capitalism, people imagine that they own their selves as though they are businesses – bundles of skills, assets, experiences, qualities, and relationships that must be consciously managed and continually enhanced.  The employment contract becomes metaphorically a business-to-business contract in which you as a business are providing temporary solutions to your employer’s market-specific problems.  The book is about how the hiring ritual and various aspects of workplaces have changed in response to this shift in metaphor.


You describe how your students’ questions about how they should go about getting jobs led you to write the book. Can you say more about this, and what practical critical tools you see linguistic anthropology offering to students and job-seekers?

I am so glad that you asked, because the more I studied what hiring actually involves, the more I realized that linguistic and media anthropologists teach very helpful analytical tools for being a competent job candidate.  And I also think that we could all be much more direct when faced with the question “How will this major help me get a job?” about all the ways that an anthropology degree is truly helpful preparation for specific tasks involved in looking for a job.

For example, all the workshops that I attended were openly guides for how to master a certain genre.   The instructors were teaching how to understand the way information should be presented on the page to anticipate a certain kind of reader – often an impatient one who wants clear signals that the applicant fits certain criteria, and with their own styles for interpretation.   These are readers who are also reading with other people’s assessments in mind, who are anticipating having to show a resume to someone else in their workplace with their own techniques for interpreting a genre.  And while the workshops tend to focus on one genre alone, the job seeker is supposed to be competent at a range of genres, all of which are supposed to interconnect and tell a persuasive narrative about the applicant.  This is precisely what students learn in our courses.  You learn how to become competent at new genres.  You learn how to anticipate the different ways people might interpret your own texts, at the same time that you are learning a range of different techniques for interpreting a text.  You often learn the relationships between a textual genre and a performance genre.  And, as importantly, you learn how to be persuasive about your own interpretations of a text, a skill that will come in handy when our students have to discuss with their future co-workers who they want to hire.


 Your book is written in an appealingly informal tone, but there are moments when the immense anxiety and frustration of job seekers is apparent. Was the fieldwork emotionally challenging at times? Were there folks for whom you felt you needed to intervene sympathetically in some way?

Honestly, this was the most depressing fieldwork I have ever done.   And this is proven to me all over again when I give talks.  When I talk about my previous research on how people use new media to break up with each other, I often feel like a stand-up comedian.  The stories and my informants’ take on things are just so funny.   And now, when I give a talk about hiring, people in the audience keep telling me that they feel deeply depressed after I am done.

One of the reasons it was so painful is that the white collar workers I interviewed seemed to accept the neoliberal advice that they were surrounded by. At the end of an interview, I would sometimes mention that I was a bit skeptical about some aspect, say the requirement to create a personal brand.  And invariably, the person I was interviewing would defend the advice.   By contrast, last summer, I spent a month interviewing homeless people about how they looked for jobs.  It was much more enjoyable fieldwork because so many of the people I interviewed had a healthy skepticism about the systems they were trying to navigate.

It was also hard because I had no concrete way to intervene for the people I was interviewing in the moment, no matter how much I wanted to do so. And offering yet more advice didn’t seem like a satisfying way to go.  After all, part of the trap that job-seekers face is not only that they are surrounded by advice, some of it good and some of it crappy, but almost all of it must be said at a level of generality that isn’t helpful enough for getting a job in a complex and specific workplace.  In the end, I decided that maybe the best I could do was point out in my book the problems with standardized advice as clearly as possible.  This might help job-seekers realize they also might want to do thoughtful research about any workplace they want to enter, research (to continue my point in the previous question) that resembles ethnographic explorations of how decisions are made in a specific organization.


For linguistic anthropologists this book will resonate strongly with your previous book The Breakup 2.0. In fact, they would be great to assign as a pair to students. But I wanted you to think of this new book in terms of your work on Samoan migrants, No Family Is an Island. I want to go out on a limb here. In No Family Is an Island, you make it clear that government bureaucrats who see their systems as acultural put Samoans in the position of “being cultural,” and making culture something to be managed in particular ways. In this new book, you mention how companies are seen to have cultures, but individuals have some leeway—true, they need to have a cultural makeup that fits the company’s own, but they’re also free to craft selves as brands and decide what kind of individual culture they have, if you will. So to draw all this out: Samoan migrants are forced to be culture-bearers, whereas American job-seekers need to be culture-designers. Is this a fair comparison?

For me, this is a very unexpected comparison, but let me see if I can work with it.   Why unexpected? In my research on hiring, I was constantly baffled by what people meant when they were talking to me about company culture and making sure that those they hired were a good cultural fit.   It often sounded to me like “not a cultural fit” was a politic way to reject a job candidate you didn’t like for whatever reason, but seemed perfectly acceptable on paper.  And I never came across anyone who thought they were creating a culture of one, job-seekers and employers both understood culture to involve a group of people interacting together.

That said, I think you are pointing to a fascinating distinction in the way that culture as a classification functions on the ground when people use the concept explicitly.  In my earlier work on Samoan migrants, culture tended to refer only to one thin slice of what anthropologists mean when they talk about culture – ritual exchanges, kinship obligations, and politeness norms.  None of these were being referred to when U.S. white collar workers were talking about company culture.  Instead, they seemed, as far as I could tell, to be referring to the specific interactional practices that linguistic anthropologists study – how do you handle conflict, or manage small talk – which was then translated into Values that company employees were supposed to uphold.  No one ever clearly spelled out the link between values such as Amazon’s “bias for action” and “think big” and how employees were supposed to behave in particular situations.  This was the tacit cultural knowledge everyone in Amazon were supposed to know — how to link these values to everyday practice.  And I suppose employees could say retroactively that the people who didn’t know how to enact this tacit link were not a good “cultural fit.”  But honestly, from my analytical perspective, moving from a job at Goldman Sachs to a job at Amazon was not switching cultures in any meaningful anthropological sense.  Both Samoan migrants and U.S. white collar workers were using culture as a classification to refer to some things that anthropologists would agree are part of culture, but it was only a slice of what anthropologists might refer to should they use the term.  But the slices were different enough that I think you are right that people viewed their relationships to culture differently.  Samoan migrants did not think they were actively making their own culture while US white collar workers thought that every conscious decision they made helped them fashion a company culture.


Finally: who do you most hope will read your book?

I wrote this book for people looking for jobs, for people looking to hire, and for the career counselors who are giving advice.  I don’t like the model of the neoliberal self, and want to encourage people to refuse it.  The question is how to do this persuasively?  I turned to analyzing hiring because it is a moment of such uncertainty and anxiety that when people are being told they had to become a neoliberal self in order to get a job, they will do it for pragmatic purposes.   I hoped with this book to suggest that this was not the way to go, both because becoming a neoliberal self isn’t all that effective as a set of strategies and because it is not allowing people to be as ethical and good to each other as I hope they want to be.


Lynnette Arnold’s Communicative Care Across Borders

My dissertation, “Communicative Care Across Borders: Language, Materiality, and Affect in Transnational Family Life,” explores the role of everyday communication in the lives of multigenerational transnational families living stretched between El Salvador and the United States, revealing how technologically mediated language both produces and contests the political-economic marginalization of geographically mobile populations. These families rely on regular cell-phone calls as a primary form of kin work in the face of long-term physical separation caused by restrictive immigration policies (Di Leonardo 1987), and the dissertation provides a close analysis of these cross-border conversations, informed by insights developed through multi-sited ethnographic engagement.

Page 99 is located in the middle of my methods chapter, and discusses the relationships that made this intimate investigation possible, describing how 15 years of connection had resulted in my adoption into several transnational families, signaling close affective ties despite the insurmountable gulf between our political-economic positions. As such, although page 99 is methodological, it draws attention to the primary theoretical contribution of the dissertation, the concept of communicative care.

Building on feminist approaches to care, I develop this term to highlight how mundane conversations attend to both material and affective concerns, nurturing the relational ties upon which cross-border families depend. The dissertation analyzes long-distance greetings, collaborative reminiscences, and negotiations of economic decisions, elucidating how each practice works to reproduce material connections between migrants and their relatives back home, while also providing forms of affective engagement that maintain kin ties.

In sustaining transnational family life, communicative care practices constitute a creative response to the failures of state care, but one that also reinforces the domestication and privatization of caring responsibilities. Thus, while communicative care is a means of pursuing well-being at the margins of neoliberalism, these strategies simultaneously produce forms of personhood and relationship that conform to neoliberal models. The analysis presented in the dissertation demonstrates the crucial importance of paying close attention to technologically mediated talk for understanding how the tensions of neoliberal mobility are both produced and managed.

Lynnette Arnold, “Communicative Care Across Borders: Language, Materiality, and Affect in Transnational Family Life,” Phd diss, University of California, Santa Barbara, 2016.


Di Leonardo, Micaela. “The Female World of Cards and Holidays: Women, Families, and the Work of Kinship.” Signs: Journal of Women in Culture and Society 12, no. 3 (1987): 440–453.


Birgit Meyer on her new book, Sensational Movies: Video, Vision, and Christianity

Interview by Yeon-Ju Bae

Given that Ghanaian video movies provide audiovisual experiences of what the Ghanaian audience might have imagined, for example, occult forces, Satan, God, modern life styles, and so on, I was curious in the first part of the book why the visual modality seems to be more emphasized in terms of “imagination, image, and imagery”, whereas audio is more briefly mentioned. In reading chapter 3, I realized that attention to the aural modality alerts us to backchannel cues that the audience produces in watching and participating in video movies. I think this interactive and co-creative process across video technology and human viewers, and across visual and aural modalities, composes one of the important features in the Ghanaian video experience. In this sense, I wonder how you would situate the audience’s aural participation among the processes of “imagination” and “sensation”.

Thank you. You raise a very interesting point. Some readers of my book have pointed out – rightfully so – that I pay too little attention to the sound dimension of these movies. The issue, of course, is not to just say more about sound, but to reflect on the sound-image relation. In the passage in Chapter 3 to which you refer I argue that the low quality of the sound compels audiences to co-produce their own sound track. This, as you observed very well, is a central dimension of the genesis of the typical video experience. So here poor sound facilitates high level interaction of audiences with the moving images. A technical deficiency allows for higher participation! But certainly more can (and should) be said about the sound-image relation in the video experience. Sound is imperfect, but not absent. In watching movies, people look and listen (and speak, sing, shout) all at the same time. There is no neat separation between visual, aural and other modalities. They intersect in various ways. In writing the book, I used the terms audience and spectators with a critical awareness that the emphasis on the aural in the former and on the visual in the latter ideally should imply each other. The lack of a single term to describe the entanglement of the aural and the visual (let alone other sense perceptions) in film reception testifies to the difficulty of developing a thoroughly multi-sensorial approach to cinema. Maybe the term “spectaudience” might be a solution? I think that in order to critique and transcend the visual bias that is still dominant in the study of cinema and film, it may be worthwhile to think further about the work of Michel Chion, who has proposed a distinction between “visualized sound” (that is, sound the source of which is visible to viewers) and “acousmatic sound” (that is, sound the source of which remains hidden). The latter accounts for the evocation of a sense of suspense. It would be interesting to think through his distinction with regard to Ghanaian movies, where sound tends to be deficient and people make up for this lack. This would require audiovisual recordings of film shows. Alas I do not have such materials, I only have audio recordings.

The same problem of an over-determination of the visual arises with regard to approaches to the imagination, imaginaries and images. In my book, which is about the interface of film and Christianity, I explore the question how movies feed into and are fed by what people imagine, how their imagination is synchronized and how this yields shared sensations and common sense. In the Introduction I wrote: “… the ‘stuff’ to which imaginaries refer is not limited to pictures and other visual items. It is the imagination, as a visualizing faculty that – not unlike a film – represents all this ‘stuff’ as mental images.” I agree with your observation that “the visual modality seems to be more emphasized” than the aural. I do think that film and the imagination are visual by definition. However, the point is that visual does not stand by itself, but is coupled with sound, smell, taste and touch. Exactly for this reason I sought to embed the imagination into a broader frame of sensation. Fleshing out an approach to the imagination, imaginaries and images that is not limited to visual registers but opens up for speaking or singing images, smelling images, sounding images, and so on, is a major conceptual issue that deserves much more attention. I think that Hans Belting’s anthropology of images, which has inspired my sensorial and material stance to the imagination as outlined in the Introduction, may be a useful starting point to conceptualize the imagination from a new thoroughly material and sensorial angle. This is one of the theoretical projects I would like to pursue in the future. Especially for the study of religion this is an important topic. From the ways in which audiences in Southern Ghana responded to the images and often deficient sounds they witnessed on screen I learned that going to the movies was an experience in which imagination and sensation converged. So much so that the films are understood to reveal something real which is normally hidden to ordinary perception.


It was intriguing for me to encounter the folk notion of public that seems to be closely related to their notion of ethics. It seems like Ghanaian people don’t regard rumors and hearsay as “public” even though these discourses are circulating. However, if those rumors and hearsay are framed in terms of the Christian ethics in which retribution adequately takes place, then the stories involving occult, violence, and sex become “publicized” via video movie forms. Within video movies as well, if there is a scene in which actors show their intimate/private body parts, their behavior is often associated with immorality in terms of the plot flow and protagonist characteristics. I’m wondering if the Ghanaian notion of public as morality is drawn from Christianity or is rooted in Ghanaian traditions. Put in other way, how do different religions or different ethnic groups in Ghana exhibit different understandings of the relationship between public and ethics?

What I wanted to make clear is that video movies flourished under conditions of democratization and the deregulation of mass media as radio, television and film that had previously been under full state control. The change occurred around 1994. The point here is that, prior to that change, stories and rumors circulated, but were not allowed a space in the mass media around which the modern public sphere evolved. Scholars studied such narratives and performances as popular culture. After 1994, videos became one of the new outlets through which popular imaginaries that had circulated before under more clandestine conditions would become visible and audible on screen. Hitherto subdued narratives circulating via rumors could go public in the context of a new politics and aesthetic of representation of culture. What I found very interesting – and here we come to the gist of your question – is the strong emphasis on ethics. So, while as far as content and message were concerned, video movies digressed from state-cinema, they were still embedded in a longstanding ethical attitude towards film according to which moral lessons were to be learned. This attitude is certainly not limited to Christianity, but emphasized in indigenous traditions, especially in traditional storytelling, as for example Ananse stories. And even though video movies revel in picturing all sorts of transgressions, the “good” people are morally sound (and hence do not undress, consult a “fetish” priest, and so on). In this book I showed how approaches to video-movies on the part of both the producers and the “spectaudiences” are embedded in everyday or “ordinary ethics” (Michael Lambek). Since I could notice that the movies appealed to people with different ethnic backgrounds, I am sure that the expectation of the morality of entertainment is widely shared. Over the next years I will conduct a collaborative project with colleagues in Ghana in the course of which we will investigate modalities of co-existence across religious and ethnic differences in the suburb of Madina (Accra). The issue of public ethics and the morality of circulating cultural forms will certainly be a major issue.


When I read your interview with a woman who paid attention to video movie scenes in which characters of the upper class are matching the color of curtains and bed sheets depending on situation, I wondered if she were a man, would the interviewee have paid so much attention to such details. This interview excerpt brings up issues of how social differences map onto experiences of watching movies. Are there any patterns in terms of audiences’ reactions or focuses depending on gender, age, class, language, region, religion, ethnicity, and so forth? I think the class (as well as language and age) differences were described in the book. What about region—are the movies circulated in rural villages as well; if so, do the ways of watching videos in villages show any difference from those of urbanites? And are there any reactional patterns in reference to multireligious and multiethnic situations in Ghana?

The movies I studied were consciously tailored to appeal to women first, who would then make the male members of the household watch as well. This is what my filmmaker friends told me over and over again. A film that would fail to do so was doomed to flop, and this would end the business of filmmaking. The women who admired the match between curtains and bed sheets was a seamstress called Floxy who had a big atelier. She unfortunately died in childbirth not long after our interview. As she told me, she got phone-calls from her female customers when a film (often Nigerian) was on in order to copy a particular appealing dress. So she, and her customers had a keen eye and great appreciation for the new styles displayed in movies. By contrast, I myself did initially not look at movies in this manner, but was eager to discern meanings. This is what I realized in the interview with Floxy. She alerted me to a modality of looking which I had so far overlooked. Scouting for styles is one of the ways through which movie watching is embedded in everyday life. It is indeed the case that what people find remarkable in a movie very much depends on their interests and dispositions. My research mainly took place in Ghana’s capital Accra, and to some extent among Ghanaians in the Netherlands. These are multi-ethnic settings which are predominantly Christian. Unfortunately I did not accompany screenings of videos in the rural areas. Nor did I study the Kumasi film industry which uses Twi as main language (rather than English). It would have been interesting to follow the circulation and screening of Accra- and Kumasi made films in villages and across the borders of neighboring countries in detail. Alas I did not do so. And now, with the spread of television and the mobile phone and its increasing use for film viewing the days of screening movies in villages to paying audiences are a matter of the past. All the same, I do not think that the identity markers you mention are reflected in particular watching patterns. The movies are made to travel across Ghana, Africa and among people of African descent in the world. Together with Nigerian movies, they are consumed all over the continent. I would rather say that these movies actively disseminate particular images, styles and attitudes about African tradition and the modern world. They articulate visions, desires, dreams, anxieties, life styles and identities. They make people share imaginaries and sensations. They are part of performing African modernity.


You said that the “sensational forms” of Ghanaian video movies give rise to “religious real”, and I was wondering for whom it is real. It seems like the representation of reality must be considered within the context of authority at various levels. For instance, video movie directors are facing criticism from censorship officers that they are not representing what is reality; traditional chiefs think that the directors cannot accurately visualize the spiritual forces, or juju, because these are not visualizable in the first place; those viewers who know English might find protagonists’ speech artificial; directors are concerned that it might seem unrealistic to the upper class if movies depict hyper-urban lives that aren’t present in Ghana—which the lower class wants to see in the movies; and during filming, actors are acting to visualize what is invisible and inaudible in a way that they themselves don’t believe it to be real. Given that you vividly show how various representations of reality are contested, I’m curious about your thoughts in using the terms “real” and “revelation”.

Yes, this is an important issue. What is taken as real is not given, but subject to authorization processes. Competing politics and aesthetics of world-making co-exist. This is so in all cultures and societies. In Ghana the video-film industry was situated in a context of heavy contestations, as it digressed from established forms of representing culture and tradition under the aegis of state cinema. Video-movies rather surfed along with the popularity of Pentecostalism, which purports a specific take on reality as being enveloped in a spiritual war whose main operators are located in the invisible world of demons. In Southern Ghana, there was and is a broad consensus, running across differentiations in terms of ethnicity, age and education, that the spiritual is real. The video movies that form the major focus of my book echoed and affirmed this consensus. This does not mean that movies were taken to be credible under all conditions and by everyone. Even sympathetic viewers would find certain depictions more convincing than others, and dismissed others as artificial. Still one main attraction point lay in the fact that video movies transfigured stories about occult forces that were circulating in society into movies. They set out to reveal the invisible realm which the naked eye cannot penetrate. In chapter 5 I argue (inspired by the work of Achille Mbembe) that one could see video filmmakers as high priests of the imagination, who create doubles of the “real thing” that is hidden from view but whose features enter into the double. Images are not mere representations, but make what they represent somehow present. At the same time, any attempt to depict the invisible generates contestations, as is the case with the chief who also was a photographer and who insisted that the real thing could not be visualized. So, interestingly, the picturing of the invisible is a paradoxical endeavor, in that the images that reveal the “religious real” (the term was coined by Adrian Hermann) on screen may well conceal it at the same time. At the same time, and you refer to this, actors can only play the role of a witch or the devil, if they do not believe that in so doing they have to become a real witch or devil. And yet, there is a sense of danger of being affected by mimesis, reason why actors recur to prayer so as to protect themselves from being intruded by the role. There were also anxieties that, in filming certain scenes which involve occult forces, these forces could be called upon to inhabit the fake shrines and possess people staging a dance or doing incantations for the sake of shooting a movie. What I wanted to show is that video-movies are embedded in processes of world-making in which what is real is constituted through revelations that rely on authorized Christian visual regimes, but are always haunted by a sense of the ultimate impossibility of revelation. We encounter here a fundamental feature of the image: it acts as a medium of something which is not present as such. The image itself – as a medium – is real, but the question is whether that what it represents is taken as real too, or simply as fake. So I use the term “real” not in an objective or positivistic sense, but as an outcome of politics and aesthetics of figuration that are tied to broader, competing imaginaries. Revelation is a way of vesting the act of representation with a sense of truth, and a confirmation of a “Christian real”. I think that this fact that what is considered real depends on processes of producing something as real and authorizing it is a basic feature of societies, as for instance the current insecurity about the possibility of news being “fake” also shows.


The Ghanaian video movie producers started to emerge thanks to the development of video technology, yet have recently been out of business due to changes in available alternatives for audiences which include films on the internet. In a sense that Ghanaian video movies are deeply concerned not only with religious “sensational forms” and moral lessons, but also with the social life of video technologies, I wonder how the video movie directors have experimented with technological resources and limitations. And I wonder what new experiments the directors are attempting to conduct in order to compete with other alternatives in this current changed situation. Moreover, will these new creative ways of “mediation” bring about any emergent themes or values?

My research ended in 2010 and I circumscribe it as a historical ethnography. Since 2010 I have only followed the industry from some distance. In chapter 1, I trace the implications of the shift from analog to digital video, and show that this technological transition offered new opportunities for newcomers. Most of the filmmakers I followed since the early 1990s are not doing well. In 2010 some sought to remain in the business by opting for another kind of transgressive revelation: mild porn. This generated a moral outcry among the audiences (although these films, just as porn, sold to some extent). It is a difficult situation for them to survive, certainly as there are no film funds and no easily accessible loans. One reason for the problems filmmakers face is that there are non-stop older films shown on various TV channels. Also an Youtube a huge amount of Nigerian and Ghanaian movies is available for free. Nowadays it is difficult to launch a new film and make sufficient money via VCD and DVD sales to earn back the investment. Piracy is lurking. Still new producers are around. My friends in Ghana told me that watching movies on television is still a very much a social affair. People watch together in the family. In long distance busses movies are shown, too. And also in pubs. I also noted that increasingly movies are available digitally and watched on mobile phones. Lindiwe Dovey has documented this shift in African screen media very well. I have not conducted research on this new phase myself. It would be great to take this up. Maybe you want to go into this?

Watching Putin Listen

by Kate Graber

 On the eve of a U.S. presidential election in which Russia and its presidential figurehead have loomed “yuge,” it is perhaps time for some observations about that central action figure of Russian political communication, Vladimir Vladimirovich Putin.

A lot has been written about Putin in English, including biographies by journalists and scholars. They vary in their foci, many locating his rise to power in his personal background or connections, others locating it in the nature of the Russian people. There is also Putin’s own autobiography, which he insists is a “frank” (it’s in the freaking title of the book) and transparent view into his personhood—more on which later. Closer to CaMP anthropologists’ interests, Eliot Borenstein often writes provocatively and entertainingly about the intersections of Russian presidential and cultural politics on NYU’s Jordan Center blog. Nothing that I say here should be taken as evidence that Putin is bad or good, or that his very personal style of political communication is bad or good. As a linguistic anthropologist, I’m interested instead in the content, context, and form of what he says and the cultural significance of those features of talk.

Why Putin? It should (but, sadly, does not) go without saying that Russian political life is far more diverse than what is broadcast by the Kremlin or captured in media coverage of “Putin’s Russia.” Personally, I am less interested in the centers of power than in what’s going on in the rest of Russia, particularly those regions well east of the Urals. There are all sorts of fascinating daily struggles in Omsk and Bratsk and Magadan that reveal more about what it is to be human—and perhaps more about power—and have little to do with what happens in Moscow. But what are you going to do? Russia’s relationship to the U.S. and its political future has increasingly been invested in the person of the president, often in laughably tangible form (again). So here we are.

Putin’s face has popped up onto my screen on a regular basis for the past 12 years, not because I was seeking it out, but just by chance, in the course of my research on minority media in Russia. For some of those years, Medvedev was president and Putin technically played second fiddle as prime minister, but somehow Putin appeared nearly nightly anyway. Now consider for a moment, if an outside researcher like me has accidentally watched that much of Putin for that long, how much more of him a Russian citizen living within Russia has seen. Television is the main medium by which contemporary Russians get their news, over radio, newspapers, or internet sources by a large margin. Most households in Russia have more than one television set, one in the living room and a second or third in the kitchen or a multi-use bedroom. Two broadcasting networks, the Rossiia network of the All-Russia State Television and Radio Broadcasting Company and the majority-government-owned Channel One, produce most of the daily political, economic, and cultural news that Russians watch. Now, it would be easy to assume that this news is overwhelmingly positive in its portrayal of the president. It is, but Putin is not without his critics. Nor is he the only actor in Russian politics guilty of (or successful at) “media manipulation.” Let’s leave aside for a second the question of whether this is orchestrated positivity or not, and just assume that if nothing else, yes, Putin has some control over how he presents himself on television. What is it that Russian viewers then see?

On camera, Putin is unfailingly calm, cool, and collected. He is a study in controlled gestures, measured pauses, and an infamously steady (sometimes steely) gaze. Whatever you think of his positions and policies, you have to admit that the guy exudes quiet confidence.

There are some important elements of Putin’s communicative style that are so different from U.S. presidential style that you might not put them in the same framework. When Putin’s PR team circulated photographs of him riding bare-chested on horseback through Tuva, or tranquilizing and tagging an Amur tiger in Russia’s Far East, U.S. audiences were bewildered and amused. Presumably they found this brazen machismo anathema to presidential politics (which now seems ironic, given the machismo that has appeared in uses of “locker room talk” and hyper-sexualized male discourse within the soon-to-be-finally-over U.S. presidential election, and anyone suspicious of why gender norms are being used as tools of authority-building in a U.S. presidential election should read Valerie Sperling’s book on similar issues in Russia).

Some scholars and clever pundits have observed that such performances are geared not toward an international audience as much as to a domestic one. Or rather, they are geared toward a domestic audience via an international performance. Putin is showing a Russian audience that he is taken in the West as a tough guy or the ultimate action man. And it largely works. But if you look only at the action man imagery, you miss an important element of Putin’s communicative style.

In Putin’s appearances on Russian television, he spends a lot of his airtime listening.

He sits beside or behind a heavy-looking wooden table or desk, usually flanked by the Russian flag and looking official, as he speaks with one of his ministers or advisers, or occasionally a regional political actor such as the governor of one of the vast Russian state’s many provinces. You-the-viewer watch the other person talk, sometimes at great length. Sometimes you then see Putin respond, quietly and firmly, and sometimes not. Sometimes you watch the ministers waver, nervously averting their eyes or cringing under Putin’s quiet gaze. Within these variations, however, there is a solid genre of news broadcasts about Russia’s president: you always watch Putin engage in a face-to-face conversation, staged as though it were between equals, in which he primarily listens.

Is this another iteration of machismo, in that Putin comes across as the quintessential “strong, silent type”? I would argue no. He engages in what is often called “active listening,” reacting to the speaker, following his partner’s gaze and lead, occasionally nodding slightly or otherwise providing some uptake. He is paying attention. If anything this is the type of thoughtful, sustained listening stereotypically attributed to women.

You don’t have to take my word for it; you can watch an example of a Rossiia broadcast from Thursday of Putin meeting with the Minister of Culture, Vladimir Medinskii. Medinskii briefs Putin on plans for the current and coming year, updating him on construction projects at the Moscow Philharmonic and Malyi Theatre and the state of funding for infrastructure. You don’t have to understand anything that’s being said in Russian to appreciate the visual details of context, gesture, and comportment. Sitting opposite one another in ornate chairs in a wood-paneled office, Putin and the minister lean forward, their hands on the table. The minister provides informational sheets; Putin appears to read or study images. The minister holds Putin’s gaze; Putin meets his eyes and nods. The minister talks; Putin listens.

Of course, there’s plenty of airtime of Putin speechifying on news programming too. He holds press conferences, gives interviews, and leads ceremonies of state. Television coverage marked the occasion of Friday’s Unity Day (a holiday celebrating the unity of varied religious traditions, ethnicities, and, yes, Crimea within a single Russian state) with Putin speaking in Moscow. But the daily news is at least as likely to include an instance of this genre of Putin in face-to-face conversation, and it is a far greater share of what Russians see their president do on a regular basis than riding bare-chested through Tuva.

What does he accomplish by having his television audience watch him listen?

When Putin published an op-ed in The New York Times in 2013 and claimed sole authorship of it, American commentators saw it as a risky or outrageous move to speak directly to the people of the U.S. But Fiona Hill correctly observed that it was also a way of demonstrating his abilities to “work with” or “communicate with people,” and to “work with information”—points of personal pride that she traced to his years in the KGB. Wherever his motivations come from and whatever is in his head, the directness of address that Putin achieved in his op-ed is also on display in his routine performance of active listening. Although the listening events are staged, the content of the conversation itself always appears spontaneous. Putin is getting this information now—and in your living room, you’re watching him digest it.

In both the U.S. and Russia (and elsewhere), heads of state are often televised in face-to-face conversation, often seated in armchairs and looking relaxed. Likely we-the-viewing-audience are supposed to be reassured that our political leaders are getting along, that they have not angrily stormed out of meetings or committed a faux pas at last night’s dinner that will accidentally result in a war. Similarly, Putin’s cordial conversations with ministers telegraph that all is well with the gears of power.

Listening like this also suggests to the audience that the president is not acting carelessly or alone, but intelligently and under good advisement. I remember commenting once to a friend in Buryatia, a political activist who opposed most of Putin’s and Medvedev’s policies, that state television news seemed to feature a lot of Putin listening. I expected him to respond cynically, perhaps by saying that Putin would do whatever he wanted anyway, or that this was just an elaborate act. “Well, he’s a very smart guy,” he said instead, “and smart guys listen.”

Hmm. I think the reason I noticed how much Russian television audiences were seeing of Putin in these interactions is that American television audiences rarely watch U.S. politicians listen. In fact, we rarely watch extended face-to-face interactions between domestic leaders of any sort. It is part of what makes televised debates so communicatively peculiar: we watch leaders who are otherwise televised talking instead listen to one another, and to the moderator or ordinary citizens in town halls, for extended stretches of time. During the U.S. presidential debates between Hillary Clinton and Donald J. Trump, social media overflowed with discussion of their respective “listening faces.” Eyebrows and lips were dissected like no one had ever seen the candidates listen before (though in fact there was plenty of material on Clinton on this point). CNN reported that Clinton had carefully crafted her “listening woman” face, as though that were surprising.

Anthropology is at its best when we excavate not only the cultural assumptions informing some weird thing those foreign-Other-type people do, but also our own unexamined expectations. My friend in Buryatia had never noticed how much time he spent watching his president listen, and I had never noticed how little I spend watching mine do anything but talk.

On the other hand, I do like talk.

Kate Graber is a linguistic anthropologist and Assistant Professor of Anthropology and Central Eurasian Studies at Indiana University. When not mulling over Putin’s taste in chairs, she researches minority media, language politics, materiality, and value, especially in Siberia and Mongolia.


Anthropology and Advertising

by Susan Lepselter

Image result for whopper virgins

I remember once going to a mall with a fellow graduate student in anthropology. It was early on, perhaps our second semester. My friend moved slowly through the department store, her eyes wide, her head turning slowly at the display of commodities. Everything had been denaturalized by our studies. The goods were an uncanny materialization of all the theory we were newly steeped in.  She picked up a lipstick and stared at it.  Use value dissolved in the air around us. Another friend made a bumper sticker for her beat up car: Reification is capitalism’s master trope. We all loved how the bumper sticker reified that sentence.

How does a thing come to life and become “transcendent,” as Marx put it, through exchange? For anthropologists, studying the social, affective and imaginative life of commodities can involve examining how people feel about the things they buy, how we value them, use them in unexpected ways, represent them, and identify with them.  Advertisers study similar public practices, feelings and ideas about the things they market. For Marx, of course, the “mystical” aspect of the commodity fetish came from how it reified the social relations of labor.  Today, it is difficult to think about a commodity without considering the life with which advertisers imbue it.

Many advertising agencies think of ethnography as an improvement over less nuanced approaches to market research.  The goals, politics, ethical concerns and commitments of the two endeavors differ. But the agencies still track a common thread: the comparative interpretation of places, people, markets, often with a colonizing echo.

Because I am interested in the affective and imaginative life of objects, I became curious about the connections between anthropology and advertising. It’s a connection that is consciously developed in the advertising industry. The industry uses tropes of ethnography, culture, difference and social context as it creates venues for commodities to enter public consciousness. You could look at the relationship between anthropology and advertising from multiple perspectives, from cultural critique to the pragmatic need to help anthropology majors envision jobs in that business. For this post, I first critically read an ad that performs an idea of cross-cultural exchange; and then I interview Jenny, an anthropologist in advertising, to understand how her industry thinks about the commodity in relation to culture.

“Cultures” and “markets” are everywhere conflated in the neoliberal world; advertising can bring that conflation to the surface of things.  Of course, anthropologists have been talking about the entangled histories of colonialism and ethnographic representation since Writing Culture rattled things up in 1986.  But we don’t always notice how discourses about culture and representation have shifted and taken up other forms in other contexts.

Take the “Whopper Virgins” campaign for Burger King, which featured a long advertising project involving cross-cultural travel to an exotic place and a subsequent story about it. Representatives from the agency (Crispin Porter and Bogusky) traveled to three geographically remote areas around the world – rural areas of Romania, Thailand and Greenland – to meet people they narrate as pre-contact, and to film the encounter.  But “pre- contact” here referred specifically to the local unfamiliarity with fast food hamburgers, The Whopper and the Big Mac.

In these three places framed as outside the modern global grid, potential consumers were considered to be pure: “virgins” who were still innocent to the taste of McDonalds and Burger King, (and therefore ultimately able to pronounce the inherent superiority of the client’s product, the Whopper. )The story of this ad expressed a dream that far transcended sales. Here taste could just be returned to the original biological sense, the natural sensing palate, stripped of habitus. The tasters were still “virgins.”

The trope of virgin land waiting to be penetrated by civilizers has circulated since at least the 1600s; of course, the idea of virgin land ignores the presence of indigenous populations and denies the land as an already-realized place. In centuries of European imagery, virginal female land is depicted as ripe for planting, fertilizing, becoming fruitful.  In the Whopper Virgins project, the older colonization narrative of conquering land merged into the story of discovering and conquering a new market.  In the ad, these locals are depicted as friendly, reciprocal, and wearing traditional clothing; they don’t know how to hold the burger, they sniff at it suspiciously, laugh, and then hesitantly take a bite. Here, an exotic innocence renews the malaise of civilization.  And so within the conceit of this story, all of these different specific places become a single place, the place of virgins, the place where one can still find authenticity, a fountain of taste. The desire for consumable difference flows through the commercial, the longing for a cultural purity that can be both apprehended in its otherness and incorporated into sameness, desired as a virgin and then deflowered.

Image result for whopper virgins

The Whopper Virgins campaign is unusual, for an ad, in this mission to a faraway place. Most campaigns play on deepening, expanding and elaborating the commodity-laden familiar—the thoughts, practices and desires of people at home. When ad agencies adapt anthropological methods to the goals of their research, they think about and theorize what culture might be.  Their references to anthropology are explicit. One agency website quotes an anthropology major who is now an ad executive: We all have to study people and know who they are, what they want and why they want it. The key is research. And when we research, that’s anthropology at its finest.

What is the structure of this connection in practical terms? I begin to approach this question by speaking with another anthropologist who has made a career in the industry. (She has asked not to use her real name; I will call her Jenny.)

After earning her PhD in anthropology and teaching for a few years a visiting professor, Jenny applied for a position that an ad agency had posted on the American Anthropological Association careers site. She sketched a brief trajectory of how the industry has changed its structure over the years to incorporate a more nuanced, interpretive understanding of culture and consumption based in part on anthropology. Once, she says, ad agencies divided their work between “account people” and “creatives.”  The account people, she explained, are oriented towards the client:

“They are Roger on Mad Men; they go golfing and drink martinis with the client – traditionally,  [like on Mad Men] that’s their image. Now it’s mountain biking and Burning Man.

“The creatives write the script and make the work. They work in teams: an art director and a copywriter. They are allowed to be unwashed, up all night, eating granola bars in the agency—making stuff. “

In the past, Jenny says, these two divisions, the “account people” and the “creatives,” comprised the agency.

“The account people would find out the assignment from the client and tell the creatives, who had to write the ad. But then –the world changed.”

Now there is a third division, people who do a specific kind of research to orient the agency’s creative work towards culture. Depending on the agency they are called planners or strategists.  The planners (as I will call them here) are not simply market researchers.  Market researchers did not interpret the latent cultural meanings of their findings; they relied on the literal, the face-value data gathered in brief focus groups and surveys. But planners don’t rely on what Jenny calls the “verbatim” messages of market data. Rather the planners research social forces and meanings that contradict or transcend the explicit answers people might supply on a form. When there’s a difference between what people say they want and what they actually buy, the planners step into that gap.  They call this dissonance “tension,” and use the contradiction to make unlikely, surprising, attention-getting and artful ads – for example, this one that does not deny, but rather intensifies, people’s anxieties about the corrosive effects of cell phones on attention and human relationships.

Jenny said, “The creatives used to hate the market researchers, who would represent the consumers. The creatives just wanted to make cool stuff. But the market researchers added this other element:” [They went beyond “let’s make a cool ad”] to ‘What do moms want?’ “

Sometimes, they found, what “moms want” contradicted not just the consumers’ explicit answers but also the creatives’ ideas, the artful conceit they wanted to execute.  And often, too, the market researchers’ data revealed other contradictions internal to the data itself: “when things test terribly but people actually love it,” Jenny said. It wasn’t enough to just create an ad strategy based on people’s explicit answers. There were what she calls “latent meanings,” positioned by broader cultural trends, which demanded that survey answers read and interpreted in its social context, on a larger scale. “When you talk to people and take their verbatim answers it leads to stupid work,” Jenny said.

“People  [interviewed after a movie] say things like ‘I wish there was no villain and it could all be princesses.’ But of course they don’t really want to go to a movie like that.” The wish to erase the villain was about something else. Traditional market researchers relied on literal survey answers, which told them more about what consumers thought they should want but not what they would actually consume.

Current planning and strategizing always looks for side angles and perspectives on their material. “A woman on the survey said she would never eat a burger because she’s on a diet. That’s what she says—but then of course she would eat it.” The strategists explore that tension. Tension refers to what she again calls “latent cultural feeling.”

“You look at what people say they do not like about the product. Say, a mini-van. People say they don’t like it; that [not liking it] is not just about the car but the end of your sexy life and beginning of your being a suburban mom. Not just ‘what is the style of the car and does it have heated seats or not.’  (A campaign that tried to exploit this specific tension with irony backfired when Brooke Shields was hired to deadpan that people were having babies in order to make use of the Volkswagen’s German engineering.)

This interest in latent cultural feeling is why Jenny’s first agency tried specifically to hire an anthropologist. “The ad I answered (posted by the AAA) said they did not want anyone from the advertising world or the business world.” Concepts from anthropology get sewn into the fabric of ad strategies. She explains: “So take for example an idea of the gift, or reciprocity, from anthropology. A gift is not a gift only, it is enmeshed in a world of practices. We in anthropology think that’s obvious—but the business world thinks that’s quite a shocker.” She laughed. “You come and tell the agency: you know what, if you give someone a gift card and they can buy themselves what they want it doesn’t work as a thoughtful gift.”

“Business people,” Jenny said, “understand use value, and they understand badging. ‘Badging’ is [the process of] saying something about you. You want to save the environment and wear Tom’s shoes. If you carry an expensive [Hermes] birkin bag you are showing you are really rich – or that some man loves you a lot. Old fashioned luxury. They want us to show what things mean.

“Anthropology has always known that there is a coherent system, a link between, say your religion and what you buy. This was novel for business thinkers, because they thought of things separately, normally.” The business perspective is not necessarily attuned to the meanings of commodities, she says, but on how “they can measure the success [of the campaign]. From a cultural perspective it’s harder to measure the success except how attitudes shift.

“The way you’re taught to think in anthropology is helpful in business, because they don’t see it that [systemic] way, they are inside it, and they don’t realize how the world is changing. For example, take the secret shame people feel in sitting on the couch and watching TV. Now, instead there is a premium on getting lots done at once.

“Thinking that way is the biggest advantage that business people see in having an anthropology background,” Jenny said.

And sometimes, they focus on the global, comparative focus of anthropology. “I would get emails that said “tell me the anthropology of gum” and you can look at practices all over the world like qat in Yemen, or the betel nut. You can look at chewing as a practice comparatively. Those are fun questions.

“If you don’t want a career only looking at one thing, this lets you. And if you write a brief a certain way to go after what, for example, people are ashamed of, you can change things.”

Our conversations have made me more constantly conscious of my own — and everyone’s — part in constructing the symbolic domain of commodities. The meaning of the ad, like the meaning of any utterance, emerges through the dense histories of both the addressor and the addressee. The commodified object, that “very queer thing” concealing human relationships in the labor that made it, comes alive on both sides of the ad.

Omar Victor Diop’s Project Diaspora: Self Portraits at Indiana University

*Image – Omar Victor Diop, Frederick Douglass, 2015, Inkjet print on Hahnemuhle paper, Courtesy of the MAGNIN-A Gallery   Consider four digital reenactments of significant portraits, a Moroccan Man, Frederick Douglass, Portrait of Citizen Jean-Baptiste Belley, and Juan de Pareja. … Continue reading

Webb Keane on his new book, Ethical Life

Interview by Ilana Gershon

Imagine that you happen to be in a long line at the airport, and find yourself chatting with another academic, say a media scholar who studies Cuban television before and after the revolution.  How would you describe the ways your book might be useful to her?

Of course no one standing in line at the airport is talking to the people around them because they’re all absorbed in their personal devices.  But anyway, there are two ways to approach your question: first, as being about revolution, and second, as being about television.  Let’s take revolution first, and then turn to television (leaving aside the old question of whether the revolution will be televised).

Revolutions, like religious revivals and social reform movements, exemplify the fact that ethical life isn’t just about being in the flow of things or cultivating virtuous habits and embodied sensibilities.  People also have a fundamental capacity to stand apart from that flow, in highly self-conscious ways.  They can take what I call the third person stance toward ethical life.   Although this kind of stance is often associated with religious moralities, avowedly atheist revolutions show that one can cultivate a god’s eye view without God.   This is why the book devotes a chapter to the Vietnamese revolution.   Obviously all sorts of factors go into any given revolutionary movement, but Vietnamese history casts light on the distinctively ethical underpinnings of political commitment.   After all, why should urban literati like Ho Chi Minh (or, I could say to your Cuban media scholar, people from privileged backgrounds like Fidel or Che) have cared about socially distant peasants enough to deviate from their own comfortable pathways in life?   I argue that to understand Ho’s revolutionary project and its wide appeal in its early years, we have to grasp its sources in what are properly ethical concerns about harm and justice.   People like Ho could crystallize those ethical concerns as a principled and readily communicated political critique thanks to the availability of a third person perspective on their society.  From the early decades of the twentieth century, Marxist social theory and historical narratives, along with elements of Confucian and Catholic social thought, provided Vietnamese revolutionaries with a position from which the view their own world from the outside.   But that “god’s eye” position alone couldn’t make a revolution.  The Vietnamese revolutionaries understood the importance of what we call ordinary ethics, that is, the way that values like respect, dignity, social recognition, and equality are embedded in everyday habits and activities.   In light of the enormous economic , political, infrastructural, and military challenges the Vietnamese communists faced, it’s remarkable how much emphasis they placed on changing seemingly trivial norms of speech and other aspects of face-to-face interaction.  In this respect, they were trying (with greater or lesser success) to bring the third person stance to bear on the habitual and unself-conscious flow of first person experience and second person address.

As for television, like any medium, it is a vehicle for the circulation of objectifications—images, expressions, and narratives that retain some formal integrity beyond their original context.  These objectifications have historical consequences for ethical life.  They contribute to ethical self-consciousness of individuals, and the consolidation–and dissolution–of public norms more widely.   So one question to ask about television is how its impact differs from face-to-face interaction and other media like newspapers, radio, cell phones, the internet, and so forth.  What difference does it make that a given medium has the speed it does, or geographical reach, social scale, visual versus aural or tactile sensoria, one-way versus dialogic format, centralized control versus open access, the techniques of intimacy and alienation, and so forth?  These questions open up a huge set of empirical problems that extend well beyond the scope of my book.  But here are some of the distinctively ethical questions we might ask.  Your own work with teenagers has focused on one of them: is it okay to break up with someone by text message?   If not, why?  What ethical difference does it make whether your social actions are carried out in one medium or another?  If this is a question about the second person address of interaction, we can also move our attention outwards into more public and sociological scales.   Do certain media facilitate the third person stance or enhance first person subjectivity?  What difference does it make that a message is conveyed in verbally explicit form or implied by sonic or visual means?  Is it more ethically dubious to be swayed by the sound of someone’s voice than by the logic of their arguments or the authority of their institutional position?  Do certain media forms reinforce the monologic voice whereas others enable dialogism?  Television and social media notoriously escape the confines of context: what weight do we give to semiotic form and producers’ intentions when a supposedly neutral image enters a context where it’s deemed pornographic, racist, or blasphemous?  These aren’t just academic questions; they also worry teachers, parents, lovers, artists, political activists, censors, lawyers, and propagandists.  We know, for instance, that the easy transmission of sermons via cassette tapes played a critical role in fostering a new kind of public space in the run-up to the Iranian revolution.  And of course all sorts of claims have been made for the transformative effects of social media on the Arab Spring—not all of which have stood up well over time.  Is the form of a medium effective independent of its content?   Muslim preachers in Indonesia seem to think so when they hire mass media consultants from American Christian televangelists.

One of the key theoretical moves you make to fashion a more interdisciplinary conversation about ethics is expanding the notion of affordances.  Psychologists and media scholars have used this concept to discuss human interactions with the material world.  In your hands, affordances can belong to “anything at all that people can experience” because they “possess an indefinite number of combinations of properties.” (30)  Yet anchoring affordances in materiality provides a significant theoretical purchase – it has typically afforded a way to conceptualize limitations and resistances.  When a cloth can be torn but not made to radiate light, this is a way that matter matters.   In your framework, what is the grounding for resistances and limitations, for determining what is possible and impossible?

I expand on the notion of affordances by including people’s experiences of such things as emotions, cognitive biases, linguistic form, patterns of interaction, and social institutions.  But ultimately these are only available to experience because they have some material manifestation.  Although this may push the concept of affordance further than its more familiar uses, I think it’s consistent with them.

I’ve been seeking to develop a realist approach to anthropology that nonetheless retains the insights of the constructivist traditions in social thought and does not succumb to determinism.  The attraction of affordance lies in this.  It treats the components of the world as real, and as making certain things possible.  But it does not do so by claiming that the things of this world necessitate anything in particular (nor, for that matter, does the analysis depend on us claiming to have the “correct” depiction of that world).  One example I use, echoing something George Herbert Mead wrote long ago, is the chair.  A wooden chair affords sitting, but only if you’re of a certain size, shape, and flexibility.  So the affordances of the chair only exist relative to the capacities of someone who might take them up.  Moreover, the existence of chair doesn’t mean that you will sit.  You could use that chair to block a door, hold down papers, prop up an art work, hit someone over the head, burn to keep warm, hide behind, step on to reach something out of reach, or, for that matter, you could simply ignore it.  That is, affordances are summoned up in response to projects of some sort.   As new projects develop, hitherto unforeseen affordances will emerge into view.

Impossibilities have to be part of the story too: you could say that a chair will not enable you to fly.  But here’s a more relevant example in the book.  Humans cannot learn to speak a full-fledged language without first developing some cognitive capacity to infer other people’s intentions and otherwise work with what some psychologists call “Theory of Mind.”  You can’t even use first and second person pronouns unless you have a rudimentary grasp of the perspective on “I” that is momentarily granted by saying “you.”  This affords all sorts of things, including shame, prayer, novels, torture, games, and witchcraft.  It also casts doubt on certain strong claims about ethnographic difference—namely, that there are some societies where people really have no concept of interiority or intentions.   To make this claim is not to eliminate interesting differences among social realities.  Rather, it pushes us to examine them more closely, to ask, for instance, what is at stake for some societies that forcibly deny the intention-reading that they are, in fact, doing all the time.  I think there’s more ethnographically specific insight to be gained this way than by treating each cultural world as autonomous, the creation of its own heroic Promethean powers to create reality.  But this should not lead us back toward any of the familiar reductive forms of determinism.

In this book, you address the possibility that self-consciousness or reflexivity can be a necessary but not sufficient first step towards social change.  Sometimes self-awareness does not change social interactions, or only does so for a fleeting moment.  What do you think makes self-consciousness socially successful so that it shapes how others evaluate ethical behavior as well? 

This is a question about the role of ideas and values in the extremely complex social and political histories out of which they emerge and on which in turn they have their effects.  The extraordinary speed with which gay marriage has gone from being an easy political wedge issue to divide classes and regions in America to much wider acceptance than anyone expected is a fascinating case.  But I think it’s too soon for us to see clearly how this came about and what will follow.  We have more perspective on the abolition of North Atlantic slavery.  As historians have pointed out, in Britain the arguments against slavery were already well known in the seventeenth century and increasingly came to find acceptance over the course of the eighteenth.  But all sorts of other things had to happen for those ideas to induce the social changes that finally came about in the nineteenth century.  These include the great wave of popular evangelical Christianity, England’s political and economic competition with France and its ideological interest in distinguishing its moral superiority to a newly independent (and slave-owning) America, the emergence of working class identities that put pressure on the value of manual labor, and more.  These elements are heterogeneous and their conjunction is largely contingent.  So the history of ideas matters—they have to be available and they have to be plausible.  But ideas only become socially viable when all sorts of other factors come together.  Ethical concepts, social institutions, political organizations, laws, technologies, economies, and so forth have quite different logics and temporalities, and are enmeshed in distinct kinds of causality.  Explicit ethical concepts help crystallize people’s intuitions and allow them to circulate in new ways (which takes us back to the issue of media raised in your first question) but they can’t tell the whole story alone.

Explicitness has such power for enabling shared agreements about what is ethical to travel across cultural contexts in your account.   I can’t help thinking however that we are currently in a stage of capitalism when the market is viewed as the ideal spontaneous order precisely because self-awareness is irrelevant to its functioning, when algorithms are viewed as idealized ordering mechanisms, but only because, in a sense, they are seen as circumventing explicitness.  What do you think of social orders that disavow explicitness, viewing explicitness as largely irrelevant for social interactions to function?

In this context, explicitness means being able to put an ethical stance into so many words: “the voting law is unjust” or “the Dean can be trusted to say what she means.”  You do this by drawing on the ethical vocabulary that’s available in a given social location and historical moment.  (By the way, this means that particular ways of being ethical are necessarily historical: As old ethical categories disappear and new ones come into existence so to do ways of being, or not being, ethical, and new ways for people to affirm or deny one another’s ways of being ethical.  Try as I may, it’s simply not possible for me to be a virtuous Athenian or a Confucian sage today.  An ethical vocabulary is not just a set of labels for ideas or values that are already there, waiting to be named.)  What some philosophers have called “morality systems” try to stabilize ethics by codifying it.  But explicitness is just one moment in the ongoing dialectics of objectification and subjectification.  It involves stepping into what I call the third person stance, taking a distance from the first person of experience and the second person of address to see oneself and others through generic categories.   It is a kind of self-distancing that induces particular forms of self-consciousness.  For this reason, explicitness has also been held in suspicion in various ethical regimes.   We can see this in certain styles of romanticism and mysticism which treat self-consciousness as a form of inauthenticity, and celebrate being in the flow of things.  It’s a recurrent issue:  some ancient Chinese philosophers also worried that any purposeful striving to be ethical would be nullified by that very effort.  Such regimes aim—paradoxically—to actively inculcate effortless, habitual ways of being ethical.   The goal is to live entirely in the first person, as it were.   But this can be only part of the story.  On the one hand, an ethics that wholly lacks the first person stance would be unsustainable—it would have not claim on anyone.  That’s part of my argument against utilitarianism, which insist one only look at things from the objective position of the third person stance.   It’s only from the first person stance that one can really care about ethics in a fully embodied and inhabitable way.  But to insist that ethics is only one or the other—either objectification or being-in-the-moment—is to deny the fundamental motility of human life.  People cannot remain entirely present in the first person, nor is it possible to sustain the third person stance only.  We are always in motion among them.   This motility isn’t a bug—it’s a feature.

So, to turn to the rest of your question, what about this period of capitalism?  We could say that neo-liberalism expresses an ideological reaction against the third person stance of the centralized nation-state, with its blueprints and planners.  Does this make it a-ethical?  Not necessarily.  After all, there is an ethics of autonomy there.  I call this an ethics because the autonomy expressed in neo-liberalism is sometimes treated as a value in itself, beyond any instrumental justification.  We may feel it’s based on false premises or has harmful consequences, but I think we should recognize that it makes ethical claims of a sort.  They’re just not necessarily ones I would accept.  However, although none of us as human beings can, or would want to, avoid ethical judgments, in our limited role as anthropologists we should not be in the business of making ethical pronouncements ex cathedra.  Having said that, neo-liberalism does deny or ignore something very basic to ethical life as I describe it in the book, the fact that people are thoroughly enmeshed with one another in very fundamental ways.  Any form of social organization that denies this and tries to treat them as wholly independent units is empirically mistaken and, let’s say, ethically compromised.

You imaginatively move a step beyond the insight that ethics is the challenging task of living alongside other people to argue that ethics at the core is about the challenging communicative task of living alongside other people when no one has telepathy.  That is, communication is profoundly at the heart of what it means in a given historical and cultural context to be ethical.  Say that you are as persuasive as I hope you will be.   What types of research projects should people explore beginning from this insight?

If people lack telepathy, then we have to take communication very seriously.  That means that every time we want to say something about experience, affect, concepts, values, intuitions, subjectivities, we should ask how they are mediated.  But communication isn’t a simple matter of transmission, getting a self-contained message from one head to another head.  For on thing, communication takes place over time, but, as I show in my chapters on social interaction, it always loops back on itself, opening messages to revision, reframing, denial, anticipation, dissemination, and so forth.  Moreover, mediation isn’t just an empty vehicle.  It is always embodied in semiotic forms (words, images, actual bodies, spaces, places, rituals, institutional procedures, and so forth).  Semiotic forms are never entirely purpose-built—as Derrida remarked long ago, “the engineer is a myth.”  As a result, they bring with them their contingent histories, they face causal constraints and give rise to unintended consequences well beyond anyone’s communicative purposes, and they possess affordances that can point their users in unexpected new directions.

It follows that research should be very attentive to the formal and material properties of our evidence.  So much contemporary ethnography tends to be literal-minded.  And far too much of it is based on interviews.   So the first point is just to take semiotic mediation seriously.   Partly this just means paying close attention the form and not just content of communication.  In addition, it means attending to materiality, to both the qualities of media and the causal networks they’re involved in.   If you were researching the internet, for instance, you might ask both about the body’s relationship to movement viewed on a flat screen and about the infrastructure that makes that relationship possible (cyber-utopians never seem to talk about how we pay the monthly smart phone bills or the environmental costs of powering Google’s servers).   So rather than suggest new research topics, we might look at the research we are already embarked on from new angles, asking what are the constraints on people’s projects, the distinctively ethical affordances and unintended consequences to which their semiotic media can give rise?

I would pay particular attention to the interplay between what gets made explicit and what remains unsaid, either because it’s too obvious to say, too ordinary to notice, or is simply impossible to put into words.   In looking at social change, for instance, what’s the relationship between those who are articulate and passionate, on one hand, and those who are silent and indifferent, on the other?  Are the voices we hear most clearly always where the action’s at?  When they are, is this because of what they say, who’s saying it, or how they say it?  In my book, I look briefly at feminist consciousness-raising during its radical moment, in the early 1970s, before it became absorbed into mainstream therapeutic culture.  (As with my discussion of Vietnam, this example draws on the historical perspective that we lack when looking at current events).  What’s interesting is how these women, some of whom had been influenced by reading Maoism and Frankfurt School Marxism and by practical experiences in the Civil Rights movement, discovered the affordances of ordinary conversation.  Out of their conversations they created a new ethical and political vocabulary for experiences that had until then seemed idiosyncratic, pathological, or simply inchoate.  The result was what I call “historical objects,” values and concepts (sexual harassment, glass ceiling, control of one’s own body) and that can be pointed to, debated, circulated widely through the media, and institutionalized—or suppressed—in explicit norms and laws.  One could argue that new ways of being a person, of flourishing, and of identifying harm came into existence that simply did not exist before.   But history is full of projects that go nowhere: objectified values and concepts remain only theoretical unless they can enter into the flow of everyday life in some way.   To see how this pans out ethnographically requires careful attention to semiotic mediation

As the Vietnamese and feminist examples suggest, the interplay between the explicit and tacit, or the said and unsaid can be crucial to understanding how social movements pan out.  There’s a lot of ethnographic interest in these topics already but I would suggest that we need to pay special attention to the motility among first, second, and third person stances.  To repeat, the idealized third person stance—an ethics of pure principles—remains only notional unless it offers some concrete ways of being inhabitable.   But as soon as something becomes concrete—for instance new kinds of marriage, styles of child-rearing, acceptable means of making a living, or practices of ethical pedagogy– all sorts of unforeseen affordances are likely to become visible and unintended consequences likely to emerge, such as new kinds of semiotic transgression or performative failure.

Your cover is so striking, when I got the book I immediately flipped to see where the cover came from, only to discover it is one of your paintings.   Could you talk a bit about the story behind the cover – did you paint this piece intending it to be the cover?

Before entering academic life, I was an artist (I’ve never taken a college course in anthropology—maybe that’s why I’ve never grown tired of the subject).   That cover image is part of a series that I painted many years ago.  When I was finishing my second book, Christian Moderns, I decided I didn’t want to have a cover that would try to illustrate the book, both because that seemed too literal-minded, and because illustration covers often encourage certain readings of the book at the expense of others.   As it happens, an abstract painting that one of my old studio mates had given me was on the wall, and worked very well.  So for Ethical Life I thought I’d use another work by a friend.  However, none of the pieces I myself owned seemed to work.  But someone suggested I use my own painting.  The original is in blacks and greys, which seemed a bit too somber, so I invited the press to alter the color scheme.  Since my first books had been green and blue, I favored red, but that turned out to look a bit too much like bloody bandages.   At any rate, you’re welcome to read into the cover what you will!

Bodoh-Creed’s When Pfizer Met McDreamy

My dissertation is an examination of the role that medicine and media play in educating the American public. The research as a whole looks at four lines of media evidence including medical fictional and non-fictional television, pharmaceutical advertising, and internet health searches (also called cyberchondria). My page 99 sits squarely in the historical review of medical television, looking at the portrayal of physicians and medicine from shows in the 1960s like Ben Casey to the current spate of shows on the air now like the long running Grey’s Anatomy and House M.D.  Page 99 discussed the role of graphic medicine and realism that, while not unique to ER, was popularized by the show and it also demonstrates how physician writers cannibalize medical experiences of their own and those of colleagues around them.

[ER] thrived on intensity for the audience. The pacing was fast and the camera shots unique. In an Emmy Award winning episode of ER in the first season, titled “Love’s Labor Lost” a pregnant woman is featured having complications in the emergency room and an ER doc having to perform a caesarian section in haste. Of course chaos ensues and it is a very graphic, fast episode that was based on a real experience of a physician friend of one of the writers. (99)

Within my dissertation research, I want to stress the importance of the amount of access that I was able to obtain within medical television industry personnel. I spoke to actors, directors, executive producers, writers, physician writers and consultants, nurse advisors and consultants, product placement coordinators who organized medical equipment for set, special effects creators, and they all gave me some incredible insight into their world and also the changes in medical television over the last 50 years. The information from these key informants show the ways that physicians and nurses create the authentic medicine that is seen on screen.  They strive for accuracy as much as possible, knowing that audiences are paying attention to the jargon, the procedures, and the medical lessons of early detection, treatments, and life saving medications.

Jessica Bodoh-Creed, “When Pfizer Met McDreamy: A Classic American Love Story Between Medicine and the Media.” PhD diss, University of California, Riverside, 2013.

Dissertation available here:

Jessica Bodoh-Creed, Adjunct Faculty, California State University, Los Angeles, Department of Anthropology.



Ben Peters on his new book, How Not to Network a Nation

How Not to Network a Nation

Interview by Ilana Gershon

Questions for Ben Peters

If you found yourself at a picnic with linguistic anthropologists, and one of them was sensible enough to bring tasty corn and seemed very interested when you mentioned briefly that you had written about the history of thwarted Soviet attempts to develop the internet, how would you explain your book?

Hey ling anth friends, please pass the corn!

(Between messy bites) I think my interest in the Soviet internet story owes a sideways debt to the fact I’m from deep corn country, USA. Coming from a small college town in the Midwest licenses one not only to know their corn (delicious!) but to appreciate life removed from the urban centers of global action. This appreciation once struck me as a 20-year-old service volunteer living in a small city called Balakovo along the Volga river in the post-industrial rust belt of Russia. There I realized not all “middle of nowheres” are similar (Ian Frazier’s delightful and meditative masterpieces The Great Plains  and Travels in Siberia make this point). In addition to the earnest people, the agricultural base, and scenic landscape I was used to back in Iowa, Balakovo, a former secret Soviet city, had decaying military-industrial factories for the cosmonaut industries, an enormous hydroelectric dam, a line of nuclear reactors looming on the horizon, and much else. Where did this outsized industrial infrastructure come from?  I remember wondering on the edge of an enormous dammed reservoir: who first thought it a good idea to plan so much electrical power in out-of-the-way Balakovo, and why?

(Holding up half-eaten corn cob) I have since recognized that the big ag corn industry in the US—and perhaps even all advanced modernity in its debt to large institutions—participates in the basic question behind the Soviet internet book: why do some institutions organize our lives and others not?  Why, in particular, wasn’t there a Soviet internet? Given its failure was neither inevitable nor natural, what was the story of the Soviet scientists and leaders that planned to network the nation with computers anyway? Who were these cyberneticists?   What did they want when they started to imagine  networking the planned economy with computers in an ambitious project called the All-State Automated System ( or OGAS for short)? And why, despite thirty years of attempts at the height of the cold war tech race, did this outsized informational network infrastructure for the Soviet people not take root?

(Hint: my answers are not censorship cultures, technological backwardness, or inefficient hierarchical states.)

Throughout the book, you explain that while many cyberneticists modeled their national network projects after the human mind, the projects were quite different, in part because how they understood this metaphor varied so much. Could you explain some of the differences?

(Sets corn cob down in order to focus.) Sure thing. There are many brain-computer metaphors—and none of them are right. I think the twentieth-century pedestaled the wrong image of the ideal computing processor: the ideal computing processor is not the human brain. (Moreover this towering intellectual hubris—or what brains think about themselves—builds naturally on a troubling early modern vaulting of western individualism.) Cybernetics, after World War II, enabled strong neural-computer network analogies to be at work in Warren McCulloch’s influence on Paul Baran’s distributed network design at RAND, in Stafford Beer’s influence on Allende’s Cybersyn network in Chile, and in (as I detail in the book) Viktor Glushkov’s influence on the OGAS Project in the Soviet Union. Each had different consequences in different places: to put it in a nutshell, the ARPANET designers imagined their nation as a single distributed brain of users, while the OGAS Project designers (not entirely unlike Beer in Chile) imagined their network as a nervous system for layering onto their nation as an economic industrial body of workers, with the state as the brain. To suggest that the first American computer network was modeled after an imagination of a national brain and the Soviet networks were after a national body not only rehearses the mid-century emerging information and industrial cold war economic differences—but it also obscures the on-the-ground story of both. I suspect linguist anthropologists have much to teach me here in particular: I think the biggest difference lies not in the metaphors but in the distance between all those brain-computer metaphors and the embodied practices of building and institutionalizing computer networks. No national network projects resemble the human mind in practice.

Economists and economic cybernetics play a much larger role in the book that I imagined when I first started reading. It left me wondering: what kind of dilemmas do economists have to solve when they don’t presume that the market is the best way to distribute goods and determine value?

They stump up against some of the hardest dilemmas I know. In all semiotic-material discourse (of which the economy is of course just one mode), every evaluation is also an executable fabrication that itself acts on other evaluations. And among the resulting chains of operations, in which there are many dilemmas, the ones that matter most in this book do not fall along the cold war economic liberal language of private markets versus public states. (Hannah Arendt, for example, nudges my conclusion to deconstruct and begin rebuilding network discourse beyond the tired cold war triumphalism around markets, liberty, and commerce.) As you note, I spend a couple chapters developing how economic relations did not work as planned in the Soviet context: continuous and partial reform among battling schools of thought, nonlinear command and control dynamics, informal power networks, vertical bargaining, and other sources of organizational dissonance. I dub all this, borrowing from McCulloch and David Stark’s language, “heterarchy.” In a heterarchy, every node is subject to competing regimes of evaluation and the resulting logics by which value is determined cannot be described or mapped onto simple two-dimensional models (markets, hierarchies, and so on). Perhaps our behavior can be mapped onto a higher order in n-dimensional spaces, suggests McCulloch, or perhaps not at all. How we determine value is a complex measure of how modern humans interact, and indeed how any actor responds to contradictory demands (do I write, prepare for class, go for a walk, or have another piece of corn?) reveals more than our negotiated compromises to that contradiction.

Back to the Soviet case: it is no surprising revelation that the on-the-ground practical relations for determining and planning values in the Soviet planned economy did not function as they promised to on paper. Still, this mundane fact had consequences in at least two directions: it frustrated the rationalizing impulses of technocratic economic reform attempts such as the OGAS Project. It also ensured that economic bureaucracies could actively resist reforms because they were free to pursue their institutional self-interest in the status quo. The Soviet network story is thus an uneasy mix of technological genius and futuristic foresight leavened with mutinous ministries and institutional infighting. Or, to restate the book hook, while the US ARPANET took shape thanks to state funding and collaborative research environments, the Soviet contemporary projects broke against the rock of unregulated competition among self-interested institutions and bureaucrats. The first global computer networks took shape thanks to cooperative capitalists, not competitive socialists.

As your story unfolds about why the Soviets never developed a national computer network, another deep irony emerges – that a system built around centralization consistently over time ended up undercutting any possibility of a centralized computer network.   You suggest that this is a story about the tensions between belief and practice, between a system that was touted as a centralized and well-regulated bureaucracy and in practice a complicated mixture of differently structured hierarchies. Could you discuss the tensions between Soviet centralization and Soviet bureaucratic fiefdoms that lie behind your history of non-events?

That’s a beautiful question central to the project—one that I’d like to tweak in two ways as a way of responding.

First, commonplace understanding of the Soviet state as centralized and well-regulated is empirically wrong. Let’s think instead of the Soviet state as trying to turn a complex field of decentralized fiefdoms into a single field of decentralized fiefdoms. Even the Politburo rarely endorsed totalizing centralization, and certainly none of the Soviet network projects in the book call for fully centralized networks. With the exception of one short-lived radial network proposal, all of the proposed Soviet network projects between 1959 and 1989 interestingly resemble decentralized pyramids (just like the official economic plan at the time).

These network still recognized a central command in Moscow while also permitting real-time remote access between any two authorized nodes on the national network. This is a key corrective (especially in light of the romance of flat organizational networks in the west): in both principle and practice, the Soviet Union was not too top-down or rigidly hierarchical.  And in administrative practice, it was too messy and pernicious.

Second, the book is a negative history of real events, not a hypothetical history of non-events (although it does color the vision of a futuristic electronic socialism that never was). The reason I think this re-characterization matters is because I am openly interested in helping normalize the study of failed projects among scholars of human relations, complex institutions, media studies, and their adjunct interests.

Or as the book puts it,

contingent histories also help focus public debate better than do popular histories of technology that parade about hackers, geniuses, and geeks marching to the Whiggish beats of technological progress. In negative histories failures, even epic breakdowns, are normal. Astonishing genius, imaginative foresight, and peerless technical wizardry are not enough to change the world. This is one of the lessons of the OGAS experience. Its story places the conventional concepts of technological successes and failures on the wobbly foundations of the accidents of history. The historical record is a cemetery overgrown in short-lived technological futures: stepping of its beaten paths leads us to slow down and take stock before we rush to crown the next generation of technologists as agents of change. (197)

Perhaps the hard moral of the story is this: no one sensitive to the suffering all around us cannot want to reform the world for the better. Yet, in the multivariable calculus of social reform, the only thing more certain than our desire to change the world (and media and language are among those ways) is to admit that there is no guarantee that any given effort ever will.

This is a beautifully written book about bureaucracy, one in which you even manage to make a bureaucratic meeting, and the fact that two people happened to be absent, suspenseful. I was wondering if you could talk a bit about your writing strategies for making compelling some topics that at first glance might seem like good insomniac aids. How do you make bureaucracy and its genres compelling?

(Setting down now finished corn cob.) Thanks! That means a lot. Writing is one of my favorite demons. Bureaucracy, because it (like computing programming) is made out of writing that is meant to be executed but never really read, often deadens its observers to the real fertility and force of the written word. Perhaps students and scholars of mind-numbingly dull technical systems should indulge in great stylists in English and other languages as temporary antidotes against the occupational hazard that is prose pollution. Outside of my own awe for language (which I take up more directly in my brand new edited volume Digital Keywords: A Vocabulary of Information Society and Culture, which Princeton published last week on the 40th anniversary of Raymond Williams’ Keywords), I don’t have any sure-fire strategies, although the normal ones will probably do: I recommend reading voraciously and strategically, slavishly imitating and scrupulously doubting the masters, writing for the smart and interested eighth grader, and then rewriting with an ear tuned to the cadence of language. Of course I rarely manage to pull all that off, but it’s a good thing to try and a better thing to have sympathetic readers to share it with.

Thanks for the picnic!

Benjamin Peters is the author of How Not to Network a Nation: The Uneasy History of the Soviet Internet  (MIT 2016) and editor of Digital Keywords: A Vocabulary of Information Society and Culture (Princeton 2016). He is assistant professor of  Communication at the University of Tulsa and an affiliated faculty at the Information Society Project at Yale Law School. Work site: Tweet at him @bjpeters.