Cult Conversations: A Series on Horror, Exploitation and the Gothic

An Introduction and a Provocation

By William Proctor

Over the past year or so, horror cinema has been discursively underpinned by what entertainment critics have described as a “new golden age,” a “renaissance” that is demonstrative of an unequivocal cultural, industrial and attitudinal shift. As SyFy’s Tres Dean claims,

The past five years or so have seen the release of such a wide array of genre-defining horror films that it may be time to go ahead and call a spade a spade: We’re experiencing a genuine horror renaissance.

Likewise, Daily Beast’s Jen Yamato argues that

The mainstream horror movie is, sadly, the last place anyone who’s ever seen a mainstream horror movie would credibly look for critical acclaim—not that horrorhounds wouldn’t love to see an impeccably crafted four-quadrant slasher sweep the Oscars.

According to Michael Rothman for Consequences of Sound, “horror isn’t just having a resurgence, it’s taking over.”

Writing for the BBC, Nicolas Barker claims that, in historic terms, horror has traditionally been the black sheep of the Hollywood genre system,

 a slightly embarrassing, bargain-basement alterative to mainstream drama […] You can understand why [horror films] might not appeal to a producer with an Oscar- or BAFTA-shaped space in their trophy cabinet.

Some critics argue that this so-called “new Golden Age” is best exemplified by the rise of Blumhouse productions, with Jason Blum’s ‘cheap and nasty’ economic model outperforming blockbuster franchises and films, at least as far as return-on-investment (ROI) goes. The Oscar nod for Jordan Peele’s Get Out is, of course, most often heralded as ‘proof’ that horror is transitioning out of the cult ghetto and into mainstream prominence. “For the first time ever,” exclaims Scott Meslow for GQ, “the most critically lauded movie released [in 2017] is a horror movie.” Blumhouse has “cornered the market on inventive horror,” argues Tracy Palmer.

Screen Shot 2018-10-30 at 12.47.20.png

Writing for The Guardian in September 2017, Anna Smith asked:

So how did these once fringe-films move into the heart of the mainstream?

 Smith’s question, perhaps a rhetorical flourish more than anything substantive, suggests that the terms ‘mainstream’ and ‘horror’ are not easy bedfellows, setting up a binary between popularity and fringe (or the incredibly amorphous term, ‘cult’). The second decade of the new millennium is, as many critics have pointed out, a high generic watermark represented by ‘quality horror,’ ‘smart horror,’ ‘high concept horror,’ ‘elevated horror,’ ‘horror-adjacent,’ and ‘post-horror,’ terms that, in Nicolas Barker’s account, operate as “back-handed compliments,” bolstering the notion that the genre is much maligned.

Screen Shot 2018-10-30 at 11.42.05.png

But has it not always been the case that horror cinema has been historically less a coherent category than, as with all genres, a system of currents, cycles and trends that have pumped valuable oxygen and blood into “the heart of the mainstream” for almost a century? For if Hollywood “has always relied on horror movies,” then the suggestion that the genre has recently been elevated from the margins and thrust into the spotlight is little more than discursive ballyhoo, I would argue. Indeed, the suggestion that contemporaneous horror media is somehow indicative of a widespread ‘renaissance’ would mean that there has been a fallow period from which the genre has risen into prominence once more.

 But has horror ever really gone away? Is the genre truly a niche or cult object? And would it be at all accurate to claim that horror cinema has historically been categorically despised and maligned?

Screen Shot 2018-10-30 at 11.43.18.png

In cinematic terms, the genre has inarguably been a key organ in “the heart of the mainstream” since the turn of the twentieth century. Prior to the coming of sound, entrepreneur and pioneer, Thomas Edison, produced the first film adaptation of Mary Shelley’s Frankenstein in 1910, drawing on the gothic tradition at a time when the horror genre was in utero. Lon Chaney, “the man with a thousand faces,” was perhaps the first star of (proto) horror cinema, most famously playing lead roles in The Hunchback of Notre-Dame (1923), and The Phantom of the Opera (1925), the former of which was Universal’s “super jewel,” the most successful mainstream picture for the studio at that point. With the inception of sound in the early 1930s, Universal’s Carl Lemmle Jnr continued to green-light adaptations of gothic literature, leading into what has been termed the first Golden Age of horror films usually illustrated by the seminal ‘Universal Monster’ cycle. In 1931, Tod Browning’s Dracula and James Whale’s Frankenstein shattered box offices across North America, resulting in horror sequels and series swiftly becoming a major cash nexus of the Hollywood motion picture industry.

universal.jpg

During the period, it was not only Universal that tapped into audiences’ appetite for blood-curdling cinema, but other studios produced a welter of horror pictures as well. Alison Peirse aims to redress this gap in After Dracula: The 1930s Horror Film, emphasizing “the diversity of horror film production during the 1930s”:

Many of the films that appeared over the next few years [after Dracula] diverged quite significantly from the mechanics of Universal’s gothic vampire story. The Gaumont-British film The Clairvoyant (1933) is grounded in spiritualism and British popular culture; Murders in the Zoo (1933) is a story of lip-sewing sadism and murder set amongst real-life big cats and snakes; while The Black Cat (1935) is an occult shrine to modernist architecture and design.

Murders_in_the_Zoo.jpg

Paramount also boarded the horror bandwagon in 1931, producing Dr Jekyll and Mr Hyde, featuring Fredric March as the eponymous split personality, a performance deemed so incredible that March walked away with the Best Actor in a Leading Role Oscar at the fourth annual awards, as well as Most Favourite Actor at the Venice Film Festival.

hqdefault.jpg

And while classic horror films might well seem excessively camp or outmoded to audiences today—my own undergraduates tend to laugh at Whale’s Frankenstein, for instance—the emergence of horror pictures during the transition from silent cinema to “talkies” attracted the ire of censors and moral campaigners. As recounted by Peirse, an administrator of the Production Code asked of Hays: “is this the beginning of a cycle that ought to be retarded or killed?” It was thus not the attack on The Exorcist in the 1970s, or the so-called “video nasties” in the 1980s that first put horror cinema in the dock.

Between 1931 and 1936, horror cinema remained at the epicentre of Universal studios’ output, so much so that the decision to cease producing horror pictures in order to address criticisms of the conservative moral brigade, as well as implementation of the production code and the introduction of the H-Rating in the United Kingdom— H standing for horror—ended up leaving Universal on the cusp of bankruptcy. It was only by returning to horror in 1939 with Son of Frankenstein that Universal’s fortunes shifted once again, meaning unequivocally that horror saved the ailing studio. Decades before the rise of the contemporary blockbuster in the 1970s, then, Universal’s horror pictures stood out as examples of what we would now describe as “tent-pole” productions.

hof.jpg

By drawing upon gothic literature, adaptations and remakes were key in the genre’s formation, but Universal pushed the envelope further by producing films that remained branded with recognizable staples of the gothic tradition, while radically manoeuvring outside of the parameters of the source material, most notably with the Frankenstein films. Considered by many critics as the frontispiece of classic horror cinema, James Whale’s The Bride of Frankenstein was the first horror sequel in film history, setting out numerous codes and conventions that continue to characterise the genre contemporaneously, especially the central motif that the monster will rise again (and again, and again).

b.jpg

The second wave of horror pictures followed hot on the heels of Son of Frankenstein, although Universal moved from lofty A-picture budgets to B-movie economics. This second cycle, said to have lasted from 1939—1946, included the House of Frankenstein and House of Dracula “monster mash” films, a protean precursor to the “shared universe” model currently employed by Marvel Studios; and as the Universal monsters moved further into parody, the Abbott and Costello films. Other studios developed and deployed horror pictures during this second cycle, including Val Newton’s RKO, Colombia and the Poverty Row studios. Indeed, “all the major studios contributed to this cycle,” and “commentators believed they were witnessing an unprecedented boom in horror film production,” as Mark Jancovich put it.  

Poster-Cat-People-1942_06.jpg

The Universal Monster canon would recycle once more in the late 1950s and early 1960s with the inception of television and the rise of “Monster Culture.” As ‘monster kid,’ Henry Jenkins has written in a special issue of The Journal of Fandom Studies, “the Universal monster movies had been part of the large package of ‘Shock Theater’ Screen Gems sold to television stations in the 1950s and still in active use on second-tier local stations in the 1960s.” The publication of Forest J Ackerman’s Famous Monsters of Filmland magazine signifed an emergent fan culture that sat around the TV set, dressing up as their favorite monsters and consuming merchandised elements, such as the Aurora model kits. In 1964, Universal produced TV comedy series The Munsters to capitalize on this new audience of baby boomers as horror become part of the domestic furniture.

auroralong_boxes.jpg

In the United Kingdom during the same period, Hammer Film Productions, like Universal, tapped into the gothic tradition, with director Terence Fisher’s Dracula, The Curse of Frankenstein and The Mummy offering “the loveliest-looking British films of the decade,” several critically regarded film series (although such regard has been retroactively applied in many cases). Over time, the quality of Hammer’s output dipped, moving from serious, though melodramatic films, to absurdly camp—although that did not prevent the studio from winning the Queen’s Award for Industry in 1968 for managing to entice $4.5 million from North America and into the UK economy. This was the Golden Age of British Horror.

hammer.jpg

In the 1960s and ‘70s, literary adaptations once again became the life-blood of the genre, at least in part. Alfred Hitchcock’s translation of Robert Bloch’s Psycho led to the Director’s nomination for an Academy Award as well as Janet Leigh for Best Supporting Actress, for which she won the Golden Globe, and John L. Russell for Cinematography. And for many critics, Psycho set the groundwork for what would later become known as the “slasher film” in the late 1970s and early ‘80s. But it was the publication of Ira Levin’s neo-Gothic thriller Rosemary’s Baby in 1967 that would spark (apparently) a new Golden Age of horror cinema. Directed by Roman Polanski and starring Mia Farrow, the film adaptation of Levin’s neo-Gothic thriller was certainly controversial, with conservative critics and moralizers criticising the film for its “perverted use of fundamental Christian beliefs,” as recounted by David J. Skal in The Horror Show. This, however, didn’t prevent the film sweeping up a lion’s share of box office receipts in 1968, with Ruth Gordon winning the Academy Award for Best Supporting Actress, and a nomination for Best Adapted Screenplay. Gordon also won the Golden Globe in the same category, while Mia Farrow was nominated for Best Actress.

rosemarys_baby.jpg

Following on from the success of Rosemary’s Baby, William Friedkin’s adaptation of William Peter Blatty’s novel, The Exorcist, was a cause célèbre, a film that was both excoriated and celebrated in turn. As Mark Kermode explains, The Exorcist was

 written by a Catholic, directed by a Jew, and produced by the vast multinational Warner Bros., this was a movie that was championed by sometime political radicals such as Jerry Rubin, picketed by concerned pressure groups, paid for by millions of eager punters, praised by the Catholic News for its profound spirituality, and branded satanic by evangelist Billy Graham. Never before or since has a mainstream movie provoked such wildly diverging reactions.

_thumb_44084e42-45a0-4622-9673-595480844960.jpg

The Exorcist demonstrated that horror cinema continued to be legitimately mainstream, becoming the second most popular film in 1974—after Paul Newman and Robert Redford vehicle, The Sting—and receiving nominations for ten Academy Awards, including Best Picture (the first horror movie to be nominated in that category) and Best Director, as well as eight Golden Globes, four of which it won (including the coveted Best Picture and Best Director). At the box office, The Exorcist became the highest-grossing horror film in history, and remains so to this day (more on that below).

In 1976, Brian De Palma adapted Stephen King’s debut novel, Carrie, starring Sissy Spacek and Pippie ‘dirty pillows’ Laurie. As explained in Simon Brown’s excellent Screening Stephen King, it was De Palma’s adaptation that catapulted King from horror fiction niche to household name as opposed to the novel itself. Carrie received Oscar nominations for Spacey and Laurie for Best Actress and Best Supporting Actress, respectively.

carrie.jpg

Adaptations were not the only dish on the horror menu, however. Richard Donner’s The Omen, also in 1976, earned over $60 million at the box office, garnering critical plaudits and Academy Award nominations for Best Original Score, which Jerry Goldsmith won. Further, Billie Whitlaw was nominated in the Best Supporting Actress category at the BAFTAS and won the Evening Standard British Film Award for Best Actress, while Harvey Stephens was nominated for a Golden Globe for Best Acting Debut.

--.jpg

While not in the same league as The Exorcist in box office terms, Tobe Hopper’s The Texas Chainsaw Massacre converted a budget of $80K into $30 million—an ROI of 37,400%—thus demonstrating that films ostensibly created for the exploitation circuit could puncture the “heart of the mainstream,” just as George A. Romero’s Night of the Living Dead managed to take $30 million in box office receipts on a budget of $114K in 1968.   

texas-chainsaw-massacre-leatherface-silhouette_a-G-13789427-0.jpg

The 1970s also became the petri dish for blockbuster/ franchise cinema, and it was a horror film that arguably started it all: Stephen Spielberg’s Jaws. Although George Lucas’ Star Wars would turn producers towards science fiction—even Bond went into space with Moonraker—Ridley Scott gave us sci-fi/ horror hybrid Alien in 1979. Yet, Jaws, Star Wars, and Rocky Balboa all engaged with serialization, more commonly known as ‘franchising.’ However, as discussed earlier, it was the Universal Monster canon that experimented with serialised filmmaking four decades earlier and should perhaps be viewed as early (or proto) franchising even though the term was not in use during the period, as Derek Johnson has emphasized. Horror cinema was not immune to the industrial turn to blockbusters and “sequelization” that started in the 1970s. George A Romero produced the first sequel in his Zombie continuity-less franchise, Dawn of the Dead, in 1979, while a year earlier, John Carpenter’s Halloween sparked what has been described as the ‘Golden Age of Slasher Movies,’ a cycle that is said to have existed between 1978 and 1984, comprising well-known films such as Friday the 13th, Prom Night and A Nightmare on Elm Street. Each of these films grew into powerhouse franchise properties, assisted by the rise of video in the 1980s, a medium that extended the life span of cinema from theatrical exhibition into the domestic realm, as well as introducing the capacity to, re-watch or record if broadcast on television. As with the Universal Monster canon, eighties’ monsters such as the stalk-and-slash triumvirate of Jason Voorhees, Michael Myers, and Freddy Krueger, would return from the dead again and again, the ideal recipe for the franchise blueprint. Other characters emerged in the ‘80s as well that would lead to franchise development, including possessed doll Chucky from Child’s Play and Clive Barker’s demonic Pinhead from Hellraiser.

---.jpg

The 1990s horror film shifted, at least partly, towards psychological horror, with Jonathan Demme’s Silence of the Lambs, an adaptation of Thomas Harris’ best-selling novel, winning the big five Academy Awards for Best Picture, Best Director, Best Actor, Best Actress and Best Adapted Screenplay. In mid-decade, Wes Craven’s final Freddy film A New Nightmare surprised critics with its smart meta-narrative, leading to Scream (1996) initiating a second Slasher cycle, but flavoured with postmodern commentary and self-conscious reflexivity. The success of Scream led to sequels Scream 2 (1997) and Scream 3 (2000), all of which spun box office gold—especially the first film, which made $170 million from a £15 million budget—as well as cycle films such as I Know What you Did Last Summer (1997), and Urban Legend (1998). In 1999, The Blair Witch Project popularised the found footage subgenre with a marketing campaign that has since gone down in history.

-------.jpg

In the new millennium, the genre mutated once more (although I would insist that the genre has never quite been static). Spearheaded by the phenomenal success of James Wan’s Saw, the so-called “torture porn” cycle was born—more of an invention of the press than a discrete genre, as emphasized by Steve Jones—and over the next few years, explicitly violent films became part and parcel of mainstream cinema. The Saw franchise produced eight films over eight years, with the law of diminishing returns temporally halting production until the ninth part, Jigsaw, surfaced in 2017. This was the era of “the Splat Pack,” with ambassadors Rob ‘The Devil’s Rejects’ Zombie and Eli ‘Hostel’ Roth flying the flag for excessive splatter, gore and, in their own accounts, political transgression (see Mark Bernard’s brilliant monograph on the topic). Often admonished by the critical establishment, these films nevertheless became key elements of mainstream horror cinema and raided the box office.

-.jpg

The post-millennial landscape was also replete with remakes, reboots and re-adaptations, many of them coming from Michael Bay’s Platinum Dunes, ‘a remake house’ in all but name at that point. Described as “the deja-vu boom,” this convincingly shows that the genre need not be subservient to a single, univocal cycle but involves cross-breeding across and within sub-generic elements, a dialogic array of different manifestations of what we might describe as “horror” at any particular historic juncture.

157600-Friday_the_13th-movies-Jason_Voorhees.jpg

In 2007, Jason Blum’s Blumhouse entered the scene, with found footage film Paranormal Activity setting box offices alight with a remarkable, record-breaking ROI. By converting a shoestring budget of 15K into box office receipts of $193 million, Paranormal Activity set the ground for what has become known as “micro-budget” horror filmmaking in the twenty-first century.

the_horrors_of_blumhouse_-_hhn_at_ush_key_art_logo-h_2017.jpg

Paranormal Activity led to a further five films in the series between 2009—2014, but the second decade of the new millennium is also marked by a shift from extreme representations of gore and violence and back to atmospheric ghost stories, including the Insidious franchise, also from Blumhouse, and James Wan’s The Conjuring films and spin offs. This is not to suggest that so-called “torture porn” has disappeared, however: Pascal Laughier’s An Incident in Ghostland is certainly an intense ordeal as is the Australian film, Hounds of Love. Cycles wax and wane, but the genre is much more than this-or-that cycle at any given moment.

Screen Shot 2018-10-30 at 12.18.39.png

Are we experiencing a “new Golden age of horror films,” then? I certainly agree that the genre is in rude health at the moment, and that the Blumhouse economic model of “micro-budget” horror cinema is giving the majors a run for their money (so much so that I am researching Blumhouse for a monograph—tentatively titled Cheap Shots). But as this potted history shows—and it is very piecemeal, I admit—I definitely do not accept that horror cinema has been unquestionably fringe, unmistakably cult, emphatically marginal and wholly disparaged.

In 2017, Jordan Peele’s Get Out and Andy Muschetti’s re-adaptation of King’s IT were most often invoked as heralding the new Golden Age in press discourse, the former primarily because of the way in which it confronted the politics of race and was nominated for an Academy Award—certainly not the first to do either—and the latter because it shattered box office records for the highest-grossing horror film in history.  

Except, it wasn’t. The common trend of citing economic performance without attending to inflation is patently ludicrous. In adjusted dollars, The Exorcist unarguably slaughters IT in no uncertain terms, standing at number one for horror cinema, and at number nine in all-time box office charts regardless of genre. Comparatively, IT stands at number 225. Moreover, The Exorcist out grossed every Star Wars movie, barring Lucas’ first instalment (now subtitled A New Hope).

---.jpg

As far as disparagement goes, it is more likely, I would argue, that it is popular cinema generally that is largely sneered at by the critical establishment, just as adaptations, sequels, remakes and franchising have been castigated as symptomatic of Hollywood’s creative inertia and decline. Many press accounts tend to deal explicitly in hyperbole of this sort, ignoring the history of cinema and the way in which adaptation and remaking practices have been with us since the very start, as pointed out by various scholars such as Constantine Verevis, Carolyn Jess-Cooke and Luzy Mazdon (to name a select few). But I wonder—and I don’t know the answer to this yet—if horror cinema has managed to attract as many critical plaudits and establishment trophies and nominations than other so-called “genre pictures”? How many science fiction films, for example, have been nominated for an Academy Award comparatively? If establishment awards are any indication of mainstream success, it is certainly true that superhero films lag far behind horror cinema in terms of the trophy cabinet.

Of course, I am not suggesting that Oscars, BAFTAs, Golden Globes etc., are signifiers of quality, but I use the cases above as a way to illustrate that horror cinema has cut across cultural distinctions at different historically contingent moments. Thus, rather than view ‘the horror film’ in binary terms, as either operating on the cultish fringe or sneaking surreptiously into the mainstream, the genre is much more expansive than dualisms of this kind allow. There is no such stable or concrete generic category as ‘horror,’ but a matrix of forces and factors that may account for the way in which the term has been discursively employed historically. To be sure, there are certainly aspects of horror cinema that have attracted a fair share of controversy and condemnation: from the Universal Monster cycle, The Exorcist, the so-called “video nasties,” the rape-revenge film, “torture porn,” The Human Centipede, The Bunny Game, A Serbian Film, etc. etc. But these currents and trends do not make up the genre that we understand as horror entirely. Horror cinema is neither wholly maligned or critically celebrated, but exists in a more complex and complicated array of dialogic utterances and discourses that often cuts across cultural distinctions.

a-serbian-film-536d285ad7869.jpg

I would also add that the proliferation of new media affordances such as streaming giants, Netflix and Amazon Prime, both of whom produce their own films and series, have dallied in horror, Mike Flanagan’s adaptation of Shirley Jackson’s The Haunting of Hill House on Netflix being a particularly fine example of serial horror (and the third adaptation of Jackson’s novel). But do these texts illustrate a “renaissance,” or a “new Golden Age”? Or are there simply more media platforms to populate with content? There is no way, I would argue, that the quantity—and quality—of horror series, serials and films being produced at the moment outweigh other cycles and currents in previous decades.

intro-1540474794.jpg

Over the next few weeks or so, Confessions of an Aca-Fan pays host to a new series comprised of interviews with several academics centred on aspects of cult media, horror, exploitation, the gothic, and more besides. And while the focus is not entirely on fandom, interested readers will no doubt recognise that the majority of these scholars could certainly fit in with the definition of what Henry Jenkins would describe as ‘aca-fandom,’ even if they do not identify as such themselves in direct terms. I asked similar questions of our contributors at times, while at others hone in on individual research endeavours, with the hope of producing a discursive debate of kinds.

We hope you enjoy the series and if anyone would like to contribute an essay or propose a topic, please email at bproctor@bournemouth.ac.uk.

religious-horror-movies-625x351.jpg

William Proctor is Senior Lecture in Transmedia, Culture and Communication at Bournemouth University in the UK. He has published widely on various aspects of popular culture and is currently writing his debut monograph, Reboot Culture: Comics, Film, Transmedia (Palgrave). Along with co-editor Matthew Freeman, William has recently published the edited collection, Global Convergence Cultures: Transmedia Earth for Routledge. He can be reached at bproctor@bournemouth.ac.uk.

 

 

 

Do We Still Believe Networked Youth Can Change the World?: A Special Issue

A special issue of the bilingual (Spanish-English) journal, Working Papers on Culture, Education, and Human Development, dropped recently, sharing three essays on the theme of “Do We Still Believe Networked Youth Can Change the World?” The exchange started with the plenary session I did with activist/entrepreneur Esra’a Al Shafei at the Digital Media and Learning Conference a year or so back. An edited transcript of that memorable exchange about social change movements in the Arab world and some reflections on it open this issue. My long-time friend James Paul Gee, a major figure in the education world and someone who writes often about games and learning, responded with some fairly strong critiques of the concept of “participatory politics” as it has been shaped by the MacArthur Foundation’s Youth and Participatory Politics research network and others in recent years. And I, in turn, responded clarifying and defending our core concepts and also trying to speak to the similarities and differences between Gee’s concept of “affinity spaces” and my own work on “participatory culture.'“ The two are sometimes used interchangeably but I see significant differences (although some overlap) between the two. If you want to read this lively back and forth, you can find it here.

Here is a brief sample from the exchange.

James Paul Gee:

Why talk about affinity spaces and distributed teaching and learning systems? Why isn’t talking about participation and connection enough? Think of an empty affinity space (no one is any of the spaces at the moment). Each of these is sitting there (really or virtually) with tools and resources to create certain sorts of learning, teaching, and appreciative systems. Each is like a restaurant resourced to cook certain sorts of food in a certain way, though food for the mind and action in the case of affinity spaces.

Such spaces are most often the historical product of mutual top-down and bottom-up design and organization. As people move through them they are guided/directed/taught by the tools and resources— and the practices they facilitate—available to them and, in the act, over time they transform them. But at no point are people innocent of directive frameworks and teaching as design whether done by a tool, resource, or person. It is the shared appreciative systems, skills, and identities that give fellow-travelers (some more than others) power in the sense of directed agency of a characteristic, focused, social, and socialized sort.

In the end, I argue that we need to focus on capacities, design, resources, values, beliefs, and norms and not media or even participation per se. The latter are constrained by and facilitate the former.

Henry Jenkins:

I am often asked about the similarities and differences between Gee’s notion of “affinity spaces” and my own conception of “participatory culture.” I have always seen potentially productive overlaps between the two. But, Gee’s suggestion here that we think about an “empty affinity space” suggests some important differences. Gee has sought to distinguish “affinity spaces” from the affiliation or sociality conjured up by a word like “community.” Starting from a focus on games, game designers, and game players, Gee is interested in shared resources and activities, rule sets and affordances, which are to some degree built into the designed environment. Starting from my personal focus on fandom, my participatory culture model emphasizes the social ties, cultural traditions, shared norms and values, and expressive practices that support informal learning. Fans cluster around existing cultural works produced by others, often commercial producers, but they read them in relation to their own lived experiences and draw from them resources that help them to better articulate their own perspectives. While commercial producers want consumers, they have not always welcomed fans and for that reason, their attempts to build platforms to facilitate fan interactions -- to set the terms of fan participation -- have largely failed. Fandoms emerge from other fandoms as diasporic knowledge spreads from one site of community engagement to the next. Mentorship is practical but not hierarchical: those who know teach those who need to learn without regard to age or authority.

Fandom is not a space; fans interact with each other across a wide range of different platforms and environments, drawn to them because they offer certain affordances that allow them to pursue their shared goals and interests.  Fandom’s power comes from its potential to persist despite top-down limits that shape the design and operation of various platforms. Shut it down here and it will spring up somewhere else. The ways commercial producers and platforms enable, limit, or seek to profit from fan engagements has become increasingly central to fandom studies research, but most of us recognize that the fan community (a word too valuable to reject) is not limited to a single platform and its affordances.  There is no such thing as an “empty” fandom; fandoms only exist when groups of people are brought together through their shared social interactions. The spread of fan knowledge and practices is better understood in terms of mentorship (including peer-to-peer mentorship), tradition, and emergence (grassroots experimentation and innovation) rather than design.

Affinity spaces and participatory cultures are not mutually exclusive; one can imagine many potential overlaps between them, but we can not assume that every affinity space constitutes a participatory culture or vice-versa.  I share Gee’s sense that we should be paying more attention to how we build stronger bridges between different participatory cultures, how we find the common ground we need to rebuild the kinds of democratic culture many of us desire. But I would have said that participatory politics movements, such as March for Our Lives, demonstrate the power of such coalitions to take collective action.


There’s a lot more where this comes from — translated into Spanish as well as English.

Rediscovering 1940s American Cinema: An Interview with David Bordwell (Part Four)

Knowing you to always care deeply about the quality of academic prose, I was struck by your close attention to stylistic issues in looking at these writers. What might contemporary film studies scholars learn from a closer rhetorical engagement with the expressive practices of these writers? You write in your introduction, “They remain far more provocative and penetrating than nearly anyone writing film criticism today.”

I do think that film scholars, like most academics, could try to write more crisply. Of course I read things I’ve written over the years and cringe: Did I really have to say things so clumsily? I struggle first to achieve clarity and then, I hope, for a certain neatness, even felicity. I think I had a fairly cogent academic style, but I think writing our blog entries over the last dozen years has made me a more conversational, and I hope, user-friendly writer. I would urge people not to try to imitate any of those critics (especially Farber) but to concentrate on developing fresh, defensible ideas about cinema and putting them forward with nuance. After all, we academics have the luxury of more space than reviewers can command, so there’s no reason we can’t go into more depth.

Developing our blogsite after retirement has given me a forum for long-form para-academic essays, and the ease of putting color stills and clips into an online platform have allowed me to follow my wayward interests (even some on politics). I don’t know that any journal would have published most of what I’ve written there, but I think the informal tone of the work did help me find publishers for our books, two of which (MINDING MOVIES and THE RHAPSODES) were revised blog entries. I think every academic researcher in the humanities should find some admirable writers of haute journalism (for me, Shaw, Hitchens, Robert Hughes, Susan Sontag, Kenneth Tynan, Elizabeth Hardwick) and study how they do it.



41b1yjI9CkL.jpg

As we read these critics, one has a sense of their passionate love for cinema as a medium. There has been a contemporary discourse which talks about the loss of cinephilia within contemporary culture. Do you agree with this assessment or is this just grumpy old people not recognizing the same faces and perspectives dominating the conversation? Do you still find things to love in contemporary cinema?

I do feel myself split. Almost every year brings several films I straightforwardly love. This year, that list includes THE GREATEST SHOWMAN, GAME NIGHT, and THE BALLAD OF BUSTER SCRUGGS. I look forward to films from directors I admire: Spielberg, Panahi, Kore-eda, Kitano, Tarantino, Paul Thomas Anderson, Nolan, Wes Anderson, the Coens, Burton (only the weird projects like BIG EYES), Damien Chazelle, Spike Lee, Agnès Varda, David Koepp, Steven Soderbergh, Lucrezia Martel, Lynch, Wong Kar-wai, Johnnie To, Jaime Collet-Serra, etc. I also enjoy genre items like entries in the PARANORMAL ACTIVITY series, and I’m a sucker for anything that strays into my research zone (A SIMPLE FAVOR, SEARCHING, etc.). But the Star Wars cult leaves me scratching my head, and the superhero films I mostly don’t get, which is odd because I read Batman and Superman (and MAD) as a kid. (Though like everybody else, I loved GUARDIANS OF THE GALAXY.) Kristin and I go to film festivals to catch up with current work; at Venice and Vancouver this year I think I saw over fifty new titles.

streetshame_xl.jpg

Still while, preparing entries in our FilmStruck/Criterion Channel  series, I was reminded of how uniquely rich classic cinema is. Studying Duvivier’s LYDIA (1941), which I also write about in the 40s book, HIROSHIMA MON AMOUR, and Mizoguchi’s STREET OF SHAME, brought home to me how tight, economical, and “dense” or “thick” a film could be. In less than two hours—in the Mizoguchi, less than ninety minutes—we have an intense, “saturated’ experience of cinema, not to mention life. Appreciating this requires concentration, though, and I do believe that people’s devotion to “multitasking” has led to a loss of one aspect of film geekery, the ability to shut off everything else and sink yourself into a circumscribed experience. For me, the early films of Kiarostami and Hou Hsiao-hsien approach this quality.

I guess I really do believe there’s a difference of some sort between silent cinema and classic studio cinema (in many countries, including Japan and India), “modern” cinema (say 1950s-1970s), and contemporary cinema. I like all of those periods, but I do sense a difference.

On the whole, though, I think “cinephilia” has become more or less a taste marker (and a branding device for film festivals). I think the idea in its recent form emerge from CAHIERS of the 1980s, when video was starting to take hold of consumers, and film fans felt the need to justify their attraction. Let’s just admit that nearly everybody loves some kinds of cinema, as they do some kinds of music or literature.



Earlier this year, I shared some thoughts with media literacy advocate Tessa Jolls about why the media literacy movement should pay more attention to the cognitive side of your work, and I’d love to get your reactions to that exchange. To what degree do you see your work as contributing to media literacy? Clearly Film Art is widely taught at the undergraduate level, but as film becomes less central to the ways media literacy is taught in high schools, say, are there more general principles for teaching media we might extract from your work? Do you think of media comprehension as a form of “literacy”? Or are there better metaphors for thinking about what we do when we make sense of a media text?

I think I’m out of my depth here, but I’ll try. 

If media literacy means making people aware of how they interact with media, then I’m wholly in the game. Everything I’ve done takes for granted that form and style shape viewers’ experience to some degree.  In terms of a cognitive perspective, you and Tessa clearly understood the interactive side of my interests. Films don’t totalistically demand a single response; but also they can’t mean any old thing we want. 

In POETICS OF CINEMA I floated a cognitive model that suggested that at as we move up three levels, from perception through comprehension to appropriation, the filmmaker’s power wanes as the viewer’s power increases. The filmmaker “structures the stimulus,” as some might say; at the level of comprehension, there’s a kind of collaboration, in which there are prompts for inference-making powers (we collaborate in making the narrative cohere); and at the level of appropriation (including interpretation, but also any use we might put a film to, including in a classroom discussion), we as viewers can build off the film in many directions completely unforeseen by its makers. (Emotional response operates at all three levels, I think.) This still seems to me a decent first approximation of how to think about the dynamic of control and freedom posed by media texts.

On the pedagogical front, I think that what we tried to lay out in FILM ART may hold good in several respects. First, all moving-image media involve the basic techniques we surveyed, from mise-en-scene to sound. Second, the idea of analyzing a media text’s form and style still seems appropriate. Just this morning I read a review of the Netflix MANIAC that suggested that the style of director Corey Fukunaga is so striking that it’s an aesthetic appeal in its own right. Writers like Jason Mittell and Jeremy Butler have shown how these ideas can invigorate analysis of televsion. 

Concepts of narrative and stylistic strategies seem totally applicable to comics too; I tried my hand at applying them in a few blog entries (http://www.davidbordwell.net/blog/2010/07/30/tintinopolis/http://www.davidbordwell.net/blog/2009/09/02/archie-types-meet-archetypes/ http://www.davidbordwell.net/blog/2016/10/22/eep-omigosh-urk-smerp-and-other-archie-epithets/)  and I hope to do more. The newish book HOW TO READ NANCY does this sort of analysis at great length, and of course Scott McCloud’s books have a lot in common with FILM ART.

Third, I think FILM ART’s application of Wölfflin’s idea that “not everything is possible at all times” could nudge media scholars into considering the historical norms at play in various periods and places. Finally, in our most recent editions, we observed that most of our readers would be image-makers. This is a fairly new development in film history. Kids are growing up shooting photos and films on cellphones, editing on computer, posting them for a public. So as teachers we introduced the angle that students of cinema should try to think like a filmmaker. Our book tries to suggest that the creative response to a choice situation isn’t just what Big Filmmakers Out There are doing, but rather something that the readers would confront every time they use a camera. That’s media literacy too, I suppose: reminding readers that they too have the power to make images and tell stories. To do that effectively involves knowing what the creative options are and thinking about the alternative effects that they can generate. I think this is in harmony with what Tessa and you are up to, yes?

David Bordwell is an American film theorist and film historian. Since receiving his PhD from the University of Iowa in 1974, he has written more than fifteen volumes on the subject of cinema including Narration in the Fiction Film (1985), Ozu and the Poetics of Cinema (1988), Making Meaning (1989), and On the History of Film Style (1997). His most recent works are The Rhapsodies: How 1940s Critics Changed American Film Culture (2016) and Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling (2017).

With his wife Kristin Thompson, Bordwell wrote the introductory textbooks Film Art (1979) and Film History (1994). With aesthetic philosopher Noël Carroll, Bordwell edited the anthology Post-Theory: Reconstructing Film Studies(1996), a polemic on the state of contemporary film theory. His largest work to date remains The Classical Hollywood Cinema: Film Style and Mode of Production to 1960 (1985), written in collaboration with Thompson and Janet Staiger.

Bordwell spent nearly the entirety of his career as a professor of film at the University of Wisconsin–Madison, where he is currently the Jacques Ledoux Professor of Film Studies, Emeritus in the Department of Communication Arts. He and Thompson maintain the blog "Observations on film art" for their recent ruminations on cinema.




Rediscovering 1940s American Film culture: An Interview With David Bordwell (Part Three)

It will not be a great surprise that I was especially interested in the links you draw here between film evolution and what was happening in other media during this same period -- particularly literature and radio drama, but also theater. What accounts for these parallel developments across media?  This is not simply cinema absorbing influences from the other arts but also the other arts catching up with cinematic devices and practices. What models might you offer us for thinking about the logics shaping exchanges of practices across media? How might we apply such models to think about the relations between games, film, comics and television at the current moment?

sorry-wrong-number-still-526x295.jpg


I don’t think there’s a single broad explanation for what I call the “media swap meet” that grew intense in the 1940s. There were close institutional/economic ties among film, radio, theatre, and publishing, so that properties and schemas could pass pretty quickly across platforms. Writers went to Hollywood and sold book rights as well; I discovered a real treasure trove in a weekly column in PUBLISHERS’ WEEKLY devoted to sales to studios, as well as studio competitions for new novels. Magazines, which we tend to overlook, weren’t just part of book publishing but also furnished many stories and writers to Hollywood. Many film people did moonlighting jobs in radio, which was in a way what TV became—a vast torrent of narrative material drawn from all manner of sources. LUX RADIO THEATRE featured stories drawn from films and was even hosted by DeMille. I thought of your trans-media storytelling idea when I learned that SORRY, WRONG NUMBER became an annual event (starring Wisconsin’s own Agnes Moorhead); people huddled around their radios to hear it again and again, which in turn posed problems when a feature film had to be made from it. (So it had to be padded out with a plotline involving a young actor named Burt Lancaster.) And of course Hollywood invested in Broadway plays so as to get the film rights. Interestingly, the influence went both ways: I point to novels obviously influenced by Hollywood, and plays (GLASS MENAGERIE, DEATH OF A SALESMAN) openly modeled on film techniques.

The give-and-take is not so different from the system now, I think. Conglomerates openly own various entertainment venues, but there’s still a lot of prowling and snapping-up of free-standing IP. I don’t know of general models, but I think that heuristically we need to trace out the local, fine-grained relations among media creators, so that we might be able to build models of “creative networks” among these media artists.

Your use the term, “middlebrow modernism,” to describe some of the experimentation taking place across popular culture during this period. The word, “middlebrow,” originally carried some degree of disdain or distaste. Does it do so for you? How might we relate this “middlebrow modernism” to the kinds of experiments in the low or popular arts in the following decade which J. Hoberman called “vulgar modernism”? Are we watching the modernist impulse work its way down the cultural hierarchy as its influence on the culture is more fully absorbed?

image-w856.jpg

I didn’t mean “middlebrow” to be taken as disdainful, and one of the luckier consequences of the reviews REINVENTING has gotten is that readers don’t seem to have taken it that way. To me, there is important and valuable art that many consider middlebrow—OUR TOWN, THE BEST YEARS OF OUR LIVES, etc. Arguably, most of Hollywood’s prestige output is middlebrow. My chief claim was a neutral one: that narrative techniques from turn-of-the-century writers like James and Conrad, amped up by High Modernists like Woolf and Faulkner, were visible on the cultural horizon of ambitious American and English writers. But those writers also realized that High Modernism was difficult, so they set about making those techniques user-friendly. My prototypes are people like Thornton Wilder, Rumer Godden, Maxwell Anderson, etc. Indeed, even Welles and Hitchcock could be considered middlebrow. I trace some of the 1940s innovations to this vein of literary culture.

At the same time, mystery fiction was changing and becoming more formally complex, and those works fed into Hollywood’s increasingly dense narrative experiments. In general, both “literary fiction” that makes High Modernism more user-friendly and “popular fiction” that mixes those elements with the inheritance of older conventions (e.g., the C19 novel) seem to me primary sources for narrative strategies we find in the 40s. Actually, I’m digging into this area more right now, and trying to compare it with the present—particularly the recent cycle of female thrillers centering on women’s culture (e.g., GONE GIRL, THE GIRL ON THE TRAIN, and corresponding novels). 

Hoberman’s concept of “vulgar modernism” seems to me very specific to certain figures (Fuller, Chester Gould, Weegee) and depends on Brecht as a prototype of modernism. I’d locate Fuller and most comics in a more purely popular tradition of eccentric storytelling.

About video games, I know nothing. I do find it interesting that films like HARDCORE HENRY and WRECK-IT RALPH (excellent movie) derive some of their technique from videogames; but then the first-person camera is an old cinematic device, so I suppose first-person video games are indebted to that.

Your book, The Rhapsodes, works in parallel to Reinventing Hollywood to describe shifts in the critical language around film during this period. You discuss an exceptional group of critics -- Otis Ferguson, James Agee, Manny Farber, and Parker Tyler. Were these critics lucky to have such a rich and innovative set of films to write about? Was Hollywood lucky to have such intelligent and innovative critics to help sort through the experiments which were taking place during this key period? In what ways did shifts in critical practice impact film culture more generally during this period?

THE RHAPSODES was a chip from the workbench. In starting my research on the 40s I read the critics you mention, and I wanted to use them as a way of registering the innovations I tried to track. But interestingly, I found that they didn’t have much to say about them. They weren’t especially attuned to the new conventions of the period, which suggests that general audiences may not have registered them much either. This might be a good example of a historian discovering novelty that the audience wasn’t particularly aware of—that is, that the novelty appears as such only in a historical perspective.

I’ve said at various points that for me ideal film criticism includes not only opinions but information and ideas, all of the above to be delivered in engaging prose. For me, my four critics accomplished this, and the book tries to make that case.

Apart from their remarkable writing skills, what struck me about my quartet was their willingness to take Hollywood seriously as an artistic endeavor, a popular form that shouldn’t be judged by the standards of high art. They seemed to me to be forging, in different ways, a perspective on Hollywood that showed its peculiar artistic value. That meant paying attention to detail, noticing technique, trying to see films as expressive vehicles (and not the reflection of a cultural zeitgeist). In short, and given the limits of their resources (no access to prints, let alone video), they were analyzing and interpreting films to a depth not previously seen in American film criticism.

I think they mostly had no influence on the industry, but they did establish a tradition of the film critic as a literary figure. Agee was the most prominent example, but by the 1960s, when US film culture was ready, they were prototypes of the “celebrity critic” (Kael, Sarris, John Simon). They never had the power of that later generation, but for me they formed the start of a powerful tradition that persists in strong, knowledgeable writers such as Sara Imogen Smith, Manohla Dargis, Michael Phillips, Geoffrey O’Brien, Matt Zoller Seitz,Peter Debruge, Todd McCarthy, and Phillip Lopate.) But I wanted to introduce readers to these extraordinary writers and their ideas about the films of their period.



David Bordwell is an American film theorist and film historian. Since receiving his PhD from the University of Iowa in 1974, he has written more than fifteen volumes on the subject of cinema including Narration in the Fiction Film (1985), Ozu and the Poetics of Cinema (1988), Making Meaning (1989), and On the History of Film Style (1997). His most recent works are The Rhapsodies: How 1940s Critics Changed American Film Culture (2016) and Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling (2017).

With his wife Kristin Thompson, Bordwell wrote the introductory textbooks Film Art (1979) and Film History (1994). With aesthetic philosopher Noël Carroll, Bordwell edited the anthology Post-Theory: Reconstructing Film Studies(1996), a polemic on the state of contemporary film theory. His largest work to date remains The Classical Hollywood Cinema: Film Style and Mode of Production to 1960 (1985), written in collaboration with Thompson and Janet Staiger.

Bordwell spent nearly the entirety of his career as a professor of film at the University of Wisconsin–Madison, where he is currently the Jacques Ledoux Professor of Film Studies, Emeritus in the Department of Communication Arts. He and Thompson maintain the blog "Observations on film art" for their recent ruminations on cinema.

How Do You Like It So Far? Podcast: Wu Ming 1 and Benjamen Walker on Conspiracy Theories

This week we talked about conspiracy theories with Wu Ming1, of the collective Wu Ming, whose books inspired one the main conspiracy theorists on the internet, and Benjamen Walker, whose podcast often focuses on conspiracy theories. We cover: The art of blurring fact and fiction, and non-fiction, discrediting gatekeepers, can we ever really debunk, the role of satire, the hunger for complexity, pizzagate, the “deep state,” QAnon, and of course, president Trump.

Benjamen Walker tackles just these sorts of trends on his podcast, “Theory of Everything,” many of which trace back their current toxicology to 9/11. In a recent episode he delves into: when the truthers were gone, and how truthers merged into “hoaxers.” He identifies that with Sandy Hook, these hoaxes turned it into a “darker form.” He is a bit pessimistic since: “Looking for a way forward… I haven’t found it yet.”

Wu Ming is a pseudonym for a group of Italian authors formed in 2000 from a subset of the Luther Blissett community in Bologna. Previous to coming together, four members of the group wrote the novel “Q” in 1999. On 28 October 2017, references to Q emerged from the message board 4chan. In a thread called “Calm Before the Storm,” Q transformed into a government insider, with top security clearance who knew the truth about a secret struggle for power involving Donald Trump, the “deep state”, pedophile rings, Robert Mueller, and the Clintons.

The poetry of debunking

When reflecting on Q, the transformation and viral spread of something clearly originating as a work of fiction, leads us to ask: are we at a point where we cannot debunk any more? We move from “don’t believe what you read, believe me” to “don’t believe what you see, believe me only.”

Conspiracy theories work precisely because they discredit the authority trying to debunk the theory, and authority writ large is exactly what the hoaxers are rejecting. So how do you get around this? Wu Ming suggests that a game-like way of debunking could ultimately compete with the interestingness of the actual theory.

Wu Ming1 also shared his thoughts on the art of weaving fiction and non-fiction.

“Ordinary debunking doesn’t work. Because even if you debunk, believers keep believing them…. Conspiracists provide people with something they need. There is always a kernel of truth, hidden inside a conspiracy theory, because otherwise it wouldn’t work… when we debunk a conspiracy theory, we should be aware of that a kernel of truth.”

Wu Ming proposes that one way to combat this trend is “showing the stitches” — meaning that white hats should open up about about the amount of work required to create works of fiction like Q (similar to showing how a magic trick is done). What we need, he argues, is a “poetry of debunking” that makes the truth more interesting than the conspiracy theory itself.

Please join us to hear this and more in what was a very interesting episode. Plus check out more links below for more content.



Rediscovering 1940s American Film Culture: An Interview with David Bordwell (Part Two)

As you know, I have always been interested in the concept of the “bounds of difference” (from Classical Hollywood Cinema) which raises the question of how much elasticity there is within a system of norms and whether there are periods or genres that stretch against those bounds. For me, my original interest was the ways Hollywood absorbs performance practices from Vaudeville during the early sound era but we could see your recent work on the 1940s as potentially representing a similar moment in American film history, where there is a high amount of experimentation and innovation (a period of “reinvention”). (It was fun to read you writing here about Hellzapoppin and Crazy  House, by the way). So, building on the quote above, what factors opened up those “new possibilities”? Do some of these experiments prove too much for the studio system? Does a new stability eventually emerge or do we see the Hollywood system as always a bit unpredictable and uncontrollable?

Yes, this was a period of innovation not unlike the late 1920s-early 1930s, as your research shows. But there the innovation centered on technology, camera technique, performance, and genre, and these are important trends throughout the 1930s. I think filmmakers worked very hard on developing sound mixing, a fluent style (emphasizing camera movement), new genres (the gangster film, the musical), and performance styles for the sound cinema. (One of my favorite critics, Otis Ferguson, was very sensitive to some of these changes.) But the 1930s also saw a shift away from the narrative fluidity that had become canonical in silent film—the use of crosscutting, the willingness to employ subjective techniques, a freedom of time thanks to flashbacks. 

To put it too grossly, 1930s narration was “behavioral” and “theatrical” to a greater degree than earlier; we have to figure out characters' minds and hearts from externals, as in a play. (Here again, performance matters a lot.) Again to be heavy handed, in the late 1930s and the 1940s, we could say, Hollywood became somewhat more “novelistic”—willing to probe inner states, to shift time scales, etc. As Sara Imogen Smith pointed out to me in a FILM COMMENT podcast, this goes along with a more interiorized performance style (Mitchum, Lancaster, Widmark, even Crawford and Davis). The narration is giving us the psychology, so the actor can be more impassive.

Daisy_Kenyon_1947_5.jpg

But to get to your point about the boundaries: I think the boundaries are flexible. We don’t know how far we can go until someone tries. Who would have predicted the elaborate formal contraption that Sturges gives us in UNFAITHFULLY YOURS? Or the psychological intricacies of DAISY KENYON and SWELL GUY? Today, who would have thought we could have such an elaborate time machine as DUNKIRK? I do think that genre helps keep experimentation within bounds; but then again genre encourages experiment, exactly because we know the norms.

Daniel Mainwaring claimed that he wanted OUT OF THE PAST to be narrated by the deaf-mute boy at the beginning, but that was ruled out as too farfetched. Would it be today? And the peculiarities of THE CHASE, which I talk about in both the book and a series of blog posts, seem to have been taken in stride by both critics and audiences. It’s not that anything goes, but we don’t know what doesn’t until somebody tries.




I could not help but read Reinventing Hollywood in relation to the ongoing debates about the status of film noir, which is often treated as a particular genre, style or mode, operating on the fringe of American film practice. But, your book suggests that many of the narrative innovations, such as flashbacks, experiments with subjective camera, nonlinear stories, etc., associated with film noir are actually visible across a range of different genres -- melodrama or romantic comedy, say -- during this same period. So, to put it bluntly, how have people missed this? More generously, how might insights from your book force us to reconsider some of the claims that have been made about film noir?

I start from a historicist position on film noir: that is, I see it as a category invented by later critics to illuminate a range of films that have some common features. It wasn’t a term for Hollywood filmmakers of the period, and so they categorized films quite differently. In the book, I point out that what we’d call thrillers, as well as some detective stories, were lumped in with horror films. 

We can’t enter the historical agents’ minds, but we can get a sense of the norms they seem to hold. So, yes, many of the techniques I study were quite general across a range of genres. I don’t know why researchers haven’t emphasized this enough, but maybe because the power of the idea of film noir (and the glamor of it, I admit) steered people away from noting the strategies elsewhere.


Genre becomes increasingly important as we move deeper into the book and you discuss how the various “narrative schemas” you identify operate in relation to such tendencies in 1940s cinema as the pseudodocumentary procedural, the fantasy film, the psychodrama, the self-reflexive comedy  or the murder mystery. How might we think about the relations between narrative experimentation and the emergence of these genres? Do the genres motivate the formal experimentation? Do these genres emerge as filmmakers seek ways to motivate the devices you have identified?

You raise a fascinating point. Genre is crucial to both narrative norms and narrative innovations. In several cases, I tried to show how genres in other media shaped filmmaking; the most complete example is the rise of the literary and theatrical thriller. As you say, the process goes both ways: existing genres offer opportunities to try out storytelling techniques. This happens with “unreliable” narration in the thriller, for example—something that is rare in other genres. Once the family saga was established with FOUR DAUGHTERS, HOW GREEN WAS MY VALLEY, etc., the possibility of downgrading the individual protagonist was there to be exploited further in war pictures. And comedy, as you know better than anyone, offers a huge range of options for playing with structure and style.

how-green-was-my-valley-6.jpg

At the same time, I do think that the emergence of certain strategies favored the development of genres that could motivate them. Whatever the cultural appeal of Freudian subjects and themes was at the time, I don’t think the “psychoanalytical” would have appeared quite so strongly without the new armory of subjective techniques. I think the dynamic you point to is especially evident today with technology. The development of analog, then digital special effects from the 1970s onward surely stimulated the development of horror, fantasy, and SF films. They motivate the use of such techniques in a way that wouldn’t be as vivid in other genres.

You title your introduction, “How Hollywood Told It,” which invites comparison to your How Hollywood Tells It book. What parallels are you drawing, implicitly and explicitly, between contemporary Hollywood storytelling and the kinds of innovations you discuss during the 1940s?  How is your interest in new narrative and narrational forms in the 1940s linked to your interest on your blog and elsewhere regarding contemporary “puzzle films”? Does taking this larger historical perspective offer us any insights into the space for innovation in contemporary films?

At the very end of REINVENTING, I floated the idea that the much-vaunted “New Hollywood” of the 1970s emerged out of conditions similar to those that nurtured the 1940s innovations I tried to chart. The industry was regaining health after a period of deprivation, some blockbusters had put money into the system, a new generation of filmmakers emerged to take advantage of opportunities, and some ambitious filmmakers tried to make formal innovations. It’s simply a parallel, but it does suggest that there were periods of intense renewal in Hollywood that we haven’t taken sufficient measure of. 

The more proximate period, and the reason I evoked THE WAY HOLLYWOOD TELLS IT, was the post-1960s era, when many narrative innovations emerged. They emerged most intensely, I think, in the 1990s-2000s, and a lot of those involved revising the schemas at work in the 40s. The network narrative, from Altman and others in the 1970s, got further elaborated, and the play with time and subjectivity we saw in, say, PETULIA in the 1960s or THE CONVERSATION in the 1970s became much more generalized during the later decades. The saying became “Form is the new content,” and films like PULP FICTION, MEMENTO, MAGNOLIA, and seemed to me ambitious reworkings of the tendencies that had emerged in the 1940s. I floated that tentatively in THE WAY, but returning to the 40s—initially under the aegis of a series of lectures I gave for the Flemish Summer Film College in Belgium in 2011—allowed me to develop my hunch in detail.

David Bordwell is an American film theorist and film historian. Since receiving his PhD from the University of Iowa in 1974, he has written more than fifteen volumes on the subject of cinema including Narration in the Fiction Film (1985), Ozu and the Poetics of Cinema (1988), Making Meaning (1989), and On the History of Film Style (1997). His most recent works are The Rhapsodies: How 1940s Critics Changed American Film Culture (2016) and Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling (2017).

With his wife Kristin Thompson, Bordwell wrote the introductory textbooks Film Art (1979) and Film History (1994). With aesthetic philosopher Noël Carroll, Bordwell edited the anthology Post-Theory: Reconstructing Film Studies(1996), a polemic on the state of contemporary film theory. His largest work to date remains The Classical Hollywood Cinema: Film Style and Mode of Production to 1960 (1985), written in collaboration with Thompson and Janet Staiger.

Bordwell spent nearly the entirety of his career as a professor of film at the University of Wisconsin–Madison, where he is currently the Jacques Ledoux Professor of Film Studies, Emeritus in the Department of Communication Arts. He and Thompson maintain the blog "Observations on film art" for their recent ruminations on cinema.







Rediscovering 1940s American Film Culture: An Interview with David Bordwell (Part One)

David Bordwell has been a hyper-productive film scholar since his early 20s and now, more than a decade into his retirement, he is still running strong. He is blogging, updating his old books, writing new ones, and jetting off to film festivals around the world. In the past few years, he has published two new books — The Rhapsodies: How 1940s Critics Changed American Film Culture (2016) and Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling (2017). — which give us fresh takes on American film culture in the 1940s, a period that seems all the more innovative and transformative through his characteristically close analysis.

I was lucky enough to be have Bordwell as my dissertation advisor in the late 1980s at the peak of the so-called “Wisconsin” project. He was a breathtaking presence in the classroom — we routinely stayed an hour or more after class until he felt his lecture was complete — and he was generous as a mentor — making sure each student found their own voice even if or especially if they disagreed with his premises. It is hard to imagine writing my first book, What Made Pistachio Nuts?: Early Sound Comedy and the Vaudeville Aesthetic, without his influence and I draw on things he taught me regularly even if my work has taken me far from cinema studies in recent years.

I am proud to be able to share with you a bit of our lifelong conversation together. Here, he situates his new books on the 1940s in relation to concerns which run throughout his career. His responses are, as always, substantive and probing, showing the continued evolution of his thinking on some core issues.

If I look across your body of work, there are books dealing with exceptional filmmakers (Ozu, Eisenstein, Dreyer, not to mention recent writing about Wes Anderson and Christopher Nolan) as well as books which adopt a more normative approach looking at samples of typical or average films (such as The Classical Hollywood Cinema, The Way Hollywood Tells It, and Reinventing Hollywood). What do you see as the relationship between these two approaches? How do they fit together in your conception of film studies as a field?



I should say at the start that I try to proceed from questions that intrigue me and that seem to me to remain unanswered (or not satisfactorily answered). The questions tend to come within three broad areas: (1) The history and creative resources of film forms (especially narrative); The history and creative resources of film techniques (i.e., style); and The principles governing activities of spectators who respond to films. As you know, I approach all those within a framework I’ve called a poetics of cinema—the probing of princples that filmmakers develop and that viewers learn to apprehend.

So the filmmakers you mention are those who seem to me to occupy niches in those areas. To take the recent examples you mention: Nolan seems to me to have developed a distinct “formal project” in his handling of narrative—essentially testing how crosscutting can create different temporal zones—while Anderson works both at the level of narrative and a distinct pictorial style. But because I’m interested in principles of narrative and style, I see those as shared and spread through a community of creators, so that norms are created that more or less shape what’s possible (or discouraged, or encouraged) in different contexts. The norms, I’ve stressed from the beginning, aren’t single mandated rules but rather range of more or less permitted options.

 In the books on Hollywood, I’ve tried to spell out the principles shaping form and style within that powerful community. The most recent book on the 1940s goes the farthest, I suppose, in trying to construct the “menu” from which filmmakers work. But the innovative filmmakers expand the menu by showing possibilities in the norms that others haven’t realized. Sometimes those possibilities themselves become normative, as, say, complex flashback construction became normative in the 40s. The same sort of process, I think, went on in Hong Kong cinema from the 1980s through the 2000s.

As for Ozu, Dreyer, and Eisenstein: In all those cases, I tried to show how the individual filmmaker worked both within and against emerging norms of form and style in their most proximate context. For Ozu, the context was Japanese studio cinema; for Eisenstein, the emerging Soviet avant-garde; for Dreyer, the “language” of international European cinema (though from my perspective today, I think I missed many chances to relate him to important trends—I just didn’t know enough!).





Auteur filmmakers such as Hitchcock, Capra, Wyler, Welles, or Sturges do make appearances in Reinventing Hollywood but often to show how their practices were in conversation with those of less well remembered films and filmmakers of the same period. You write, “To a greater extent than their contemporaries, they carved out new formal options. But their very originality created problems of competition. Once the new schema are out there, anyone could imagine telling a story through multiple flashbacks, embedding a film within a film, restricting our knowledge to a single character, or ringing changes on thriller premises. To stay prominent, Welles and Hitchcock had to outrun their imitators and themselves.” One of your very first widely read essays dealt with Citizen Kane. What does this more robust map of this cycle of innovation during the 1940s help us to see within this film you would not have seen before?



Because of its length, REINVENTING HOLLYWOOD allowed me to deal with changes within norms to a degree I couldn’t before. Both THE CLASSICAL HOLLYWOOD CINEMA and THE WAY HOLLYWOOD TELLS IT build up a general picture of norms of storytelling and style. They do address some local changes—e.g., early sound shooting style, deep-focus cinematography in CHC and the emergence of “network narratives” and “worldmaking” in THE WAY. But the 1940s book let me dig more into the dynamic of how narrative strategies develop in a short time span. 

citizen_kane_stairs_1_-_h_2017.jpg





Once I conceived Hollywood as a community built on “cooperative competition,” I was able to sense the extent to which a film like KANE did two things: It assimilated several storytelling strategies that had emerged in films and other media; and it provided a template for further revision, by Welles and by others. Again, it came down to different questions. That very early essay on KANE, and the better analysis I wrote for the editions of FILM ART: AN INTRODUCTION, were concerned with functional explanations—providing an analysis of how the film worked. The 40s book was more concerned with causal explanation, asking what narrative schemas were available to be synthesized by Welles and his collaborators, and how those in turn became available to others. 

I came to appreciate the notion that filmmakers were making films not only for audiences but for other filmmakers, as part of a give-and-take of influence and, perhaps, rivalry. Certainly I think that the great number of “Hitchcockian” thrillers that followed Hitchcock’s emigration to the states shaped a sense of competition in him: he had to outrun his imitators. He did this with some very outré projects, like LIFEBOAT, SPELLBOUND, ROPE, and UNDER CAPRICORN, but he also managed to perfect the “Hitchcock touch” in NOTORIOUS. Filmmakers, I’m convinced, can be quite aware of the pressure to innovate, especially when they become famous.


8433383_orig.jpg





You write: “The collective nature of the Hollywood enterprise yielded remarkable achievements, and the results were never perfectly controllable or predictable. When collective effort was blended with individual abilities and fresh opportunities, new forms -- not formulas -- could emerge, expand, and mingle. We’re confronted with two levels of artistry: tried-and-true conventions executed with more or less skill, and innovations that open up new possibilities.”  So, what are some of the “new forms” and “new possibilities” that emerge during this period?

Broadly, the 1940s sees a crystallization of several narrative options (as well as stylistic ones, which I try to deal with in other work). There’s the flashback narrative, in all its myriad forms; the multiple-protagonist film (probably seen best in the combat picture); the  “psychological” film (e.g., THE LOST WEEKEND, THE SNAKE PIT); the social-comment film; the “new realist” film (e.g., INTRUDER IN THE DUST); the film relying on subjective imagery and voice-over; and the self-consciously stylized film, which acknowledges its ties to or breaks with earlier film history (e.g., HELLZAPOPPIN, THE PERILS OF PAULINE). I also place a lot of emphasis on the emergence of the psychological thriller, either based on the man-on-the-run or the woman-in-peril; in the 1940s, the thriller became central to mainstream cinema, as it remains today. None of these options was absolutely new at the period, but in the 1940s they coalesced and developed in new variants very rapidly.

hellzapoppin-1941-13-g.jpg

In a sense, I tried to do for narrative strategies what genre critics have long done. A critic studying the musical or the Western or whatever casts a wide net, looking for basic conventions and less-common innovations that are taken up, or not. I tried to do the same for narrative devices. For example, in 1940-1941 every studio makes at least one prestigious picture based on flashbacks. This was an uncommon option in the 1930s. By the end of the 1940s, flashback films are a mainstay of Hollywood storytelling, and some films—eg, BACKFIRE—have flashbacks of an intricacy that no one in 1941 would have attempted. This is the sort of “expansion and mingling” that I tried to capture.



David Bordwell is an American film theorist and film historian. Since receiving his PhD from the University of Iowa in 1974, he has written more than fifteen volumes on the subject of cinema including Narration in the Fiction Film (1985), Ozu and the Poetics of Cinema (1988), Making Meaning (1989), and On the History of Film Style (1997). His most recent works are The Rhapsodies: How 1940s Critics Changed American Film Culture (2016) and Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling (2017).

With his wife Kristin Thompson, Bordwell wrote the introductory textbooks Film Art (1979) and Film History (1994). With aesthetic philosopher Noël Carroll, Bordwell edited the anthology Post-Theory: Reconstructing Film Studies(1996), a polemic on the state of contemporary film theory. His largest work to date remains The Classical Hollywood Cinema: Film Style and Mode of Production to 1960 (1985), written in collaboration with Thompson and Janet Staiger.

Bordwell spent nearly the entirety of his career as a professor of film at the University of Wisconsin–Madison, where he is currently the Jacques Ledoux Professor of Film Studies, Emeritus in the Department of Communication Arts. He and Thompson maintain the blog "Observations on film art" for their recent ruminations on cinema.









Popular Religion and Participatory Culture Conversations (Final Round): Sarah Banet-Weiser and Hannah Scheidt (Part 2)

Hannah: Thanks for sharing this introduction to your current project. You define and develop key concepts for understanding the current sociopolitical climate and its media stage. One concept or theme that intersects with my own work (and that religious studies can offer some insight into) is that of “narratives of injury.” You identify a narrative of injury at work in the contemporary white nationalist movement, and an accompanying narrative of restoration or redemption of sorts, though this side of the story seems perhaps less visible (let me know if you disagree). 

This basic story line, which involves a threat, a victim, and the violation or disruption of the status quo, is familiar to me as a narrative of persecution. Scholars of religion have explored the operation of these types of narratives in countless religious communities, from ancient Judaism and early Christianity to New Religious Movements. 

Specifically, though, because we are talking about the modern American context, I thought immediately of Christian Smith et al.’s sociological study (now two decades old) on American evangelicalism: American Evangelicalism: Embattled and Thriving. Smith argued basically that evangelicals are not only “embattled” and “thriving” but are thriving because embattled. His notion of “subcultural identity theory,” suggests that groups take advantage of “embattled” status in their projects of identity formation. Identity construction is a process of drawing symbolic boundaries; categories of differentiation and comparison vis-á-vis outgroups aid in the project of collective identity formation and promote solidarity. 

“Subcultural identity theory” is obviously portable, and some have applied it (rightly, I think) to New Atheism, within which a narrative of opposition and persecution exists (hence the need to “come out” as atheist) (Cimino & Smith 2011,LeDrew 2015). What I find interesting in contemporary atheism is that this narrative of an embattled minority exists in tension with other formative narratives: that there are actually more atheists than polls and statistics account for, for example, or the understanding that secularization is an immutable process and that “reason will reign.” 

GW769H203.jpeg



It is worth noting that in both of our work, opposing movements or cultures employ similar narratives of persecution and of injury (atheism and evangelicalism, feminism and white nationalism). I wonder if we could start to talk through a more robust accounting of how similar logics and strategies operate in competing or opposing groups. I think you could find examples of how each group imagines an alliance between the opposing ideological group and “the mainstream” (institutional powers that be, the mainstream media). This is essential, as the threat must be imagined as dominant, structural, hegemonic – not as another subcultural minority.

The media studies perspective contributes an analysis of how narratives of injury circulate, and the form they take in the “economy of visibility.” I am curious about the transmutation of politics into visibility, as you describe it, in the case of coverage of white nationalism. I do wonder if a media outlet’s focus on, for example, the fashion choices of a neo-Nazi could be read as intended not to normalize but to unnerve – a version of the handsome and charming serial killer narrative. So instead of reading “Nazis, they’re just like us,” the consumer comes away with “Nazis…they could be anywhere.”

Sarah: hmmm, I’m not sure.  I don’t think the mainstream media generally try to unnerve, even if that may well be the effect of some representations. I think that the mainstream media, and journalists in particular, are in a difficult place in the contemporary moment, in the context of misinformation, disinformation, and post-truth.  I see stories such as “The Nazis next door” as part of that context, which works for the mainstream media to normalize in the name of “we cover different perspectives;” in this sense, it is related to Trump’s statement about the racist Unite the Right rally at Charlottesville as having “very fine people, on both sides.” 

 I really like the idea of “thriving because embattled” because I see that affect increasingly gaining currency in the contemporary context. We hear about white men in the US being victims all the time—this sentiment was on glorious, devastating display in the Brett Kavanaugh hearings.  He claimed victimhood through his white masculine rage, a rage that was then bolstered by Lindsey Graham’s similar explosion where he vowed to “not ruin a man’s life” over accusations of sexual assault.  Tragically, it is no surprise that women’s lives, and how they are ruined over and over again because of sexual violence and not being believed, was not part of Graham’s rant. We can see, in the contemporary media and cultural climate, how claiming male (especially white male) victimhood actually strengthens and supports masculine hegemonic status.  

I’ve been writing about what I’m calling “feminist flashpoints,” stories that get wide immediate visibility in the media, but then quickly are obscured by the circulation of yet another story, another abuse of power.  In terms of white nationalism, I think that a focus on “fashy fashion” or the fact that Nazis buy milk just like the rest of us, is part of this obfuscation. The temporality and rapid circulation that mobilizes an economy of visibility often means constant production, not reflection.  In the current media economy, the need to constantly gain new followers means that there needs to be constantly new and potentially flammable material. And rage is perhaps the most flammable material—but we need to remember that white men are not only encouraged to be full of rage, it gives them even more power.  In contrast, the rage of women and people of color is routinely dismissed as hysteria, insanity, or ignorance.  

 Sarah Banet-Weiser is Professor of Media and Communications and Head of the Department of Media and Communications at LSE.  Professor Banet-Weiser earned her PhD in Communication from the University of California, San Diego.  Her research interests include gender in the media, identity, citizenship, and cultural politics, consumer culture and popular media, race and the media, and intersectional feminism.

Hannah Scheidt recently completed her PhD in religious studies from Northwestern University. Her dissertation, a cultural study of contemporary atheism, was informed by perspectives from religious studies and media studies. The dissertation shows how "atheism" is constructed through a complex relationship with "religion" - a relationship that involves critique and contrast but also imitation and resemblance. Her other research interests include religion and science, transhumanism, and (newly) American craft and maker movements.  

Popular Religion and Participatory Culture (Final Round): Sarah Banet-Weiser and Hannah Scheidt (Part 1)

Hannah: My work explores contemporary atheist culture. My goal is to determine how “atheism” operates today as a source of identity and community – how it comes to be associated with a host of meanings, messages, and values amongst self-identified atheists beyond the simple definition of a “lack of belief in god(s).” As a scholar of religion, I am particularly interested in the ways that atheism is defined and “filled out” alongside religion, in a complex negotiation that involves appropriation as much as it does conflict as critique. 

If religious studies is one “pillar” of my work, media studies is the other. New media technologies have played an integral role in the development of contemporary atheism, as others have acknowledged. The challenge comes in developing methodologies and frameworks that allow us to organize and analyze the huge stores of material – the cultural “texts” in their varied media formats – that exist across multiple platforms. My work draws from diverse sources, including atheist fan art and digital comics, popular television programming, moderated debates, and observation of grassroots Internet communities. Throughout my analyses of these varied formats and media, I pay attention to the ways atheists circulate, interpret, and remake the stories, images, characters, and “facts” that they encounter. This allows me to piece together a living narrative of contemporary atheism: a narrative that involves a variety of voices with different degrees and kinds of power that alternately cooperate, compete, and conflict.

As mentioned above, atheism’s main cultural “conversation partner” is religion, and much of my work deals with exploring just how religion is imagined in contemporary atheist discourse. What kinds of authorities, institutions, and modes of social organization do atheists see as “belonging” to religion? What kinds of identities and ways of knowing are “religious”? When should atheism oppose or contradict religion, and when should atheism mimic or borrow from religion? 

Studying atheism as participatory culture – operating in spaces supported and structured by new media and defined by the creative and critical involvement of consumer/producers – reveals that contemporary atheism has a host of other cultural conversations partners as well. As one would expect, the contours of and divisions within the atheist network are shaped by affinities with countless other contemporary subcultures and movements: gaming culture, sci-fi and fantasy fandoms, popular therapeutic culture, and the LGBTQ movement, to name a few. 

One movement that has made its presence felt in contemporary atheism – and this is where my work has confluence with Dr. Banet-Weiser’s – is anti-feminism (and, relatedly, men’s rights activism). I have been aware of New Atheism’s less-than-subtle “woman problem” for years, having followed the “Elevatorgate” incident of 2011 (atheism’s “Gamergate”) as well as coverage of sexist remarks made by New Atheist leaders. I knew, from work on the fan followings of New Atheist leaders, of the reluctance among some atheists to unsettle traditional structures of power, at least when it came to white male hegemony. The tension or irony (and the reason that the existence of anti-progressivism within atheism surprises some people) is that atheist movements have historically been invested in a critique of traditional authority as part and parcel of religion.

But it was my attendance at an atheist-conference-turned-men’s-rights-rally about a year ago in Milwaukee that opened by eyes to the extent of this trend: its health, its energy, its populism. The conference featured a session on identity politics in the atheist movement. The conversation, between an anti-progressive YouTuber and a mainstream liberal podcast host, took up questions of censorship and political correctness (do campaigns to stop cyberbullying threaten free speech? what are the limits of the liberal commitment to free speech?) and social justice and equality (do contemporary feminist movements and Black Lives Matter address “real” injustices and inequalities? do progressive efforts to counter the effects of inequalities and injustice actively repress white men?) 

The event felt like a YouTube comment section come to life: anarchic, offensive, agonizingly circular: a real-life “flame war.” There were obviously a number present who were confused and alarmed by the direction the conference, which to them felt only tangentially related to atheism, had taken. But a good portion of the audience had obviously come for this battle. 

Why does this affinity exist? Or, as one colleague asked when I described this experience, “What’s the throughline?” The internal logic – the way atheist critics of the “regressive left” explain it is this: feminists, progressives, moral relativists, and other “Social Justice Warriors” (SJWs) have turned liberalism into “ideology”. Ideology is understood as inherently authoritarian and antithetical to the autonomous exercise of reason (read: religion). Identity politics = ideology = enemy of reason, out of touch with reality. “Reason” is big in atheist discourse, and many (even those who don’t follow the train of logic all the way to anti-feminism) appeal to reason as a means of freeing people from the corrupt influences of institution and authority. 

A full explanation of the existence of anti-feminism within atheism also involves an account of how the social norms of Internet spaces have shaped the development of the culture. Work in cultural studies and media studies shows how aggression, harassment, and antagonism became the “social norms” in early Internet spaces, and how these norms persist in some Internet communities. It is some of these same white-male-dominated spaces that hosted communities and conversations dedicated to atheism, and from which the stereotype of the “Internet atheist” (an aggressive and socially-inept white male who trolls the Internet looking for religious people to debate and ridicule) emerged. Of course, now in the age of Trump, these social norms seem to have crept off the Internet.

Sarah: Thank you for inviting me to be part of this conversation – it is really great to engage with Hannah and to get to know her work. Like many others in this series, I am not a scholar of religion—my recent work focuses on popular feminism and popular misogyny. In particular in my recent work I analyze contemporary forms of gendered power through an analytic of visibility – what I call an economy of visibility. 

As I’ve argued, feminist media studies scholars, critical race theorists, and cultural studies scholars have long been invested in studying the politics of visibility. The politics of visibility has thus long been important, and continues to be, for the marginalized. To demand visibility is to demand to be seen, to matter, to recognize oneself in dominant culture. The insistence of marginalized and disenfranchised communities – women, racial minorities, non-heteronormative communities, the working class – to be seen has been crucial to an understanding and an expansion of rights for these communities.

Politics of visibility are clearly still important, and have real consequences. But alongside the politics of visibility, we are witnessing the ways economies of visibility increasingly structure not just our mediascapes, but also our cultural and economic practices and daily lives. Within this context, visibility often becomes the end, rather than a means to an end. The visibility of these categories is what matters, rather than the structural ground on and through which they are constructed. 

For me, what is really important about this shift to economies of visibility is that it reshapes what political struggle looks like. And this gets us to an interesting intersection between Hannah’s and my work, or between media scholars and religious scholars more generally. Here, I suppose I’m thinking more of a religiosity rather than religion itself.  

The demand for a visibility politics competes with an economization of visibility, resulting in quite different goals and consequences. This kind of visibility is in line with what Nabil Echchaibi calls hypermediation: the ubiquity and interactivity of emergent and residual media circulations. This hypermediation then authorizes and enables a visibility of religious practices, as Hannah points out in her work on atheism.  

Part of what I noticed when researching online misogyny is the increasingly normative visibility of white nationalism, where media representations of white nationalism often are framed as a hybridized kind of public secular religiosity. 

How does this work?  I think that economies of visibility connect politics, religion and visibility in specific ways. The messages that circulate easily within this economy are those that require little labor to be seen and understood, they rely on familiar narratives, ones that are easy to incapsulate in an image, a slogan, a product.  

One of these familiar narratives that has found traction within an economy of visibility in the contemporary moment is that of injury and capacity. In my book, I think about this in terms of popular feminism and popular misogyny, and how these discourses and practices tap into a neoliberal notion of individual capacity (for work, for confidence, for economic success), but both also position individual injury as a key obstacle to realizing this capacity. For women, the injury is found among other places, in centuries of sexism, misogyny, and gendered violence. 

For white men, however, the injury in the contemporary moment is one of displacement; white men feel displaced by women in general and feminists in particular, and in the US, by immigrants and people of color.  We’ve seen in all realms of culture – in the technology industries (as Hannah mentioned, perhaps most visibly in GamerGate), in online communities in the violent responses to the apparent threat posed by women and people of color simply because they exist; in politics, with Trump leading the pack in his relentless ranting about how white men are disadvantaged.  

I see these tropes of injury and capacity as the crux of white nationalism as it circulates within this media economy. In this moment of hypermediation, white nationalists distribute their racist and nativist message with religious fervor, as a recuperative mission, a pursuit to restore whiteness and patriarchy, to repair injuries caused by women, people of color, and immigrants, and to return capacity to white men. An economy of visibility creates an environment where white rage is mobilized by using media outlets to emphasize hopelessness and fear. Indeed, hypermediation stokes this rage, and the dynamic of visibility validates a reactionary response to the perceived displacement of white men, which then manifests as a structural violence.  

That is, hypermediation and an economy of visibility are validated and amplified in a particular political economic context: As Wendy Brown has argued, the retraction of social services, and the return to a kind of statism and nativism as we are witnessing now in the West, means for many that a “need” for a strong authority is produced, to secure order, boundaries, borders, as well as to reclaim and restore a way of life for a declining white middle and working class, since the contemporary life has apparently been destroyed by immigrants, people of color, feminists, terrorists, refugees (Brown, 2018).  An economy of visibility provides the platform for the circulation of this “need” through images, misinformation, lies, and obfuscation. 

Here, white nationalism transmutes into a sort of distorted understanding of what Stewart Hoover calls the guiding principles of Christian masculinity, “provision, protection, and purpose.” Religious values get co-opted and distorted within this economy.  As Hoover has argued, Christian men define themselves by their ability to provide and protect their families and by identifying their purpose as head of the household and as breadwinner (to be clear, these principles are also based in patriarchy). White nationalists also claim to be guided by provision, protection, and purpose, but because they have lost these principles. They can no longer provide because immigrants and women have taken their jobs. They protect white masculinity, often through violence.  And their purpose is to destroy all “others” – and in particular women and people of color.  

These principles become legible then only as a threat, one posed by people of color, feminists, immigrants – the threat becomes the religion. Provision, Protection and Purpose are re-routed and severed from spiritual grounds, becoming rather only about the supposed decline of white people, and white masculinity in particular.  

Aided by the economization of visibility, where visibility becomes an end in itself, masculine Christian principles of Provide, protect and purpose are transmuted into violence, attacks that are waged within the logic of the conviction that the white race is perpetually in peril; threatened by racial integration, by “political correctness,” by multiculturalism, by all others.   

This economy of visibility also works to normalize white nationalism. For example, the style politics of celebrity Nazis like Richard Spencer, or alt-right spokespeople like Milo Yiannopoulos, have been profiled alongside some of the ideologies they espouse, effectively both normalizing them and distracting us from their anti-woman, anti-immigrant, racist, and nativism by their clothes and their dandy style. The visibility of the celebrity hatemongers is what matters, rather than their politics and the way they incite violence. For example, Mother Jones wrote about Richard Spencer: 

1477603037884.png

“An articulate and well-dressed former football player with prom-king good looks and a “fashy” (as in fascism) haircut—long on top, buzzed on the sides—Spencer has managed to seize on an extraordinary presidential election to give overt racism a new veneer of radical chic.” 

GQ has commented on Milo: “And while much of the media focus was on Yiannopolous's behavior, his clothes were worth noting, too. On a weekend when debating what it means to respect the American flag became a national pastime as big as, well, watching football, Yiannopoulos showed up on campus in an American flag-print hoodie from the streetwear brand Supreme.”

Claiming that public figures give “overt racism a new veneer of radical chic” or that Milo’s “clothes are worth noting, too” is part of the dynamic of an economy of visibility, where visualizing every experience rather than interrogating the grounds of that experience, is what circulates, accumulating likes, clicks, and followers.  

The consequence of this is both to distract from the structural ground of racism and misogyny and to normalize.  

For example, as many people have noted recently, mainstream media such as the New York Times has routinely profiled Nazis in a kind of US Magazine style of “Nazis! They’re Just Like Us!” – portraying them and their lives as “normal” US citizens who just happen to hate women, people of color, immigrants, non-heteronormative people. As the New York Times in their profile of white nationalist Tony Hovater framed it:  

“He is the Nazi sympathizer next door, polite and low-key at a time the old boundaries of accepted political activity can seem alarmingly in flux. Most Americans would be disgusted and baffled by his casually approving remarks about Hitler, disdain for democracy and belief that the races are better off separate. But his tattoos are innocuous pop-culture references: a slice of cherry pie adorns one arm, a homage to the TV show “Twin Peaks.” He says he prefers to spread the gospel of white nationalism with satire. He is a big “Seinfeld” fan.”

Profiling white nationalists in this way transmutes the political logic of what it means to be racist, a political subjectivity invested in shoring up gender and race inequities, into what a nazi looks like, his visual representation. 

The clothes and the bodily style are the politics;  the politics are contained within the visibility. This works effectively to defang the violence of these politics, to transform them into “radical chic.” The identification, and announcement, of one’s visibility is both the radical move and the end in itself (Gray, 2013).  Economies of visibility do not describe a political process, but rather assume that visibility itself has been absorbed into the economy; indeed, that absorption is the political.

The economy of visibility within which these logics are circulated often dresses them up in a cool outfit, an ironic tattoo, as a way to distract publics, shifting attention to the “chic” rather than the racist. But we need to remember that the common sense of this moment is not consensus (achieved through normalization) but rage and violence. Within this context, white nationalists, Nazis, klansmen can gather in a “free speech rally” as “freedom fighters” and fascism is understood as authenticity.  


Sarah Banet-Weiser is Professor of Media and Communications and Head of the Department of Media and Communications at LSE.  Professor Banet-Weiser earned her PhD in Communication from the University of California, San Diego.  Her research interests include gender in the media, identity, citizenship, and cultural politics, consumer culture and popular media, race and the media, and intersectional feminism.

Hannah Scheidt recently completed her PhD in religious studies from Northwestern University. Her dissertation, a cultural study of contemporary atheism, was informed by perspectives from religious studies and media studies. The dissertation shows how "atheism" is constructed through a complex relationship with "religion" - a relationship that involves critique and contrast but also imitation and resemblance. Her other research interests include religion and science, transhumanism, and (newly) American craft and maker movements. 

How Do You Like It So Far Podcast: Rohan Joshi, Captain America and News Comedy in India

This week Henry talked to Rohan Joshi, from the comedy group All India Backod, who walks us through how to use comedy to confront social issues, particularly in the Indian context.   Joshi recently spoke at the Indian Culture Lab ,highlighting insights Indian audiences could learn from Captain America.

How do we still have a Captain America?... how has avoided the potential campiness of it? He is always asking ‘what does it mean to be a Patriot?....He is a symbol of a soldier, propaganda, etc. But he never allows himself to just be that mindless drone. He is essentially driven by one question: “how do I use this for public good?... that’s what keeps him away from being a cog in the system…

…..In the 70s, during the era of Nixon,  there was a captain America storyline… you may not know this because it was not in the movies… the thing that breaks his heart is when he finds out that the head of this terrorist organization is the president of the United States himself... Captain America is so disillusioned by this breaking of this American ideal that he gives up his suit entirely, and decides to become this state-less creature known He’s so disappointed that he takes off his suit and becomes “Nomad”.  He asks, when is it time to stop being a good citizen, and become a good person?

In his chat, Joshi also gets into other successful civic interventions that can be mounted through comedy.  An example was the massively viral video, from 2013 called “It’s your fault” that dealt with the issue of rape, focusing on the irony of victim-blaming.. They decided to create it after many high profile cases of sexual assault in India and the clumsy political and legal responses surrounding the issue. The sarcasm-heavy video mocks ignorant statements that high profile figures made in response to the assaults. Joshi described the process of creating the video, sharing that at one point, they passed it by academics and activists to “make sure we were using the right language and tone and not somehow replicating the same mistakes we had been hearing.” They also asked actress friends to perform the script. The worldwide response was much more than they expected, with people reaching out to ask if they could replicate the video as a means to combat widespread patriarchal issues..

The video, called “it’s your fault”:  


When reflecting on what to call this form of civic intervention, Joshi shared that simply calling it  “standup,” is too small , because of the impact and way it’s being used across different languages around the country: is it a movement of comedy that manages to better explain issues going on in India? Is it civic entertainment? Can we call it so?


Join us as we delve into these questions and more!






Popular Religion and Participatory Culture Conversation (Round 7): Nabil Echchaibi, Yomna Elsayed and Kayla R. Wheeler (Part 2)

Unknown.jpeg




Yomna:

 Nabil, I really enjoyed how you historicized the rise of popular religion in the Middle East and connected it with the events of 9/11. It was particularly refreshing to see you contextualize the Khaled phenomenon and recognize it as an effort to rearticulate religious traditions. This is an essential point.

While my personal experience and that of the youth in the digital spaces I study convey a general disenchantment with the figure of Amr Khaled, as a form of popular religion, this does not discount the fervor with which his phenomenon was received. To me, this betrayed a thirst for reconnection with tradition, and a quest for a genealogy of the modern Muslim self.

Khaled only happened to scratch the surface of this underlying need. His mix of religion and entrepreneurship as you describe it, however, did not hold up to the depth of this need and the test of the Arab Spring. Likewise the self-help genre (Tanmiyya Bashareyya) that was coincident and intimately interwoven with Amr Khaled’s message did not withstand the Arab Spring developments. A disillusionment in this genre surfaced in the form of sarcastic memes and parody videos that took aim at Khaled and not surprisingly this same genre within which his lessons predominantly fell. To this end, I agree with Donald Hall when he argues in Subjectivity that self-help messages tend to fall within the interests of laissez-fairepolitical interests, absolving the government of their role in social welfare. 

To me, utilizing sarcasm in critiques of religious figures, was a tell-tale sign that a relationship was transforming; it signaled that the religious top-down communication has been fragmented and replaced with a two-way one, whereby youth could critique the credibility and the message of the speaker, even if it was religiously framed. Perhaps the very fact that Khaled was not religiously trained, provided the youth with the license to venture into religious negotiation. 

Many theorists understandably refuse to describe the Arab Spring as a movement and constraint it to the bounds of a temporally-bound uprising without consequence, especially with the situation in Egypt becoming more draconian than before. But that would be true, if we were to restrict ourselves to the political reading of events. Those uprisings launched a wave and a movement whereby many social practices and beliefs, from personal relationships to religious teachings were being recalibrated (at least in the minds of young adults) based on the ideals of the Arab Spring and its demands for “’Eish, Horreya, ‘Adala Egtima’ya: Bread, Freedom, and Social Justice.” 

However, as many of the researchers of this highly misrepresented, and misunderstood MENA region, I always run the risk of being misquoted in an attempt to reinforce orientalist binaries—East/West, Civil/Barbarian, Modern/Traditional—that we as, postcolonial scholars, constantly fight in our work. I worry that those critiques may be read as a sign of religious reformations that contest premises not execution, mirroring the Christian reformation, thereby canceling the agency and particularity of Muslim subjects and societies. This concern is further complicated by the fact that Muslim societies, like any society, struggle with their own marginalization of racial, and religious minorities. This is why a critique from within, such as Kayla’s is ever more pressing, where we criticize Western hegemony all while acknowledging and critiquing our own. 

Despite the fact that we study Muslims in different parts of the world, we make surprisingly similar observations. Young Muslim Adults, in both the US and the MENA region, are challenging forms and expressions of what they consider a bygone nationalistic era that does not reflect their current aspirations nor challenges. Our research continues to show, how arts, culture and fashion are utilized as means for challenging the long established political, cultural and religious institutions. It is quite fascinating to even witness this shift among American Muslim descendant from Arab-speaking parents, and how they are gravitating towards, as well as adopting, Black Muslims’ fashion trends and expressions. On one hand, it could be their way of asserting their American identity by embracing the fashion styles of Black Muslims whose history in the US dates back to the antebellum era. On the other hand, it could also be their way of carving out a new “cool” identity, as Kayla describes it, that is expressively different in essence and appearance from that of their parents.

With continuously maturing sensibilities flourishing in digital participatory cultures, are we witnessing a demise of the traditional religious sermon: A process of disentangling religion—I would not say from politics—but from dogmatic authority? Is the interplay of Arab Spring agency, Black Muslim women self-assertion and digital technologies powerful enough to reorganize centuries-old ways of accreditation and preaching? What I see us doing in our research is analyzing this slow cultural build-up (towards a movement, an uprising, or a revolution, albeit slow one), by studying the more main-stream/less famous border-line professional/border-line activists who are able to translate the language of activism and sometimes religion to that of subject classes through arts, culture, and fashion. 

 

Nabil:

Yomna, you raise a critical point about the disenchantment with the Amr Khaled phenomenon. I’d join you in your assessment and state even further that his form of popular preaching has been substantively vacuous in terms of its engagement with tradition. Khaled’s popularity was due in large measure to his success in re-socializing Islam based on a neoliberal program of consumption and individual emancipation, particularly among the middle and upper class of Egypt and other Arab countries. I would also say that his waning popularity today is arguably a good proof of this disillusionment in substance and his inability to forge a more compelling Islamic ethos that is not overly determined by neoliberal logics and practices. Just to be clear, Amr Khaled has never been an agent of Islamic reform, nor has he been an advocate for a critical engagement with Islam. My point, however, is that various accounts and analyses of Amr Khaled and other popular preachers not only in Egypt, but also in Turkey, Malaysia, and Indonesia, have focused exclusively on the novelty of this genre of popular religion and its striking similarities with evangelical Christianity that they miss any connection with a long-standing tradition in Islam of non-official and unsanctioned forms of preaching which have left indelible marks on how Muslims practice and experience their faith today. This oversight can devalue Muslims’ ability to engage, contest and rework the pull of tradition and the pressures of modernity and how Muslims deal with various epistemologies of and innovations in religious mediation. 

It bears repeating that Khaled’s intervention is indeed not about critical reform, but his brand of consumer Islam and visual devotional programming has recast faith as an engine of social mobility and self-improvement which directly competes both with state power and religious opposition groups like the Muslim Brotherhood or mainstream religious authorities like the clerics of Al-Azhar. It was not a coincidence when the Mubarak regime banned him from preaching in public. Khaled’s realignment of Islam with consumer culture, self-help, and social mobilization was a political affront and cannot be discounted because it also focused the nexus of social action on the self and the need to reform the Muslim individual first in order to enact a larger program of social change. 

This individualization of Islam and the privatization of faith allow for an intensification and expansion of the political beyond the habitual spaces of political activism as in political parties, voting, large scale protests, etc. There are important arguments against the risks and limitations of deploying individual solutions to structural problems, but there are also compelling theoretical reasons to focus on the complex intersections of personal and collective identity in Muslim lived experiences without reducing this phenomenon to neoliberal logics of individualism and consumerism. The work of Turkish sociologist Nilüfer Göle on this emerging manifestation of Islam through the lens of identity and difference is exceedingly instructive in this regard.  

This is why the work Kayla is doing on historicizing Islamic fashion as a fluid bodily ritual is so important now. The concept of “cool Islam” is a complex category that reflects both the specific lived reality of Muslims in various locations, but it also marks an important contestation of dominant and normative standards of modesty, agency, and liberty. Muslim fashion and other artifacts of material culture foreground an alternative public imaginary of the interaction between beauty, modesty and public space. This new contested visibility of Muslim identity in non-Muslim majority contexts complicates not only the performative nature of piety, but it also challenges the received definitions of the religious and the ethnic in secular societies. In the context of fashion and black Muslims in the anthropological work of Kayla, blackness and Islam are equally interrogated in terms of how they inspire the cultural production of black Muslim women while resisting hegemonic forms and narratives of proper piety and modesty within Sunni Islam. I also appreciate the historical sensibility of this research as it analyzes black Muslim fashion in the context of the American Muslim black experience which cannot be simply reduced to its connections with Islam in the Middle East or other Muslim-majority societies.    

It is refreshing to see how our research complicates the study of Islam in various geographical locations and based on different cultural experiences of Muslims around the world. I would agree that we are arguably witnessing the decoupling of Islam from its old moorings in dogmatic authority, causing significant disruptions in how Muslims re-imagine the symbols, values, and codes that inform their identities. But this kind of Muslim self-fashioning aided by elaborate circulation networks of digital communication cannot be studied only in terms of its difference and contestation of dominant secular arrangements. A question I come back to quite often now is: do we study Islam and Muslims simply to underscore this difference and intensify some sort of Muslim distinction, an alternative modernity of sorts? Or is our work still invested in questions of the universal, albeit a more lateral form of the universal?

 

Kayla:

 

Yomna and Nabil, I really enjoyed learning more about your research. I agree, we all have a lot in common. Identity production through consumption seems to be central to all our research, whether it is buying modest clothes, spreading memes, or watching sermons online.  What I love the most about the study of popular religion is that it highlights youth’s voices, who are often pushed to the margins both in scholarship and everyday life.  Yomna, this is something you introduce towards the very end of your post, when you talked about digital spaces helping youth rediscover what brings them together based on their intersecting identities. However, I wonder how by focusing on people who have the time, money, and other resources to engage with media, we might be ignoring even more marginalized groups, like the poor and elderly. Who is the Amr Khaled for Egyptians living outside of urban areas, who might not have the same rates of access to a steady Internet connection or have different relationships to religious authority? In what ways does place dictate how we consume religion?

I am also interested in how a focus on popular religion can reveal transnational dialogue.  Media allows people in the diaspora to connect to struggles back at home, but to also introduce them to people dealing with similar struggles. I’m thinking of how Khaled M and El Général cited Tupac as a major influence for their music produced during the Arab Spring or how Palestinians sent Ferguson activists advice on how to deal with tear gas. On the flipside, this mass circulation of art, culture, and fashion allows for local practices to be appropriated and decontextualized.  I agree with you, Yomna, Arab American youth are using Black Muslim culture to claim their American identity.  I’ve watched non-Black Muslims monetize Black Muslim culture without making space for Black Muslims and at times, embrace anti-Blackness. Su’ad Abdul Khabeer’s book Muslim Cool: Race, Religion, and Hip Hop in the United Statesdoes a great job of breaking down non-Black Muslims’ relationship to Blackness. 

I appreciate how you both have placed Amr Khaled within the history of sermonizing in Islam, rather than solely comparing his trajectory to evangelical Christian ministers. Religious Studies, as a field, has not developed a common language for discussing non-Christian practices/beliefs/rituals/traditions without making comparisons to Christianity.  I often wonder how much different our conversations would be if Religious Studies was able to rid itself of its Protestant influence and we, as scholars, were not so concerned with the Christian gaze. Yomna, I can relate to your concerns about your work being misused and weaponized.  I grapple with theses tensions a lot in my work because covered women are hypervisible in the media and in scholarly writings on Islam and they experience harassment and violence due to anti-Muslim sentiment at disproportionate rates in their everyday lives.  I worry about how my work may contribute to the fetishization of covered Muslim women, reducing them to walking hangers, while simultaneously erasing women who don’t cover. Nabil, do you think that as mainstream media in the U.S. has begun to present more varied depictions of Muslims (not just the dangerous Brown man and oppressed Brown woman) that it has gotten easier for scholars of Islam to just do their work without constantly worrying about how it might be mis/used?

Nabil writes that we must ask ourselves “Who do we focus on when we label our research as work on Islam?” I would add to that question by asking, who are we including in our definitions of “Muslim”? I hope our focus on popular religion provides us with an opportunity to center non-Sunni Muslim voices, like members of the Nation of Islam, who have been engaged in political activism and reimagining religious authority since its creation.  When I was reading both of your introductions, I kept thinking about Elijah Muhammad and Louis Farrakhan and how they have both used media (radio, journals, social media) to share their message of self-improvement. How do they, and other non-Sunni Muslims outside of the MENA region, fit into your genealogies and analyses? How might a deeper engagement with critical race theory shift our work? What does Islamopolitanism look like when Black people (on the African continent or in the Diaspora) are at the center of our analyses? If you don’t already engage with their work, I think Su’ad Abdul Khabeer, Sylvia Chan-Malik, and Jamillah Karim’s would be especially helpful.  How can our explorations of popular religion help us to redefine or reimagine our citational politics?  

I am excited to see how your projects continue to develop and look forward to continuing these conversations!

 

Yomna Elsayed holds a PhD in communication from the University of Southern California. In her research, she examines the interplay of popular culture, social change and cultural resistance. Her dissertation examined how popular culture mechanisms, such as humor, music and creative digital arts, can be utilized tosustain social movements all while facilitating dialogue at times of ideological polarization and state repression. 

Nabil Echchaibi is chair of the department of media studies and associate director of the Center for Media, Religion and Culture at the University of Colorado Boulder. His research and teaching interests include religion, popular culture, postcolonial and decolonial theory, and Islamic modernity. His work has appeared in various journals and book volumes. His opinion columns have been published in the GuardianForbes Magazine,SalonAl Jazeerathe Huffington PostReligion Dispatches and Open Democracy.

Kayla Renée Wheeler is an Assistant Professor of African American Studies and Digital Studies at Grand Valley State University. Currently, she is writing a book on contemporary Black Muslim dress practices in the United States. The book explores how, for Black Muslim women, fashion acts a site of intrareligious and intra-racial dialogue over what it means to be Black, Muslim, and woman in the United States. She is the curator of the Black Islam Syllabus, which highlights the histories and contributions of Black Muslims. She is also the author of Mapping Malcolm’s Boston: Exploring the City that Made Malcolm X, which traces Malcolm X’s life in Boston from 1940 to 1953.

Popular Religion and Participatory Culture Conversation (Round 7): Nabil Echchaibi, Yomna Elsayed, and Kayla Renee Wheeler (Part 1)

Yomna Elsayed

University of Southern California


20859.jpg

 

Outside Al Hossary Mosque in Greater Cairo, crowded young, mostly affluent, Egyptians. Their bodies blocked the entrance to the mosque, while their double-parked cars congested its street (a common sight in overcrowded Cairo). It was short after sunset, the time for the then-popular Muslim televangelist, Amr Khaled, weekly lesson. His lesson was about sincerity, which he intercepted with jokes, storytelling and a teary supplication towards the end. It was the start of the millennium, when Amr Khaled seemed to be attracting a strong following of young Egyptians desperate for enchantment in what seemed like a country ruled with an iron fist, when Mubarak was still in power (apparently, it is now ruled with a “steel-fist”). Khaled’s lively lessons and animated character stood in contrast with the traditional cloak-wearing, state-approved Azhar clerks on one hand, and the Jilbab-wearing, arguably state-approved, fundamentalist Salafis on the other. With his shaven face, tieless suite, and wide smile, Khaled looked more like themselves. He enlivened the same stories they monotonically heard as children with detail and personified the Prophet and his companions—from which Muslims draw their life-style—through affective rhetoric. He was soon named by the Times asone of the most 100 influential people in 2007. This arguably put him in a perilous situation with the Egyptian government that was historically weary of religious figures turning popular. Amr Khaled was therefore judicious in steering away from politics. Nevertheless, in the minds of his adoring fans, this left room for speculation as to a possible double meaning in some of his lessons, and/or choice of stories. Soon, there was a wave of religiosity sweeping Egyptian society especially among upper middle-class Egyptians; gradually, many young men and women started practicing religion publicly. They observed their daily prayers and frequented the mosques, especially Ramdan’s night prayers, while many young females started wearing the headscarf. 

I was one of Khaled’s young fans, a late-adapter nevertheless (if we can liken Khaled to a new technology). Skeptical of religious rhetoric at the time, I was afraid that someone would make me feel guilty about my wavy hair, tight top and jeans. As a Muslim woman I am required to dress modestly and cover my hair. It was not until, I indulged myself in Islamic philosophy and the writings of Al Ghazali, that I decided to cover my hair during my last year of university against the objection of my family. I was proud to have worn it without the influence of religious figures, or family. At which point, I started listening to Khaled, and was surprised at how he did not limit a woman’s expression of religiosity to the way she dressed as some fundamentalist Salafis liked to do. In fact, very few of his lessons touched upon the physical appearance of the Muslim, whereas most focused on their piety and interactions with society. His words about sincerity, his animated storytelling and teary supplications are still vivid in my memory. I remember them now with bitterness. I, like many of his fans, was struck by what many would describe as Khaled’s transformation.

Following the January 25thuprisings, Egyptian media was rampant with hostile accusations of treason against young protesters who were scrambling for voices of support. To the youth’ dismay, Amr Khaled, whom they brought to fame, stayed silent on the subject. He was not alone in that; many popular religious figures followed suit, or worse, attacked the protests as un-Islamic or unpatriotic. Khaled did not speak, until it was apparent that Mubarak would resign and eventually concede. His statements remained ostensibly impartial, urging “everyone” to exercise temperament. However, with so many victims to state brutality, staying on the side-lines was no longer acceptable to his young fan base, of which many participated in the uprising. His popularity, however, did not sharply sink, until a video surfaced for him following the 2013 military coup where he was addressing Egyptian soldiers and providing them with religiously-framed arguments for blindly following commands. To many in my generation, this was the last straw. 

In my research on cultural resistance post-Arab spring, I examinehowthe energies of the Arab Spring have transformed to the participatory, ephemeral and relatively ambiguous spaces of humor, music and creative digital arts. Unable to publicly criticize cultural and political authorities, young Egyptians reveled in their ephemeral digital triumphs over the low hanging fruits of authority, its cultural productions. Amr Khaled, with his watered-down rhetoric, turned from a religious heart-throb to yet another state-media talking head. The prudence that worked for him pre the uprisings, worked against him following the military coup, after countless victims were lost to police brutality. Hence, it came as no surprise that the once admired figure of Khaled on one hand, and the once revered self-proclaimed Salafi Sheikhs on the other, became the object of ridicule in memes and parody videos of online youth. While some may see those parodies as signs of dystopic cynicism, I see them as a sign of maturing sensibilities that reject any attempts at misleading them into previous complacency. Not surprisingly, this was also paralleled by a rejection of favorite childhood entertainment figures, such as Mohammed Sobhy, for their moralistic rhetoric and state support. 

However, this rejection of religious figures should not be mistaken for a rejection of religion altogether; to say so would be disregarding a central aspect of life in the Middle East. It, however, signaled that those young adults were now consuming religion in a much more critical fashion. As someone trained in cultural and post-colonial studies, I continually emphasize that part of the acclaim that the Arab Spring uprisings received from western media analysts and commentators was not only inspired by its promises of political reforms, but also—what some viewed—as a promise of subsequent social reformations to an inherently flawed ‘other’. Such discussions of religious reformations, trying to replicate the Christian reformations, are both patronizing and counterproductive and have little to do with the societies these populations live in, as Shadi Hamid from the Brookings Institute once asserted. An uprising against an old order does not necessarily translate to an uprising against its heritage and tradition; it could rather simply imply a rejection of one appropriation of that tradition but not the other. 

While traditional religious spaces may have been viewed as part of the institutions social movements were trying to resist, this resistance may have only been to the state-abiding aspect of these institutions; other aspects such as the religious rituals or promotion of social justice and advocacy may continue to be sources of inspiration to some of the activists. In my research, I have seen youth both emphasizing continuity with tradition side by side to, discontinuity or resistance to some of its state-abiding aspects. So their relationship to tradition, and childhood texts is better described as a negotiation, a site of struggle over the role of religion in their social and political lives; this relationship still exists, however, on their own negotiated terms, ones that do not sacralize individuals all while respecting difference.

Last Ramadan, marked in my opinion a sad, yet telling, ending to the phenomenon of Amr Khaled, when he appeared in an adfor army-produced chicken, emphasizing the  need to consume healthy food products, aka army produced ones, to enable proper worshiping in the month of Ramadan. The criticism on social media was predominantly sarcastic. To young adults, the irony was self-evident, yet one mixed with disappointment over what could have been a possible meeting point between tradition and change.I can now see Egyptian and Arab youth weaving this connection in their participatory spaces, breaking the sanctity of individuals on one hand, all while rediscovering what brings them together as Egyptians, Arabs or Muslims. 

 

Nabil Echchaibi

University of Colorado Boulder

 

I began my work on religion immediately after 9/11, that fateful event that has ushered in a perpetual state of emergency and fear about Islam and Muslims. I was writing my dissertation on second and third-generation French and Germans of North African descent and how they navigated the political philosophies of assimilation and integration in their countries through media production. Up until this moment, these minorities had confronted a relentless form of cultural and institutional racism in which religion didn’t figure so prominently. French North Africans, for example, were referred to as “les Maghrébins” or “les arabes”, terms that were replaced overnight with the ominous label of “les musulmans”. This new ascendancy of militant Islam in the West precipitated a public scrutiny of Islam and exacerbated anxieties about the motives of Muslim minorities. Questions multiplied and quickly turned into paranoid interrogations of the loyalty of Muslims and the compatibility of Islam with modernity. 

Suddenly, Muslims were called to provide theological answers to questions about jihad, niqab, hijab, sharia, and suicide bombing amidst a media climate of deep semantic and cultural confusion about the meaning of these words and their relevance in a Western secular democracy. 

I became interested in the sources Muslims both in the West and in the Middle East were urgently consulting to confront the suspicious tenor of these allegations. Although these emerging questions about Islam pertained as much to politics and Western foreign policy as they did to religion, Muslims turned to various forms of popular religion to ask their own questions and challenge the narrow premise of fixed binaries and regressive traditions. Oil monarchies of the Gulf flooded satellite television with religious programming, some of which inaugurated innovative forms of preaching and religious entertainment in the form of reality television, game shows, and music videos. The political ramifications of this Islamic revivalism through networks manipulated by Saudi Arabia was hard to miss, but I was also intrigued by the novelty of this style and the large following it commanded around the world.  

Popular preachers like Amr Khaled, a former accountant, pioneered a creative breed of religious programming with an effective mix of religion and entrepreneurship. He later adapted Donald Trump’s The Apprenticeto a program about Islamic charity. Moez Masood, a former advertising producer, created a slick twenty-part television series in which he toured the streets of London, Cairo, Jeddah, Al Madinah and Istanbul interviewing Muslims about spirituality, romance, homosexuality, drugs and veiling. And two British Muslims launched a record label company that specialized in devotional music and Islamic entertainment. Critics of this popularized form of preaching called it “air-conditioned Islam’ or “Islam light”, accusing its producers of simply mimicking or importing the religious performance genre that helped popularize American evangelical Christianity through the adoption of modern media and popular culture. I began, instead, to explore the aesthetics and rhetorical import of this televised and digitized Islam in a way that did not dissociate it from the rich history of sermonizing in the Islamic tradition. To me, this phenomenon had more to do with a historical tension within Islam over religious knowledge and its transmission, which made many Western accounts of these preachers too shallow and predictable. Television and the Internet only complicated an oral/aural/visual tension in the devotional experience of Muslims and I wanted to capture that continuity.   

The point of my research was not to argue that there was nothing new in these emerging forms of mediated Islam. Rather, I wanted our analysis to also adopt a historical approach which contextualized the complex theological, ethical, and cultural dimensions of mediation and circulation within Islam. This part of my research has largely benefitted from the work of Talal Asad, Saba Mahmood, Mahmood Mamdani, and others who argued, persuasively, for an intimate engagement with the intellectual and political history of Islam in order to recognize the vitality of Muslim efforts to re-articulate their religious traditions and adapt them to their modern condition. This is not simply a return to a bounded notion of tradition, although it is for some, but a negotiation of traditions to make sense of the world.   

Drawing on postcolonialism and decolonial critiques, my recent work focuses on emerging material expressions of an Islamic strand of cosmopolitanism that is deeply invested in this effort of sensemaking. I call this ‘Islamopolitanism’, a combination of popular religion and an intellectual engagement with what it means to be modern and Muslim today. Specifically, I ask how our analysis of new Muslim digital spaces and aesthetic formations can reveal emergent cultures of Muslim cosmopolitanism, a cultural sensibility and a way of dwelling in the world intimately born of the complex tensions between religious universalism and particularism, cultural mixity and purity, and authentic piety and neoliberal commodification. I argue that this form of Islamopolitanism is primarily rooted in a cultural aesthetic rather than a political conviction. Its proponents call for a remix of Islamic culture that arguably resists the nativist visions in the dominant narratives of Muslim identities. 

It is precisely this epistemic disobedience against the duality problematic of modernity and tradition that is still absent in our accounts of Muslim lived experiences. Moroccan postcolonial thinker and novelist Abdelkebir Khatibi insisted on demystifying both Western and Arabo-Islamic logocentrism in favor of a double critique that springs from tradition but only to create new ideas, new questions, and new ways of knowing. I invoke Khatibi’s postcolonialism in my research because it resists narratives of melancholy, victimhood, shame, malaise, loss, or existential uprootedness. Instead, his invitation was to find a discourse of possibility, an epistemology of suspicion, an idiom of Muslim syncretism, and a path toward intellectual independence. 

Other Muslim thinkers call for a similar open engagement with the particularism of Muslim cultures, local intellectual and political histories, and religious doctrines to deliver Muslims from the long grip of the slogans and the blackmail of Western modernity. In his provocative thesis that postcolonialism has ended, Hamid Dabashi argues that we have reached a moment of epistemic exhaustion that marks the “implosion of the ‘West’ as a catalyst of knowledge and power production.” The Arab uprisings of 2011 were, according to Dabashi, only the beginning of this new defiance. Senegalese philosopher Souleymane Bachir Diagne, who advocates for decolonizing the history of philosophy, also calls on Muslims to keep Islam an open project, a doctrine in movement ready to drop all forms of identitarian chauvinism and to listen and absorb other voices inside and outside its tradition.     

Islamopolitanism is an open-ended project that shares these sensibilities and aspirations. I explore the work of performance artists, activists, devotional musicians, and authors who have developed sites, aesthetics, and cultural tastes to interrogate the mediation of Islam and the making of Muslim subjectivities beyond the limitations of traditional Islam and secular modernity. My aim here is also to expand the object of study in Islam beyond simply the visibly pious adherents of this faith. In fact, what are we studying precisely or who do we focus on when we label our research as work on Islam? My own approach is concerned with unpacking the complexity as well as elusiveness of the Muslim subject. As Achille Mbembe and Sarah Nuttal remind us in their writing about cities of the South and their dwellers as subjects ‘en fuite’ (on the run), I want to theorize Muslims as subjects en fuite- in the sense that they “always outpace the capacity of analysts to name them.”

 

Kayla Renée Wheeler

Grand Valley State University

 Throughout jumaah at the Annual Muslim Convention, I awkwardly tugged at my khimar. Unlike the times I had spent doing fieldwork in predominantly Arab and South Asian mosques, I wasn’t worried about making sure my neck and flyaway hairs were covered.  Instead, I was repositioning my khimar to make my slicked down baby hairs visible and to show off my dangly earrings. I wanted to fit in. I was surrounded by Black women in every possible head covering imaginable: berets, kufis, turbans, hoodjabs, and Shayla khimars.  Their wax print and bogolan maxi skirts made them appear to float elegantly down the rows, their layering techniques would have made Bonnie Cashin jealous.  They were performing what anthropologist Su’ad Abdul Khabeer calls, Muslim cool, a form of embodied resistance that privileges Blackness.  I had finally found home.  

My experience at the Annual Muslim Convention was one of the few times where my loosely tied khimar and 3/4-length sleeve shirt had not been met with side eyes from Muslim aunties.  None of the aunties at the convention chastised me for not dressing modestly or “Muslim” enough, something that often happens in the small college town mosques that I visit across the U.S.  These critical aunties, who are quick to call my outfits inappropriate and even haram, are invested in what I call “hegemonic Islam,”  which is Sunni-centric and privileges Arab expressions of Islam as the most authentic based on the belief that geographic or cultural proximity to Prophet Muhammad’s native land dictates one’s religiosity.  Hegemonic Islam is naturalized as “true Islam” and marginalizes those who do not fit within its framework.  It proves problematic for African-American Muslims who can only trace their natal history to the Americas.  Hegemonic Islam is inherently anti-Black because it devalues practices and beliefs created within African-American Islam.  

I developed the term, hegemonic Islam, in my dissertation, which explores how Black Muslim women use YouTube fashion and beauty tutorials to create alternative images of the ideal Muslim woman.  I traced the development of hegemonic Islam back to postcolonial movements in the Middle East and North Africa (MENA) beginning in the 1960s, during which Muslims critiqued Western political and cultural dominance across the world.  Many sought to create an alternative shared identity for Muslims that would transcend social class and geography.  One way this shared identity was expressed was through dress.  Regionally specific clothes and styles, such as the abaya and thobe, were transformed into the only authentic Muslim dress.  This new shared identity created a new social hierarchy, where Arab Muslim cultural practices are placed at the top and African-American Muslim practices are at the bottom.  Wearing clothes that had once been specific to the MENA region became a sign of one’s commitment to Islam, instead of the materialist West. Su’ad Abdul Khabeer calls this pious respectability, where it is assumed that the more “religious” a Muslim becomes, the more they will shift aesthetically towards MENA.  I am interested in exploring how Black Muslim women have used fashion to reimagine pious respectability and resist hegemonic Islam.  

In my book, I explore how Black Muslim women in the United States have historically used fashion to construct alternative femininities that disrupt Eurocentric beauty norms and create transnational networks of belonging based on a shared identity as Black Muslims.  Through my research, I explore how the Nation of Islam (NOI) and Imam W. Deen Mohammed community’s (IWDMC) emphasis on racial uplift via entrepreneurship and patronizing Black businesses have been essential to building what I call the Afro-Islamic Diaspora fashion industry.  These organizations host charity fashion shows, house bazaars at annual conventions, and build women’s only spaces where women and girls can learn how to sew and design, providing women with the opportunity to monetize their talents and promote Black self-determination.

I situate my work within Islamic fashion studies.  The field is underdeveloped because scholars have historically understood fashion to be a product of the Christian West, originating in the Renaissance during the rise of early capitalism when people moved to urban areas and sought ways to individuate themselves.  These fashion origin stories create a binary between the West as a site of modernity and the East as being stuck in the past, which replicates Orientalist tropes.  This leads to scholars viewing Muslim women’s covering practices as static and geographically bound, but that is not reflective of what is happening on the ground. What fabrics, colors, and silhouettes are considered trendy is constantly shifting.  Five years ago, Khaleeji hijabs were “in”, now it’s turbans. It has been important for me to avoid looking for motivations as to why Muslim women cover—they are often numerous and fluid.  Instead, I am interested in examining what clothes communicate to others, what bodies are produced through dress choices, how definitions of modesty are constructed, and how objects become “Islamic”.  This approach prevents me from fetishizing Muslim women and their clothing.

It has been interesting watching the rise of modest fashion within the mainstream Western fashion industry.  2015 seems to have been a major turning point in the industry.  Not only did high-end brands like Tommy Hilfiger, Dolce & Gabbana, and Monsoon begin selling Ramadan collections, many of which were only available in the Gulf region, more affordable brands like Nike, Uniqlo, and Macy’s have created permanent lines.  In general, the fashion industry has embraced longer hemlines and higher necklines.  On a personal note, it’s been so exciting to ditch my collection of cardigans and leggings that I used to use to make outfit more modest because so many brands now cater to my tastes.  While the move from body con dresses to maxi shift dresses could be a result of the cyclical nature of fashion, I think it’s also a recognition of Muslims’ growing global buying power.  The fashion industry is finally seeing Muslims as consumers.  

From my research, I’ve learned that the mainstream Western fashion industry’s embrace of Muslims as consumers has had negative consequences.  Independent Muslim designers are being pushed out by fast fashion brands that can make their products quickly and at significantly cheaper prices.  Many of the clothes sold by fast fashion brands like H&M are produced by Brown Muslim women in Indonesia and Bangladesh who work in unsafe work environments at low wages.  Mainstream fashion advertisers have slowly begun to use Muslim models who regularly cover in their marketing campaigns.  However, these models are primarily young, thin, visibly able-bodied, light-skinned, and non-Black.  I cannot deny the importance of positive representation of Islam for young Muslim children’s self-esteem, especially considering the rise of anti-Muslim, which disproportionately affects visibly Muslim women.  However, these advertisements reproduce the image of Islam as a “Brown” religion, contributing to the marginalization of Black Muslims.  They also uphold Eurocentric beauty standards, leaving many Muslim women outside the realm of fashion.   

The new focus on modesty in the mainstream Western fashion industry is mirrored by an uptick in scholarship about Muslim women’s dress that focuses on Muslim women outside of MENA.  While I have been happy to see the decline of veil historiographies, which dominated the field of Muslim dress studies in the 1980s and 1990s, I am disappointed that the scholarship still privileges women living in Muslim-majority countries, including Turkey, Indonesia, and Iran.  When Muslims living as religious minorities are discussed, race and racial difference are often ignored.  The United States provides a unique case study because there is no racial or ethnic majority among Muslims, but there is a clear racial hierarchy in terms of defining Muslim authenticity.  Despite Black Muslim women, specifically African-American women associated with the Nation of Islam and the Imam W. Deen Mohammed community, making it “cool” to cover as early as the 1920s and creating and building a fifty-year old fashion industry, they’ve largely been ignored by scholars.  I hope to correct that.


Yomna Elsayed holds a PhD in communication from the University of Southern California. In her research, she examines the interplay of popular culture, social change and cultural resistance. Her dissertation examined how popular culture mechanisms, such as humor, music and creative digital arts, can be utilized tosustain social movements all while facilitating dialogue at times of ideological polarization and state repression. 

Nabil Echchaibi is chair of the department of media studies and associate director of the Center for Media, Religion and Culture at the University of Colorado Boulder. His research and teaching interests include religion, popular culture, postcolonial and decolonial theory, and Islamic modernity. His work has appeared in various journals and book volumes. His opinion columns have been published in the GuardianForbes Magazine,SalonAl Jazeerathe Huffington PostReligion Dispatchesand Open Democracy.

Kayla Renée Wheeler is an Assistant Professor of African American Studies and Digital Studies at Grand Valley State University. Currently, she is writing a book on contemporary Black Muslim dress practices in the United States. The book explores how, for Black Muslim women, fashion acts a site of intrareligious and intra-racial dialogue over what it means to be Black, Muslim, and woman in the United States. She is the curator of the Black Islam Syllabus, which highlights the histories and contributions of Black Muslims. She is also the author of Mapping Malcolm’s Boston: Exploring the City that Made Malcolm X, which traces Malcolm X’s life in Boston from 1940 to 1953.

Popular Religion and Participatory Culture Conversation (Round 6): Brandy Monk-Payton and Patrick Johnson (Part 2)

Brandy Monk-Payton:

I’m so glad you mentioned Aretha Franklin’s homegoing ceremony.  I grew up in a Southern Black Baptist church environment and the entire event was so visually and sonically familiar. While I was only able to catch bits and pieces of it, I’m thankful that Black Twitter was able to provide me with a rundown online. Participatory culture in a digital era is enhanced by racialized social networking practices.  

The appearance of the church within many Black media objects is such an important observation. While you focus on nineties sitcoms in your research, I can’t help but think of a figure like Tyler Perry. I wrote an essay for the edited collection From Madea to Media Mogul: Theorizing Tyler Perry (University Press of Mississippi) titled “Worship at the Altar of Perry: Spectatorship and the Aesthetics of Testimony” that attempted to account for the fandom (especially southern black churchgoing female fandom) around hugely successful African American media maker Tyler Perry through the framework of religion. I argued that Perry transformed the cinematic experience into a sermon with his on-screen parables. While Perry is not an actual pastor, it seems that Black religious leaders have a fan culture unto themselves - certainly the African American mega-church preacher is a mainstay in Black culture that garners much adulation. 

Your use of haunting to describe how Blackness resonates across popular media forms is intriguing. The TV programs that you mention seem to always be present in Black cultural discourse as specters. Scholars such as Alfred J. Martin are bringing them back into focus as valued objects of study, because they are frequently obscured in traditional archives of television programming history. Additionally, Black fans have notoriously been excluded from examinations of fandom (see Rebecca Wanzo’s important essay “African American Acafandom and Other Strangers: New Genealogies of Fan Studies” in the journal Transformative Works and Culture). 

The recent Emmys broadcast commented on these mainstream erasures with Michael Che’s bit “Reparations Emmys” in which Black TV sitcom actors like Marla Gibbs from The Jeffersonsand Kadeem Hardison from A Different World were given Emmys for their influential roles in iconic Black-cast television programs. African American communities have been exposed to these legends through a kind of televisual “inheritance” that you speak of passed on from generation to generation.   

 

Patrick Johnson:

When Che gives Kadeem Hardison his Reparations Emmy for his role as Dwayne Wayne, he tells him, "I don't think you realize how many young brothers you actually inspired to go to college." My study's participants echoed these sentiments, citing A Different World as a key influence on their ability to see themselves as college students. In the 24 years since the show's series finale, it has remained the primary scripted televisual representation of Black college life. There is a generation of Black folks (myself included) who probably made some major life decisions informed by their A Different World fandom and who proudly identify as alums of the show's fictional Hillman College. Emmy-winning writer and producer Lena Waithe, who named her production company Hillman Grad productions, described her affinity for A Different World and The Cosby Show in an interview with NPR's Terry Gross. "I was just lucky that I was a kid watching it, seeing not myself yet in A Different World. I was seeing who I wanted to be and I saw so much of myself and so much of what I wanted to be in those shows. What television did for me is that it taught me how to dream, it taught me what to dream about." In this sense, fandom, like religion, can be understood as aspirational, providing the guidelines for the kind of person one hopes to become. 

 

Brandy Monk-Payton:

The Cosby Show is such a difficult text to engage with now. And Bill Cosby himself represents a kind of crisis, ideologically and affectively, in Black fandom. I think in part because of the way in which Black icons make meaning, spiritually, with Black audiences. The symbolic power they can hold over culture really puts us in a tough position when/if they fall from that position of “grace.” The Boondocks episode that critiques R. Kelly (and R. Kelly fans) perfectly depicts such a crisis.

I’m interested in what you think of these Black Americans deemed exceptional across fields (television, music, sports) and examples of how we have participated in their elevation.  

 

Patrick Johnson:

As a basketball fan, coming of age during the 1990s, there was no player more important than Michael Jeffrey Jordan. Crying Jordan meme, dad jeans, and "the ceiling is the roof" aside, Jordan remains a largely unassailable figure amongst basketball fans. While like Beyoncé, Jordan clearly appeals to a broad spectrum of people, there is a special place within Black culture for "his Airness". Writing about Jordan in the early 1990s, public intellectual and theologian Michael Eric Dyson identified "a religious element to the near worship of Jordan as a cultural icon of invincibility" and argued that Black youth made a particular "symbolic investment in Jordan's body as a means of cultural and personal possibility, creativity, and desire". The Black youth of the 1990s have grown into adults who remain fiercely loyal to Jordan and are particularly invested in him remaining a cultural icon. Make any earnest attempt to discuss the greatest basketball player of all-time and you quickly learn that there is little room for arguments that do not have Jordan firmly at number one. In the words of Krs-One, for many fans Jordan is not only number one but "number one, two, three, four, and five." In recent years, most conversations about the GOAT involve comparing Jordan's credentials against those of Los Angeles Lakers forward LeBron James (whose King James nickname is perhaps too on the nose when it comes to religiosity and sports). At some point in the debate, the pro-Jordan fan offers MJ's perfect 6-0 record in the NBA finals (compared to James' 3-6 record) and his intangibles such as his "heart" and "killer instinct" as evidence of his superiority. The latter traits, while not quantifiable, are nonetheless felt by the fan. The anti-LeBron argument will often come back to a single moment in his career when he was viewed as "quitting" on his team, perhaps the ultimate sin that an athlete can commit. What on the surface seems like a conversation about who is the better ball player is really one about faith, about which player you can believe in. 

 

Brandy Monk-Payton:

I’m wondering if this all comes down to a reconceptualization of faith to account for how Blackness signifies in popular media culture...how Black folk make meaning and create symbolic worlds to believe in as we navigate discrimination and oppression. The hope, across generations, that is put in representation (especially as epitomized by our icons) becomes vital.

 

Patrick Johnsonis a Ph.D. candidate in the Social and Cultural Studies program in the Graduate School of Education at the University of California, Berkeley. His research interests include critical media literacy, Black fan studies, cultural memory, and the residual circulation of past media.

 

Brandy Monk-Paytonis an Assistant Professor of Communication and Media Studies at Fordham University. Her research on theories and histories of African American media representation and cultural production has been published in the journals Film QuarterlyFeminist Media HistoriesThe Black Scholar, and Reconstruction: Studies in Contemporary Culture. Other work has appeared in various edited collections and is forthcoming in the anthology Unwatchable(Rutgers University Press). Her first book project examines Black celebrity in late twentieth and early twenty-first century U.S. public and popular culture.

 

 

 

 

 

Popular Religion and Participatory Culture Conversation (Round 6): Brandy Monk-Payton and Patrick Johnson (Part 1)


25-best-beyonce-songs-01-480x320.jpg


Brandy Monk-Payton:

I’m so excited to be part of this series of discussions with Patrick! Thanks to Diane, Sarah and Henry for the invitation. My research is located at the intersection of Media Studies and Black Cultural Studies. Specifically, I am working on a manuscript emerging from my dissertation that explores the aesthetics and politics of Black celebrity across television and digital media. I’m especially interested in logics of public exposure and the construction of racial notoriety. 

A sense of religion has always undergirded our experience with stardom. The title of Richard Dyer’s seminal book Heavenly Bodies: Film Stars and Society(1986) references the godlike quality that we attribute to screen personalities. While I am not a scholar of religion, I often think about fame in relation to the act of worship. It’s a topic that dovetails with Kathryn Lofton’s useful book on Oprah Winfrey. In Oprah: Gospel of an Icon(2011), Lofton examines how Winfrey’s image and brand are predicated on popular forms of spiritual empowerment. Participatory culture here transforms into a commercial experience of reverence towards a divine figure.

Enter Beyoncé Knowles. Now I’m not inclined to get into debates about the beloved pop star, lest the Queen Bey’s very vocal Beyhive (her community of most devoted fans) begins to buzz. Yet it seems that Knowles exemplifies celebrity worship and, even more so, a particular type of Black celebrity worship. 

Rachel Kaadzi Ghansah writes about her pilgrimage to a Beyoncé concert: “I am not a Beyoncé fan but I felt like crying tears of joy all three times I saw the Mrs. Carter show. Because while other pop stars may sing about throwing some glitter on it and making it rain, only Beyoncé could literally soar over us, climb up over our heads and our real lives, climb over her kingdom, to actually throw down over us what looks like bits of pollen, golden confetti, and make it rain bits of her dream all over her fans who love her so, and who would do anything for their Queen.”

I have never been to one of her concerts, but I know many people who have, who describe it in some way, shape, or form, exactly like the above. Indeed, Beyoncé has devoted followers in her “kingdom” and on the face of it, this is no different from any other adored musical artist. Yet this icon, this Black icon, carves out a sacred space in the public imaginary with every performance. I’m reminded of her April 2018 appearance at Coachella (coined “Beychella”), which turned the site into a large-scale HBCU football halftime show. That performance included a rendition of the Black national anthem “Lift Every Voice and Sing.”  

While her appeal is widespread, I would argue that she mobilizes histories of Black expressive culture in order to generate collective memory amongst her African American audience. The interaction between Beyoncé and her fans resonates in terms of folk traditions as epitomized by the Black church.   

The discourse of racial iconicity hinges on both veneration and denigration, per Nicole Fleetwood. It is the communal and oftentimes ritualistic aspect of the veneration amongst Black fans that I find most fascinating.    



monkimage.php.jpeg


  

Patrick Johnson:

First of all, I would like to thank Henry, Sarah and Diane for putting together this series. I have really enjoyed learning about everyone's work. As a scholar whose research does not explicitly engage religious studies, these conversations have been particularly fruitful for helping to expand the terms in which I understand my scholarship. My work sits at the nexus of television studies, Black cultural studies, and education. My dissertation looks at how 1990s Black sitcoms such as MartinThe Fresh Prince of Bel-AirLiving SingleA Different World,and Moesha function as a form of heritage and inheritance for Black millennials. I am broadly interested in understanding what the relationships between Black music cultures and television can tell us about Black cultural memory. I employ the concept of haunting to think through not only the resonance between Black media forms but the ubiquitous nature of the televisual past within Black popular culture.  

One of the major points of intersection between my work and that advanced through religious studies are around issues of inheritance and collective memory. Most of my study's participants, who were between the ages of 18 and 24 at the time of the study, worked under the assumption that most Black people have some awareness of the aforementioned 1990s Black sitcoms. While participants had varying orientations to the programs (many with strong negative critiques of the shows, especially Martin's gender politics), they expressed feeling that they had to at least contend with them. Most of study's participants described inheriting their 1990s Black sitcom fandom from their grandparents, parents, or older siblings. The ability to be conversant in the shows, knowing the characters, catchphrases, and major storylines enabled them to participate Black cultural conversations that connected them to previous generations of Black folks. They understood their literacy in the shows as granting them access to both imagined and actual Black communities. This brings to mind a quote from a participant in Jonathan Gray’s study on The Simpsons who stated, “Even though I don’t watch this show, I don’t like this show, uh, I have to know about it to a certain degree…otherwise I will be excluded from the conversation of my friends” (p. 71). In this sense, I wonder how might religious studies help us think through the fandoms that can almost be read as compulsory. 

The Black church has been a central institution in Black life. As a result, there is a certain literacy that many Black people have with the Black church regardless of their religious affiliation. Beyond the Black cast, the Black church is often employed in Black sitcoms as one of the many signifiers that connotes Blackness to the audience. The success of scripted shows like OWN's Greenleafand Bounce's Saints and Sinners as well as reality shows such as the Preachers franchise, proves that there is an appetite for programming that centers the Black church within its narrative structure. 

However, as I think about the convergence of religion, participatory culture, and Blackness, the Queen of Soul Aretha Franklin's homegoing service is perhaps the show that cogently brings together these elements. The service with all it's pageantry provided space for Black people to collectively laugh at some of the ridiculousness that took place. Through social media and our offline discussions, Black people engaged in what Beretta Smith-Shomade (2016) describes as playful piety. Our running commentary on fashion choices, singing ability, and audience reactions should not be read as incongruent with religiosity but instead reflecting a means through which we "enjoy the foibles, fallacies, contributions, and even grace of black religious ways of being" (p. 321). This infusion of humor was part of celebrating Aretha Franklin's life and achievements. 

 I am interested in any site where multiple eras of Blackness come together, and the homegoing service would definitely qualify as such. Haunting has been useful for thinking about the “always thereness” of not only certain figures and cultural artifacts but the seemingly antiquated ideologies that continue to resound in the present. We can look at the service and think about Bishop Charles E. Ellis III’s interactions with Ariana Grande or Rev. Jasper Williams’ rebuke of Black Lives Matter and understand them as representing regressive ideologies that linger when left unconfronted. However, the mediated and spectacular nature of the service ensured that their problematic behavior and statements would not remain within the confines of the sanctuary. Rather their actions were subject to scrutiny by Black folks in real-time via social media.

Patrick Johnson is a Ph.D. candidate in the Social and Cultural Studies program in the Graduate School of Education at the University of California, Berkeley. His research interests include critical media literacy, Black fan studies, cultural memory, and the residual circulation of past media.

 Brandy Monk-Payton is an Assistant Professor of Communication and Media Studies at Fordham University. Her research on theories and histories of African American media representation and cultural production has been published in the journals Film QuarterlyFeminist Media HistoriesThe Black Scholar, and Reconstruction: Studies in Contemporary Culture. Other work has appeared in various edited collections and is forthcoming in the anthology Unwatchable(Rutgers University Press). Her first book project examines Black celebrity in late twentieth and early twenty-first century U.S. public and popular culture.

Popular Religion and Participatory Culture Conversation (Round 4): Tisha Dejamanee and Deborah Whitehead (Part 2)

motherdaughter-54-swscan05579.jpg


Tisha

Debbie, I love your take on food storage blogs! And I completely agree that these ideas about women being compelled to return home to shore up the family unit are flourishing in the wake of the culture of high anxiety that currently prevails in the face of the increasing fragility of the American Dream. I’ve mostly approached this topic from the perspective of neoliberal individual responsibility – the notion that now that the American public is increasingly distrustful of the role of the State, corporations and medical authorities that women are increasingly called upon to step up and take on the labor that was once outsourced to these social institutions through homeschooling, cooking from scratch and doing extensive individual research into alternative health practices. 

However, I am also very interested in how you see the process of mainstreaming working to reshape communities around shared knowledge that potentially disrupt the boundaries of traditional political and religious affiliations. For instance, I see food blogging as a practice that increasingly unites women across the political spectrum and, while the narrative is often very intimate, political discussion is tacitly forbidden. Apolitical language clearly serves the commercial purpose of not unnecessarily excluding the blogger’s potential audience, but it also suggests that the blogger’s influence is predicated upon her political silence. After the 2016 election, I noted a few examples of food bloggers who broke the code and used their blogs to share strong feelings about the election outcome and this created a lot of agitation among readers. Are there any points at which you note similar ruptures between secular and religious communities emerge in the exchange of knowledge around prepper culture, or how does discussion of religion insert itself into these blogs? 

In response to the first question that you had for me about ambivalence, the way you discuss prepper culture as explained in terms of ‘empowerment’ and ‘confidence’ is pretty accurate to my understanding of how ambivalence works in postfeminist culture. That is, the rhetoric of choice can be used to justify any act as (post)feminist. Part of this is due to the influence of individualism – there is no discussion of domesticity and food preparation as a gendered outcome of structural or broader cultural pressures, or as labor that is systematically devalued yet obligatory; the only thing that matters is whether the individual interprets such work as empowering or fun. This is not to say that food work cannot be empowering or fun or pleasurable, but that pleasure is increasingly defined as the political end goal in and of itself and it is up to the individual to work out how they will justify and manage all of these pleasures, rather than organize collectively to fight for structural change.  

In response to your second question about authenticity, I was really fascinated about your article on ‘emotional fraud’ in the Christian mommy blogosphere. I think the examples you discuss are really great ways to think about the blurry lines that arise from the ways that bloggers are expected to confess and to share highly personal details about their emotional and domestic lives (and, I do think there is a gendered distinction in the blogosphere here), and the ways these same details form the basis for monetary and emotional payoffs. While most of the fraud that I have encountered in the food blogosphere has been of a less spectacular scale – say, stolen recipes, enhanced photography, and fake cheer (the controversy surrounding Thug Kitchen is a more notable exception of identity fraud) – in my experience there is an acceptance that bloggers can both share meaningful affective content and use these personal details to support corporate partnerships. ‘Hate’ blogs – for instance the GOMI community – are an interesting example of this because they offer scathingly honest critiques of the lifestyle blogosphere (for instance, referring to blogger’s children as ‘content generators’) while also serving as a testament to the seductiveness of blog content. However, this reading relies on a ‘buyer beware’ kind of model of response to the blogosphere – that is, the reader should be savvy to the kinds of renegotiations of identity and truth that are prevalent across the blogosphere, and understand that most popular blogs are aspirational and curated. While I agree that shades of truth don’t necessarily negate the affective impact of content on the reader (as you conclude in your article), I wanted to hear more about how faith shapes the response of readers and the ways that content is framed within the mommy blogosphere? 


Deborah :

 Thanks Tisha; I appreciate your questions about the politics of participatory culture.  Like you, I often observe a “code of silence” among women bloggers and readers when it comes to political affiliation – the old cliché that it’s not polite to talk about politics or religion applies in the blogosphere too!  Of course, there are many exceptions to that rule.  Survivalist and prepper bloggers whose anxiety levels peaked during the Obama years and in the run-up to the 2016 election tended to be fairly explicit about their support for Donald Trump.  As “Survival Mom” Lisa Bedford put it after the election, “many (not all!) in the prepper community have breathed a sigh of relief”; but she cautioned her readers that there were still good reasons to keep prepping, including natural, personal and social disasters (hurricanes, floods, job loss, riots, etc). In the post-election climate, while survivalist and conservative preppers may feel able to relax a bit, the election has had the opposite effect on those on the left end of the political spectrum.  A new community of liberal preppers is on the rise, characterized by both their political leanings and their desire to learn how to prep, and bloggers like Lefty Prepper Mom spend their free time reading the same LDS websites as do conservative preppers, learning Mormon techniques for food storage and emergency preparedness though they do not themselves identify as LDS.  But even though conservative and liberal preppers are united in their admiration for these particular aspects of the LDS tradition, they don’t seem to be interested in engaging with Mormons or Mormonism beyond appropriating their food storage practices, nor are they interested in engaging with one another across political boundaries; the Liberal Preppers Facebook group, for example, a closed group with over 3500 members, says in its description that it does not “knowingly accept conservatives and/or Trump supporters, into this group​.”  If anything, then, the 2016 election has seemed to harden political boundaries within the larger and increasingly diverse prepper community.  

It’s in the evangelical women’s blogging communities that I’ve seen some of the kinds of disruptions you mention.  Evangelical women’s blogs can be understood as a subspecies of evangelical women’s ministries more generally, sharing a desire to convert and lead other women in the faith, as well as a subspecies of the secular mommy blog, documenting family life and sharing personal reflections, recipes, crafts, etc.  The explicitly religious nature and purpose and the overt religious content of evangelical women’s blogs make them distinctive, but it is still true that like the food blogs you have analyzed, they tend to shy away from overt political discussion, perhaps out of a desire to maintain influence as you say, perhaps also out of a fear of alienating potential converts. This belies the fact that the blogs’ subject matter of women’s bodies, sexuality, marriage, and family is, of course, highly political; the rhetoric of “family values” has been used to advance a number of political positions, movements, organizations, and candidates over the past three decades, as well as to advance a particular notion of the patriarchal Christian family as the most basic unit of the modern nation state.  

A small number of evangelical Protestant mommy bloggers have built personal brands by cultivating decidedly apolitical stances, building large social media followings, authoring books, participating in Christian women’s ministry tours around the country, and even having their own HGTV shows.  The controversy that has erupted around Jen Hatmakerfor speaking out in support of same sex marriage and openly criticizing Trump and his policies (leading to condemnation from other evangelicals, criticism of women’s ministries more generally, and her books being pulled from a major Christian bookstore chain) is a striking example of the power of what Hatmaker has called “the Christian machine” to manufacture a disciplined silence around political issues among women in the evangelical community.  Yet Hatmaker is not alone; evangelical bloggers Beth Moore, Sarah Bessey, Austin Channing Brown, and others have contributed to what some have called an “evangelical crisis of authority” by using their powerful platforms to give voice to those not typically represented in evangelical institutional leadership structures in the U.S., which tend to be dominated by white men.  And so, to answer your question about how faith shapes content and reader response in the evangelical blogosphere and how that might be different from the “caveat emptor” attitude expressed in food blogs or hate blogs, I think it has to do with the question of religious authority – the notion that these women possess a kind of authority to teach and lead other women in the faith, one that may be outside denominational or institutional structures, but a kind of authority nevertheless; and because of that, personal misrepresentations, lies or omissions, or departures from church doctrine or practice, are seen with a unique kind of gravitas.  We’ve been talking a lot about politics and silence and authority, and I’m curious to hear your thoughts on the question of authority in the food blogosphere, as well as whether you’ve seen any political commentary in food blogs around topics other than the election, for example around food politics, sustainable agriculture, food deserts, GMO foods, labeling, and so on? 


Tisha: 

Thanks for sharing these interesting examples. A theme that I think is prevalent across the texts we examine is the nostalgic valuation of certain kinds of knowledge in response to a culture of high anxiety – around politics, around natural disasters, around State failure, around the changing social and demographic landscape of the U.S. In the food blogosphere, I see a direct parallel to the valorization of prepping in the ways that self-sufficiency is based around the desire to deconstruct and recreate food in the individualized domestic sphere, for instance through learning intensive production processes such as grinding one’s own flour or baking marshmallows from scratch. What I find really interesting about the example of the Liberal Preppers is that a lot of this anxiety is explicitly channeled into the kinds of community that form around such knowledge exchange; that a closed liberal community is required to make these discussions ‘safe’. 

 In contrast, I would characterize the food blogosphere as driven by the impetus for expanding one’s influence. Part of this takes place through the ways that food bloggers set themselves up as authorities on cooking, which almost inevitably involves exaggerating the performance of a normative, girlie femininity – through the intimate chatter, self-deprecating commentary, and the foregrounding of family and domestic life. As such, while the community itself is not necessarily closed, the construction of a digital femininity usually is. This is one of the most troubling aspects of the nostalgic romanticization of traditional knowledge – it often simply operates as code for gender and racial conservatism. 

 To answer your question about sites where food politics becomes explicit in the food blogosphere, there are certainly several examples. Many of them revolve around the ethical and environmental impacts of veganism, which is one of the most popular subgenres of food blogs. Breeze Harper’s Sistah Vegan Project is an important example of critical social activism through blogging about food, while many other vegan blogs make explicit the social issues that have led them to veganism (interestingly, just as many vegan food blogs are likely to emphasize their apolitical, judgement-free stance). Jack Monroe’s Cooking on a Bootstrap blog became widely circulated as a British austerity blog, and Monroe has used this platform to publish explicit political commentary on U.K austerity policies and various other social movements. Multiple other examples exist of blogs that frame food choices in terms of consumer politics, although these rarely acknowledge structural deficiencies in the mainstream food system. In general, I see blogs that seek to deal with politics as outliers that may resonate with particular audiences but are not generally rewarded within the mainstream structures of visibility of the blogosphere.

As we close this blog conversation I’m interested to hear your thoughts on how, as scholars, we assess the political potential or impact of the blogosphere. I think your example of disruptive evangelical women’s blogs offers an interesting way to think through the general contradiction of the lifestyle blogosphere, which is that they offer women a platform and a place to document, circulate and add value to their experiences while at the same time this visibility is often contingent upon their performance of a correct kind of femininity or gendered cultural authority. You’ve alluded to this in your previous posts, but I was wondering whether you could speak more explicitly on what kind of potential you think religious blogs have to shape or change gender roles within religious communities? 

Deborah:

1502040854439.jpg

 Tisha, I think you’re absolutely right that the nostalgic desire to recover knowledge around cooking, homekeeping, prepping, mothering, being thrifty, etc. that we see in the blogosphere and beyond is often bound up with political anxiety, gender and racial conservatism and a romanticized view of the past – one in which “traditional” white working class “culture” is unproblematically celebrated and preserved. One could also look at home design blogs and the current craze for “farmhouse style,” most famously exemplified by the long running HGTV show “Fixer Upper” and its flannel shirt, jeans and boots-wearing married hosts, Chip and Joanna Gaines, to see this nostalgia evident in contemporary home design.  Words like “rustic” and “homey” and “rural” are frequently used to describe farmhouse style, which “eschews modern sensibilities and goes back to a simpler time” with its ubiquitous weathered wood finishes, exposed beams, barn doors, shiplap, galvanized steel, mason jars, black and white color schemes, and buffalo plaids, and its values of “simplicity” and “practicality” and “warmth,” evoking a new American Gothic for mass consumption.  Joanna Gaines’ aesthetic has been copied, channeled into several home design show spinoffs, and marketed as a Target line; the look is aspirational, yet achievable, we’re told, as long as we’re willing to “peruse, meander and collect” to meaningfully curate our homes as perfect combinations of “hand-me-downs and flea market finds combined with newer pieces.”  It is not difficult to see Chip and Joanna Gaines and their many imitators and admirers of farmhouse style as literally engaged in rummaging through and reassembling the past for present consumption, all while foregrounding particular normative conceptions of gender, race, sexuality, and nation.  

I appreciate your question about religious blogs and gender roles; it’s a complicated one and returns us to the question of the political potential of participatory culture.  Back in 2005 at an inaugural gathering of women bloggers, a controversy erupted as to the political potential of mommy blogs. As Lori Kido Lopez relates the story, one participant commented that “if women ‘stopped blogging about themselves they could change the world.’”  In response, blogger Alice Bradley declared that, in the context of the male-dominated world of blogging in which mommy bloggers were not taken seriously as writers, “mommy blogging is a radical act!”  Does such a statement recover the second wave feminist rallying cry that the personal is political, or does it reflect the postfeminist rhetoric of choice and pleasure and visibility as a substitute for organized political action?  

When it comes to the religious blogosphere, the question is refracted through the lens of authority and tradition.  On the one hand, I agree with you that the price of visibility in the blogosphere is too often the performance of “proper” gender roles and, I would add when it comes to religious blogs, the performance of institutionally or communally sanctioned religious belief and practice.  The religious blogosphere is, of course, incredibly diverse, and my observations here are confined only to the evangelical and LDS blogs I’ve studied.  Jen Hatmaker and other evangelical women have spoken about the presence of a “pink ghetto” in evangelical Christianity that limits women’s opportunities to participate in ministry or leadership roles, constraining them both offline and online into a “less threatening,” Instagrammable, “hey girl” performance of femininity, one that eschews taking stands on controversial issues.  When Hatmaker spoke out in support of same sex marriage, the furor around her (which she has said included death threats) was not just about the specifics of this particular issue, but about the fact that she’d chosen to, as a female influencer, speak publicly about it.  “Being on the wrong side of the evangelical machine is terrifying and punitive,” Hatmaker has said, so it is not surprising that most evangelical women are reluctant to take it on, and therefore that despite what I earlier mentioned as a “crisis of authority” within the tradition generated by social media, the possibilities of institutional change remain similarly constrained, at least for now.  Heidi Campbell’s analysis of 100 religious blogs reached a similar conclusion, finding that religious bloggers more often affirmed than challenged traditional sources of authority in their respective traditions, because bloggers’ online practices are so deeply embedded in and connected to their offline practices and beliefs.  

 On the other hand, I find the kinds of examples of youth activism and political engagement that Henry Jenkins and his colleagues have collected in By Any Media Necessary to be inspiring.  I am particularly intrigued by the notion that in an era of increasing distrust in political organizations and institutions, political change has become, through social media, something that is part of everyday lives instead of just being confined to discrete events like organized rallies or protests.  Seen from that vantage point, moments of disruption in the blogosphere where political views are surfaced in normally carefully guarded apolitical, “everyday” spaces become very significant.  For example, in October 2016, Christian women’s ministry superstar Beth Moore, who had “spent her career carefully mapping the boundaries of acceptability for female evangelical leaders,”tweeted her outrage at then-candidate Donald Trump’s 2005 Access Hollywood tape, and the way that some male Christian leaders rushed to excuse it as “locker room talk,” to her 900,000 followers, arguing that the evangelical community needed to “wake up” to its pervasive sexism and its frequent willingness to overlook sexual harassment and abuse of women.  Blogger and author of Jesus Feminist Sarah Bessey started the hashtag #ThingsOnlyChristianWomenHear on a whim in April 2017 to “amplify the voices of women who have too often been silenced” in the church; more conversations, including #ThingsOnlyBlackChristianWomenHear, have followed, highlighting experiences of misogyny, sexism and racism.  These actions by Moore and Bessey and many others are helping to generate the evangelical community’s own #MeToo movement. One could also look at the way in which many religious (and secular) lifestyle bloggers and Instagrammers interrupted their daily posts this past April to denounce the Trump administration’s new “zero tolerance” immigration policy that resulted in separating parents from children at the border, explaining that as mothers themselves, they could not stay silent on this issue, and urging their readers to take action by calling their congressional representatives and donating to legal aid organizations for refugees.  Such disruptions are risky and controversial even among one’s followers, and generate both praise and criticism.  But perhaps, in the context of religious traditions which do not ordain women or where women’s leadership roles are tightly controlled and constrained, they represent small steps toward a kind of participatory politics grounded in the power they have as influencers and organically connected to their identities as women, mothers, and people of faith.    

Tisha Dejmanee is an Assistant Professor in Communication at Central Michigan University. She has authored several journal articles on issues where the fields of gender studies, popular culture, politics, new media and food intersect. Published work relating to the themes discuss in these posts include "‘Food Porn’ as Postfeminist Play” (http://journals.sagepub.com/doi/abs/10.1177/1527476415615944)  and “Consumption in the City: The Turn to Interiority in Contemporary Postfeminist Television”

Deborah Whitehead is Associate Professor in the Department of Religious Studies and Senior Resident Fellow with the Center for Media, Religion, and Culture at the University of Colorado Boulder.  Her research focuses on intersections between religion and philosophy, gender, popular culture, and media in the U.S.  She is author of William James, Pragmatism, and American Culture (Indiana University Press, 2016) and several articles on James, religion, gender, media, and popular culture.  She is currently working on a second book on U.S. evangelicals and new media, forthcoming with Routledge.   

Popular Religion and Participatory Culture Conversations (Round 5): Whitney Phillips and Jason Bivins (Part 3)

Jason on fanaticism:



braveheart1.jpg

Let me propose a bit of a shift here in my final section. What can we learn about religion, technology, and identity from the Indianapolis 500? Several lifetimes ago, in 1997, a moderately successful radio host named Mike Pence had a bone to pick with the “mainstream media.” Pence, who, before becoming Vice President, would go on to serve in the House of Representatives as a proud ally of the Christian Right and the Tea Party, was fond of describing his show as “Rush Limbaugh on steroids.” And on May 23, 1997, Pence was irate that the local media was reporting diminished crowds at that year’s Indy 500. Sound familiar? Pence was riding a wave that was a long way from breaking, and still has not.

As I try to measure the distance between that year and this year, I haven’t found much that’s revealing in all the oceans of pieces about Mike Pence’s “servant leadership,” or the future of the “evangelical vote.” I have been thinking instead about the links between persecution complexes, religion, and crowds. The Trump era gives me a lot of data, after all, and new insights into earlier moments, some of which have to do with films and which help us think freshly about the over-determined category “fanatic.”

Think about a Trump crowd. Not just a rally but an inauguration, or a Charlottesville march. The slogans. The defiant embrace of a singular identity. Things we regularly ascribe as fanaticism, filled with the kind of all-or-nothing furor of a boozy NFL game. Trump publics understand themselves to be memories, and brands, and they come into being to the extent that they can posit a negative public, one that is collectivist, radical, and anti-religion. That’s a standard right-wing move dating back to the Haymarket riots and the first Red Scare. Against the brightly colored cords of memory stretching back to Boston Harbor, a dark history roils with “hate-filled” or “divisive” leftists: communists, antifa, or the Black Panthers. Other fanatics.

It’s worth noting how contemporary anti-left discourse is shaped by a few very specific cinematic imaginings of what it means to be an American, and who counts in public life (spoiler alert: it’s white people). Think here of Forrest Gump. In that film, Tom Hanks’ simpleton, running Zelig is morphed into one segment of historical footage after another, from an awkward meeting with John F. Kennedy to a talk show panel seated next to John Lennon to ping-pong diplomacy in China. Many filmgoers loved the corporatist nostalgia constructed here, though less was made of the film’s reactionary historical memory, where returning Vietnam vet Gump disdains hippiedom (what hath the counterculture wrought of his sweet Jenny?) and belittles the Panthers (“Sorry I had a fight in the middle of your Black Panther party”), portraying radical politics as officious, anti-feminist, and spiteful.

Gump was temporarily the poster child for Newt Gingrich’s “Contract with America,” a cynical manipulation of whitewashed American history and a vacuous celebration of a cultural community that will not stomach any boat-rocking. But somewhere between the post-apocalyptic enthusiasms of The Road Warrior and the Opus Dei-fueled The Passion of the Christ, whose fetish for historical “accuracy” manifested in the degree to which it could granularly document the suffering Messiah’s broken body like a Rocky film’s mauled, swollen faces, Mel Gibson got political. Or rather, long before viewers learning of his anti-Semitism understood that he was not simply play-acting “crazy” in the Lethal Weapon franchise, Gibson’s two Clinton-era “historical” films – Braveheart and The Patriot – were co-opted into American politics.

A year after the “Gingrich revolution” of the 1994 midterm elections, a widely-documented and discussed upsurge of (mostly) white male masculinity emerged in (mostly) suburban America. First was the increase in war-games as recreational activity, including not only Civil War reenactment or Renaissance Fair jousts but paintball, Green Beret cosplay, and prepper training at gun ranges, in the woods, or at separatist compounds, each sharing a concern that the “tyranny” of big government was no longer able to be checked by the integrity and wholesome good will of “the people.” Some of these ideas are as old as America, and they are also bedrock for the alt-right. But Braveheart framed them for millions of Americans, grumbling about “Hillarycare” or gangsta rap or NAFTA, the fictionalized slice of Scottish history standing in for any American viewer’s felt experience of hardship, of Embattlement. In time it became perhaps the most powerful template for what, in my current work, I call Life As Action Movie (LAAM).

Briefly, the film focuses on Gibson’s portrayal of Scottish hero William Wallace. It opens with a portrayal of kilted, brawny men laboring contentedly, attending fairs, falling for pale maidens, and performing manly feats. This carefully manufactured scene prepares the viewer for outrage at the weaselly Robert the Bruce, playing both sides of the fence but ultimately giving comfort to the Crown, whose armies would rob sweet Scotland of her freedom. After one too many village raids and the execution of his wife, Wallace and his merry men enter into a kind of protracted guerilla warfare, whose resonance with the 1990s anti-statist militia movements was as unmistakable as it was seldom remarked. Wallace is eventually tried and executed himself. At the movie’s conclusion, as Gibson grimaces and contorts on the rack of tyranny, a sniveling, foppish magistrate leans into his face and tells him he need only say “Mercy” to be spared a grisly death. The magistrate announces “The prisoner wishes to say a word,” whereupon Wallace, defiant to the last, wails “FREEEEEDOOMMM!!!”

The film depicts a national identity that takes shape via navigation of the freedom/tyranny dyad. Important to the appeal of the LAAM template is the caterwauling self-evidence of the idea of “freedom” as Gibson hollers it from his torture instrument. Freedom does not require elaboration. Freedom’s absence is torture. In this we detect an affective resonance with the fervor and violence of Trump crowds, with their collective effervescence, their violent rhetoric, and their isolationism. These crowds are unabashed in their rearward glance. Their vision of America, like the Scotland of the mind, is sentimentalized, homogenous, and very pious.

But what is the lack for which these crowds compensate? Perhaps, those sympathetic to Hillbilly Elegy tell us, it is the precarity of neoliberalism that has suburban crowds all angsty. Try again. It is fury and disbelief at the presence of other American bodies, whose legitimacy they cannot accept. Bodies whose difference inspires a fanaticism, much though academics will cluck their disapproval at so clunky and disdainful a category.

Fanaticism is what happens when Americans cannot – will not – think about what public life actually demands in an actual society. Inside this condition, a condition we must think and act our way beyond, words like “freedom” and “religion” produce an enthusiasm that deflects attention from the difficult projects of rethinking what a public is. Because what has become so agonizingly obvious in the Trump era is that the long con is the fiction that politics as such doesn’t matter. If government is only out to get us, and if politicians can’t be trusted, then we’d be better off not hoping for incremental improvements of law and policy but finding the one truly magnetic candidate who can vicariously fight for us. Crowd-sourced, mediated feeling is the remainder.

So the lack isn’t so much the lack of opportunity to register one’s voice in public, to make representatives accountable to the coalescence of citizens in the polis, as it is the felt lack of certitude fading into the future, of the insulation citizens crave manifesting as naked isolation, one that is porous at that, continually pricked by the pain and hardships and desires of others, so much that we cannot tune them out; we thus compensate for these feelings by being louder than others.

What the “economic anxiety” crowd seem not to realize about the emotional intensities of religious identity – which is always also racial and gender identity – is that Trump is not some Hail Mary designed to get coal jobs back, no matter how many articles tell us so. The sense of crisis behind collective emotional shrieking, these shared protestations defending the flag and the cross, are always connected to the very neoliberalism they contest. The sense of permanent crisis is fundamental to capitalism, not only because of its own precarity nor even just because of how unlikely it is that we make it (the oldest American dream), but because the crisis is the inverse of the promise that everything will work out. This promise demands the expression of the kind of surplus feeling on display among our fanatics, because we know it is false. In selling us the promise, neoliberalism sells us the very lack that demands that promise; it sells us the boredom and flatness of the secular so we will crave the enthusiasms that are packaged for us.

As Mel Gibson is stretched eternally on the rack, and Nazis march in Jefferson’s Charlottesville, and we develop apps to measure the coalescence of emotion on Twitter, we must wonder: what kind of American public sphere are we willing to defend? Despite all of the chatter holding that the Tea Party, and exurban Trump supporters, and even the alt-right form some sort of recrudescent populism – the “deplorables” simply articulating their “economic interests” against tone-deaf East Coast elites – the bodies in America’s streets since the November 2016 election confirm that an alternate national anthem could be The Clash’s “I’m So Bored With the USA.” Certain Americans enjoy the luxury of boredom, which is the phenomenological experience of the market sine qua non, since boredom keeps us yearning for the next fix, the next distraction from ourselves. But from Portland to D.C., from Ferguson to Charlottesville, from the stock exchange to the Bundy Ranch, America has also become another Clash tune: “White Riot.”



Whitney on fanaticism:

Questions about crowd size, persecution, and boredom all factor into my experiences with fanaticism online, particularly in the context of fanatical conspiracy theorizing, or fanatical shaming, or fanatical support for certain political candidates and causes.

The pressing issue for me in these cases, however, isn’t that participants actually are fanatics. It’s that often (and this recalls my previous response), it’s not clear who really is and who’s pretending to be for who knows what reason, humor or manipulation or hate or some idiosyncratic combination therein. This brings us back, of course, to Poe’s Law, which my co-author Ryan Milner and I describe as a monster skulking in the darkness; “it’s always just standing there, menacingly” we write in our book The Ambivalent Internet. In these cases, details like how many people are participating, who is doing so out of sense of genuine persecution, and to what extent being bored, or at least, having too much time on one’s hands, factors into the discussion, often remain unknown and unknowable.

This doesn’t stop many people, most conspicuously journalists, from trying. Here’s an example: earlier this summer, I got a DM from a reporter at a large national publication looking to write a story about shaming culture online, and how it’s gotten completely out of control (given the nature and focus of my work, I do a fair number of these kinds of interviews). He pointed to the recent firing of James Gunn, who Disney had hired to direct the next Guardians of the Galaxy movie. Some concerned citizen had poured through years of Gunn’s old tweets, because it’s 2018 and that’s what it means to be a person now, and discovered that Gunn had once tweeted a handful of pedophilia jokes. They were, obviously, gross. This resulted in OUTRAGE by THE PEOPLE, which is what triggered Gunn’s firing; and wasn’t this proof of PC and shaming culture running amok? The reporter wanted to get my take, i.e., agree with them, with a smile, nod, and soundbite.

The problem was, this wasn’t real. Not exactly, not as it was being framed. Outrage had followed Gunn’s firing alright, but as was quickly revealed (and is what I was suspecting would come out, given so much precedent), that outrage was precipitated by extremist groups, who have for months, for years, orchestrated similar attacks on prominent liberals. I say the outrage was “precipitated” rather than “manufactured,” here, because it’s not clear how many of the responses constituted coordinated astroturf, and how many were posted by individuals who didn’t know about the smear campaigns, and who were genuinely disgusted by Gunn’s statements. This mix would have become even more difficult to parse after the early media coverage started rolling in, bringing untold thousands of additional participants into the fold.

The Gunn case was multi-pronged: not only did it target a specific individual, it played into the narrative--a favorite of far right extremists and more mainstream conservatives alike--that liberals are hysterical, hypersensitive snowflakes. Look at them scream bloody murder over a couple of jokes! (For a case that pulls exactly this page out of exactly this playbook, see this example from 2014; the same things keep happening, and happening, and happening, as I chronicle in the first part of this report).   

It’s not just smear campaigns that raise similar questions of motive. From Pizzagate to QAnon to countless narratives in between, the underlying issue--reiterating the above--is that observation is not confirmation online. There is often no way to parse what is sincere from what is satirical, what is deliberate disinformation and what is someone’s good faith effort to tell the truth, at least their version of it. Efforts to point at particular unfolding controversies and declare emphatically that this is an example of X are, as a result, almost guaranteed not just to misrepresent the story, but to do someone else’s media manipulation legwork. To what end? Unfortunately, the answer is often “who knows.”

It is certainly the case that there are fanatics on the internet. What is equally certain is that people perform fanaticism on the internet, for all kinds of reasons, to all kinds of effects. The result of not knowing very much, and worse, not even knowing what exactly we don’t know, is distressing. It means that everything we say grows monsters. At least it can--with that possibility always just standing there, menacingly.   



Conclusion:

Ultimately, I’m not sure I delivered on Whitney’s promise that I would provide any “religious studies grounding” in our conversations. But what’s clear to me, and I hope clear to readers, is that I was really energized by my engagements with Whitney and her incredible work. Together I think we’ve succeeded at delivering some snapshots of our own interests, from our distinctive perspectives, and in that the kind of collage portrait that can emerge through interdisciplinary jamming.

Contrast and combination often lead to fresh thinking. And even if they don’t, maybe they’ll provoke or annoy you in new ways. Pizzagate and Forrest Gump. The alt-right and James Gunn. Trump and trolling (okay, you already knew about that one). These combos and categories not only catalyze our thinking about religion and media, they mediate our exchanges with each other, too.

Whitney Phillips is an Assistant Professor of Communication, Culture, and Digital Technologies at Syracuse University. She holds a PhD in English with a folklore structured emphasis (digital culture focus) from the University of Oregon, and an MFA in creative writing from Emerson College. She is the author of 2015's This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture (MIT Press), which was awarded the Association of Internet Researchers' Nancy Baym best book award. In 2017 she published The Ambivalent Internet: Mischief, Oddity, and Antagonism Online (Polity Press), co-authored with Ryan Milner of the College of Charleston. She is also the author of the three-part ethnographic study "The Oxygen of Amplification: Better Practices for Reporting on Far Right Extremists, Antagonists, and Manipulators," published in 2018 by Data & Society. She is working on a third book titled You Are Here: Networked Manipulation in the Digital Age.

Jason C. Bivins is a Professor of Religious Studies at North Carolina State University. He is a specialist in religion and American culture, focusing particularly on the intersection between religions and politics since 1900. He is the author of Spirits Rejoice!: Jazz and American Religion (Oxford, 2015) a Choice Outstanding Academic Title of 2015. He has published most actively in the area of U.S. political religions, the subject of his first two books, Religion of Fear: The Politics of  Horror in Conservative Evangelicalism (Oxford, 2008), a Choice Outstanding Academic Title of 2008, and The Fracture of Good Order: Christian Antiliberalism and the Challenge to American Politics (UNC, 2003). He is currently working on his next monograph in political religions: Embattled Majority, a genealogy of the rhetoric of “religious persecution” in public life. He is also writing about Jack Kirby, the “King of Comics,” for Penn State Press’ Religion Around series.




How Do You Like It So Far? Podcast: Anushka Shah on Civic Entertainment

In this episode, we get the chance to talk to Anushka Shah, who works as a researcher at the Center for Civic Media, MIT Media Lab. More recently, she has started a project called Civic Entertainment that explores the intersection of civic engagement with television, radio, digital entertainment and film. This project researches the media effects of fiction on thought and behavior change and explores how methods of social change available to citizens can be best represented in entertainment media. It also investigates the representation of protest and activism in current popular culture. She also runs a production studio in Mumbai called Civic Studios that creates civic entertainment content for Indian audiences. Shah tells us about the inspiration she, and other Indians, have gotten from popular media, and how she brought civic participation with entertainment together. She helped to organize a conference on Civic Entertainment held recently at the Godrej India Culture Lab.

To learn more on civic entertainment in India, check out this account of the activism around Rang De Basanti and this account of the mobilization around Indian Pop Idol.

Next week, we are joined by Rohan Joshi from All India Bakchod, which performs “news comedy” in the Indian context. We will share a portion of Joshi’s remarks at the Godrej Civic Entertainment event, where he spoke about Captain America as a model for Indian civic participation, and we discuss news comedy as a form of civic entertainment.

And, by the way, how do you like it so far?



Popular Religion and Participatory Culture Conversation (Round 5): Whitney Phillips and Jason Bivins (Part 2)



045A1571_2.jpg



Jason on fear:

Every so often you write things that you secretly hope will become outdated. We want people to read our stuff, of course. But when scholars write, as I do, about religio-political formations that they find deeply challenging and in places anti-democratic, it’s not unreasonable to hope that the culture will change in ways that steadily reduce their presence and influence. Ten years ago I published a book called Religion of Fear, in which I posed the question: how did impulses and ideas that lurked at the margins of cultural discourse in the 1960s become mainstream by the 2000s? My answer was to look at the influence of several different cycles of evangelical popular entertainments, specifically at their fearful depictions of American politics.

Clearly, the movement of outlandish and fearful into the mainstream isn’t finished yet. Who can even keep up? Fervid imaginings of the Deep State, sinister secularists, or radical leftists who want to abolish Christianity through mind control – all this is so commonplace now that we wonder if we’re losing our collective ability to be outraged by such claims. By talking about the strange normalization of outrage, catastrophe, and horror – not Hurricane Maria or the day’s latest mass shooting by a white male – we’re focusing in on the ascendance of a particular religious response to “the fucking new.”

While we live in the thick of progress narratives – the providential variety, or the secular pluralist variety most prominently – we find that to engage religion in America is necessarily to be confronted with fantastical and fearful narratives. And these narratives – whether mass entertainments like The Walking Dead or Left Behind, or more subcultural fare – depend on symbolic violence done to innocent and guilty bodies alike. In my own work, I explore the eroticism that’s part of these imaginings, and the attractions of demonology generally. But there are also more overtly political implications.

In his Introduction to Metaphysics, Martin Heidegger described violence as an ontological breaking into and breaking down of the “communal World” as it appears to custom: “This act of violence, this decided setting out upon the way to the Being of beings, moves humanity out of the hominess of what is most directly nearby and what is usual.” We might think of this passage when reckoning with religious enthusiasm, and with how the production of disturbing fear-talk is a sign of civic disaffection in a time of political crisis. But that hardly accounts for the real violence of Pizzagate or Charlottesville, does it?

Perhaps because of that intellectual limit, I’ve found myself turning more often in my current work to fiction as a way of explaining public life. In “The Canterbury Pilgrims,” Nathaniel Hawthorne explores the American propensity wherein “a cold and passionless security be substituted for human hope and fear.” And in “Passages from a Relinquished Work,” he meditates self-referentially on Cervantes, the first novel, and thus the opening of modern self-reflection on some level, preoccupied from his authorial position in mid-19th century America with Quixote’s and Panza’s tacking between “auguries” and “anxieties.” How might these Hawthornean notions capture the strange comforts of fear in a time that is actually so decidedly scary?

They do so partly by keeping things in our head, those things which in frightening us so deeply confer on us a real vitality that boring old neoliberal democracy can’t. And they also allow for the more obvious pleasures of vicarious killing, as with all those Obama puppets in nooses. The ceaselessness of the chyron becomes the fuel for the trapping and the expulsion alike. Consider John Edgar Wideman, who wrote in his novel Fanon about how “Three new stories in the news catch my eye – faith-based prisons, cell phones with tracking chips, a man arrested for raising a tiger and an alligator in a Harlem apartment. The same story really. The Big Squeeze at both ends, so nothing left alive inside people’s heads.” American politics takes shape in the relationship between death and life, fantasy and the Other.

We can’t make sense of this condition simply by pointing to religious fear as reason’s other. What seems salient to me is not just the moral urgency of fear, which is the obvious point, but also its epistemological urgency. What is at stake is not simply identity, or moral principle, but a form of knowing, of filtering out ambiguities, of rescuing a message from the depth dimensions of language. The vibrancy of the fearful and the apocalyptic subsists comes through its assessment of what it identifies as evidence. This doesn’t simply mean theological claims that, for example, ISIS rockets are biblical plagues of locusts or that Hillary was (is?) the Whore of Babylon. More than this, the emotional focus of fear, its necessary urgency, comes through moments of exposé that we convince ourselves confirm our righteous outrage. The exposure of hypocrisy and secrecy, the proof that the monster everyone else denied is really there, gives us authenticity in precisely those places where the self – and thus the world – begins to wobble just a bit.

If Foucault was right, over a half century ago, that the apocalyptic is “the world’s old reason engulfed,” what if we confront seriously the idea that many millions of Americans find excitement and promise in a glorious, Action Movie burning out rather than the insignificance of ordinary unhappiness?

Whitney on fear:

When I was a graduate student (I earned my PhD in 2012), I used to joke on conference panels that I was the Darth Vader of the conversation--everything I studied was always such a bummer (understatement), from Facebook memorial page trolling to “media fuckery” (as participants then giddily called it) to a range of identity-based antagonisms on and around 4chan’s /b/ board. Since then, my work has taken on an even more ominous cast, as focus on media manipulation and online antagonists--I don’t use the word “trolling” to describe any of this behavior, for reasons I articulate regarding Donald Trump here--has brought me into the orbit of online extremists and others committed to weaponizing information, sowing discord and mistrust, and generally undermining participatory democracy.

So I’m pretty well positioned to say, and I do not think this will come as a surprise to anyone, that there are a lot of things to be afraid of on the internet, from harassment to extremism to manipulation to hoaxes to mis- and disinformation and all the ways those things can have an immediate, embodied, irreversible impact on people’s lives.

Much of this fear--or at the very least, this loathing--stems from a rejection of the lives, worldviews, and behaviors of (those who are seen as) bad others. This is an understandable, indeed I would argue natural and appropriate, impulse when considering the violence and harm enacted by white nationalists and supremacists, and others whose sole motivating impulse is indeed to watch the world burn. But the reaction to bad others can be just as visceral when the “badness” of those others is debatable; when it’s not fear of being harmed, of traumatized, or dehumanized, but fear of being...disagreed with, or asked to take responsibility for one’s own actions and choices. (I’m looking at you, anyone who’s tried to disappear into the wallpaper with the excuse that “I was just trolling” when called out for harmful behavior).

Wherever the fear--and/or loathing--may originate, these are always instructive moments. At least, that’s what I tell my students in the Cross-Cultural Monsters course I’m currently teaching. To do so, I refer to Mary Douglas’ exploration of dirt and taboo, and how ideals about what constitutes clean, or pure, or normal are the logical preconditions of any declaration that something is dirty, or tainted, or aberrant. Focusing on what that bad thing is--or what that bad thing is regarded as being--provides immediate insight into what that culture or community values, believes, and normalizes. Monsters, in short, help us understand who the upstanding citizens are. (I used this framing to explore subcultural trolling from around 2007-2014, with the added complication that, hmm, the same people standing in mainstream quarters--with a particular focus on journalists at Fox News--lamenting the existence of trolling were often…..doing the same things as trolls, certainly in the trolls’ and journalists’ mutual exploitation of sensationalist news stories and racial/racist tensions during the Obama era, raising the question of exactly what people were criticizing, when they criticized trolls).  

I bring this up, first, to offer a means of taking inventory of what norms people are pointing to, privileging, or otherwise reifying by spotlighting the badness of others--however well-deserved that designation may (or may not) be.

I also bring it up as a segue to what I fear the most online. White nationalists, supremacists, abusers, manipulators--they are all high on my list, with the “loathing” quotient very well represented. But occupying its own special category is the fact that online, because of Poe’s Law, because of context collapse, because of rampant decontextualization, we often have no way of knowing what kind of monster we’re dealing with. A person spreading outrageous, harmful conspiracy theories might genuinely believe them. They might be trying to mess with reporters. They might be a Russian disinformation agent. They might be part of a computational propaganda effort, of which they might or might not even be aware. They might not be a person at all. Crafting an effective response--Do you debunk? Do you ignore? Do you call the FBI?--hinges on knowing which is which. But more often than not, we can’t know--complicated by the fact that when several, or dozens, of hundreds, or thousands, or millions of people are spreading the same information, there are several, or dozens, of hundreds, or thousands, or millions of possibilities as to why. Even the most effective intervention for some could do nothing for others--or could backfire. Could create entirely new categories of monsters.

Like all fear, mine is revealing. It’s also confusing. The fear, the aberration, is uncertainty, and the discomforting fact that observation is not confirmation online. That would make certainty and empiricism the norm, certainty the norm--but are they? Were they ever? Has that always just been wishful, privileged thinking? Is this what the world is actually like? Whatever the answer is, I do not know what to do about it. And that scares the hell out of me.

Whitney Phillips is an Assistant Professor of Communication, Culture, and Digital Technologies at Syracuse University. She holds a PhD in English with a folklore structured emphasis (digital culture focus) from the University of Oregon, and an MFA in creative writing from Emerson College. She is the author of 2015's This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture (MIT Press), which was awarded the Association of Internet Researchers' Nancy Baym best book award. In 2017 she published The Ambivalent Internet: Mischief, Oddity, and Antagonism Online (Polity Press), co-authored with Ryan Milner of the College of Charleston. She is also the author of the three-part ethnographic study "The Oxygen of Amplification: Better Practices for Reporting on Far Right Extremists, Antagonists, and Manipulators," published in 2018 by Data & Society. She is working on a third book titled You Are Here: Networked Manipulation in the Digital Age.

Jason C. Bivins is a Professor of Religious Studies at North Carolina State University. He is a specialist in religion and American culture, focusing particularly on the intersection between religions and politics since 1900. He is the author of Spirits Rejoice!: Jazz and American Religion (Oxford, 2015) a Choice Outstanding Academic Title of 2015. He has published most actively in the area of U.S. political religions, the subject of his first two books, Religion of Fear: The Politics of  Horror in Conservative Evangelicalism (Oxford, 2008), a Choice Outstanding Academic Title of 2008, and The Fracture of Good Order: Christian Antiliberalism and the Challenge to American Politics (UNC, 2003). He is currently working on his next monograph in political religions: Embattled Majority, a genealogy of the rhetoric of “religious persecution” in public life. He is also writing about Jack Kirby, the “King of Comics,” for Penn State Press’ Religion Around series.

Popular Religion and Participatory Culture Conversation (Round 5): Whitney Phillips and Jason Bivens (Part One)

Whitney:

Because this blog series on popular religion and participatory coverage could be taken in so many directions, and because Jason and my respective backgrounds go in several of those different but overlapping directions (with his work focusing on religion, politics, and culture in the US, and my work focusing on online antagonism, digital folklore, media manipulation, and digital ethics), it took us a few exchanges to decide how we would approach the conversation. The thing is, while the subjects of my research have certainly engaged with religion and religious people, often disparagingly, sometimes violently--thinking in particular of the anti-Semitic and Islamophobic elements of far right extremism--I am not a scholar of religion. So how to combine our interests? Ultimately we decided to reflect on three concepts prominent within the study and practice of religion, and just as prominent, though not always as directly acknowledged, in media and communication studies: faith, fear, and fanaticism. Jason will jump right in with faith, and lead each subsequent exchange as well, to provide the religious studies grounding needed for context.

Jason on faith:

I’m very excited to have been invited to participate in this forum. Aside from my admiration for the scholars convening the participants, my own scholarship on political religions has long benefited from cross-disciplinary exchanges. But as I think about my own interests in relation to Whitney Phillips’ fantastic thoughts on internet trolling and monsters, I wonder (as I often do) just what the disciplinary formation of Religious Studies really is. In a typical anthology or scholarly panel, such uncertainty is very nearly boilerplate. We spend an awful lot of time reminding each other that “religion” is a constructed category shaped from multiple biases and presumptions, that the field’s emergence bears the imprint of colonialism, that there is no settled method, and so forth.

As someone interested in public life and politics, it’s not that I find these self-inventories uninteresting. But they don’t exactly stimulate fresh thinking for me, especially in terms of my longstanding interest in making sense of religious discontent with American politics, that changing same. I’ve always tried to look beyond standard formations – party, denomination, lobby – out of a sense that those framings conceal the big story, the big shapes, as occluded as these sometimes can be. Media Studies (especially those at the intersection of affect and technology) has provided me with much more stimulation in the last decade-plus, particularly as I have focused in on religion’s emotional articulations in public discourse (in crowds, online, in entertainments).

Stodgy, placeholding terms like “faith” conceal emotional multitudes. I’ve spent an awful lot of time thinking about fear and outrage, which Whitney and I will jam on. Here, though, I want to spend a bit more time thinking about just what my subject is, and how it might relate to Whitney’s.

I was once obsessed with the HBO series Deadwood, which – in its institutional histories woven into melodrama – did for the late nineteenth century West what The Wire did for post-industrial Baltimore. Stunned by the irresistibility of what character Charlie Udder called “amalgamation and capital,” the residents of Deadwood strove to make sense of the new technologies – mining tools and pumps, railroads, and telegrams – that now produced and ordered their lives. In one memorable scene, sniveling worm of a mayor E.B. Farnham panicked at the presence of George Hearst in Deadwood, complaining to local godfather Al Swearingen, “It’s Hearst. Hearst! Is he Caesar, to have fights to the death for diversion? Murder his workers at whim? Smash passages in the fucking wall? A man of less wealth would be in fucking restraints.” Swearingen replied soberly, “We’re in the presence of the new.” Farnham: “Fuck the fucking new! Jesus Christ, Al. Is it over for us here?”



hqdefault.jpg

This scene stuck with me as a way of thinking about how not just religious selves but modern selves and categories are formed in response to moments of overwhelmedness by industry and technology. I’ve found it suggestive to think about religion as a medium of experience and an object of mediation since, as I try to chase down in writing elusive articulations of “faith,” the weight of media and technology is in many ways the weight of language, its power and purpose only partly a functional one. The steam powered train, the telegraph, constellations of data we imagine grouped together into something called a market, the tweet, all these technologies are so palpable and sensate and yet take us into realms of experience where the empirical is burdened, into realms we might think of as religious in their ineffability – that disembodied voice, those unseen gears, those swirls of paperless exchange.

The power of these collective media can shape the freestanding law of American jurisprudence, the “sincerely held belief” that makes for the generic American’s generic religion, or the crowd-sourced emotion, viral assertions, or chains of distraction that make for our online lives. My interest in faith, then, has been consistently interwoven with studies of politics and discipline and difference. Like many others, I return to Foucault regularly when considering such matters, specifically the 1976 lectures introducing ideas of biopower in a discussion of dispersed populations and industrialization. With biopolitics, Foucault says, we move beyond the disciplinary apparatus and even the surveilling apparatus of the early modern into a period where power moves through not just new bodies but new mechanisms: “forecasts, statistical estimates, and overall measures. And their purpose is not to modify any given phenomenon as such, or to modify a given individual insofar as he is an individual, but, essentially, to intervene at the level at which these general phenomena are determined.”

This very generality and abstraction is what places a greater premium on description, on language, on representation for life, literally the making lively of things, the animation, the em-powerment that is ubiquitous in an era of light and movement and circulation. One key to understanding the power that “faith” holds for Americans – aside from symbolic capital or lobbies – is its ability to make us forget our own role in its production, our own imaginative media, or our reliance on its very self-evidence as a category. That opacity, that everywhereness, is what allows it to do such powerful emotional and political work.


Whitney on faith:

The relationship between faith and technology is one I’ve reflected on often, though maybe not with an eye towards transcendence. Or maybe...sort of, as these issues don’t just rear up in response to the latest political crisis, but speak to deep existential, epistemological, and even ontological concerns. What it means when we lose a shared sense of reality; what it means when every single day is filled with so much chaos, so many conflicting truth claims; what it means when it’s just not clear what any of it means, and therefore, what any of us should do in response.


This is where faith comes in, particularly when considering responses to the spread of mis- and disinformation across social media, frequently described as “fake news” or “alternative facts” or other terms indicating that oh god, we’re really in trouble here. The call from many, particularly within journalism, the technology sector, and education, is to ramp up media literacy efforts: to check facts, verify sources, and evaluate the overall truth value of content. That approach, Alice Marwick argues, has within certain circles emerged as a kind of “magic bullet” theory for the digital age (the “magic bullet” refers to an early, frequent straw-person theory associated with interwar anti-propaganda efforts; it maintains, rather morbidly, that media messages go straight into a person’s brain, without any critical reflection or individual agency to stop them, rendering audiences perfectly and immediately under the propagandist’s control, which is not how media works--but that’s a different conversation). As Marwick shows, such efforts implicitly assume that truth is a corrective unto itself, a position that dovetails with the related assumption that the underlying problem is the public’s lack of exposure to facts. On this view, if we could just expose people to what’s true, our problems would be solved.  

Let me be clear, truth is a good thing. I like it very much (as do other scholars who critique traditional media literacy models; no one is saying that truth doesn’t matter, or that actual things in the world are somehow overrated). Critical thinking is also a good thing, as is close textual analysis, and source verification, and assessment of bias, and careful fact checking. All of it is good, in theory.


But in practice, that doesn’t mean it works, or that such efforts serve as a one-size-fits-all solution to the spread of polluted information.

The reason they don’t, not universally anyway, is that fact value--whether something is objectively, verifiably true--isn’t always why someone chooses to engage with or share something. Consider, for example, this 2016 Pew Research study, which revealed that 14% of survey respondents admitted to sharing false stories they knew to be false. That’s not a problem of fact checking. That’s a problem of people having their own reasons for sharing particular kinds of content, from the desire to make their friends giggle to the desire to watch the world burn to anything and everything in between.

In the same vein, whether something is objectively, verifiably true, isn’t always why someone puts their faith in something. I’m not restricting “faith” to religious experience, here; a person can have faith in institutions (or not), or faith in journalists (or not), or faith in the government (or not), or faith in each other (or not). Where one puts their faith--or not--isn’t something that cold hard facts can necessarily penetrate. The effort to counter faith with facts might even backfire. As Marwick argues--and she’s not alone; see the work of Lewandowsky et. al, Tripodi, and boyd, among others--correcting an untruth isn’t just likely not to penetrate the underlying belief structure. It may ultimately entrench the belief.

In short: my facts about something might just make your faith in the opposite thing stronger. Where do we go from there? What do we do when efforts to bridge epistemological gaps ultimately risk burning those bridges?

I don’t know. It’s a matter of faith, and faith is also how we got here.

Whitney Phillips is an Assistant Professor of Communication, Culture, and Digital Technologies at Syracuse University. She holds a PhD in English with a folklore structured emphasis (digital culture focus) from the University of Oregon, and an MFA in creative writing from Emerson College. She is the author of 2015's This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture (MIT Press), which was awarded the Association of Internet Researchers' Nancy Baym best book award. In 2017 she published The Ambivalent Internet: Mischief, Oddity, and Antagonism Online (Polity Press), co-authored with Ryan Milner of the College of Charleston. She is also the author of the three-part ethnographic study "The Oxygen of Amplification: Better Practices for Reporting on Far Right Extremists, Antagonists, and Manipulators," published in 2018 by Data & Society. She is working on a third book titled You Are Here: Networked Manipulation in the Digital Age.

Jason C. Bivins is a Professor of Religious Studies at North Carolina State University. He is a specialist in religion and American culture, focusing particularly on the intersection between religions and politics since 1900. He is the author of Spirits Rejoice!: Jazz and American Religion (Oxford, 2015) a Choice Outstanding Academic Title of 2015. He has published most actively in the area of U.S. political religions, the subject of his first two books, Religion of Fear: The Politics of  Horror in Conservative Evangelicalism (Oxford, 2008), a Choice Outstanding Academic Title of 2008, and The Fracture of Good Order: Christian Antiliberalism and the Challenge to American Politics (UNC, 2003). He is currently working on his next monograph in political religions: Embattled Majority, a genealogy of the rhetoric of “religious persecution” in public life. He is also writing about Jack Kirby, the “King of Comics,” for Penn State Press’ Religion Around series.





Popular Religion and Participatory Culture (Round 4): Tisha Dejamanee and Deborah Whitehead (Part 1)


 

logo-hfm.png


Deborah:

I’m grateful to Henry Jenkins, Diane Winston, and Sarah McFarland Taylor for organizing this series of exchanges and inviting me to be part of it.  I’m grateful too, Tisha, for the opportunity to learn more about your work; we definitely have many overlapping areas of interest!  I want to focus on three in particular here:  (1) the idea of food preparation as “women’s work” and the ways that female food bloggers both are constrained by this, as you note, but also play with it; (2) the intimacy and sense of community found in blogging, both of which play into the notion of authenticity that is so valued, and commodified, in the blogosphere; and (3) how the concept of participatory culture has been helpful in our work, and how it relates to the study of religion – a thread running through the other conversations so far that I’ll also weave into my thoughts here.  In my own work, I’ve found the concept of participatory culture enormously helpful for thinking about the ways that communities are formed, and unformed, around personal religious blogs, and the values and practices that underlie these processes.   

 

I’m intrigued by your focus on the intricate relationships between gender, labor, and community in a postfeminist context – one feature of the “ambivalence” you describe. In my own work on food storage blogs, which exist at the busy intersection of food blogs and mommy/parenting blogs, I have written about the idea of food preparation for an emergency situation as “women’s work,” specifically as maternal labor.  “Survival is a mom’s job,” proclaims blogger Lisa Bedford, a natural extension of maternal care for one’s family both now and in the future.  These self-proclaimed “survival moms,” mostly U.S. working and middle class white women, are on the one hand constrained by a sense of pervasive anxiety about contemporary conditions over which they have no control – prepper culture has exploded in the aftermath of 9/11 and the 2008 economic recession and housing crisis – and to which they feel they have no choice but to respond to protect their families and preserve their lifestyle by learning to prepare and store massive quantities of food to have on hand in the event of an emergency.  One could argue that political and economic instability has caused these women to “lean in” to traditional gender roles as a means of survival, generating an idealization of and nostalgia for “women’s work” and for the specific skills of their grandmothers and great-grandmothers who scrimped and saved and gardened and canned massive quantities of food to get their families through the deprivations of the Great Depression and World War II.  Many of these survival moms also turn to the practices of food storage and emergency preparedness that have been part of the Church of Jesus Christ of Latter-day Saints from its beginnings, not by converting but by consulting LDS (popularly known as “Mormon”) mommy blogs and church resources to gain this specialized knowledge.  On the other hand, this retrieval and romanticization of the past, of the LDS pioneer ethic, and of food preparation and storage as maternal labor is often celebrated in (post)feminist terms that the survival moms’ grandmothers would not recognize – food storage provides “confidence” and “empowerment” and “expertise,” enabling women to “worry less and enjoy life more,” as Bedford puts it.  I definitely see a set of ambivalences there as well, and I am wondering if you can say more about what you see as the “ambivalence that more broadly characterizes postfeminist popular culture” and how it relates to the “political potential of food blogs” that you seek to theorize.  

 

I absolutely agree with your point about how in the enterprise of blogging about domestic spaces, “intimacy becomes a lucrative commodity that is used to connote trustworthiness and authenticity by corporate advertisers.”  In my own work on mommy bloggers, I’ve explored how authenticity is a “symbolic construct” central to self-branding (Banet-Weiser) as well as to the creation of new forms of community based on a blogger’s gradual self-revelation.  I’ve also been influenced by the work of several scholars (Heidi Campbell, Pauline Cheong, Mia Lovheim, Paul Teusner, Lori Kido Lopez, May Friedmann, and others) in exploring how blogging generates new notions of religious authority, religious practice, community, gender identity, motherhood, self-branding, and so on.  But I’ve also been very interested in the dark side of this dynamic, and in what happens when bloggers are exposed as frauds or as having been less than fully forthcoming about their domestic lives. When a blogger’s credibility and readership is based on a presumption of honest self-disclosure, and that trust is called into question or shattered by new revelations, a variety of consequences ensue. In some cases, the result is very much like a religious deconversion process:  readers tell stories following a familiar narrative arc, establishing initial strong commitment and devotion followed by gradual loss of faith, that resemble evangelical deconversion narratives in striking ways.  Though practice is certainly a key part of their devotion (online and offline participation), these narratives, like evangelical deconversion narratives, tend to emphasize the centrality of belief:a reader initially “believed in” a particular blogger but now they do not, and they are left to process the shame, guilt, anger and sadness that result from this loss of faith.  I think a similar narrative arc is evident in the “Get a Life!” SNL sketch that Sarah McFarland Taylor and Henry Jenkins discussed in an earlier post, which portrays the dramatic threat to fan devotion that occurs when the object of that devotion turns out to be all too human.  As Henry has pointed out, the sketch also repeats a familiar narrative of caricaturing fans; I saw the same kinds of critiques leveled at my disillusioned blog readers. Critics called them gullible, pathetic losers; “how could you believe so strongly in someone you didn’t even know?, they scornfully asked.  “I drank the Kool-Aid,” came the woeful response, a direct reference to the 1979 Jonestown tragedy.  Parallels between religious devotion and fan devotion abound in the stories we tell about them and the ones they tell about themselves. In such situations it becomes difficult to disentangle the moral and religious dimensions of authenticity, or even to maintain that such a distinction is a meaningful one:  which is the greater offense, lying to one’s public or being a false prophet (or deity)?  Charles Taylor (1992; 2007) has called authenticity the “moral ideal behind self-fulfillment,” giving rise to a new sense of the self as both private and public and to a new form of “expressive individualism” in which each of us inhabits spaces of “mutual display,” not only consuming but also creating to and for others.  This seems directly related to the elements of participatory culture as Jenkins has developed it:  the awareness of new publics, new forms of networking, identity creation and play, and community.  I’m curious to hear more about how you find the concept of authenticity helpful in your own work.  

 

Tisha:

 

When I first moved to the United States seven years ago, a simple online search for a recipe launched me into the world of food blogs – texts that combine personal narratives with recipes and high-quality food images. They are often created by individuals who blog from their households and narrate the mundane details of daily life to their unseen audience as though they are dear friends. In my research, I focus particularly on female bloggers – who comprise the majority of food bloggers – to analyze the ways this media genre illuminates ideas about labor, pleasure, and community. 

The creative and technical skills that are nurtured within the food blogosphere are apparent in tangible cultural goods such as cake pops, food porn and the extensive homewares line produced by The Pioneer Woman and Walmart. However, I also view the creative potential of food blogs as situated in their performance of postfeminist subjectivities that acknowledge both the pleasures of this idealized femininity as well as its internal contradictions. On the one hand, food blogs often exaggerate a girlie femininity that is sweet, infantile and down-to-earth. This takes place through a relentlessly cheerful narrative tone that Amanda Fortini (2011) describes as containing “no serious conflict, no controversy, no cynicism, no snark”; an aesthetic preference that has yielded trends such as unicorn food and gourmet sprinkles; and, the centrality of being a wife and mother, as food blogs glamorize these traditional identities. It is this performance of effortless and innate girlie femininity that leads bloggers to insist that their multi-layered cakes and meticulously photographed meals are amateur creative projects. On the other hand, I see this exaggerated performance of sweet femininity as strategic. It is, after all, an appropriation of the aspirational language that lifestyle media has long directed toward women. Now women are using food blogs to re-deploy these commercialized tropes to attract advertisers that have increasingly begun to partner with bloggers to mine the ever-growing audience of the lifestyle blogosphere. 

This dynamic raises two issues that I think are relevant to participatory culture here. The first is the question of access to participation. In my above generalizations of food blogs, and my descriptions of the determined normativity cultivated by the blogosphere, I do not wish to suggest that examples of rich subversive, radical and alternative food blogs do not exist but, rather, that such examples are rarely rewarded within the hierarchies of visibility and profit within the blogosphere. Although it is true that there is some diversity within the food blogosphere, these user-generated texts remain much more homogenous than might be expected. The second is how to respond critically to an example of participatory culture that so faithfully reproduces, and is often seamlessly reabsorbed into, the high-production quality and commercially-oriented values of mainstream lifestyle media. 

I suggest that in addressing these issues, as well as in making sense of the political potential of food blogs, it becomes necessary to embrace the ambivalence that more broadly characterizes postfeminist popular culture. For example, food blogs draw on the cultural trend romanticizing traditionalism and DIY culture (Matchar, 2013) to showcase and circulate gendered, domestic foodwork that has long been devalued and rendered invisible. At the same time, the professionalization of the food blogosphere has led to the normalization of time-, labor-, and skill-intensive meals that are often portrayed as everyday domestic fare. As another example, these texts can be understood as an archival practice, as they are forms of women’s autobiographical life-writing that are told through the materiality and rhythms of food. Yet, this personal life-writing is often co-written through the networked participation of readers in forms that run the gamut from supportive comments to hate blogs to the whims of digital platform design. 

Ultimately, I turn to the spirit of networked community that is so important to both food and food blogs as a way of clearing a path through this ambivalence. It is true that the particular way that food blogs encourage individuals to provide personal testimony of the lived, gendered experiences of domestic life is intertwined with a “neoliberal moralframework, where each of us has a duty … to cultivate a self-brand” (Banet-Weiser, 2012, p. 56), and that within this framework intimacy becomes a lucrative commodity that is used to connote trustworthiness and authenticity by corporate advertisers. While we have long grappled with a sense that corporate messages are manipulative in the ways that they are explicitly designed to be persuasive, I am nevertheless struck by moments in the blogosphere where the depth of ties within this community are made salient. This has taken place at moments of tragedy within the lives of bloggers, as when blogger Jenni Perillo unexpectedly becomes a widower with two young children or blogger Lindsay Ostrom experiences the loss of her premature baby at 23 weeks of pregnancy. The community support at these moments has been material and emotional, and demonstrated the ways that the virtual friend-audience is mobilized in times of crisis. On a more everyday level, I think about the value of recipes to support special needs such as hypoallergenic and specialty diets that are now made freely available, and digitally indexed and searchable thanks to the knowledge-exchange practices of the blogosphere.

It is clear that food blogs are not utopian digital spaces. Successful food bloggers are often privileged and require substantial existing resources to be influential within the increasingly competitive blogosphere. The generic conventions of the genre do not support explicit political discussion. However, in the same ways that cultural studies scholars advocated for the ability of the audience to critically read popular culture texts, I support the notion that bloggers and blog readers can engage in pleasurable and meaningful, food-centered dialogue and identity play without losing sight of the ways that corporate logics underwrite the spaces and values of the blogosphere. 

I have to admit that I am another media scholar that has not spent a significant amount of time thinking about how religious studies intersects with and could build on my work – although I have noted that a significant proportion of bloggers share their religious affiliations, I have tended to subsume this information within the broader secular terms traditionalism and conservatism in my analysis of the blogosphere. Debbie, I’m very much looking forward to hearing more about your research interests and seeing how you approach similar themes! 

 Dr. Tisha Dejmanee is an Assistant Professor in Communication at Central Michigan University. She has authored several journal articles on issues where the fields of gender studies, popular culture, politics, new media and food intersect. Published work relating to the themes discuss in these posts include "‘Food Porn’ as Postfeminist Play” (http://journals.sagepub.com/doi/abs/10.1177/1527476415615944)  and “Consumption in the City: The Turn to Interiority in Contemporary Postfeminist Television” 

Deborah Whitehead is Assistant Professor of Religious Studies at the University of Colorado in Boulder, where she teaches courses on religions in the United States, Christianity, and critical-theoretical approaches to the study of religion, gender, and culture. Her research interests are centered in Christianity and U.S. culture from the late nineteenth century to the present. She is currently preparing a book on the American pragmatist tradition as well as participating in a Ford-funded project called "Finding Religion in the Media" through CU's Center for Media, Religion, and Culture, focusing on U.S. evangelicals' uses of new media. Her current work focuses on issues of historicity, narrative, identity, the visual, and authenticity in new media.