Reflections on Media in Transition 5

This entry is a stub. My goal here is to create a space where people who attended the Media in Transition conference this weekend can share their perspectives about what worked or didn’t work during the event but also give us suggestions about what they might like to see at Media in Transition 6 which will be two years from now. This year’s focus on collaboration, creativity, and appropriation emerged from discussions among conference participants at Media in Transition 4. We were especially urged to try to develop themes which would allow more participation from artists, educators, lawyers, activists, and policy people and I am happy with the ways that this year’s conference did attract more non-academics into the mix. So far, at the closing session, there has been a greater emphasis placed on historical perspectives, which have long been a hallmark of the Media in Transition events but which were under-represented this year. There was also a desire for more critical or skeptical perspectives on media change and as always, more challenge to insure the diversity of the mix of speakers at the event. And finally we were urged to reach out to librarians and archivist who had special roles to play in preserving the past even as they are involved in insuring the circulation of culture. These were all great insights but I am sure that there are other ideas out there we should collect while the conference is still fresh on everyone’s mind. So, fire away. But keep in mind that to some degree our ability to draw in these other groups will depend on your outreach in your local community. So, talk up the conference and help us identify people you know who should be in the mix next time.

The plenary events are already available in podcasts.

Folk Cultures and Digital Cultures

Collaboration and Collective Intelligence

Copyright, Fair Use and The Cultural Commons

Learning Through Remixing

Reproduction, Mimicry, Critique and Distribution Systems in Visual Art

Summary Perspectives

We will be posting a directory of participants to our conference website as well as providing access to many of the presented papers. Indeed, there are lots of interesting papers already here

And for those of you who would like to read some live blogger accounts of some of the events, here’s some we’ve found already:

Axel Bruns

Walter Holland

Grand Text Auto

Tarleton Gillespie

So, thanks for all of you who came. If you weren’t here for the conference, check us out. And either way do let us know what you think…

Liwen’s Digital Journey Into the Computer World

Last week, I shared Debora Lui’s essay about her relationship with the Netflix Queue as an example of the work I’ve received on an assignment I set my students in the graduate prosem I teach on media theory and methods. They were asked to write an essay which drew on personal experiences as the basis for theoretical observations about media and popular culture. Today, I wanted to share another example of the work generated in response to this assignment. This one comes from Liwen Jin, a CMS first year master’s student, who comes to us from the People’s Republic of China. So much has been written in the west about China’s embrace of digital technology that I thought you might appreciate reading her perspective on the changes new media has wrought in her country and about the process by which she became digitally literate.

Liwen’s Digital Journey into the Computer World

Liwen Jin

My first time to touch a computer was in May 1995, when I was about to graduate from a primary school. My parents sent me to a professional institute to let me get some basic training in wielding the computer. However, when I arrived at that summer school, I was totally surprised and even scared by the fact that all of the students there were twenty or thirty something except me, only a 12 year old girl in that big class. During that time, very few Chinese people knew how to operate a computer. Computer education was limited to MS-DOS and keyboarding. In that class, though I was the smallest one, I got the highest grade in the final test, which made me pretty confident in utilizing the latest technologies, and it fascinated me with that small magic”box” at that young age.

After that, I had no more experience with the computer until entering high school in 1998. Every high school student in China was supposed to get some elementary computer education. However, the fact was far from the requirements set by the country’s National Education Ministry. High school students usually sat in the computer room, busy doing their own homework. Driven by the intense pressure of College Entrance Examinations, high school students usually devoted all of their time to their studies. They did not have weekends, nor extra time to watch TV or play the computer. They were usually regarded as one of the most “miserable” social groups in China. Besides, the Internet was not popular at all at that time. Getting access to the Internet was very expensive and the speed was quite slow. Without the Internet, a computer is just a dead body without its soul. To me, the computer at that time was an alternative to the typewriter, which had no connections to my daily life or studies at all.

The late 20th and early 21th century was a period when China was fervently riding the wave of the “information economy”. The bubble of the dot-com economy in the West brought this fever to China too. The business of computers and dot-com rose to prosperity overnight.

In late 2001, my parents bought me a $2,000 personal computer because I was admitted to one of the most famous universities in China. However, it was still rare for college students to carry a personal computer around on campus in that year. I became the first one in my department who owned a personal computer. Fully enjoying the “luxurious” convenience of the computer and the richness of information, I nonetheless slipped into one extreme. I became really immersed in the virtual world. I spent less and less time communicating with my classmates, but more and more time chatting with strangers on the Internet. In different chatting rooms, I disguised myself by different “identities”: college student, female artist, singer etc. I enjoyed discussing art, Chinese literature, films, and entertainment news with different people using different identities. Just as Sherry Turkle says in her book Life on the Screen: Identity in the Age of the Internet, the existence of the Internet has become a place where people are able to forge “cyber-identities” and even get more comfortable being who they are. The Internet possesses the magic to “decentralize” the social identities of users in the virtual world–it strips users of their identities, wealth, social status and social relations in the real world, which makes it possible for online individuals to freely express their opinions and communicate with each other. It “shatters” the “bodies” of people, making their online identities so fragmented and multiple that it becomes really difficult to unify them. Besides, I felt that the separation of online identities from offline identities also resulted in the irresponsibility of netizens to their online speeches.

Indeed, my immersion in cyber space gradually separated me from “true” communication with my friends in real life for a while. Some of my friends even thought I got the symptoms of autism. In fact, during that time, except going to school, I usually confined myself to my room and surfed on the Internet.

But gradually, many of my friends got the same symptoms as mine. From 2003 to 2004, most of my classmates got their own computers and began to replicate my experience with their own. Generally speaking, girls liked to indulge in chatting on the Internet, while boys preferred to play computer games. It became a common phenomenon that dorm-mates chatted on OICQ or MSN instant message instead of talking face to face even though they were living next door to each other. Furthermore, it became very true that some students who behave timidly in real life may speak arrogantly in cyberspace. I actually was also along with them. My friend once told me, “you look very gentle and quiet in real life, but so funny and naughty on MSN. It’s really hard to unify those two of ‘you’!” That’s what I defined as “cyber schizophrenia.” People could have two or even more personalities with the infiltration of “virtual life” into real life. I still remember that one boy who looked extremely shy in real life unexpectedly sent me a series of love letters via email or MSN instant messages at that time. But after I turned him down, he looked so natural and unembarrassed when encountering me on campus. It seemed that the guy on the Internet was not “him” at all. Indeed, the Internet, in this sense, greatly challenged the Chinese tradition of Confucianism which urged people to abide by the principle of moderation and to avoid verbal aggressiveness in any case.

One of the most interesting cyber events during that period was cyber love. It became a fashion especially among college students, since young students had more time surfing on the Internet and they could usually pick up new technologies much more quickly than other social groups. Besides, people do tend to be more frank and audacious in cyberspace. There was a popular love story entitled “First Intimate Touch” written by a Taiwanese writer on the Internet during that period. It described a tragic cyber love story which got widely spread among college students. In fact, the “First Intimate Touch” also ushered in the prosperity of cyber literature in China. The Internet opened a new door to aspiring writers and connected them closely with the audience. In the past, writing had long been considered as a lonely profession, but when prose and poems got put on the Internet, the instant feedback made writing not so lonely any more. That phenomenon could be regarded as the early stage of the convergence of media producers and consumers.

In 2003, another kind of online community began to fascinate me. That was the online Bulletin Board System (BBS). My university’s BBS was one of the most popular college BBSes. It was usually deemed the virtual home to all NJU (Nanjing University) students, just like Mecca to the Islamic. Even though I have been graduated for nearly two years, I still cannot get rid of the habit of logging into NJU BBS every day to see the latest news and join students’ discussions of hot social issues. I thought BBS could be a virtual form of the Habermasian public sphere for the cause of China’s democratization. However, I gradually found that online communities like BBS only validated the theory about the principles of the popular mind of large gatherings of people on the Internet. This theory was first proposed by French social theorist Gustave Le Bon in his book The Crowd: A Study of the Popular Mind:

The masses live by, and are ruled by, subconscious and emotional thought process. The crowd has never thirsted for the truth. It turns aside from evidence that is not to its taste, preferring to glorify and to follow error, if the way of error appears attractive enough, and seduces them. Whoever can supply the crowd with attractive emotional illusions may easily become their master; and whoever attempts to destroy such firmly entrenched illusions of the crowd is almost sure to be rejected.

On Chinese BBSes, there was one recurrent issue that never failed to attract the attention of “the crowd”, that is, the anti-Japan nationalism. Last year, MIT’s Visualizing Culture issue was just a case of this point. MIT’s Visualizing Culture course, which used a 19th century wood-print image of Japanese soldiers beheading Chinese prisoners, was spotlighted on MIT’s home page. Unexpectedly, these images swiftly sparked complaints from the MIT Chinese community. Some Chinese students re-posted the images to several famous college BBSes in China, which stimulated a vehement fever of anti-Japan hatred on China’s BBSes. Those “angry young people”began to throw “bricks” on the Internet. Someone even exposed the email address of Professor Shigeru Miyagawa, and instigated people to condemn him via email. Vociferous comments flew around the BBS sphere. Most of them were rude, while truly rational and objective voices were only submerged under the abuse. Obviously, the masses in the blogosphere could easily lose their rationality and follow the “emotional thought process.”

In 2004, the term “blog” became a key word of that year in China. I also joined the crowd to chase that trend. I established my first blog on the Internet and kept writing essays and poems on it. It was really a wonderful place for me to write my meditation on various social, political or cultural issues, and then share with my friends. Compared to BBS, the advantage of the blogosphere lies in its greater rationality than the BBS sphere. On BBS, with their true identities veiled and agitated by mass netizens, people tend to express extreme ideas and they are free of any responsibility for the consequences of their speaking and contents. In the blogosphere, one blog is a separate and independent unit, which is immune to the chaos of the crowd. Besides, after the advent of blogs I saw a trend of the unification of online identities with offline identities in China. Some bloggers have begun to view their blogs as a virtual spiritual home and uncover their real identities on blogs. In this way, netizens will be more responsible for their online speeches. Thus, blogs were supposed to become a powerful driver to accelerate the democratization process in China. However, it dismayed me again. The swift development of celebrity blogs in 2005 finally brought a rigid hierarchy in China’s blogosphere. The popularity of a blog became positively related to the fame of the blogger in real life. Celebrity blogs greatly overshadowed common people’s voices, the result of which discouraged ordinary people from participating in the democratization in China. Besides, the features of the”eyeball economy” dictated that rationality and abstractness were usually far from the foci of our society. The people in cyber space were rarely willing to bother themselves to explore the profundity behind the text. The entry which gets the most clicks on my blog is actually the one to which I post my own photos.

Today, I have been used to the life with the computer and Internet, though my mom still thinks that is addiction. But MIT is always a place full of computer/Internet “addicts.” I cannot even imagine a day without computers and Internet! However, I have to admit that working on the computer is quite inefficient. With the Internet open, the computer becomes a kaleidoscopic world which seduces you to do everything else except your work. The affluence of information on the Internet is thus a virtue as well vice to us. To me, I will continue my journey in this colorful digital world. And I will continue exploiting every chance brought about by new media to promote the democratization in China. I believe that should be regarded as one of the most important missions for overseas Chinese students, to develop and advance our own country along the way of democracy.

Jin Liwen hails from China, where she received her undergraduate degree in media and communications from Nanjing University followed up by studies in American politics and history and international relations at the Johns Hopkins University-Nanjing University Center for Chinese and American Studies. She interned in the news commentary division at China’s largest media organization, China Central Television (CCTV), and worked as a journalist at News Probe, an investigative documentary series that addressed the problems of marginal populations such as homosexuals and AIDS patients. This experience encouraged Liwen to turn her academic work towards a critical investigation of the relationship between various media forms (traditional media, blogs and online bulletin board systems) and the development of a democratic culture and public sphere. At CMS, she is eager to continue her research into the role of media in facilitating political democratization and international cultural understanding.

Media in Transition 5: Creativity, Ownership and Collaboration in the Digital Age

This weekend, the Comparative Media Studies Program will play host to several hundred researchers, activists, and artists from around the world who will be attending the fifth of our Media in Transition conferences. The core theme of the conference centers around issues of Creativity, Ownership and Collaboration in the Digital Age, though our goal is to discuss the present moment in relation to the larger history of media change. I haven’t publicized the event here because the number of participants has reached such a level that there are very few seats left for people who simply want to attend.

For those of you who are in the Boston area, it may make sense to drop by for one or another event since there is no fee to attend and since we often have some seats left.

For those of you who are not in the Boston area, have no fear. You will have two opportunities to take advantage of the event programing. First, we will be streaming the plenary events via Second Life. And Second, we will, as with all of our events, be offering webcasts which will be announced here once they are available.

How to Access MIT5 on Second Life

To view from New Media Consortium Campus:

You must first join the NMC to view from here. It’s free and simple. Go to the following address: http://sl.nmc.org/join/ and give them your SL Avatar name, your real name, a valid email address, and for affiliation, mark as ‘MIT’.

The SLURL for the NMC Campus is here:

http://tinyurl.com/nraap

We’ll be at the Gonick Amphitheatre which can be seen the campus map here:

http://sl.nmc.org/wiki/Campus_Map and within the Welcome area in SL.

For more info about the NMC Campus in Second Life, go here:

http://sl.nmc.org/wiki

A Few Thoughts on Media Violence…

The news of last week’s tragic shooting at Virginia Tech has brought the usual range of media reformers and culture warriors (never camera shy) scurrying back into the public eye to make their case that “media violence” must be contained, if not censored, if we are to prevent such bloodshed from occurring again. Almost immediately, longtime video game opponents Jack Thompson and Dr. Phil McGraw started appearing on television talk shows, predicting that the shooter would turn out to be a hardcore video game player. (The odds are certainly with them since a study released several years ago of frosh at 20 American colleges and universities found that a hundred percent of them had played games before going off to college and that on average college students spend more time each week playing games that reading recreationally, watching television, or going to the movies.) In fact, when the police searched the killer’s dorm room, they found not a single game nor any signs of a game system.

The focus then quickly shifted with the news arguing first that the shooter was a heavy viewer of television “including television wrestling” and then linking some of the photographs he sent to NBC with images from Asian cult cinema — most notably with the Korean film, Old Boy. An op-ed piece in the Washington Post asserted that Old Boy “must feature prominently in the discussion” of Mr. Cho’s possible motivations, “even if no one has yet confirmed that Cho saw it” and then later, claims that Cho “was shooting a John Woo movie in his head” as he entered the engineering building.

And then, of course, there was that damning evidence that he had construct violent and aggressive fantasies during his creative writing classes. Time magazine even pathologizes the fact that he was a college student who didn’t have a Facebook page! Talk about damned if you do and damned if you don’t!

None of this should surprise us given the cycle of media coverage that has surrounded previous instances of school shootings. An initial period of shock is quickly followed by an effort to round up the usual suspects and hold them accountable — this is part of the classic psychology of a moral panic. In an era of 24 hour news, the networks already have experts on media violence in their speed dial, ready for them to arrive on the scene and make the same old arguments. As a media scholar, I find these comments predictable but disappointing: disappointing because they block us from having a deeper conversation about the place of violence in American culture.

I want to outline here another set of perspectives on the issue of media violence, ones that are grounded not in the literature of media effects but rather in the literature of cultural studies. I have plenty of criticisms of the media effects approach, which I outlined in my recent book, Fans, Bloggers, and Gamers: Exploring Participatory Culture, but for the most part, my focus here is more on what cultural studies might tell us about media violence than it is in critiquing that body of “research.”

So, let me start with an intentionally provocative statement. There is no such thing as media violence — at least not in the ways that we are used to talking about it — as something which can be easily identified, counted, and studied in the laboratory. Media violence is not something that exists outside of a specific cultural and social context. It is not one thing which we can simply eliminate from art and popular culture. It’s not a problem we can make go away. Our culture tells lots of different stories about violence for lots of different reasons for lots of different audiences in lots of different contexts. We need to stop talking about media violence in the abstract and start talking about it in much more particularized terms.

Otherwise, we end up looking pretty silly. So, for example, a study endorsed by the American Academy of Pediatrics reported that 100 percent of feature length cartoons released in America between 1937 and 1999 contained images of violence. Here, we see the tendency to quantify media violence taken to its logical extreme. For this statement to be true, violence has to be defined here so broadly that it would include everything from the poison apple in Snow White to the hunter who shoots Bambi’s mother, from Captain Hook’s hook to the cobra that threatens to crush Mowgali in The Jungle Book and that’s just to stick with the Disney canon. The definition must include not only physical violence but threats of violence, implied violence, and psychological/emotional violence. Indeed, if we start from a definition that broad, we would need to eliminate conflict from our drama altogether in order to shut down the flow of media violence into our culture. Perhaps this is reason enough not to put pediatricians in charge of our national cultural policy anytime soon. Certainly few of us would imagined our culture improved if these films were stripped of their “violent” content or barred from exhibition.

Almost no one operates on a definition of violence that broad. Most of us make value judgments about the kinds of violence that worries us, judgments based on the meanings attached to the violence in specific representations, so church groups don’t think twice about sending young kids to watch Jesus get beaten in The Passion of the Christ, and games reformers go after first person shooters but not World War II simulation games (which coat their violence in patriotism and historical authenticity) even though this genre is now consistently outselling more anti-social titles in the video game marketplace.

[Read more...]

Slash Me, Mash Me, Spread Me…

A while back, I mentioned that Jonathon Lethem, author of The Fortress of Solitude, Motherless Brooklyn, and Men and Cartoons, had poached a passage from Textual Poachers in an article he wrote for Harpers about copyright and creativity. Since Lethem, along with Michael Chabon ( The Amazing Adventures of Kavalier & Clay), has emerged as one of the poet laureates of fanboy lit, I was delighted to discover that my work on fan culture had made it onto his radar screen. But it just keeps getting better. Annalee Newitz was interviewing Lethem for Wired and asked him directly about his relationship to Textual Poachers, as she reports in her blog:

Lethem, always a fan of art that exists in a copyright gray area, is eager to encourage fanfic writers of all stripes. He admires Henry Jenkins’ seminal book about fanfic, Textual Poachers, and champions the creative appropriation of pop culture icons. “Fanfic is a beautiful allegory of appropriation,” he said. “But that doesn’t mean the exact gesture is the most aesthetically promising one.” Translation: Fanfic rules because it tweaks copyright law, but it’s not always good art. Maybe Lethem just hasn’t read some of the fantastic Harry Potter fanfic that’s out there?

Moreover, Lethem has laid down a challenge to the fan writing community, which I am happy to help publicize here:

The award-winning nerd novelist revealed that he’d love to be in a slash fiction story. Whom would he want to be paired with? “I want to be surprised! I want to see ones I wouldn’t think of!” he enthused, eyes wide with anticipation — or possibly fear. Lethem believes he’s been “slashed” only once, paired with fellow geek novelist Michael Chabon in a “sublimated homoerotic comic by Patricia Storms that was just an inch away from being Kirk and Spock.”

Lethem may well be the first celebrity in my memory who has publicly campaigned to be the subject of a slash story. I can certainly think of plenty of examples where stars and writers not to be subjected to the slash treatment. (Personally, I am rooting to see Lethem climb into bed with The Goatman, the aptly-named character from one of his short stories, but then what do I know…)

I became aware of the Lethem effort to encourage people to slash him about the same time that I learned about the latest efforts of Steven Colbert to encourage his own brand of grassroots creativity. As his website at Comedy Central explains:

For Your Editing Pleasure

It all started when House Democratic Caucus Chairman Rahm Emanuel told freshmen Democratic congressmen not to appear on the Colbert Report. The complaint? That Stephen gets final cut on interviews. So in the interest of playing fair, Stephen has decided to put it all out there for you. And by “it,” we mean footage of an interview with Stephen that you can edit any way you like.

Download the footage at www.colbertnation.com. The knife is in your hands, Americans. Wield it wisely.

So, at a time when other producers are sending out cease and desist notices to shut down mashups of their content, Colbert is encouraging you to re-edit and recontextualize incriminating statements from his show (and believe me, what made the sketch so funny when it first aired was the whole series of potential meanings behind seemingly innocent statements once he planted the idea in your head.) Of course, none of this has stopped Viacom from trying to get Colbert Show segments removed from YouTube in what is surely a classic example of a media company speaking out of both sides of its mouth at once.

[Read more...]

The Wrestler in My Living Room…

My students sometimes nail me for a tendency to overuse the metaphor, “wrestling” to talk about the work we do in making sense of a particular theory or cultural phenomenon in my class. But this term, rather than wrestling with a theory, we had a chance to study theory with a wrestler. A few weeks ago, WWE superstar Mick Foley, better known to his fans as Mankind, came to MIT to interact with our students.

The primary occasion for Foley’s visit was a class which we have been offering this term on American Professional Wrestling. The class was added to our curricular line up to take advantage of the expertise, experience, and connections of one of our graduate students, Sam Ford, a lifelong wrestling fan, who has performed as a manager as part of a minor wrestling circuit back home in Kentucky. In his fictional role, Sam plays the part of an arrogant young man who has left home to go off to the evil city and study at MIT. Sometimes, he wears his CMS t-shirt into the ring and confounds his rivals with a mixture of fancy theory speak and just play bad-mouthing. Sam did his undergraduate thesis at Western Kentucky University on professional wrestling but as a master’s student at CMS, he has been devoting his attention to the ways soap operas have responded (or more precisely, should be responding to) shifts in the media landscape. But we didn’t want to let him off that easily and so we have put him to work helping his fellow graduate and undergraduate students make sense of the controversial and complex world of professional wrestling, which he describes as an immersive story world, a term he also uses to explain the appeal of soap operas and comic books. Sam has tapped his network of contacts and has gotten the cooperation of World Wrestling Entertainment, which has sponsored talks at MIT by long-time announcer Jim Ross and Mick Foley.

The class has also attracted a fair amount of media coverage, including an article that recently ran in The Boston Globe: reporters have expressed astonishment that MIT now offers a class in professional wrestling (confounding expectations both about who MIT students are and who is interested in watching televised wrestling) but also more or less comprehending the reasons why anyone studying contemporary media culture needs to give at least a passing glance to the squared ring.

For me, the reasons why we should care about wrestling are the following:

1. As Sam suggests, Wrestling has been an early experimenter in transmedia storytelling. From the get go, moving its entertainment between televised buildup and arena shows, and gradually absorbing print magazines and comics, action figures and other toys, radio shows and podcasts, pay-per-view events, and so forth into its media empire. So, in that sense, wrestling gives us a glimpse into the future of the American entertainment industry, embodying most of the trends I discuss in Convergence Culture.

2. Wrestling also carries with it the rich legacy of late 19th and early 20th century entertainment forms, such as circus, vaudeville, and popular melodrama. When Jim Ross was on campus, he entertained us with stories of life on the road, which could have come as easily from the mouth of a traveling showman a century earlier. As I have written in my essay, “Never Trust a Snake,” (reproduced in The Wow Climax), professional wrestling borrows much of its core vocabulary from melodrama and much of its politics from American Populist traditions.

3. Wrestling gives us a glimpse into the culture of working class masculinity. I think elite Eastern institutions should be studying it for the same reasons I suggested a week or so back that we should be studying Evangelical media — because it can give us insights into other parts of American culture at a time of polarized political rhetoric and culture war discourse. Wrestling can be pure agit-prop, translating contemporary politics through the lens of its performance traditions, and as it does so, it helps us to identify the complexities and contradictions in American political thought.

[Read more...]

Contra the Snacks Hypothesis

Last month, Wired Magazine ran a special issue defined around the theme of “snack media.” At the heart of the issue was the following proposition:

We now devour our pop culture the same way we enjoy candy and chips – in conveniently packaged bite-size nuggets made to be munched easily with increased frequency and maximum speed. This is snack culture – and boy, is it tasty (not to mention addictive).

In a sense, this is a return to a very old idea that television of the future will be designed for zappers, that it will be designed in very small units which can make sense outside of any narrative context and that can be consumed whenever we want. In Convergence Culture, I explore how a contemporary television show like American Idol is designed to balance the fragmented interests of Zappers (or snackers) with the gradually deeper levels of investment represented by casuals and loyals. On a superficial level, much of popular culture looks as if it is designed for this kind of fragmented and short-term attention. So, it is not hard for Wired to find film producers, say, who are skeptical about whether the feature film will continue to be the central form of cinema:

It’s not written in the Bible, “A movie shall be two hours.” Somebody made that up to sell theater tickets. With technology, the very definition of a story has changed. It used to mean an actor and a script. Now a story is a 15second, no-dialog clip of somebody running across the street. An artist used to be the person who could get the studio to finance, manufacture, and distribute a story. Today an artist is somebody sitting in Des Moines in front of his computer – and his audience isn’t a million folks at once, but one person a million times over. I now look to GoFish and YouTube to get ideas, to see what’s going on. They show me not only what people are posting, but also what people like. It’s a much better metric than a Nielsen rating system.

We are all scrambling to construct a new model to profit from these bits and pieces, but there’s so much out there, it’s like trying to harness a tornado and getting spat out the top. I definitely don’t have the answer yet. I don’t even understand all the questions. But if people are thinking this is the end of Hollywood, they’re wrong. This is a whole new beginning.

– Peter Guber, CEO and chair of Mandalay Entertainment Group and host of AMC’s Sunday Morning Shootout

Or to find radio programmers who think people are too antsy to sit still for an entire song:

Why climb the “Stairway to Heaven” when you can take the elevator? That’s the logic behind Radio SASS (Short Attention Span System), an experimental radio protocol currently in development that takes classic tunes and whittles them down to about two minutes. “People’s patience for music – even the stuff they like – is thin,” says founder George Gimarc, a veteran programmer and former DJ from Dallas. “Twelve songs per hour won’t cut it.” Gimarc and his team of editor-musicians use what he calls “intuitive editing” to trim pop songs to their catchiest crux, pruning seconds from a guitar solo here, lopping off a chorus there.

Or television critics who think that the previews are more entertaining than the programmes:

Even if you’re a regular viewer, labyrinthine shows like Lost and Prison Break require full concentration and are best consumed in marathon viewing sessions aided by TiVo or DVD. But you can still drop in on complex dramas midseason – just make sure you catch the “previously on…” recaps before each episode. These mini montages have become a captivating subgenre for both regulars and channel surfers. Back in the early days of narrative dramas, in the ’70s and ’80s, bare-bones recaps for serials like St. Elsewhere rarely topped 30 seconds. Fast-forward to Lost or Prison Break, and recaps of a minute or more are common, with some lead-ins for season openers or finales taking nearly two minutes to bring viewers up to speed – and bear in mind that each shot in those recaps now lasts less than two seconds on average. Sometimes editors rescue scenes from the cutting-room floor, if those bits tell the story in a tidier form. It’s a new kind of TV serial, distinct from both the hour-long episode and the season-long arc.

So, what’s wrong with this picture?

[Read more...]

Applied Game Theory, R.I.P. 3: Addiction and Copyright

I am continuing my series of highlights from the Applied Game Theory column I wrote with Kurt Squire. The first is a column on the concept of games addiction (mostly Kurt) and the second is about the City of Heroes dispute with Marvel comics over copyright (mostly Henry). For the record, the City of Heroes dispute got settled out of court and the terms of the settlement have not been made public.

I am posting tonight from Cornell University. James Paul Gee and I had a public conversation today about games, participatory culture, and learning. We’ve done these off and on for the past several years — what I call the Jim and Henry show. Our host recorded it and will be making it available as a podcast so I will let you know when it is available. J

For the love of God, get that screenshot away from me!

New research suggests that people who play video games to excess exhibit traits similar to those of drug users.  Or so read the headlines at MIT’s Technology Review. A recent study on neurotransmitters and gaming made big waves: researchers showed that people who report “being addicted” to games experience increased releases of dopamine (a chemical associated with pleasure), when shown game-related images.

Most gamers react with amusement, before asking, “And this is a big deal because…?” If we are being honest, most of us have had played a game more than we should have. Some game designers brag about producing “addictive” titles. A few highly publicized stories ­ particularly around Massively Multiplayer games show that some people do let their gaming get the best of them, forgetting (or refusing) to sleep, shower, eat, or take care of loved ones. Of course, any activity from work to working out can have an adverse impact on our family, health, and relationships. And in fact, most of us have experienced something like what this research describes. All it takes is the login sound from WoW to put our minds back in Azeroth.

So why does this matter? It is one thing to urge people to balance game play with other import aspects of everyday life, another to equate gaming with drug addiction. Once that happens, groups like the American Medical Association and the American Psychological Association step in, claiming authority to regulate the media you consume.

Here’s how it usually works. A group of moral reformers comes to the AMA or APA with a policy brief that cites studies “proving” that games are highly addictive. These groups do little or no independent research, relying on what they know from reading the papers (where negative research is disproportionately represented) and then they vote to approve some kind of feel good resolution or policy statement, which itself becomes fodder for more sensational news coverage

and another bit of ammunition that reformers can use in pushing for games regulation. These groups want to regulate games as drugs (or cigarettes, another popular analogy) rather than art: their medical “expertise” masks the attempt to simply assert their tastes as normative.

It’s hard to translate these research findings into meaningful social policies. After all, America’s success rate in the “war on drugs” hardly demonstrates that we should take a similar tack on other “social problems”?



Do we ban images or words from World of Warcraft? Do we ban any activity that is

pleasurable, or produces chemical reactions?

Most pleasurable activities stimulate the release of brain chemicals. We don’t know how, say, playing a highly competitive game of basketball affects the brain because you can’t sit in an MRI while playing point guard, but we do know that working out also leads to increased dopamine. So does eating food. Basically, if we banned activities that lead to changes in brain chemistry, the species would die out from starvation or a lack of procreation. Maybe just plain boredom. And once we start asserting that some activities are simply more meaningful than others, we are back in the business of making cultural, rather than “scientific,” judgments and in that space, it is hard to justify why the AMA should have any more say than, say, professional organizations devoted to studying the cultural impact of media.

So what can we do as gamers? We must refute the idea that gaming is a drug and suggest that it’s an activity –­ one a large portion of the American public, although apparently not of the American Medical Association, finds meaningful. In fact, this same study found that part of the pleasure in gaming is the learning that occurs through confronting new challenges.

Second, gamers should push to understand why people find games so compelling. Researchers like Ted Castronova and Constance Steinkuehler have shown that for some people the roles and identities in games are more rewarding than the roles available in the real world. Maybe Azeroth is a more socially engaging place than Starbucks, USA for some people out there. This can’t be explained purely in terms of dopamine dependency.

Researchers like Jack Kuo and William Huang at Mt. Sinai Hospital in LA are developing more

nuanced models of game “addiction” that try to let /gamers/ decide what they want out of life, decide when gaming becomes unhealthy, and make their own decisions about what’s normal. They are careful to suggest that game playing can become an addiction but that the activity itself is not intrinsically destructive, unlike say shooting up herion. These researchers are finding that the number of cases of actual games addiction are much much smaller than the sensationalistic coverage would suggest. Gamers shouldn’t be in denial. We shouldn’t ignore the potential negative consequences of having games take over someone’s life but these small number of cases don’t call for dramatic policy shifts.

Now, hand over that joystick!

[Read more...]

Applied Game Theory, R.I.P.2: Role-Play and Race

Yesterday, I took a few moments to acknowledge the passing of “Applied Game Theory,” the column which Kurt Squire and I wrote for Computer Games Magazine for the better part of five years. The column now has no home because the magazine has stopped publication. If any magazine editors out there are looking for columnists, we are all ears.

The goal of the column, not unlike the goal of this blog, was to bridge between academic research on games and other media and a general public which is grappling with trying to make sense of this emerging medium. We weren’t games reviewers in any traditional sense. We were taking what we knew as academics — Kurt as someone in the field of Education, I as a media scholar — and using it to address topical concerns impacting game design, the games industry, and games culture more generally.

Some months, the ideas in the column originate with Kurt and got tweeked by me. Some months, fewer in fact, the ideas originated with me and got some assistance from Kurt. We brought different kinds of expertise and experience to the table. As a rule, the more detailed they were in discussing individual game titles, the more likely they were to originate with Kurt. While I play games from time to time, he grew up with games and remains a serious gamer. I am much more of a casual games guy who has a strong intellectual interest in what’s happening in the medium. All told, it has been one of the most successful intellectual and creative collaborations of my career to date and I am sad to see this chapter of my work coming to an end.

Yesterday, I shared a few pieces we wrote about aesthetic issues around games. Today, I wanted to push a bit deeper into the public policy debates around games. The first is a piece mostly written by me which deals with the debates about role play and its ties back to a larger history of anxiety about theatricality. The second piece reports on some research Kurt Squire and some collaborators at University of Wisconsin-Madison have been doing, examining how players of Grand Theft Auto think about race and violence.

Performance Anxiety

Is Pokemon part of a “secret Satanic war against the youth of America?” A segment of concerned conservative Christians believes so. As youth minister Phil Armes warns, “While our children play his ‘games,’ Satan and his host of hell are playing for keeps.” Role-playing games, they warn, can lead to demonic possession and promote, take your pick,

secular humanism, globalization, Neo-Paganism or New Age Philosophies.

To be sure, most Christians wouldn’t consider role-playing games to be the devil’s work. There are other groups, such as the Christian Gamer’s Guild, which embraces role-playing as a form of fellowship. There has been a movement to develop alternative, spiritually uplifting, Biblically-grounded games and several mainstream ministries have developed sites that rate games so parents can choose which ones are consistent with their own values.

Yet, it is too easy to make fun of such views as wacky extremism. Strip aside the Satan talk and the underlying logic of their arguments differs very little from the critique of role-playing offered by more mainstream reform groups. Games, the argument goes, are not simply bad because they express bad ideas; these reformers see the very act of role-playing as dangerous, because it blurs the line between fantasy and reality.

Consider some of the following claims made against Pokemon:

“Not only does this repetitive practice blur the line between reality and fantasy…the child learns to accept unthinkable behavior as normal.”

“In order to master this game you need to take on characteristics of what you are playing.”

These arguments have a long, long history.

Theater Historian Jonas Barish documented the persistence of what he called “the anti-theatrical prejudice” from its early roots in the writings of Plato through to its absorption into Christianity at the hands of St. Augustine and down to the present day. Plato argued that actors were professional liars who, over time, came to believe their own lies. After decades of playing debased and amoral characters, they lost moral judgment. Actors were often associated with madness, delusion, and drunkenness. Theater was equally dangerous to spectators. The theater stirred up our emotions in response to imaginary events and thus dulled our sensitivities to things that really mattered. The exaggerated emotions of the stage were more memorable and seductive than the events of the mundane world. Shakespeare had to struggle against these fears (and the reform movements they inspired) in Elizabethan England just as Rockstar Games has to confront them today.

With games, the line between player and spectator blurs. The reformers warn that games are more harmful than television because kids enact anti-social behavior rather than simply witnessing it.

Then as now, defenders of the theater question whether role-playing constitutes deception, since consumers and performers develop a basic competency in distinguishing between representations and reality. The ancient Greeks did not respond emotionally to the spectacle of Oedipus gouging out his eyes the same way that they would have react to a similar event in the agora. Through exploring these alternative realities, spectators learned to reflect more deeply on their own experiences and values. Aristotle knew that rule-breaking (in theater) was actually a powerful means of rule-enforcement, reaffirming social norms by representing their transgression.

The anti-theater argument depends on obscuring such distinctions. Earlier reformers debated whether actresses committed adultery when they kissed (or even spoke words of love) on stage. Yet, the use of avatars in games represents one more line of separation between reality

and play-acting. No one actually kisses (or hits); they simply press a button. Yet, the question persists. Do pretend actions have real consequences?

Consider the slips between fantasy and reality which occurs in this statement by anti-game activist David Grossman: “When I played caps with Billy when I was a kid, I said, ‘Bang, Bang, I gotcha.’ Billy said, “No, you didn’t.” So I smacked him with my cap gun. He cried. I got in big trouble….I learned that Billy is real and that when I hurt Billy I am going to get in trouble. Now, I play the video game, and I blow Billy’s stinkin’ head off thousands of times. Do I get in trouble? No, I get points for it.”

Isn’t blowing off Billy’s head in a game more like saying “Bang, Bang, I gotcha” than like clubbing him? And wouldn’t the kid get in trouble — not score points — if he actually decapitated his friend? Play, reality — no difference.

Like their ancient counterparts, these modern critics either do not grasp or intentionally misrepresent the nature of role-playing. Some things never change.

[Read more...]

Applied Game Theory, R.I.P. 1: Melodrama and Realism

For the past five years (more or less), Kurt Squire and I have written a monthly column, “Applied Game Theory,” for Computer Games Magazine. We recently learned that the publication is going out of business. Computer Games Magazine will be missed. It had a great bunch of columnists and writers and really took games seriously as an emerging form of expression, writing thoughtful reviews and well-informed opinion pieces. Unfortunately, if my experience was any indication, it didn’t necessarily reach engaged readers. I have met only two or three people who mentioned reading the columns in the five years that we were writing them, compared to the clear evidence of reader engagement with what’s going on in the blog. Given that, I thought I might share a few of the highpoints of the columns off and on for the next few weeks.

Today’s selections deal with aspects of game aesthetics — specifically with the relationship of melodrama to game design and with the concept of realism as it applies to games. Enjoy!

Games and the Melodramatic Imagination

Want to design a game to make us cry? Study melodrama.

Don’t snicker, o ye hardcore gamers. Although we associate melodrama with the soap opera — that is, “girly stuff”, melodrama has appealed as much to men as to women. Sports films like The Natural or Seabiscuit are classic examples of this, and in fact, most action-oriented genres are rooted in traditions from 19th century melodrama.

The best contemporary directors of melodrama might include James Cameron, Peter Jackson, Steven Spielberg, and John Woo, directors who combine action elements with character moments to generate a constantly high-level of emotional engagement. Consider this passage from Cameron’s The Abyss during which the male and female protagonist find themselves trapped in a rapidly flooding compartment with only one helmet and oxygen tank. Games include puzzles like this all the time, but few have achieved the emotional impact of this sequence.

Cameron deepens the emotional impact of this basic situation through a series of melodramatic devices: Playing with gender roles (the woman allows herself to go into hyperthermia in hopes that her ex-husband, the stronger swimmer can pull her to safety and revive her), dramatic gestures (the look of panic in her face as she starts to drown and the slow plummet of her hand as she gasps her last breath), emotionally amplifying secondary characters (the crew back on the ship who are upset about the woman’s choice and work hard to revive her), abrupt shifts of fortune (a last minute recovery just as we are convinced she is good and truly dead), performance cues (the rasping of the husband’s throat as he screams for help), and an overarching emotional logic (she is brought back to life not by scientific equipment, but by human passion as her ex-husband slaps her, demanding that she not accept death). When the scene ends, absorbed audiences gasp because they forgot to breathe. Classic melodrama depends upon dynamism, always sustaining the action at the moment of maximum emotional impact.

Critics might argue that these conventions are unique to film, but most melodramatic techniques are within reach of today’s game designer. The intensity and scriptedness of a scene like this couldn’t be sustained for 40 hours, but it could be a key sequence driving other events. Classic melodrama understood the need to alternate between down time and emotional crisis points, using abrupt shifts between emotional tones and tempos to further agitating the spectator. And, we often associate melodrama with impassioned and frenzied speech, yet it could also work purely in pantomime, relying on dramatic gestures and atmospheric design ­ a technique platform games do well for fun or whimsy (think Psychonauts), but few games use for melodramatic effect.

Some most emotionally compelling games are beginning to embrace the melodramatic. Take, for example, the now classic game, Ico. The opening sequences work to build sympathy towards the central protagonists and use other elements of the mise-en-scene to amplify what they are feeling at any given moment. The designers exploit the contrasting scales of the characters’ small physical builds with the vast expanses of the castles they travel through. The game also relies on highly iconic gestures to communicate the protagonists’ vulnerability and concern for each other’s well being.

One lesson that game designers could take from classic melodrama is to recognize the vital roles that third party characters play in reflecting back and amplifying the underlying emotions of a sequence. Imagine a scene from television drama where a mother and father fight in front of their child. Some of the emotions will be carried by the active characters as they hurl words at each other which express tension and antagonism. But much more is carried by the response of the child, cowering in the corner with fear as the fight intensifies, perhaps giving a hopeful look for reconciliation. Classic melodrama contrasted the actions of the protagonists and antagonists with their impact on more passive characters, helping us to feel a greater stake in what is occurring. Games, historically, have remained so focused on the core conflict that they spend little time developing these kinds of reactive third party characters with most NPC seemingly oblivious to what’s happening around them.

Finally, the term melodrama originally referred to drama with music, and we often associate melodrama with swelling orchestration. Yet, melodrama also depends on the quality of performer’s voices (especially the inarticulate squeaks, grunts, and rasps which show the human body pushed beyond endurance) and by other expressive aspects of the soundscape (the howling wind, the clanking shutters, and so forth) — elements that survival horror games use to convey fear, but are rarely used for other emotions. Game designers can not expect to achieve melodramatic impact if they continue to shortchange the audiotrack.

Want to design a game that will make players cry? Study melodrama.

[Read more...]