Once You Open Your Laptop...: Activities from My Technology and Culture Class (Part One)

Last time, I shared some of the results of a semester-long effort to integrate forms of transactive memory and collective intelligence into the teaching of an undergraduate lecture hall class on communication technology and culture.  Over the next few installments, I am sharing the discussion prompts and exam questions we developed in this context. Each is designed to support the efforts of small scale 3-4 person teams as they seek to apply concepts from lecture into the investigation of contemporary digital phenomenon. I am sharing these prompts in part because they incorporate so many resources which may be useful for other media scholars and in part because they illustrate the kinds of questions and activities that work on the scale of social interaction we are exploring. As you will notice, the activities became a bit more streamlined as the course went along, reflecting what we learned in terms of how much material the teams could process within the designated classtime and how much background they needed in order to be able to perform the activities. Your experiences will certainly differ in terms of the abilities and backgrounds of your students.

The chunk of activities featured on today's post were ungraded, but intended to give students a chance to work in groups. I will signal when we shifted to graded activities.

I was lucky to be working with three very dedicated and creative Annenberg PhD students, Meryl Alper, Andrew Schrock, and Rhea Vichot, and I've given credit where credit is due here, indicating which activities each of them developed for the class.

-------------------------------------------------------------------------------------------------------------------------------------------------------------

Week 3: Facebook and Privacy (Andrew Schrock)

Introduction: The terms of service (TOS) describe the uses that parent companies that maintain platforms and other web services deem acceptable. Among other things, Facebook's terms of service describes the ways that Facebook captures, analyzes, and uses data related to our online identities and interactions. boyd and Marwick described privacy as "both a social norm and a process” – an entirely public or private life would not be feasible (or particularly enjoyable). Privacy is an extremely complex notion, reliant on culture and social context. Feelings of “privacy violations” are often sudden and leave us feeling confused or helpless, such as when our personal information is displayed in unexpected ways. To help us think through the complex negotiations that occur between individuals, platforms, and privacy, we can interrogate the TOS for possible areas of friction between platform-endorsed uses and individual practices.

 

Team activity: Your assignment is to read the terms of service for Facebook with a critical eye. In teams of 2-3, read a section of the terms of service at http://www.facebook.com/legal/terms. You will be assigned one of the following sections: 2 (sharing), 3 (safety), 4 (registration), 5 (protecting rights of others), 9 (special provisions to developers), or 11 (special provisions to advertisers). Please spend 10 minutes reviewing your section and prepare brief responses to the following questions.

 

Questions: What does Facebook consider private? How does it differ from yours? Do you see clauses that strike you as potential violations of privacy? If so, why?

 

What do you think Facebook frames the terms of service this way? How do you think Facebook uses the data it collects? How does Facebook exercise power?

 

Have you altered the privacy settings of Facebook or used social strategies to deliver messages to friends ("steganography" from danah/alice article)? Can you think of times you or your friends have accidentally or deliberately violated the TOS? If so, why did you?

------------------------------------------------------------------------------------------------------------------------------------------------------------

Week 4 Wikipedia Mechanics (Rhea Vichot) Warmup (5 Minutes) [Citation Needed:]

 

http://citationneeded.tumblr.com/

 

http://citationneeded.tumblr.com/post/29905972747/whac-a-mole http://citationneeded.tumblr.com/post/31336657830/victor-salva http://citationneeded.tumblr.com/post/28419289190/placeholder-name http://citationneeded.tumblr.com/post/27763947374/cultural-depictions-of-elvis-presley

 

Questions:

 

  • Why is this funny? What kinds of critiques are being made about Wikipedia?

○      the humor is in the failed attempt at creating an “authoratative voice”. There are some critiques of the editorial policies of WIkipedia as well as the attempts to treat all subjects, no matter how trvial or transitory, with the same voice

○      I also feel there is a subtle poke at how white and nerdy Wikipedia editors are, but that’s just my take - RAV Main Activity: How is Wikipedia Structured (Two Parts: 30-35 Minutes Total) Part I (10-15 Minutes) In groups of 2-3, have students look at one of the following Main and Talk Pages (5-10 minutes):

 

After 5 minutes, have each group provide a quick summary of the main points of their assigned page as well as an interesting discussion thread on the talk page.

 

Questions:

 

  • What ideals are being espoused on these pages?

○      SIngular voice

○      Being not a research circle, but a repository for secondhand research

○      WIkipedia believes in “meritocracy” whether or not that is what happens in reality

  • What kinds of concerns are these policies hedging against?

○      Trolls, Abuse

○      Misinformation

○      Infighting, Faction building

  • Does this make you more or less likely to contribute content to Wikipedia?

 

Goals:

  • Understand what Wikipedia’s editorial policy
  • Understand that these editorial Policies are agreed upon and what assumptions may go into those conventions

Part II (20-25 Minutes) In the same groups, they should visit a Wikipedia page on a topic they are familiar with (A novel, Film or TV Show, Comm theory from another class, A piece of technology, or a historical figure or event). They should look at: (1) The structure and content of the main page, (2) The Talk Page and relevant discussion Points, and (3) The history of the Page and Talk, including the first version of the Page. (5-10 Minutes)

 

Examples:

 

Dr. Pepper

http://en.wikipedia.org/wiki/Dr_Pepper

 

Steve Jobs:

http://en.wikipedia.org/wiki/Steve_Jobs

 

The Assassination of John F. Kennedy:

http://en.wikipedia.org/wiki/Kennedy_Assassination

 

50 Shades of Grey:

http://en.wikipedia.org/wiki/50_Shades_of_Grey

 

Questions:

  • What aspects of the topic were on the page. What was relegated to separate pages? What was missing, if anything?
  • What were the main points of controversy in the talk page?
  • What kinds of changes were made over time? Were they updates to the topic? Were they major changes to the content and form of the article?
  • How do the Editorial Policies above shape the content of the page and the discussion on the Talk Page?

○      Calls for citations, for better sources, and for discounting personal anecdotes as Original research and, thus, unsuitable.

Goals:

  • Practice skills needed for the Research Paper
  • Remembering that Wikipedia Pages are Dynamic, both temporally, and content-wise
  • Understand how the editorial Policies above shape the pages displayed

Pull Back: Some Recent Issues  (5-10 Minutes) Gender Gap among Wikipedia Editors:

http://gizmodo.com/5942168/the-wikipedia-gender-divide-visualized

Define Gender Gap? Look Up Wikipedia’s Contributor List:

http://www.nytimes.com/2011/01/31/business/media/31link.html?_r=3

 

Philip Roth encounters trouble editing his own Wikipedia page

http://www.csmonitor.com/Books/chapter-and-verse/2012/0913/Philip-Roth-encounters-trouble-editing-his-own-Wikipedia-page

“An Open Letter to Wikipedia” - Phillip Roth

http://www.newyorker.com/online/blogs/books/2012/09/an-open-letter-to-wikipedia.html

Questions

  • In what ways do the editorial policies act as a barrier to contribution?

○      the weight of citations overwhelms even claims made by the subject of the article in question.

○      The community’s emphasis on meritocracy and “correctness” mobilizes privilege under the guise of “correct voice” and “citable sources” which shuts out marginalized voices.

  • What possible alternatives could there be to increase participation and the kinds of voices represented on Wikipedia?

------------------------------------------------------------------------------------------------------------------------------------------------------------

Week 5 Advertising a New Medium (Meryl Alper)

Warmup (10 min): “Advertising” New Media

 

Screen 2 YouTube clips: 1)Lots-o'-Huggin' Bear Commercial (circa 1983)

2)Japanese Lots-o-Huggin bear commercial

Questions:

Who do you think is the intended audience for these commercials?

What do you think these videos are trying to sell?

 

Main Activity: Advertising “New” Media (30 minutes: 20 minutes in group, 10 minute share with class)

 

Humans tend to overestimate the “newness” of new media.  Not only do many technologies build on what innovations came before them, but the way a medium is advertised also builds, incrementally and creatively, on prior advertisements and advertising styles.

 

In the book chapter you read, Lynn Spigel talks about “popular media discourses” - ways people talk about or represent (through media) how society experiences media.  Spigel’s big claim is that popular media discourses about television and the family reflected sometimes conflicting viewpoints: that TV would bring families together, drive them apart, but also a hybrid of the two.  She analyzes popular magazine ads as evidence for her claims.

 

This activity will be an exercise in meaningfully comparing and contrasting two print advertisements from different eras but that share some common themes and styles.

 

Students will break into groups of 3 or 4.  All students will have had the PowerPoint sent to them prior to section.

 

The PowerPoint has 6 different pairs of advertisements:

 

1A - RCA VideoDiscs - “How to improve your social life” - 1980s

1B - Hohner Harmonicas - “The Hero of Amateur Hour” - 1940s

 

2A - Dumont Television - “Once upon a time...” - 1940s

2B - Atari - “‘New Frontiers’: Learn to brave new worlds.” - 1980s

3A - Sony - “Sound of a different color - 1980s

3B - Majestic - “For sparkling, vivid colorful tone...” - 1940s

 

4A - Western Electric - “There are still some things Americans know how to do best” - 1970s

4B - Tobe Filterette - “YOU BET the war has changed us!” - 1940s

 

5A - Douglas - “How satellites can give us low cost emergency telephone service” - 1960s

5B - Panasonic - “With a new Panasonic cordless phone, you won’t sounds like you’re calling from another planet” - 1980s

 

6A - Sharp - “The first laptop designed to be your first laptop” - 1980s

6B - Bell Telephone System - “Television” - 1940s

 

Each group will be responsible for one pair of advertisements.

 

Questions:

1. Briefly do an online search for major US & global events during the era of each ad.  How might these ads fit into larger historical trends (e.g. wars, economic up turns and down swings)?

 

2. Read the “copy” (written text) that the ads use.  A) On it’s own, what meaning does the copy have?  B) When taking into account the full visuals of the ad, does the copy take on additional or different meanings? (You’ll want to zoom in to take a closer look at the ads with smaller text.)

 

3. What kinds of anxieties and hopes do each of these ads reflect about:

  • Family life?
  • Social life?
  • Political life (in the US and internationally)?
  • Culture/stylistic trends?
  • Gender?
  • Economic issues?

 

4. Are the people in the ads are actually using the technology or are people are props around the object?  What does the space around the media look like?  How does this make a difference in the ads message?

 

5. Finally, don’t just describe each ad on its own; Put both of these ads in conversation with each other.  How might they complement and/or contradict each other?

------------------------------------------------------------------------------------------------------------------------------------------------------------ Week 6 Hacker Week Discussion Activity (Andrew Schrock)

Introduction – What is open-source? (25 mins.)

Stephen Fry explains free software-  http://www.youtube.com/watch?v=YGbMbF0mdPU

What do you make of open-source? How does it relate with previous concepts we’ve encountered in the class? Why do hackers like open-source? How can it be contrasted with more restrictive control over source code?

 

Protei - open-sourced hardware project - oil skimming  bots http://www.youtube.com/watch?v=vmZ_uy2Ehi4

 

Who is involved with this project? How does hardware hacking differ from software? What observations can you make about the progression of the project?

Second part - Software hacking hands-on activity (20 mins.)

 

One theme of this class is thinking not just about how systems exist in isolation, but how information flows across systems that can talk to one another. Hacking describes a way of viewing technology with a critical eye to understand their inner workings.

 

If-this-then-that is a website that connects "triggers" to "channels." Triggers are activated when something happens, and channels are what is triggered. The combinations are called "recipes" and can be shared publicly and modified. For example, every time you are tagged in a Facebook photo (trigger), you receive an SMS text (channel).

 

In groups of 2, think of a cool or interesting recipe. Look to see if one has been created already. Either use that or create one of your own and make it active. Test it out. Did your idea already exist in a recipe? Can you think of triggers that you want but can’t find?

-----------------------------------------------------------------------------------------------------------------------------------------------------------

Week 7 YouTube's Many Communities (Rhea Vichot)

Group Activity 1: YouTube as Site of Community and Remix Culture
In groups of 2-3, look through and choose a video from a participatory culture you are familiar with. If you can't find one, you can also browse the YouTube charts: (http://www.youtube.com/charts/) and look through the Most Discussed and Most Favorited videos for the past week or month.

Questions: 1) Is it a commercial or amateur production? How can you tell?

2) What kinds of communities are these videos a part of? Is this a convergence of multiple communities?

3) Is the video critiquing or curating commercial content? In what ways?

4) Who are the creators of the content? How might that affect what is either being expressed or what sorts of comments are being made about the video?

5) What sorts of Intellectual Property (IP) are used? Are the uses if IP in your example defensible by Fair Use? How?

Group Activity 2: Creating Remix Videos

Using the YouTube Doubler: (http://youtubedoubler.com/), create a mashup of video and sound. Use the google URL shortener (goo.gl) to post a link on Blackboard.

Examples:

"Ant on a Treadmill Vs. Breakfast Machine-Danny Elfman":

http://goo.gl/iH7Or

 

"Rooster Vs. Alex Jones":

http://goo.gl/6l1kz

 

Questions:

1) What sorts of Intellectual Property (IP) are used? Are the uses if IP in your example defensible by Fair Use? How?

2) What kind of juxtapositions does your example make? Do the juxtapositons made, either in your example or the ones provided, make a critique about the media used?

(MORE TO COME)

What Happened in My Open-Laptop Exam Class? (Part Two)

Learning About Collective Intelligence From the start, the group activities were framed in terms of notions of collective intelligence and participatory culture, themes which had been central to the first part of the semester. By the time they got to the group activities, students would have done papers exploring how Wikipedia works, would have participated in lectures and discussions explaining some of the core findings from MacArthur Foundation’s Digital Media and Learning initiatives, and would have looked at a range of social media and media sharing platforms and their dynamics. We had prepared for the problem sets by having earlier inquiry based activities in discussion which were organized around groups at a variety of different scales but which were ungraded (except in terms of attendance) Students had been given a set of exam questions about a week prior to the midterm, with a subset of the questions appearing on the exam. Students could bring their notes and other materials into the exam and consult them as they filled in their blue books.  Students had the option of sharing information or pooling insights with other students on the midterm, as long as they disclosed who they worked with. Most of the students seemed to work with at least one other person on the exam.

In one case, a team of students formed and posted online their collective responses to each question on the class mailing list the morning the midterm was to be given. This unanticipated situation posed a last minute challenge to the class instructors: we decided to write to the class, warning them that not all of the information contained in the posted answers was accurate, that they should use the material at their own risk and that they should disclose whether they had consulted these responses in preparing their answers. It turned out that one of the students had taken the liberty to posting the work of the other group members and some of them were not happy being placed in that situation. Other students said that they were afraid to even read the posted answers, but for the most part, the class took the situation in stride, there was still a great deal of diversity in the quality and content of the midterm answers. Whatever was going on behind the scenes, students did not mindlessly copy down the information that had been posted.

 

Taking the Final

The team’s performance on the final exam was uneven, but generally, the groups succeeded in creating richer, more fully documented responses than they would have been able to do individually. Some of the responses felt fragmented and contradictory, as if the teams had not been able to fully smooth out differences between members about the best way to approach a question; some of the responses included too much information, including much that was not pertinent to answering the question.  We had tried to break each question down into a series of steps, much like the weekly problem sets, so that students had a good way to structure their problem solving activities. In general, students did best where the questions were concrete and pointed to specific readings or topics from the class; they had more difficulty abstracting from the information provided, speculating about its future implications, or evaluating real world phenomenon based on proposed criteria. The collective process brought forward a strong tendency towards synthesis but set clear limits on their capacity to produce shared critiques. While some of the questions explicitly called on them to bring in their own examples, they tended to still operate within the borders of the class materials rather than going outside in search of new information. These later insights might be consistent with what we know about Wikipedia for example: that participants are often guided by a shared understanding of what an encylopedia entry looks like, that the community’s norms value “neturality” over critique and that there is a ban on publishing “original research.”  Success here rests, then, on correctly calibrating our expectations to value what works well in a collaborative context.

Student Criticisms

For those students who found this process frustrating, the largest single factor identified was a sense of loss of control over their own classroom performance.  One put it simply, “I have more control of my grade the first half of the semester and less control of my grade the second half of the semester.” Many of the USC students are very good at playing the traditional classroom game, calculating how many points they needed to get their desired grades, and giving the teachers what they wanted. If they grew up in a networked culture, they also grew up in a culture based on standardized exams, and so there was a certain degree of discomfort, among many of the students, with a more open-ended process which did not tell them what they needed to know and with a structure which meant that they were dependent on others for their mutual success. As one student explained:

"I preferred doing things on my own because I got stuff done much faster and more efficiently. I did not like relying on my other group members to do readings because I never knew if they had done them properly or not, and some of my team members did not even show up to a single class. That meant that they were going to receive the participation in lecture points based on my participation, and that does not seem fair to me at all.”

 

Others felt bruised by the lack of respect and trust shown them by other team members: "In order to work in a group, people have to understand the strengths and weaknesses of their group members and they have to be flexible.  When there are group members that don't trust other group members and want to constantly be in control, the group fails." One student described the final exam as a "debacle" because the group could not agree on strategies or criteria for producing a solid answer, while another complained about harsh treatment from classmates who did not value each other's contributions: "I have never felt so disrespected in my entire life. Some of the other group members made me feel like dirt, just because they thought that they were better than two of us.”

Many of the frustrations centered around unequal sets of expectations between team members, including a different sense of how well they wanted or needed to do in the class. Here, for example, was a student who compared negatively the experience of working with an assigned group in a required class and the processes which made collective intelligence work outside the classroom:

“I did not particularly enjoy the group portion of the class because I did not trust certain members of my group to complete the work and to do it well. Although the group portion theoretically could have stimulated more conversation about the topics and inspired people to participate in their learning, a couple of members in my group seemed very uninterested and content to skate by on the work my other group members and I did...  I fully understand the value of learning how to work in groups, especially given our shift toward participatory culture; but I assume in participatory culture, the participants actually have some glimmer of interest in the content they are creating."

 

Student Enthusiasm For those who had a more successful experience, they felt supported by their teams and energized by the shared responsibility over the material:

“It was nice to have other people to help with the assignments. Our team worked very well together, and I think learning how to work in teams is an important skill to have. In the second half of the semester, I was pushed to do my assignments because I knew that the team relied on me. Compared to having to do assignments alone, it was nice knowing that if there was a reading that I didn't understand, then there was somebody in the group that could help contribute.”

*********************************** "Honestly, I was a little skeptical as to how group work would ultimately play out and whether it would be successful, but to my pleasant surprise it was a great success. Just as the class was intended, different teammates were responsible for different materials and therefore were able to master different contents of the class and teach them to their team members. While I felt that the first half of the class was also well done, I had an even better learning experience in the second half of the class. While there was some participatory activity going on in the first half of the class, I believe there was a well-working participatory culture in the second half. The professor and the TA's structured the discussions very strategically to be able to push the students to work quickly and efficiently in their teams by grouping their knowledge into a collective product. I genuinely feel that this made the team much greater than the sum of its parts.”

*********************************** “Group work is definitely more challenging. However it challenged me to practice better negotiation and communication skills. I would consider the second half a practical application for all the communication theories learned in past years”

************************************ "I really liked having the groups for the readings and in-class discussions. I felt that I was able to cover so much more material (even if only through the short-hand of my teammates) by examining the notes for ALL of the readings on our Google Doc. I felt that I was more informed coming in to lecture. The first half of the semester, it was often difficult for me to get all of the assigned readings done. But with only one reading per night, it was a lot easier. Plus, I had the weight of my team to encourage me to actually get it done on time."

In many cases, they were thrilled not to have to go it alone, to be able to turn easily to someone else on the team who understood a particular chunk of course material better than they did. And even some who did not have a perfect group experience saw the value in the end of the process:

“If anything it made me realize that we all have limitations. One person can not carry a group. I feel that it all worked out in the end . I wish we had better communication within our group though.”

Some of the teams clearly acquired new techniques for coordinating and collaborating within a network: “Working on assignments together via Google Docs was very helpful because we each knew our roles and could quickly add to each other's work if needed.”

 

Assessing the Experience

For all of the frustrations expressed by some students about students getting equal points despite not doing equal work, a review of the grades by group suggests there was significant variation in their final performance in the class within each team in part because of their individual performance in the first half of the semester and in part because the mechanism of rewarding those who attended and participated in sections worked as it was designed to do.

Overall, students seemed to have reflected deeply about the advantages and disadvantages of the collaborative production of knowledge, a theme which recurred throughout the class, and in the process, they developed a stronger appreciation of  research as a process rather than imaging knowledge as a contained body of information. There's still a lot we all have to learn about making these kinds of group processes function, especially given the degree to which they fly in the face of the ways students have been socialized throughout their formal education to think of themselves as autonomous learners. Clearly, I am troubled by the reports of some of the destructive experiences which occurred within some of the more disfunctional groups, yet, over all, many more students expressed enthusiasm for the process than shared frustrations.

Interestingly, when I taught the subject two years ago with a much more conventional grading scheme, the average GPA for the class was 3.14, while the average GPA for the class with the collective experiment was 3.21, well within the average variation from one semester to the next.

 

 

 

NEXT TIME: THE DISCUSSION SECTION ACTIVITIES COMING SOON: THE EXAM

What Happened With My Open Laptop-Exam Class (Part One)

Background My plans for an open-laptop exam generated a fair amount of buzz when I announced them in the fall, so I figured you would be interested to learn more about how things played out. Annenberg PhD Student Adam Kahn, who helped design this curricular intervention/innovation, is still working through a massive amount of survey data collected about the process, so any observations I share now are provisional based primarily on what I saw from in front of the lecture room and on exit surveys students completed after turning in their final exams.  In general, I think the experiment was successful, even though, with any design process, there are many things I would change on the next iteration. And, as we will see, the experience had some critics among the students in the class.

To remind you, the basic set up was this: Students completed a series of individual assignments throughout the first part of the term, which counted for 50 percent of their total grades. In the second part, they were put onto teams, which worked together on every assignment, including a series of weekly problem sets conducted in the discussion section, contributions to class discussion, and the final exam. Students had to attend the discussion section in order to receive the team’s points for their contributions, but otherwise, participants received their grades based on collective rather than individual performance. We introduced this process into a 200 level lecture hall class on New Media Technologies and Culture, with a population of 110 students, mostly Communication majors, taking what was a required subject for their degree.  You can see the syllabus for the class, including the assignment structure, here.

 

Impact on Class Discussion

My first observation was that the emotional tone of the class shifted dramatically following the midterm as we placed students on teams. The teams sat together in the lecture hall; they chose a shared name, and they used that name to identify themselves when they participated in the class discussions. From the start, there was a strong sense of team identity for most of the groups. I’ve speculated that this approach might work especially well in the context of USC where there is such a strong sports culture.

From the start, I had placed a strong emphasis on class participation during lecture sections, trying to move towards a more Socratic approach to teaching the content. There had been push back early on when I relied too heavily on discussion, and so I had tried to find a balance between short lectures designed to introduce core concepts and then more open ended discussion to allow students to share their perspectives on core debates of the digital age. We struggled a bit with managing discussion in a large lecture hall context: students balked at the mechanics of passing around microphones, but some of the students had trouble being heard in the large space and were thus more reluctant to speak. Over the course of the term, the process started to feel more natural for both the teacher and the students, and we had some very engaged and informed conversations.

As with any discussion class, there were a number of students who were quick to raise their hands and engage, while there were others who were intimidated by the large size of the class. The most active participants continued to dominate discussion in the second half, but there were many others who made their first contributions during this period, either empowered by having teammates supporting them or by the sense of competitiveness that teams introduced into the mix. As one student explained, "I liked that we all sat together during lecture. This enabled us to whisper about the lecture content and, all together, come up with a question to pose or a comment to offer." More dramatically, team members were much more likely to anchor their statements to specific statements or information contained within the readings. Indeed, it was clear that a much higher percentage of the students had done the readings and done them closely knowing that they were dependent upon each other for the quality of information being transmitted to the group.

A highlight of the course came when we conducted a role playing activity in one of the lecture sessions focused around debates about digital piracy and the evolution of new business models for the music industry. Each team was assigned a specific role -- from new artists trying to break into the industry to recording studio executives, from fans to teachers and librarians, from religious performers to international musicians who are developing a following in the United States. The teams were assigned their parts in advance and encouraged to do a little home work so that they had thought through their assigned perspectives. Each group was asked to make an opening statement, which were surprisingly well informed, for the most part, and then, they were given time to negotiate across groups to see if they could identify common interests and propose new solutions to the issues. This was the only time in the term when we encourage activity across groups rather than within groups, and multiple students pointed to this activity as transformative in terms of their understanding of the value of the team process. It also resulted in a spectacular discussion which got students out of familiar debating points around issues of digital piracy and allowed them to develop a more systemic understanding of the issues. I would love a way to create more such experiences across the class the next time I teach it.

Working Within Teams

Students were placed randomly on teams, in the hopes of insuring greater diversity. On the one hand, we felt that if students self-selected teams, they would be more likely to choose people with whom they already shared many common interests, i.e. people who were like themselves. On the other hand, we also wanted to avoid the common pattern of consciously combining strong and weak students onto teams together, which tends to result in the stronger students being asked to carry the load by themselves.  In the exit surveys, students were sharply divided between those who felt that the random assignments insured that they met new contacts and brought more diverse knowledge together and those who felt that some of the logistical problems they encountered would have been minimized if students had been able to work with people they already knew.  Here, for example, was a student who valued being randomly assigned: "When my group worked, we worked efficiently because we didn't know each other at all, so there were few distractions. We were friendly, but didn't have a lot in common, which was conducive to learning the subject material." Yet this student also noted that their lack of familiarity with each other could sometimes result in a lack of accountability:" I didn't make it to class the first day and realized later that no one in my group had taken any initiative to do the necessary organization for future readings, in-class work, etc. No one was really a leader. We couldn't count on each other. There were no ground rules set, etc.”  Some students wanted better mechanisms for dealing with students who failed to contribute to the collective good: "“I think the students should either be able to choose their own groups or somehow get rid of the weakest link." The large scale of the lecture class makes it particularly likely to attract students who are not strongly motivated by the subject matter and who are likely to exploit the good will of their classmates.

Each team consisted of 3-5 students (with the unevenness a product of the uneven number of students who had registered for the different discussion sections which met at different days and times). It was clear from the start that the larger teams worked better, overall, with smaller teams more vulnerable to individual students who let down their team through under-performance.

Most of the teams became effective learning communities, but not all of them did. We had taken steps to insure shared expectations of members, asking each team to write a contract together so that they had a mutual understanding of their responsibilities to each other. We had built in one core check on group participation -- i.e. the students had to attend the discussion section and work on the problem set in order to gain credit for that assignment.  Otherwise, we relied on social mechanisms to insure that they held each other accountable. Through these weekly problem sets, students gained practice working together, learning each other’s strengths and vulnerabilities. We had felt using the discussion sections in this way would insure some regular face-to-face time between group members (as did having students sit in team during lecture).

Overall, attendance in discussion sections increased with the emergence of a team structure, though there were still many students who still did not attend class regularly, a manifestation of the “free loading” problem which often crops up when working within a commons. And for those teams which were struggling with the process, there was a perception that the instructors were not doing “anything” about it. We wanted to resist the temptation of shuffling the teams once the process began, since doing so would be likely to disrupt the coherence of those teams which were functioning well, since we wanted to encourage teams to find ways to work through their own problems seeing learning to self-correct their process as an important learning opportunity. In many cases, teams that did not gel at first did find their footing over time, part of the value of repeated experiences working in teams, while in some cases, teams that had worked well up until that point hit real friction when they turned their attention to dealing with the high stakes final exam. Here, for example, was a student who felt the group had gotten in the swing of things just in time for the exam: "My group members let me down on numerous occasions but our final went so well and so smoothly that I'm having a hard time deciding how I felt about the whole thing overall.”  TAs did give advice to team members who were having a frustrating time; we felt that there were penalties built into the system for those members who under-perfomed -- again, the fact that they did not get points for sections which they missed and the likelyhood that underperforming students had also underperformed during the individual portion of the class.  Next time, I want to provide much greater advice to the students about strategies for insuring team cohesion and meaningful interaction.

We struggled with the question of whether we should have introduced some self-evaluation process where team members could assess what each contributed to the process and so that we could adjust grading accordingly. We choose not to do so for several reasons: We feared that such a practice might further fracture teams which were struggling to survive, raising the tension level at the time when we wanted teams to be developing greater trust in each other, and as importantly, we felt that it would be inappropriate to change the rules of the game mid-process.  Next time we do this, I am going to weigh this question again more closely, since the lack of such formal mechanisms was the single most frequent complaint we heard about the group activities.

 

Designing Problems

Designing the problem sets for the discussion section proved challenging for a number of reasons. We wanted the questions to be sufficiently challenging so that students were motivated to put in the extra efforts and also be able to see that they could indeed do more collectively than they would have been able to do individually. We wanted the questions to be open-ended enough so that students could show what they knew, bring their individual and collective knowledge beyond the class into the process, and have a chance to dig deeper into their own passions and interests. We also wanted to have questions which relied on as many of the readings from the week as possible, since we were encouraging students to divide up the readings between them and then deploy what they needed in response to each problem. Early on, it was clear the teams needed more guidance on the best way to find the information they needed, and the challenges of working in a hour long discussion section (well, 50 minutes really) meant that we needed to simplify the options in order to allow students to get out of the gate quicker. Here, for example, is how one student described their team's frustrations:

"The assignments given in discussion sections were rather long and difficult for the amount of time allotted to students to complete them. The assignments also placed a large emphasis on the skill of being able to produce quick thoughts and responses to questions that students were not fully prepared to answer. If the questions were given prior to coming to class, it would have helped to allow students to come in more prepared and produce more thoughtful and engaging responses."

We streamlined the problems week by week, but students still complained that they did not have time to fully complete the assignments during the class period. (I am going to share with you the assignments in a follow up series of posts).  We had been reluctant to extend the time working on the problem sets because we were afraid the most anxious students would turn them into a much bigger project than intended and because extending them beyond the class time would increase the logistical challenges involved in working with teams.

While most of the students complained about the time constraints, some felt like we had achieved an ideal balance: “I think that the discussion section questions struck the perfect balance in that they pushed the students to produce a lot of quality work in a short amount of time, yet it was completely fair as our knowledge was collaborated from what we obtained throughout the week. I was always very satisfied and impressed with the work we were able to produce in such short periods of time.” Some students used the practice runs to rehearse strategies and refine skills in preparation for the final: “The activities done during discussion section were also beneficial because you could kind of gage what people's strengths and weaknesses in the course material were and how it can be applied to the final.”

Overall, we felt the quality of the problem set responses were strong, with most of the teams scoring in the A-B range, and with signs of general improvement over time, suggesting that, in most cases, the teams were learning to work better together each time they confronted a new problem.

(MORE TO COME)

 

Spreading Independent and Transnational Content

As we count down to the wide spread release of our new book, Spreadable Media: Creating Value and Meaning in a Networked Culture, which I co-authored with Sam Ford and Joshua Green, we are rolling out this week five more essays -- in this case, dealing with core issues from the book's chapters on independent media and transnational media flows. One final crop of essays from the project will go on-line next week. By now, some of you may well be receiving copies of the book advanced ordered through Amazon or New York University Press. We'd love to know what you think. I was lucky enough to be able to share some thoughts about this project this past week with faculty and students at Concordia University. This post is available in Czech language (provided by Alex and Nora Pozner from bizow reviews team).

 

The Long Tail of Digital Games

In the raging debate over the legitimacy and consequences of the “Long Tail” theory (Anderson 2006), few markets have received more attention than those dedicated to digitally distributed video games. Proponents of the Long Tail have argued that digital distribution will finally turn the historically hit-driven game industry on its head—that future revenues will be driven by consumer activity distributed across a huge catalog of video games developed, in large part, by independent game developers as opposed to titanic publishers; that it will prove consistently more profitable to focus on niche audiences in this new world of digital game distribution, rather than to focus on the development of broadly appealing hits; and (for those of us interested in the spreadability model) that a new generation of empowered consumers will actively seek and promote the highest-quality content, driving revenues to the most deserving game developers and leading to a healthier and more vibrant video game ecosystem overall.

There can be no doubt that encouraging signs of this development have begun to crop up everywhere. Many now-prominent independent game developers, such as The Behemoth and 2D Boy, have leveraged console-based digital distribution platforms such as Xbox LIVE, Wiiware, and the Playstation Network (PSN) to reach markets that were previously only accessible via the long arm of a traditional publisher. These developers have not only created award-winning games that have generated significant amounts of profit. They have, in many cases, retained the rights to their intellectual property (IP) and operated with near-total independence, an unthinkable situation for small console game developers only a few years ago. And, while digital distribution on the console typically generates the most buzz, independent developers have made equally great strides on mobile devices, the web, and the PC thanks to a wide variety of channels (stores such as iTunes, Android Market, and Steam; portals such as Kongregate.com; and more generalized distribution through social network sites such as Facebook, to name just a few). Savvy observers have noted that in mobile ecosystems in particular, independent developers have consistently had greater success than traditional publishers in cracking into the “top 10.”

MORE

(Sp)reading Digital Comics

Comic books—especially single issues, or “floppies”—have always been spreadable. As kids in the 1980s, my friends and I would head into our local comic shop, each emerge with an armful of floppies, then spend the afternoon first reading through our own haul and then each other’s. Usually, at least one of my friends’ floppies would be from some larger multipart story arc, and, if it was any good, I’d either go digging through my friend’s collection or thumbing through the store’s back issues to find out what was going on. Sharing, recommendation, drillability, and vast narrative complexity were all part of our everyday lives long before we could even drive.

Webcomics have emerged as an alternative form of publishing that makes such practices even easier. Many webcomics use RSS feeds to deliver new installments via email or RSS reader applications, and many webcomics offer forums where fans can chat and bicker and share their favorite comics with one another, much as my friends and I did in person so many years ago. Now, I can recommend comics to friends around the world either by emailing them a link to a webcomic’s site (and thus the latest comic) or a “permalink” to the archived page or, more commonly now, by texting, IMing, or Facebook messaging them such a link. Many webcomics, such as Emily Horne and Joey Comeau’s A Softer World, include built-in widgets for fans to recommend them on online services such as Digg, Facebook, Reddit, StumbleUpon, Del.icio.us, Technorati, and Twitter. Scott Kurtz’s PVP includes widgets to share each strip on twenty different services.

Unlike traditional print comics, for which most writers and artists labor under “work for hire” contracts for large publishers such as Marvel and DC, webcomics are typically owned and operated by their creators and rely on revenues generated by advertising, fan subscriptions/memberships, or sales of ancillary merchandise. As a result, for creators, getting individuals to purchase a single instance of their work (such as a traditional print floppy) is less important than establishing an ongoing relationship, aggregating a large recurring audience over time. The simplicity of the URL system supports this—when recommending a comic to a friend, I could copy and paste an image of the comic itself into an email, stripping out the context, ads, and links to the related merchandise, but why bother when sharing a link is so easy?

MORE

 

The Use Value of Authors

A key dilemma for both media consumers and producers in today’s media environment is discoverability: with so much media spreading, and even more desperately wanting to be spread, how do we choose what to consume? Consequently, consumers need highly effective filters to direct them to the media they are most likely to enjoy and away from that which they are unlikely to enjoy; producers, meanwhile, need to develop techniques to ensure that their content enjoys safe passage through such filters and finds the audiences most likely to enjoy their work. Herein lies the importance of, and the use for, authors.

As compared to creative figures—producers, writers, artists, designers, and a wealth of other terms in common parlance to describe those who make media—an “author” is someone to whom we attribute a heightened level of authority and autonomy over the item of media in question. Most consumers operate on the assumption that a vast amount of media isn’t worth personally consuming, either because it is corporate hackery written by committee just to make a fast buck, because it is amateurish and incompetent, or simply because it doesn’t appeal to any of their interests. An author, though, is a totem of sorts that signifies a certain level of skill and singularity of vision. To talk of authors for professionally produced content is to assert creativity and self-expression in what can too often be characterized as a faceless, paint-by-numbers industry, while to talk of authors for amateur-produced content is to attribute artistry in what can too often be characterized as a world full of everyone’s uploaded cat videos. Discussing authors can be a way to validate the product of said authors, and hence to allow ourselves to discuss art, meaning, and depth in some popular media without attributing artistry or depth to all popular media.

At the same time, precisely who the author is can be hotly contested and variable, as the content industries may pose one author, while fans may look to others, sometimes working to uncover who the “real” author is. For instance, while The Simpsons is often popularly spoken of as Matt Groening’s, many fans have nominated other individuals in the show’s production as the true source(s) of the show’s perceived brilliance, and hence as its author(s). The fact that people would bother to argue over who the author is should signify how much the title of author matters, and it offers an initial sign of the importance of authors. MORE

 

The Swedish Model

Sweden is a small country, yet it has one of the world’s biggest and best-selling music scenes. You might think ABBA, and you wouldn’t be wrong, but they’re just the best-known starting point of a very long tail, with thousands of bands spanning every genre and degree of success. Sweden is also home to The Pirate Bay, the world’s top torrenting site, which ABBA songwriter Björn Ulvaeus has decried as made by and for those who are lazy and stingy and don’t understand that, if creators can’t anticipate payment, they will never release music (“ABBA Star” 2009). Since the advent of recording in the early twentieth century, recorded music has been the central economic good of the music business. Hence, it is no wonder that the mainstream industry has been so vociferous in its efforts to demonize and sue uploaders and to support national policies that limit the ability of listeners to spread music.

Further down the tail, though, Sweden is home to many artists and labels trying to forge a new way through this thicket, one that rejects the notions that certain payment is a precondition for artistic expression or that file sharing detracts from the economics of their business. The attitudes and actions of The Swedish Model, a consortium of seven independent labels committed to a more optimistic dialogue on music’s future, and other Swedish labels and musicians put spreadability at the center of their hopes for the future of the music business. The tiny label Songs I Wish I Had Written, headed by Martin Thörnkvist, who also heads The Swedish Model, shared an office with a Pirate Bay cofounder, and Thörnkvist uploads his label’s catalog in the highest quality to Pirate Bay. Labrador, another Swedish independent label, gives away annual samplers through Pirate Bay and posts all its singles for free download on its website.

These entrepreneurs have taken to heart that if their music doesn’t spread, it may as well be dead. The logic goes like this: We are small and have minimal budgets. There are few mainstream venues that will promote our music, so few people will have the opportunity to hear it through mass media. The more people who hear it, the larger the audience will become. Even if most of that audience does not pay for CDs or mp3s, the slice that does will be bigger than the entire audience would otherwise have been. And the slice that doesn’t pay to buy music may well pay for other things. As Thörnkvist put it when addressing the music industry audience at MIDEMNet, “I’d rather have one million listeners and one hundred buyers than one hundred listeners and one hundred buyers” (2009).

MORE

 

Transnational Audiences and East Asian Television

Consider a clip from the Japanese variety show Arashi no Shukudai-kun that recently made its way onto YouTube in early 2009: a small group of Japanese pop singers are challenged to eat a “surprisingly large” hamburger named after a city in the Ibaraki prefecture and are joking about how “Super American” the situation is. They suggest that the burger inspires them to don overalls and grow “amazing” chest hair, while Bruce Springsteen’s “Born in the U.S.A.” blares in the background. The clip was then subtitled in English by two fans based in Australia and circulated based on its appeal to English-speaking audiences of the “J-pop” performers in the video as an embodied spectacle of Japanese popular culture. Various versions of the clip were distributed online through fan communities on LiveJournal, a Russian-owned social blogging platform with offices headquartered in San Francisco, and other forums, and fans shared the links through their blogs, Facebook, Twitter, Delicious, and other social media channels. In the process, the Arashi no Shukudai-kun clip was recontextualized, reformatted, resubtitled, and diverted to new (and sometimes unexpected) audiences at every step along the way. Far from exceptional, there are countless clips like this one on YouTube: in the global spreadable media environment, its crisscrossing path back and forth across multiple national, linguistic, and cultural boundaries is becoming perfectly common.

Not only is the transnational movement of media becoming increasingly pervasive; it has also become significantly more—and more visibly—multinodal. Thus, we must go beyond the use of Bruce Springsteen in the background of a Japanese variety show as part of a parody and indigenization of Western cultural materials to consider its subsequent movement as it is taken up, translated, and circulated by grassroots intermediaries, passing through divergent and overlapping circuits, often outside the purview of established media industries and markets. In short, we must look beyond sites of production and consumption to consider the practices of transmission and the routes of circulation—the means and manner by which people spread media to one another—which are increasingly shaping the flow of transnational content.

MORE

 

 

 

A Game Level Where You Can't Pass

  This fall, I had the opportunity to teach a PhD seminar on media theory and history focused around issues of medium specificity and intermediality as part of USC's iMAP Program. Here's how the Cinema School describes that innovative degree program:

Created in 2007, the interdivisional program in Media Arts & Practice (iMAP) situates technology and creative production alongside the historical and theoretical contexts of critical media studies. This practice-oriented Ph.D. program provides students with both practical experience and theoretical knowledge as they work to define new modes of research and production in the 21st century.

Media Arts & Practice was inspired by recent developments in media and technology that have altered the landscape of media production, analysis, distribution and display. Our goal is to support a new generation of scholar-practitioners who are able to combine historical and theoretical knowledge with creative and critical design skills. Students who complete a Ph.D. in Media Arts & Practice will be uniquely prepared to shape the future of media and scholarship, and to actively engage in the emerging cultural, technological and political dynamics of a global media ecology.

Media Arts & Practice integrates the strengths of each program within the School of Cinematic Arts (production, critical studies, writing, interactive media, and animation & digital arts) by offering students the opportunity to substantially design their own course of study. The core iMAP curriculum consists of three foundational courses in design, media and theory, plus a professionalization seminar devoted to exploring emerging movements in media technology, theory and practice. Students have unprecedented freedom to define and pursue their own specialization by drawing on the course offerings and world-renowned faculty across the School of Cinematic Arts and utilizing the resources of the school's state-of-the-art digital production facilities.

You can imagine how much fun it is to introduce a comparative media studies perspective to this diverse, creative, and intellectually engaged group of students, and helping them to think more deeply about how theoretical and historical perspectives might further inform their own expressive practice as media artists and designers.

What follows is one of several essays produced for the class which deal with cutting edge developments in the Independent Games world. In this case, Micha Cárdenas discusses two recent games which seek to explore transgender identities and experiences.

A Game Level Where You Can’t Pass Micha Cárdenas

When one plays a video game on a computer, does the game maker’s identity matter? Or does the player’s identity matter, in terms of game play? How does one understand the identity of a game theorist in relation to their writing? Recent independently produced games by transgender women game designers Merrit Kopas and Anna Anthropy open up a series of questions about the nature of computer gaming. How much does the metatext for a game shape our experience of it? How do players identify with characters whose gender they do not identify with, or understand at all? How can the rules of oppressive social structures like the binary gender system become part of game play? To consider these questions, I will rely on theories from game studies scholars as well as looking at comparative game examples.

Lim is a game created by Merritt Kopas that is playable on web browsers that support HTML5. I found this game because I am facebook friends with Kopas. She posted this on Facebook and on Tumblr ‘I made a game called Lim. A friend describes it as being about “the tension and violence and dread and suffocation of passing.” Play it online here.’ The kind of passing being referred to here is passing as a desired gender, as in the case of a transgender woman, and the ensuing degrees of violence from verbal to physical which ensue when one fails to pass as male or female.

My experience of the game was certainly shaped by the metatext in this case. As it is an extremely simple graphical game where the player’s avatar is a colored cube and other colored cubes attack you if your color is different, there are many possible readings of the game. On the game creator’s website, the link to play the game is preceded with this quote from Erving Goffman: “there seems to be no agent more effective than another person in bringing a world for oneself alive, or, by a glance, a gesture, or a remark, shriveling up the reality in which one is lodged.” As such, it is stated from the onset to be concerned with social identity and it’s difficulties, but is only situated as a game about transgender experience by virtue of its creator’s identity.

This opens up a host of questions, such as how much a game, or any work of art, should be evaluated based on the identity of its creator? Further, given the knowledge that the game depicts the violence of the everyday experience of transgender people, how do cisgender (non-transgender) players experience the game? Richard Schechner’s response to Markku Eskelinen is useful here, when he asks “What we don't know about the ‘real life’ of computer games are the social circumstances that surrounds, and to a large degree guides, their playing. That is, what ‘other’ stories are the players enacting?” While in many cases these subversive readings allow transgender players to see themselves in cisgender characters, in the case of Lim we can understand multiple readings of complex social interactions to arise from the very simplicity of the aesthetic and the ability of players to identify with incredibly simple objects if they have the ability to control them in a game context.

From a game studies approach, the question of a player’s identity in relation to a character has been considered largely in terms of race and binary gender configurations. While Henry Jenkins writes in the section “Play as Performance” that “we don’t speak of controlling a cursor on the screen when we describe the experience of playing a game; we act as if we had unmediated access to the fictional space,” such claims at immediacy seem to elide the possible alienation some gamers feel based on the disjunction between their identities and the available avatars. In From Barbie to Mortal Combat, Jenkins and Cassell write “historically, gender was an unexploited category in video game design, with male designers developing games based on their own tastes and cultural assumptions… Yet, as feminist critics note, as long as masculinity remains the invisible norm, the default set within a patriarchal culture, unselfconscious efforts are likely to simply perpetuate male dominance.” While it would be an easy step to transpose this claim to cisgender game designers perpetuating the dominance of the gender binary, as most games today still only depict primarily male and female characters, perhaps more interesting possibilities arise when one goes beyond simple claims for more representation of transgender people in games. What Lim demonstrates is a set of fundamental game mechanics that emerge from a life experience that exceeds gender binaries.

One striking characteristic of Lim is its sparse set of instructions. The only text on the actual page for the game is the following: “arrow keys move, z to blend in | note: contains flashing lights and shaking effects. Made with Construct 2 — the HTML5 game creator” Given these instructions, the player is left to determine the game mechanics on their own. In a review of Lim, Porpentine writes “Lim’s mechanics are the message…. you have to struggle and mash the keys and slide along the walls just to scrape into the next room and when it’s over you feel like you never want to do that again so you’re going to be really careful about passing in the future, it’s just not worth—ohhhhhh.”

One of the most effective moments in Lim is its moments of total breakdown. Players have complained that at times the game becomes unplayable, one’s avatar gets blocked from proceeding, from passing as it were, and at other times the aggressive game enemies may knock one’s avatar completely out of the bounds of the game world. To me, these are the most revealing moments, because in reality, at times transgender people are not able to pass into spaces they want to enter, or are trapped in spaces they want to escape, or are murdered because of their gender expression. These moments in the game play are particularly revealing.

Another game designed by a transgender woman is Dys4ia by Anna Anthropy and it similarly uses game mechanics to convey parts of its message. The game can be played online, like Lim, and is an autobiographical game about the six-month period in which she decided to start hormone therapy. Again, this transgender game maker has chosen to make the game mechanics a reflection of the difficulties of their experience as a transgender person. By doing so, they create a medium specific experience in which players of the game can experience some small degree of the feelings involved, instead of merely seeing or reading a representation of them.

In Dys4ia, there are 4 levels, “Gender Bullshit, Medical Bullshit, Hormone Bullshit, Is Gets Better?”. Each of these levels is composed of a number of mini games. Each mini game has its own rules, controls and directionality. In effect, a mirror of the experience of transgender people navigating the complex world of hormone therapy is created because the player has to figure out an entirely new set of rules in each of these mini games. As players read the text associated with each mini game, such as “shaving is humiliating” or “now to find a good clinic” or “my breasts are too sensitive to touch”, they are introduced to a new game mechanic and they have to figure out the new rules quickly. In my own experience, as a transgender woman, this is very similar to the experience of hormone therapy, where each new obstacle: psychiatric therapy, doctor visits, personal relationship issues, must be deciphered and figured out, like a game, yet each game has its own unique set of rules and mechanics.

Dys4ia provides a useful example for game studies. The simplicity of Dys4ia’s mini games mirrors the statement by Eskelinen that “the main thing is that any element can be turned into a game element, and a single element is enough to constitute a game if it allows manipulation, and this fact alone allows combinations not witnessed in narratives or drama.” Many of the mini games in Dys4ia are extremely simple, such as a figure dodging the harsh words of anti-trans “feminists” which fly across the screen horizontally by moving up and down, a mechanic similar to classic games like Pong or Galaga, but in this case associated with the drama of the personal struggles of a transgender woman. The online indie game format allows Anna Anthropy to create a very simple aesthetic expression of her experience, similar to Galaga in its degree of complexity, yet differentiating from Galaga in her choice of colors and iconography.

The questions raised by games made by transgender game designers can be informed by game design theories written by the artist Eddo Stern. Stern describes a process he uses in game design as phenomenological game design that takes into consideration the embodied experience of players. In a lecture given in the cinema school at USC, Stern asked how game design can be changed by a consideration of the player’s embodied experience, for example, asking how role playing games may be different if players actually had to be charismatic in order to have a high charisma score for their character, or how a player’s sense of direction may be incorporated into the abilities of an in game character, or how games may be designed for deaf or blind players. With these considerations in mind, he has designed a game called Dark Game, centered around the struggle of two players to either bring light to a world or bring a world into darkness. He describes the game as a sensory deprivation game, and one image of the game shows a player with a hood over their head obstructing their vision, and the player can only feel the contours of the world through the haptic feedback in the PS3 controller. Stern has worked with differently abled people such as blind or deaf people in the play testing of this game, and the interface of the game clearly reflects this, including for example the character creation system contains a menu which is both visually presented and has every word spoken aloud. Dark Game, as an example of phenomenological game design, provides a comparison to the games Lim and Dys4ia by introducing the differences in game play between transgender players and cisgender players.

When I, as a transgender woman who has had many experiences similar to those described by Anthropy, played Dys4ia, I was literally brought to tears by the emotional connection I felt in the game. In contrast, watching a queer identified cisgender friend of mine play the game, she was simultaneously interested in the content and confused by it, adding an additional layer on top of the challenge of learning the controls. In particular, the mini games that dealt with Anthropy’s personal relations with their girlfriend were the ones that were both most emotionally compelling to me and confusing to my friend who played the game in front of me. The analysis in this paper was in fact aided by my experience observing this friend of mine play the game and our subsequent conversation in which they were able to make useful observations about the game mechanics that had escaped me in my emotional response to the game. These differing experiences point to the importance of a consideration of complex gender and sexual identifications of players and designers of games as there is much possibility for enriched experiences in games aimed at specific publics rather than targeted for an assumed mass public. Further, this short example additionally lends support to the importance of consideration of other social characteristics of players and designers in game design, including race, ability, economic class, immigration status, body size and more. Each of these characteristics offers a rich set of theoretical history from which game designers and game studies scholars can draw to add meaning and affective impact to their work.

While both Lim and Dys4ia appear to be incredibly simple games with low resolution two dimensional graphics and simple game mechanics, their social context allows for a deep richness of play, design and theorization. Games produced by transgender game designers about transgender experiences open up a space for a consideration of the intersections of game studies, gender studies, phenomenology, narratology and disability studies. Further, these fields can be combined with writing from feminist cinema scholars who look at reception and subversive readings or critical race scholars who consider the intertwined construction of race and technology to give further support to these theories. The study I have undertaken here is only a sketch that gestures to the possibilites of studying these games and their implications.

Micha Cárdenas is an artist/theorist who works in social practice, wearable electronics and intersectional analysis. They are a PhD student in Media Arts and Practice (iMAP) at University of Southern California and a member of Electronic Disturbance Theater 2.0. Micha’s project Local Autonomy Networks was selected for the 2012 ZERO1 Biennial in San Jose and was the subject of their keynote performance at the 2012 Allied Media Conference. Micha’s book The Transreal: Political Aesthetics of Crossing Realities, published by Atropos Press in 2012, discusses art that uses augmented, mixed and alternate reality, and the intersection of those strategies with the politics of gender, in a transnational context. Micha holds an MFA from University of California, San Diego, an MA in Communication from the European Graduate School and a BS in Computer Science from Florida International University. They blog at transreal.org and tweet at@michacardenas.

Transmedia Synergies: Remediating Films and Video Games

Earlier this fall, I received an email from a UCLA Cinema and Media Studies graduate student Mathias Stork, sharing with me a video he had produced for Janet Bergstrom's "DVD Essay for Film Analysis" and Steven Mamber's video games class.  Here's how Stork describes his essay:

I researched, planned, and produced it within a ten-week period. It has not been altered since I submitted it for class review. As a result, it is not an overly polished work. In the interest of time, I had to make several concessions in terms of style and argument (for instance, I would have preferred to use a video game font and elaborate upon the narrative dimensions of media convergence in the digital era). Nevertheless, the work, as it is, effectively reflects, I believe, an increasingly important topic in film and media studies. The goal of the video essay is to sketch out the culture of synergy situated at the intersection of cinema and video games, taking account of journalistic, industrial, and, predominantly, aesthetic correspondences between the two media. In my opinion, it represents a 'work-in-progress', designed to stimulate interest and future research.

The video essay was published in the Winter 2013 issue of MEDIASCAPE, available here

I immediately know I wanted to pass this video essay along to my blog readers -- for many reasons. First, because it represented an innovative form of scholarship. I am hoping we will see more examples of these kinds of analytic video essays in the future, and there are several others featured in this issue of Mediascape. Second, because the issues it discusses -- having to do with the interplay between video games and cinema, notions of remediation and transmedia storytelling -- are ones which we regularly discuss through this blog and which I know many of my readers are finding ways to teach. Stork's video essay reviews a broad range of theorists and approaches which we might take to think comparatively about old and new forms of entertainment and illustrates them with a compelling selection of clips from contemporary films and games. I know this is a video I will be using in my own teaching in the months ahead and if you are teaching new media or transmedia, you may also find it a welcomed resource.

Transmedia Synergies - Remediating Films and Video Games from Matthias Stork on Vimeo.

Matthias Stork is a Masters student in the Cinema and Media Studies program at the University of California – Los Angeles, USA. He researches the intersections of film and digital media, especially the synergies between films and video games, as well as the aesthetics of digital marketing, fandom, and the forms of digital film studies. His work has appeared in Frames, Cinema Journal, Press Play, and Film Studies for Free. He served as META section editor for the Winter 2013 issue of Mediascape and is currently the co-Editor-in-Chief of the upcoming issue. He is also the co-editor of Superhero Synergies: Genre in the Age of Digital Convergence (Scarecrow Press, 2013).

Spreadable Media Spreads New Joy For 2013

So, we are now roaring into 2013 with the next installment of essays associated with the launch of Spreadable Media: Creating Meaning and Value in a Networked Culture, which I co-authored with Sam Ford and Joshua Green. The book is due out from New York University later this month. Each week, we are releasing a series of commissioned essays associated with the book, written by various friends, colleagues, former students, most of whom have at one time or another been affiliated with the Futures of Entertainment Consortium. The Consortium, among other things, runs two conferences per year -- one on the East Coast (Futures of Entertainment, hosted by MIT) and one on the west coast (Transmedia Hollywood, which is jointly hosted by UCLA and USC). These essays are tightly integrated into the book's argument, but they are also intended to stand alone as spreadable content, and we hope that you will feel free to pass them along through your various social networks.

I have been writing about the core concept of Spreadable Media via this blog for several years now, and it has already inspired rich discussion. I thought I would share with you an outstanding video, which uses Spreadable Media concepts, to explain the Caine's Arcade phenomenon. If you do not know the original Caine's Arcade video, check it out below.

Now, here's the video explaining what happened produced by Stephanie Linka, a student in a class taught last Spring at George Washington University, by USC Annenberg School of Communication and Journalism alum Nikki Usher.

How Caine Won the Internet from Stephanie Linka on Vimeo.

And now onto our regularly scheduled series of essays. Today's crop are focused around forms of participation within a networked culture.

The Moral Economy of Soap Opera Fandom C. Lee Harrington

Soaps accompanied my real life as a stay at home mother, chronicled my years as a working adult, kept me company when I was alone, gave me something to bond with my mother, sisters, daughters, and daughter-in-laws over.

—52-year-old soap opera viewer who has been watching General Hospital for 46 years, One Life to Live for 41 years, and All My Children for 39 years; quoted in Harrington and Bielby 2010

I have long been fascinated with daytime soap operas, both as a source of pleasure in my own life and as the central anchor of my research on media industries, texts, and audiences. Soaps are distinct from other media forms due to their longevity in the U.S. television landscape (the average age of soaps airing in 2011 was 40 years), the daily installments of “primary” text (260 new episodes per year, per soap), their celebration and magnification of emotional expression, and the possibility of lifelong relationships forming between loyal viewers, soap characters, and the communities in which those characters live and work (see the epigraph). No other form of media fiction offers comparable dailiness, intimacy, and familiarity over the long haul.

Soaps’ longevity poses challenges to researchers, who struggle with the sheer volume of textual material produced, as well as to the soap industry, which struggles with staying true to shows’ long narrative histories and developing characters in “real time” while aligning those narratives with contemporary tastes of both newbies and lifers. Balancing these potentially competing demands generates a particular moral economy within soap opera fandom. The research on soap fans that Denise Bielby and I conducted in the early 1990s (Harrington and Bielby 1995) captured the beginning of fandom’s migration to the Internet, with viewers experimenting with electronic bulletin board discussions as a supplement to their investment in other aspects of “public” fandom (attending industry-sponsored fan events, buying fan magazines, joining fan clubs, etc.). In our book, we made a distinction between legal ownership over soap narratives and what we called “moral” ownership over them—fans’ sense that soap opera communities and characters are “theirs,” rather than belonging to the writers, actors, directors, or producers.

This sense of ownership is rooted in at least three factors. First, “soaps’ very success at creating and sustaining a seamless fictional world [. . .] creates a space for viewers to assert their claims when they perceive continuity is broken” (Bielby, Harrington, and Bielby 1999, 36). Second, viewers regularly outlast soaps’ revolving writing and production teams. Many long-term fans have been invested in their show(s) longer than the people creating them (as, often, have several of the actors playing the characters, leading to interesting ownership struggles within the industry [Harrington and Brothers 2010]), and they often do know their show’s history better. (The same point can be made of long-term sports fans or movie-franchise fans, contexts in which transgenerational fandoms outlast coaches, players, actors, directors, etc.) Third, soap production schedules allow the industry to respond relatively quickly to fan complaints and concerns, giving fans a sense that their opinions can make a real difference. MORE

How Spreadability Changes How We Think about Advertising Ilya Vedrashko

You can’t spell “spreadability” without “ad.”

The vision of unpaid people cheerfully passing around ads they love has been a guiding light for marketers for more than a decade now. And what’s not to like? An ad that gets passed along receives extra attention. The Good Housekeeping stamp of consumers’ approval that such transmission suggests is assumed to add trustworthiness to the message. An ad that “goes viral” scores extra eyeballs.

But while the demand and the budgets for “viral” have been growing, it’s been surprisingly difficult to find a permanent box for spreadable media on the modern agency’s org chart. While many different disciplines—creative, media, public relations, social—are claiming ownership, a systemic problem has prevented spreadability from gaining a true acceptance.

Ad agencies, like factories of the industrial era, are a particular arrangement of means of production, highly specialized labor force and scarce resources optimized around efficient mass manufacturing of a particular type of output. For agencies, this output consists of ad units placed in print, television, online, radio, outdoor, theaters, events, and so on. An average agency produces and places thousands of such units on behalf of its clients each year.

These ads—paid announcements that appear in media—come in a finite variety of formats and sizes, and their production is scalable to the point where much of it can be, and has been, automated and outsourced. Ads are designed to elicit responses along the vector “see, like, remember, buy.” The agencies are structured around maximizing the number of these responses. Media departments craft media plans that try to ensure the highest number of the right people see the ad at the lowest cost. Creative departments are judged by the number of people who like and remember the ad. Ultimately, the agency’s output is evaluated against the number of people who buy the advertised product. The more people see, like, remember, and buy, the more successful the agency is in the long run. MORE

Soulja Boy and Dance Crazes Kevin Driscoll

During the summer of 2007, U.S. pop media seemed saturated with talk show hosts and pro athletes dancing along to “Crank Dat (Soulja Boy).” By the time an official music video was shot in late July, the dance craze was already approaching an apex, with new videos appearing daily on MySpace and YouTube. Close inspection of the phenomenon reveals a diverse array of overlapping audiences exploiting “Crank Dat” as a producerly framework for the expression of personal, social, and political messages. Steeped in southern hip-hop’s independent tradition, teenage rapper Soulja Boy Tell ’Em championed the songs, dances, and videos produced by these audiences in pursuit of his own commercial success. “Crank Dat,” for all its confusion, contradiction, and welcoming incompleteness, is a valuable demonstration of spreadability in practice.

In the dominant narrative of the 1990s, hip-hop was driven to pop dominance by a rivalry between Los Angeles and New York City. Excluded from mainstream media channels, artists living in the southern U.S. were forced to develop an alternative hip-hop industry supported primarily by locally grown “indie” record labels with connections to regional radio personalities, nightclub DJs, and mom-and-pop record-shop owners (Grem 2006). This independence enabled the southern artists to develop innovative sounds and styles quite distinct from their coastal peers. In 2003, with CD sales flagging, major record labels turned to these indies in search of new talent to revitalize the industry. Among the many southern styles attracting attention, snap music deviated the most from the conventional hip-hop template. Snap’s minimal drum programming and repetitive lyrics destabilized unquestioned hip-hop norms such as the value of complex wordplay and the use of funk and soul samples. MORE

Television’s Invitation to Participate Sharon Marie Ross

In Beyond the Box: TV and the Internet (Ross 2008), I argued that television shows starting in the late 1990s increasingly seemed to be “inviting” television viewers to become actively engaged with the TV text, often through the Internet. I saw three forms of invitation emerging: overt invitations, where a TV show obviously invites a viewer to become involved (e.g., American Idol’s calls to phone in a vote); organic invitations, where a TV show assumes that viewers are already actively engaged and incorporates evidence of this within the narrative of the show—or, in some cases, television network (e.g., Degrassi: The Next Generation’s attention to the role of new communications media in teens’ lives, and The N network’s use during Degrassi episodes of interstitials that feature teen viewers texting and IM chatting via The N’s website); and obscured invitations, where a TV show’s narrative complexity demands viewer unraveling that drives fans to online applications (e.g., Lost’s dense referencing of philosophers and artists as clues to the “hidden” meaning of the island and its inhabitants).

In discussions with Henry Jenkins since, I have suggested that organic invitations are likely to become the dominant form of TV invitations to participation. Today’s texting, IMing, web-surfing teens will become tomorrow’s multimedia-tasking adults, who will likely only be followed by a new wave of teen TV watchers who will be engaging in yet-to-be-imagined forms of new media communication.

Such developments are reverberating throughout all of media, from increasing demands on print journalism to be more present online to the use of branding in the spread of media franchises across TV, film, and music in such a way that demands more widespread knowledge of marketing from all media professionals. And such changes tend to spread throughout the TV landscape—even CSI has popular online applications, after all. MORE

What Old Media Can Teach New Media Amanda D. Lotz

While it may be the case that you can’t teach an old dog new tricks, the question remains whether that old dog can teach a new dog anything useful from its existing repertoire. Or, in terms of spreadable media, can the “old”—or, as I prefer, “established”—processes of media industries for creating entertainment content teach those who are endeavoring on the creation of spreadable media anything of value? In the overinflated rhetoric of new media, media revolutions, and change, too often we lose track of basics and fail to consider that most of what seems new and different isn’t really, either. In this essay, I identify some of the established characteristics of entertainment-based media industries that remain relevant in an era of spreadable media and explore how some of the strategies these industries have developed to deal with their particularities do or do not apply to the spreadable media context.

A key starting point for understanding entertainment-based media industries is acknowledging that they are different from most other business sectors—often in particularly frustrating ways for their practitioners. This “difference” of media industries means that the rules and practices that hold for and prove productive to commercialization practices elsewhere simply don’t work, or at least don’t work as effectively, for these media companies. One of these key differences is captured in the maxim “nobody knows,” also expressed sometimes as the acknowledgment that such media industries are “risky businesses.” This sense that nobody knows results from the fickleness of audiences when it comes to creative and entertainment goods. Conventional focus-group testing or the combination of known “successful” features tend not to be particularly predictive of success in the design of a new media good. In other words, you can’t test or engineer your way to a hit with any certainty.

Considering the spreadable media successes of the past few years, I suspect the “nobody knows” maxim is likely to be true of the circulation of spreadable media to the same degree it is for the distribution of established entertainment media. Try as we might to identify common features or characteristics, we fool ourselves if we think we can anticipate a formula for producing creative content likely to catch the cultural fancy of any particular audience at any given moment. But all is not lost; these media companies have developed a number of strategies designed to counter some of the uncertainty of their established platforms, and some of these strategies might prove productive for making spreadable media as well. MORE

For those of you who were at the Modern Language Association conference this past weekend, you might have had a chance to buy an advanced copy of the book. If you did, we'd love to hear what you think, so feel free to drop a note here or even better on the Spreadable Media website.

The More We Know: Academic Games Research and Industry Collaboration (Part Three)

In many ways, iCue was also designed to respond to some of the challenges confronting contemporary journalism. What insights did you take from this project about the difficulties of engaging young news consumers and the challenges of reforming current journalism practices?

This challenge was part of the original vision, but NBC was quite wary of what students might do with their media if left to their own devices, or what they might report on if they were the ones doing the reporting.  The remix ideas were quite limited through the games.  And the participatory journalism was a successful small scale experiment that was cut from the larger rollout.

 

You frame this book as an account of a "failure," yet you end with some hope that the lessons learned through iCue have informed subsequent initiatives by NBC News. In what ways?

NBC has learned a lot about what it takes to make something for the education market in terms of design, marketing and messaging.  Many of the same staffers remain in their NBC Learn department. They can now use that knowledge to do some interesting things.  They are certainly taking an incremental approach to making such change though, starting from the place that they know teachers are interested in and then slowly pushing those boundaries.  They have told us they want to bring back games and social media in their project.  The market is certainly more ready now than it was six years ago - we hope that they take that risk.

To its credit, NBC has also elevated the public conversation around education through the annual Education Nation summit and its associated workshops and presentations around the country.  To see a major network devote its “A Team” and multiple channels to shine a spotlight on important issues is perhaps one of the greatest outcomes of the “failures” that their project team encountered early on.  As we said, many of the core team, including the senior producers who believed in the initial project enough to leave the safety of their traditional roles, are still fully engaged in NBC Learn.  Their commitment to improving education is laudable and should be recognized.  They are warriors for the cause.

Many academic projects proceed with the assumption that "if we build it, they will come." What might be a better approach for academic researchers wanting to establish a community around their educational interventions?

Marketing.  Academic projects don’t think enough about this and often funders don’t provide for this portion of the project.  But academic projects need marketing too in order to get out there.  Yes, there are viral successes that have foregone this step, but those are few and far between.  We have seen marketing work in our project Vanished, which got thousands of kids playing an alternate reality game about science over the course of 6 weeks, and we have also seen in with our recent Lure of the Labyrinth challenge, which attracted tens of thousands

How did the iCue project contribute to the development of the Learning Games Network? What new model have you adopted for promoting innovation in education around games-based learning?

The challenges we confronted in getting the NBC team to understand the research and then apply it in design inspired us to start a non-profit that would help bridge the gap between research and practice.  We realized we could be better advocates for change as partners with a wide variety of stakeholders, supporting their efforts through the entire game-based learning pipeline, from design and production to implementation and student assessment.  Coming to understand the myriad challenges that are both shared and unique to textbook publishers, national broadcasters, and international technology companies as they strive to innovate in the education market has helped us explore better, we think, strategies to support their business goals.  We want to enable market leaders to succeed because those victories, small and large, ultimately raise the awareness of the power and potential game-based learning products and services. In turn, this enables our colleagues in academia to raise the level of scholarship they pursue.

What do you see as the biggest successes so far to come out of the work of the Learning Games Network team? How do you define success in this space? what factors do you feel contributed to their success?

Our biggest success is a somewhat personal one.  Having been working together for the better part of 12 years, first as colleagues at MIT and now as a group with our hands (and feet) in different organizations, our core team is still intact.  The fact that the four founders of Learning Games Network bring such different perspectives in scholarship, creative design, and business makes us uniquely strong and effective.  We each trust what the others bring to the table in solving challenges, which is really unique and especially necessary since game-based learning is such an interdisciplinary enterprise.

That trust manifests in the culture that’s emerged in our Cambridge and Madison studios.  We are developing professionals who are strengthening skills that are a hybrid of academic, technical, and commercial backgrounds, as well as encouraging that kind of cultivation with our partners.  Over the past few years, our efforts have been rewarded by grants from major foundations and contracts with market leaders.  Our most recent success came at this year’s Meaningful Play conference, where Quandary, a game we produced in our Cambridge studio to support ethical thinking among young people, and Fair Play, a game produced in our Madison studio that sensitizes players to the challenges of race and equity in science, both won awards among a very competitive field of submissions.

 

 

 

Eric Klopfer is Professor and Director of the Scheller Teacher Education Program and The Education Arcade at MIT.  Klopfer's research focuses on the development and use of computer games and simulations for building understanding of science and complex systems. He is the co-author of the book, Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo, and author of Augmented Learning: Research and Design of Mobile Educational Games from MIT Press.  Klopfer is also the co-founder and President of the non-profit Learning Games Network.

Jason Haas is Graduate Research Assistant in the Media Lab and in The Education Arcade at MIT. His research focuses on the design and efficacy of learning games. Recent research and design has been for The Radix Endeavor, a Gates Foundation-funded MMORPG for science and math learning. Previous research has involved the role of narrative in learning in the casual physics games Woosh, Waker, and Poikilia and in large-scale collective intelligence gaming  in Vanished.

Alex Chisholm is Co-Founder and Executive Director of Learning Games Network, a non-profit organization bridging the gap between research and practice in game-based learning.  He has collaborated on product and program development with Microsoft, LeapFrog, NBC Universal, BrainPOP, Federal Reserve Bank-New York, and the Hewlett and Gates Foundations, among others.

The More We Know: Academic Games Research and Industry Collaboration (Part Two)

The last time I reported about iCue on this blog, it was part of an overview of the work of The Education Arcade. In what ways were the choices made on iCue informed by the Education Arcade's previous experiences developing prototypes for "serious games"? What are some of the factors which have made it hard to get university-based games research beyond the prototype stage and into the world where it might have greater impact?

There is a lot of pushback in the system where change is required.  If there is a change required in the way teaching and learning are perceived, then it is much harder to get adoption.  As such, the teachers never really came for the games, but rather the other parts that they could adopt or adapt and plug into existing structures. In turn, NBC didn’t take the games as seriously; they didn’t grow the more innovative or risky ideas, and, due to the financial crisis in 2008, they couldn’t really even update them.

Thinking about how we moved from previous work into this project, we were working in a much more constrained space then we were used to.  Rather than having the flexibility to build something rich and multi-faceted as we had with Revolution, we were working in the narrower starting space of media archives and integration with the AP curricula.  That restricted the game space, but provided perhaps more realistic constraints than we were used to working within.

What do you see as some of the major hurdles which academic researchers face in terms of working with industry partners?

There are certainly competing interests.  In academia, we can take a longer view, learning and refining over time.  These learnings are valued in and if themselves.  Of course, we also need a successful product, but we can take the time to get there carefully and be thorough. We can be risk prone in the short term.  In industry, pressure to return revenue quickly creates risk aversion.  Even though NBC News’ then-CFO, Adam Jones, protected iCue against those pressures more than the average project, it still had to make compromises that we had to stomach. For instance, there was early hope that the site would feature remix tools for young people to author their own content, but NBC Standards and Practices department shut down that talk almost immediately.

What factors make the education marketplace a particularly challenging one to navigate?

There are big issues around who pays for products, and who makes the decision to buy.  Are schools paying? Can a teacher make the decision, or must they appeal up the food chain to their principal or district? Are parents going to pay? Would any of these stakeholders accept a free-to-play model with sponsored advertising?  Then, depending on these factors, how do you design and market the product? There are also issues of metrics and measurement—how do you show that your product is working?  Does it leverage existing metrics (which may be poor), or new metrics (which aren’t yet implemented or validated)?

Further, are the schools and teachers even ready for the product, both pedagogically and technologically?  Do they have the preparation they need to use the tool effectively?

Finally, if you can settle all of those questions but have a new product approaching learning in a new way, how do you communicate that to your audience?  It can be difficult to transmit that kind of messaging through the standard, narrow channels to schools and teachers.

 

If you could go back and time and leave a message for yourself at the beginning of the process, describing what you now know, what would it say?

Instead of moving our research team to an evaluation position on the project, stay on the design side.  Convince NBC News that the need to sell something quickly shouldn’t obscure the original vision of what this product might do in the hands of students (where it never really got).

We would also push back on timelines and growth models.  We might have seen more success if we had started in a more targeted area and grown from there. That would have almost certainly been a more effective model instead of jumping all in right away, diluting much of the opportunity for participatory learning and deeper learning experiences.

What challenges did you face working with the educational establishment? Were teachers ready for what iCue sought to do? Were students?

Teachers might have been ready, but ultimately the site lacked the depth and frequency of updates it needed to really achieve its goals.

Students might also have been ready, but iCue was a space populated with teachers when they arrived, perhaps sending the signal that it wasn’t a space for them.

The jury is still out on whether students can and will come to an academic social space like iCue was envisioned to be.  That is an interesting question that we continue to explore in our work.

 

Eric Klopfer is Professor and Director of the Scheller Teacher Education Program and The Education Arcade at MIT.  Klopfer's research focuses on the development and use of computer games and simulations for building understanding of science and complex systems. He is the co-author of the book, Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo, and author of Augmented Learning: Research and Design of Mobile Educational Games from MIT Press.  Klopfer is also the co-founder and President of the non-profit Learning Games Network.

Jason Haas is Graduate Research Assistant in the Media Lab and in The Education Arcade at MIT. His research focuses on the design and efficacy of learning games. Recent research and design has been for The Radix Endeavor, a Gates Foundation-funded MMORPG for science and math learning. Previous research has involved the role of narrative in learning in the casual physics games Woosh, Waker, and Poikilia and in large-scale collective intelligence gaming  in Vanished.

Alex Chisholm is Co-Founder and Executive Director of Learning Games Network, a non-profit organization bridging the gap between research and practice in game-based learning.  He has collaborated on product and program development with Microsoft, LeapFrog, NBC Universal, BrainPOP, Federal Reserve Bank-New York, and the Hewlett and Gates Foundations, among others.

 

The More We Know: Academic Games Research and Industry Collaboration (Part One)

The following is an excerpt from the foreword I wrote for a new MIT Press book, The More We Know: NBC News, Educational Innovation, and Learning from Failure, which was authored by two of my former MIT colleagues Eric Klopfer and Jason Haas. Klopfer and Haas are part of the Learning Games Network, a joint initiative between games-based learning researchers at MIT and the University of Wisconsin-Madison, and it describes the iCue project, which began while I was still back in Cambridge. First, a bit from my foreword, and then, over the next few installments, an interview with Kopfer, Haas and Alex Chisholm about this book, which recounts some of the potentials and pitfalls in collaborations between industry and academia:

Three immovable objects walked into a bar. The first was the current world of corporate media (and especially what remains of traditional network news), the second was the current world of higher education (as it lurches towards new funding models and institutional practices), and the third, perhaps, the most immovable and intractable of them all, was the current policy and institutional mess we call public education (which is shaped by a profound mismatch between what we know of how students learn and policies setting standards that in no way reflect those insights). Each wanted to buy the others a drink, give them something that might ease their stress, sooth their tempers, or at least let them forget their problems.  But they couldn’t agree on what the ingredients of this beverage should be, or how it should be paid for, or how they should decide what it should contain, or what kind of relationship would be implied by the buying and selling of drinks, or in what order they should be drinking or....

[Imagine there’s a punchline somewhere around here.]

This is the story of the book you hold in your hand reduced to the level of a farce, as in you’d best keep laughing in order to keep from crying. But of course, the iCue saga is more than a farce. It might also be called a tragedy, in which the best of intentions are waylaid, malformed, and brought low through a series of fatal flaws which prevent each of these institutions from fully embracing change, which block them from seeing the future that the others see so clearly, or which require them to sell out what they value the most if they are going to make any progress forward.  Yet, calling the story you are about to read a tragedy is to imply that it was a perfect failure from start to finish.

And we all know nothing’s perfect.

In fact, as The More We Know makes clear, there were many localized successes along the way and as a consequence of the efforts described herein, other good things have happened. It is rather a story about imperfect failures and imperfect successes, about unintended consequences, unreached goals, and unanticipated results.

It is also an epic, involving a constantly changing cast of characters, each embodying as any good epic does, the contradictions of their times, and featuring multiple heroes, who push greater boulders up to the tops of high hills, only to watch them roll back down again.

The More We Know is also an adventure story set on the bleeding edge of innovation and reform, one which will offer some guideposts for those of you who would follow in the protagonist’s footsteps. There are relatively few post mortems on how great ideas and good intentions do not always turn out the way we expect. I would probably put this on my book shelf next to Brenda Laurel’s Utopian Entrepreneur, which describes the rise and fall of Purple Moon and the girls game movement, or perhaps Sandy Stone’s account of working at the early days of Atari. It certainly, as the authors suggest, provides a personal and extended example to illustrate some of what Mimi Ito has told us about the creation of educational software or what Collins and Halverson have suggested about the resistance of educational institutions to new technologies and practices.

Whatever its genre, The More We Know is the story of the people in the trenches on the front lines of media change and the authors, themselves key participants, tell it very well here....

In our classrooms, we were teaching our students that media change takes place through evolution rather than revolution, but in our labs, we still wanted to change the world, we wanted to blow down the walls and reshape core institutions, and we were painfully, awkwardly, sweetly naive. The path forward turned out to be harder than idealists predicted but not nearly as impossible as skeptics and cynics might insist.

The book you hold in your hands describes some of the walls we hit and the ways our faculty,research staff, and students worked around and through them. My hope is that readers will take from this the right set of lessons.  We succeeded sometimes, failed sometimes, and learned a great deal always about what it takes to make change in the imperfect world around us. The More We Know is not a warning to “avoid this path - there be monsters here”; it is a challenge to “follow us if you dare.”

 

 

 

The More We Know is, in some senses, what game designers would call a "postmortem." What do you see as the value of this genre of writing and why do you think we see so few postmortems coming out of academic research projects compared to their prominence within the games industry?

Much of this boils down to how differently industry and academia perceive “failure.”  There is a perception within academia that funding follows success, and that small, successful projects attract bigger funding.  In industry, there is (at least sometimes) a feeling that failure can lead to learning for teams, which, in turn, become more fundable based on that learning.  This means that in academia we only want to talk about successes.

There is another issue, though.  For academia, we perceive failure to be a failure of our product—the thing we made.  But in industry one can perceive failure any place in a system - failure of marketing, timing, audience, etc.  They can think about the whole ecology surrounding the product.  Academics aren’t as prone to thinking about these things as much.  As such, we feel the failure to be much more personal,  even as the failure of academic products can be attributed to many parts of that ecosystem as well.

 

Describe to us the iCue project. What were its initial goals? What problems was it intended to address? What partners did it try to bring together?

 

Stated simply, the iCue project was originally conceived to bring younger people to the NBC News brand while supporting important learning goals through the repurposing of old media assets and the creation of a new digital experience.  More pragmatically, NBC News needed a cost-effective strategy to digitize its vast archives without breaking the bank.  Education and the perceived abundance of technology funding in schools provided the roadmap for what this project could possibly be.

The original pitch for iCue was that it was one part media archive, one part social learning network, and one part learning games and activities.  iCue was imagined to provide young people with media and tools for learning in a more engaging way, creating a bridge between the curricula and traditional media their teachers were comfortable with on the one hand and the interactive world in which they’ve grown up on the other. It was intended to be supplemental, enabling teachers and students to engage with it in support of Advanced Placement curricula in English Composition, U.S. History, and U.S. Government.  Since NBC News is a broadcast company with radio and television assets extending back to the very earliest days of broadcasting, project leaders sought to bring together a diverse set of education, archive, and print partners, including the College Board, Washington Post, and the New York Times, among others.

 

 

Eric Klopfer is Professor and Director of the Scheller Teacher Education Program and The Education Arcade at MIT.  Klopfer's research focuses on the development and use of computer games and simulations for building understanding of science and complex systems. He is the co-author of the book, Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo, and author of Augmented Learning: Research and Design of Mobile Educational Games from MIT Press.  Klopfer is also the co-founder and President of the non-profit Learning Games Network.

Jason Haas is Graduate Research Assistant in the Media Lab and in The Education Arcade at MIT. His research focuses on the design and efficacy of learning games. Recent research and design has been for The Radix Endeavor, a Gates Foundation-funded MMORPG for science and math learning. Previous research has involved the role of narrative in learning in the casual physics games Woosh, Waker, and Poikilia and in large-scale collective intelligence gaming  in Vanished.

Alex Chisholm is Co-Founder and Executive Director of Learning Games Network, a non-profit organization bridging the gap between research and practice in game-based learning.  He has collaborated on product and program development with Microsoft, LeapFrog, NBC Universal, BrainPOP, Federal Reserve Bank-New York, and the Hewlett and Gates Foundations, among others.

More Spreadable Media: Rethinking Transmedia Engagement

Let it spread, let it spread, let it spread. By now, you know: Spreadable Media: Creating Value and Meaning in a Networked Culture is a new book, being released by New York University Press at the end of January 2013, written by myself, Sam Ford, and Joshua Green. Around the book will live thirty or so online essays written by colleagues, former students, and others who have been associated with the Futures of Entertainment Consortium through the years, which both engage with the content of the book, and are, in turn, taken up as part of the book's core argument.  We are hoping you will do your part to help spread these essays throughout your own social networks, and let the conversation start before the book even gets released to the world.

Today's crop, the last before the new year, offers new perspectives on transmedia entertainment and more generally, on the issue of audience engagement, both central themes in the book, as those of you who regularly read this blog might imagine. For more information, check out the book's home page.

Forensic Fandom and the Drillable Text

 

While the rise of spreadable media is a major trend of the contemporary era, another development within media seems to pull in an opposite direction: narrative complexity of media storytelling, especially on television. Since the late 1990s, dozens of television series have broadened the possibilities available to small-screen storytellers to embrace increased seriality, hyperconscious narrative techniques such as voice-over narration and playful chronology, and deliberate ambiguity and confusion. These trends, which I’ve explored at length elsewhere (Mittell 2006), are tied into transformations within the television industry and technologies of distribution that have enabled programs to be viewed more consistently by smaller audiences and to still be considered successful.

Such long-form complex narratives as Lost, The Wire, 24, and The Sopranos seem to run counter to many of the practices and examples of spreadable media found elsewhere in this book. These shows are not the ephemeral “video of attractions” common to YouTube that are shared and commented on during downtime at work. They are the DVD box sets to be shelved next to literary and cinematic collections, long-term commitments to be savored and dissected in both online and offline fora. They spread less through exponential linking and emailing for quick hits than via proselytizing by die-hard fans eager to hook friends into their shared narrative obsessions. Even when they are enabled by the spreadable technologies of online distribution, both licit and illicit, the consumption patterns of complex serials are typically more focused on engaging with the core narrative text than the proliferating paratexts and fan creativity that typify spreadable media.

Perhaps we need a different metaphor to describe viewer engagement with narrative complexity. We might think of such programs as drillable rather than spreadable. They encourage a mode of forensic fandom that invites viewers to dig deeper, probing beneath the surface to understand the complexity of a story and its telling (Mittell 2009a). Such programs create magnets for engagement, drawing viewers into story worlds and urging them to drill down to discover more. READ MORE

 

A History of Transmedia Entertainment

As embraced by industry professionals and media consumers alike, transmedia storytelling promises to bring greater institutional coordination, added narrative integrality, and deeper engagement to the various pieces of contemporary media franchises. Comic books, video games, and other markets once considered ancillary now play increasingly significant and recentered roles in the production and consumption of everyday film and television properties such as Heroes, Transformers, and the reenvisioned Star Trek in ways that only very few innovators (such as George Lucas and his carefully elaborated and expanded Star Wars empire) had previously conceived in the twentieth century. Yet, while contemporary convergence culture has set the stage for a greater embrace of transmedia entertainment, the processes by which stories have been spread across institutions, production cultures, and audiences from different media have a much longer history. Although we might recognize transmedia storytelling as something newly emergent, we also cannot deny its relationship to long-established models of media franchising whereby the creative and economic resources owned by monolithic corporate entities were nevertheless widely used and shared across production communities and industry sectors. The franchise models that multiplied one Law & Order into several sister series and turned X-Men comic books into action figures worked by spreading resources among a network of stakeholders brought into social relations by virtue of their parallel (though often imperfectly aligned) interests. Thus, neither transmedia entertainment nor convergence point to the end of industrial models of cultural production in favor of some new social media; instead, the transmedia storytelling of convergence offers an opportunity to see how spreadable media extend, reorient, and reimagine existing historical trajectories in the industrial production and consumption of culture.

Understanding transmedia in terms of cultural exchange across and transformation through different media experiences means recognizing traditional processes of adaptation and translation of content as a foundation for the social exchange of spreadable media today. READ MORE.

 

 

Performing with Glee

Some producers developing cross-platform media franchises are experimenting with distribution models that engage consumers on a quotidian level, capitalizing on personal audience networks and not-quite-official distribution routes to help content spread. For FOX’s television franchise Glee, the network integrates traditional, legal distribution practices with experimental tactics that engage loyal fans, in addition to harnessing unofficial distribution channels that fall into legal gray areas.

The production team has embraced the show’s fans—known as gleeks, a fusion of “Glee” and “geek”—fashioning a popular (brand) identity and catering specifically to them. In addition to conventional broadcast, Hulu and FOX.com allow viewers to catch previous episodes, and FOX offers additional content such as cast interviews and behind-the-scenes clips. Glee’s thematic fusion of high school comedy and Broadway musical provide opportunities for musical guests from both Broadway (such as Kristin Chenoweth) and the popular music circuit (such as Britney Spears and Josh Grobin), bringing new viewers into the Glee fan club while keeping current fans engaged.

To retain fan interest after season one ended, FOX partnered with CoincidentTV to create the “Glee Superfan Player.” The online platform integrates social network sites such as Facebook and Twitter with other fan-enticing elements—such as links to buy music on iTunes and to create “photobooth” pictures with the cast—in a unified space that plays episodes while viewers multitask. While the player only provides access to material on Hulu and FOX.com, rendering the experimental platform useless once episodes eventually expire, it at least represents an attempt to create a consolidated cross-platform fan experience. Other recent experiments include a MySpace karaoke contest, in which fans record themselves singing hits from Glee, and live concert tours that sold out in four American cities—so successful that the cast plans to tour the UK in mid-2011. READ MORE

Valuing Fans

Why work toward a model for valuing fans?

The U.S. media industry has run into some significant economic problems in recent years. Study after study suggests that Americans are watching more television and consuming more movies, music, and information than ever before, but, at the same time, it is neither as captive nor as concentrated as before. New ways to discover emerging artists and projects, as well as increasing choice in media platforms and content, are challenging how ad-supported media is bought and sold and rendering direct funding for some media content much harder to come by.

It was this situation that gave rise to the popularity of “engagement” a few years ago, a tactic to sell advertisers audiences whose enthusiasm is believed to translate to more awareness of and receptivity to product placement and commercials. How much more “engaged” and receptive this new audience is than the older, bigger one was considered crucial in setting a price for the advertising that supports media production. Conspicuously absent from these discussions was the role that fan communities (groups whose various interests in a media property may range widely) play in contributing economic value beyond paying attention to commercials. READ MORE

 

The Online Prime Time of Workspace Media

Ask a producer of digital content about website usage patterns, as I have, and they will tell you how important the audience accessing their content from work is to daily website traffic. According to NBC’s vice president of digital content and development, Carole Angelo, NBC.com designs its daily production schedule to service its workweek “lunch hour” audience. Fox Sports Digital (2009) also adopts this production strategy, as it summed up in its 2009 slogan “lunchtime is the new prime time.” Reporting on this trend, the New York Times observed that American cubicle dwellers were increasingly choosing to spend their break time watching online videos, playing Flash games, and engaging in social network sites instead of heading to the water cooler (Stelter 2008). The entertainment industries are creating digital content for the work space because they see this audience as a dependable online consumer demographic.

Programming for the workspace media audience is crucial to entertainment industry efforts in the online space. It allows producers to adapt familiar television programming strategies for the Internet. In television, producers have long programmed according to “day parts”—segments of the broadcast day designed for particular audiences and viewing contexts. Nick Browne has argued that the scheduling of day parts enabled television companies to reflect and reinforce a “socially mediated order of the workday and workweek” to “mediate between the worlds of work and entertainment” (1994, 71). Each day part carries with it certain assumptions about the needs and desires of audience segments, as well as expectations of modern labor. The scheduling of a workday day part demonstrates the influence that technology has had on the blending of work and entertainment. READ MORE

The Cost of Engagement: Politics and Participatory Practices in the U.S. Liberty Movement

From time to time, I am sharing through this blog some of the research being generated by the MacArthur Foundation-support Youth and Participatory Politics Research Network. This team, headed by Joseph Kahne from Mills College, is seeking to map the ways that the practices associated with participatory culture and the technologies of networked computing are impacting the political lives of  youth, primarily in the United States but also in other parts of the world. See for example earlier posts about the YPP survey and about our case study of DREAM activists. Today, I am proud to share a new report, a case study of the political and cultural experiences of young Liberatarians, as they seek to find their own voice, forge their own community, in a space defined both by participatory dimensions of their own informal networks and by the influence of powerful conservative think tanks and funding organizations. This report was prepared by Liana Gamber Thompson, a Post-Doc who has been working as part of my USC-based research team, Media Activism and Participatory Politics (MAPP), as we develop ethnographic case studies of innovative organizations and networks that have been successful at increasing civic engagement and political participation amongst youth.  

 

 

 

PLAY (Participatory Learning and YOU!)

Last time, I shared Shall We Play?, a report funded by the Gates Foundation and distributed by the Annenberg Innovation Lab. Today, we are releasing its companion report, PLAY (Participatory Learning and YOU!), which is authored by Erin Reilly, Vanessa Vartabedian, Laurel Felt, and Henry Jenkins. It continues our exploration of insights gained from our year-long work with elementary and secondary teachers from the Los Angeles Unified School District as they sought to develop a more participatory environment in their classroom. Through this research, our teams has identified five core principles for participatory learning:

1.     Participants have many chances to exercise creativity through diverse media, tools, and practices;

2.     Participants adopt an ethos of co-learning, respecting each person’s skills and knowledge;

3.     Participants experience heightened motivation and engagement through meaningful play;

4.     Activities feel relevant to the learners’ identities and interests;

5.     An integrated learning system - or learning ecosystem - honors rich connections between home, school, community and world.

In this report, we will discuss each of these principles, describing specific examples of how they were applied through the workshop process, what impact they had on the teachers and students involved, and what some of the challenges we face in bringing about this kind of change within the current public schools system.

 

 

 

Shall We Play?

Earlier this term, I shared through this blog Designing with Teachers: Participatory Approaches to Professional Development in Education, a white paper funded as part of a grant from the MacArthur Foundation and released by the Annenberg Innovation Lab. The report, edited by Erin Reilly and Ioana Literat, featured case studies of innovative professional development initiatives ( Vital Signs, PLAY, Scratch, Ask Ansai, the Participatory Assessment Project) with a larger exploration of what it might mean to adopt a more participatory model for working with teachers. Today, we want to expand upon that report with the first of two reports that emerged from our own PLAY (Participatory Learning and YOU!) project, discussing core insights we derived from a year-long program working with teachers in the Los Angeles Unified School District to develop more participatory approaches in their classrooms. The teachers spaned both grade-levels and curricular categories, allowing us to develop new approaches together that work in a variety of contexts.

The first of these reports, Shall We Play?, was written by Erin Reilly, Henry Jenkins, Laurel Felt and Vanessa Vartabedian. It represents a revisiting of my original MacArthur white paper, Confronting the Challenges of a Participatory Culture, and lays out what we see as core principles for participatory learning.  It includes some core reflections on what has happened in the Digital Media and Learning movement over the past six years as we have sought to bring a more participatory spirit to those institutions and practices that most directly touch young people's lives.

Spreadable Media Goes Retro: Pass It Along!

We continue this week with the process of rolling out the essays commissioned to accompany Spreadable Media: Creating Meaning and Value in a Networked Culture,   the book I wrote with Sam Ford and Joshua Green and which is being released to the world at the end of January, 2013. You can start to get a sense of the shape of the book's argument by reading these essays, week by week, as they get unleashed upon the world. This week, for example, we are sharing essays which are designed to accompany the book's second chapter -- Reappraising the Residual -- which explores competing regimes of value, competing processes of appraisal, and especially the ways that old media content might regain value from the ways it moves within and across social networks online.

For those who would like a bit more of a road map of Spreadable Media, below is the breakdown of the chapters:

Introduction: Why Media Spreads                                                                                                               

Chapter One: Where Web 2.0 Went Wrong

Chapter Two: Reappraising the Residual

Chapter Three: The Value of Media Engagement

Chapter Four: What Constitutes Meaningful Participation?

Chapter Five: Designing for Spreadability

Chapter Six: Courting Supporters for Independent Media

Chapter Seven: Thinking Transnationally

Conclusion

 

To learn more about the book, check out our main website. You can go there to read the whole essays (or follow the links below).

We strongly encourage you to spread these essays through your own social networks, repost them on your blogs -- all we ask is that you acknowledge the authors and the fact that they are associated with our book.   Thanks to all of you who have recirculate previous essays we've released.

RETROBRANDS AND RETROMARKETING

Today’s big brands are all rooted in the past. Tide, Coca-Cola, BMW, and even Apple are all connected to bygone decades. When these brands extend and use their existing brand name to introduce a new product or service, the past meanings and images that it invokes become an important element to be managed, understood, wielded, and shaped by managers. This short essay discusses and analyzes a form of brand extension strategy that has gained prominence, in which tired or even abandoned brands have been reanimated and successfully relaunched. Management will deliberately reach into the past and consciously seek to gain new value from old brands and the meaningful relationships they convey. Stephen Brown (2001) terms this a “retro revolution” in which the revival of old brands and their images have become an increasingly attractive option for marketing managers. Over the past decade, I have been involved either independently or with coauthors in a growing body of research that looks at how the past is consumed, valued, revalued, and managed, beginning with a study of the values and images of the Wal-Mart retail chain (Arnold, Kozinets, and Handelman 2001). Stephen Brown, John Sherry, and I define retrobranding as “the revival or relaunch of a product or service brand from a prior historical period, which is usually but not always updated to contemporary standards of performance, functioning, or taste,” seeing retro goods as “brand-new, old-fashioned offerings” (2003b, 20). Old brands retain value simply by being old: the value of nostalgia, the so-called retro appeal. There is also value in the communal or cultural relationships that the brand has built over its lifetime. Finally, there are values on an individual level that relate to the former two other values.

In a set of studies cutting across three different retro, “cult brand” products—the Volkswagen Beetle, Star Wars, and Quisp breakfast cereal—Brown, Sherry, and I have sought to explain the underlying principles of retrobranding and the way consumers responded to it (2003a, 2003b). The VW Beetle was a popular car associated with the 1960s era and hippies and also immortalized in Disney’s Herbie films, a series of four films originating with 1968’s hit The Love Bug (the series itself later updated and retrobranded into Herbie: Fully Loaded, a 2005 motion picture starring Lindsay Lohan). Star Wars is one of the most successful media franchises of all time. And Quisp cereal is an American breakfast cereal released in the 1960s using cartoon advertising created by Jay Ward, the creator of cult animation hit Rocky and Bullwinkle, and employing some of the same voice talents.

In each case, the entertainment connections of the brand have helped spur a type of residual and actual “brand fandom” that led to the possibility of a revival. In the case of the VW Beetle, this was the 1998 launch of the VW New Beetle. For Star Wars, it was the much-maligned 1999 prequel The Phantom Menace. For Quisp cereal, it was the quiet and limited redistribution of the cereal into select markets in the 1980s, after it had languished without support since the late 1970s. As well, Quisp’s fan-spurred and eBay-supported emergence in the mid-1990s marked it as the first so-called Internet cereal.

READ MORE

THE VALUE OF RETROGAMES

Existing in dialectical tension with contemporary games which trumpet their photorealistic graphics, sprawling storyworlds, and intricate, extended, networked play, retrogames preserve and celebrate a prior era of gaming often referred to as a “golden age” of arcade standards (such as Asteroids, Tempest, and Donkey Kong) from the late 1970s and early 1980s. Increasingly, the category also covers the decade that followed the industry crash of 1983, when the locus of gaming shifted to home consoles such as the Nintendo and Super Nintendo Entertainment Systems (NES and SNES), the Sega Genesis and Dreamcast, and home microcomputers such as the Commodore 64 and Amiga, as well as the first generation of PCs and Macintoshes. Compared with games for contemporary consoles such as the Xbox 360 and PlayStation 3 that occupy gigabytes of memory, resurrections of 8-bit, 16-bit, and 32-bit video and computer games look like the mathematically downscaled primitives they are: their blocky resolutions, limited color palettes, and blip-bleep-bloop sound reproduction are matched by equally simple and repetitive gameplay. However, retrogames are not hopelessly antiquated museum pieces lacking the good sense to stay buried in gaming history. Their continued presence complicates easy (and industry-friendly) conceptions of technological and aesthetic progress, in which the newest equals the best equals the most expensive.

Older games thrive alongside their more sophisticated descendants, gaining popularity and influence with each passing year. Retrogames continue to be played in both authorized and unauthorized forms. Their minuscule memory footprint, easily grasped rules, and convenient fit within the interstices of daily routine make them ideal content for mobile devices. For instance, the App stores for iTunes and Google Android phones devote sections to retrogames. The Xbox Live Arcade markets “updated retro classics” alongside its “newest hits,” while the Wii Virtual Console sells downloads from “the greatest video game archive in history”—actually licenses owned by Nintendo. These monetized properties coexist uneasily with the thriving emulator scene, where every conceivable old game has its software simulacrum and renegade read-only memories (ROMs)—files containing data images copied from memory chips, computer firmware, or the circuit boards of arcade machines—circulate beyond the bounds of copyright. For both legal and illegal purposes, the Internet functions as both archive and distribution network, supporting the sharing, spreading, and mutation of content

READ MORE

 

A GLOBAL HISTORY OF SECONDHAND CLOTHING

Clothing, almost by definition, is a medium of transmission within a spreadable media ecology. It is both the means and the site for the storage and spread of information. Clothes are made to be carried by the human body (as in the French porter and the Haitian Creole pote). Textile skins were, from their origins, portable artifacts and temporary prostheses, shaped by the demands of a mobile body and inscribed with markers of that body’s history. The demands on clothing have always been high—armor (protection against shame, enemies, and the elements) and aesthetics, comfort and durability. Clothing is portable, proximate to the human body, and eminently changeable. Clothes remain artifacts in continual flux. They convey messages to the world, and they also provide the raw material for subversion of precisely these messages.

Before the industrial era, vestments were few and far between. Their production took a great amount of human and material resources. Into their tailored forms much was literally and culturally invested. In the Western tradition, throughout the Middle Ages and Renaissance, clothing—once shaped to a given body—might be worn for years, sometimes carried for a lifetime. The clothing wore its owner as much as the owner wore the clothing, bearing comparable markers of a personal narrative. Through the movements of a body in time, its clothes would acquire increasingly personal and human characteristics—worn knees and elbows, a stretched waist. Stains, patches, tears, and color changes accompanied a life journey, or at least several decades thereof.

Sometimes an article’s function was portable. This was especially true when even the simplest clothing was scarce: its production costly, time consuming, and labor intensive. A coat might be cut down into a vest, or a dress into a scarf. As a garment’s function evolved, so too might the identity of its wearer. A dress might be handed from mother to daughter through a gift economy. In such instances, it carried with it signs and markers of generational passing. A master might give his worn-out shirt to his servant, for whom it could serve as either bodily cover or portable currency. In the Renaissance, it was common for servants to sell their masters’ old clothing to peasants in neighboring villages. Itinerant rag and old clothes dealing grew into a veritable calling within a commodity-based economy. This was a profession of portability. The dealer became an intermediary between wearers, marking a transitional phase in an article’s mobile life history.

Attention Transmedia Producers, Attention Transmedia Scholars...

Today, I am using my blog space to share announcements of two upcoming events which may be of interest to some of my readers Transmedia Lab Competition at RioContentMarket 2013

The Transmedia Lab is one of the activities of the RioContentMarket, an international event on multiplatform content production open to the audiovisual and digital media industry. The Transmedia Lab aims to promote professional training and project improvement.

In the last edition of RioContentMarket, in 2012, more than 100 projects from all of Latin America were submitted and 12 transmedia projects were selected to participate in the Transmedia Lab, which lasted 4 days. The project’s authors and representatives consulted with market experts; were presented in pitching sessions to buyers, co-producers and television channels; and participated in meetings with domestic and international market players.

Besides creating opportunities for all participating projects, three awards were given and chosen by three different groups of judges:

(i)            Reed MIDEM Award (participation and pitching at MIPCube): Buenaventura Mon Amour project (Colombia);

(ii)           PETROBRAS/The Alchemists Award (participation in Transmedia Hollywood): Contatos project (Brazil), and

(iii)          Turner Broadcasting Award (USD 10,000 for project development): Contatos project (Brazil).

 

In the 2013 edition of RioContentMarket, the Transmedia Lab will focus on transmedia projects for TV series and 30 projects will be selected: 10 international and 20 Brazilian projects. The Transmedia Lab - Series will be held from February 17 through 22, 2013 in two steps (I) Capacitating from February 17th to 19th, and (II)Pitching and Panels, February 20th to 22nd. The Capacitating step will be held following the training of the projects’ authors for pitching and the scheduling of meetings between consultants and creative producers of the selected projects. The Pitching and Panels will be held during the RioContentMarket 2013 with keynotes and panels related to transmedia topics and pitching projects for industry and market professionals.

The Transmedia Lab objectives for 2013 are:

·                enhance television series narratives, through specialized consulting with market experts;

·                improve the transmedia projects for television series to qualify them for the audiovisual market nationally and internationally;

·                bring players together, encouraging the dialogue between independent producers and channel executives;

·                create business opportunities for the development of high quality TV series; and

·                give visibility to selected projects.

For more information, visit this site. 

------------------------------------------------------------------------------------------------------------------------------

Media in Transition 8: Public Media, Private Media

Conference dates: May 3-5 (Fri.-Sun.), 2013 at the Massachusetts Institute of Technology, Cambridge, MA.

An archive of previous Media in Transition conferences, 1999-2011.

CALL FOR PAPERS

Submissions accepted on a rolling basis until Friday, March 1, 2013 (evaluations begin in November). Please see the end of this call for papers for submission instructions.

The distinction between public and private – where the line is drawn and how it is sometimes inverted, the ways that it is embraced or contested – says much about a culture. Media have been used to enable, define and police the shifting line between the two, so it is not surprising that the history of media change to some extent maps the history of these domains. Media in Transition 8 takes up the question of the shifting nature of the public and private at a moment of unparalleled connectivity, enabling new notions of the socially mediated public and unequalled levels of data extraction thanks to the quiet demands of our Kindles, iPhones, televisions and computers.  While this forces us to think in new ways about these long established categories, in fact the underlying concerns are rooted in deep historical practice.  MiT8 considers the ways in which specific media challenge or reinforce certain notions of the public or the private and especially the ways in which specific “texts” dramatize or imagine the public, the private and the boundary between them.  It takes as its foci three broad domains: personal identity, the civic (the public sphere) and intellectual property.

Reality television and confessional journalism have done much to invert the relations between private and public. But the borders have long been malleable. Historically, we know that camera-armed Kodakers and telephone party lines threatened the status quo of the private; that the media were complicit in keeping from the public FDR’s disability and the foibles of the ruling elite; and that paparazzi and celebrities are strategically intertwined in the game of publicity. How have the various media played these roles (and represented them), and how is the issue changing at a moment when most of our mediated transactions leave data traces that not only redefine the borders of the private, but that serve as commodities in their own right?

The public, too, is a contested space. Edmund Burke’s late 18th century invocation of the fourth estate linked information flow and political order, anticipating aspects of Habermas’s public sphere. From this perspective, trends such as a siege on public service broadcasting, a press in decline, and media fragmentation on the rise, all ring alarm bells. Yet WikiLeaks and innovative civic uses of media suggest a sharp countertrend. What are the fault lines in this struggle? How have they been represented in media texts, enacted through participants and given form in media policy? And what are we to make of the fate of a public culture in a world whose media representations are increasingly on-demand, personalized and algorithmically-designed to please?

Finally, MiT8 is also concerned with the private-public rift that appears most frequently in struggles over intellectual property (IP). Ever-longer terms of IP protection combined with a shift from media artifacts (like paper books) to services (like e-journals) threaten long-standing practices such as book lending (libraries) and raise thorny questions about cultural access. Social media sites, powered by users, often remain the private property of corporations, akin to the public square’s replacement by the mall, and once-public media texts, like certain photographic and film collections, have been re-privatized by an array of institutions. These undulations in the private and public have implications for our texts (remix culture), our access to them, and our activities as audiences; but they also have a rich history of contestation, evidenced in the copybook and scrapbook, compilation film, popular song and the open source and creative commons movement.

MiT8 encourages a broad approach to these issues, with specific attention to textual practice, users, policy and cultural implications. As usual, we encourage work from across media forms and across historical periods and cultural regions.

Possible topics include:

  • Media traces: cookies, GPS data, TiVo and Kindle tracking
  • The paradoxes of celebrity and the public persona
  • Representing the anxieties of the private in film, tv, literature
  • MMORPGs / identities / virtual publics
  • The spatial turn in media: private consumption in public places
  • Historical media panics regarding the private-public divide
  • When cookies shape content, what happens to the public?
  • Creative commons and the new public sphere
  • Big data and privacy
  • Party lines and two-way radio: amplifying the private
  • The fate of public libraries in the era of digital services
  • Methodologies of internet and privacy studies
  • Creative commons, free software, and the new public sphere
  • Public and civic WiFi access to the internet
  • Surveillance, monitoring and their (dis)contents

Submit an Abstract and Short Bio Short abstracts for papers should be about 250 words in a PDF or Word format and should be sent as email attachments to mit8@mit.edu no later than Friday, March 1, 2013. Please include a short (75 words or fewer) biographical statement.

We will be evaluating submissions on a rolling basis beginning in November and will respond to every proposal.

Include a Short Bibliography For this year’s conference, we recommend that you include a brief bibliography of no more than one page in length with your abstract and bio. 

Proposals for Full Panels Proposals for full panels of three or four speakers should include a panel title and separate abstracts and bios for each speaker. Anyone proposing a full panel should recruit a moderator.

Submit a Full Paper In order to be considered for inclusion in a conference anthology, you must submit a full version of your paper prior to the beginning of the conference.

If you have any questions about the eighth Media in Transition conference, please contact Brad Seawell at seawell@mit.edu.

The Affordances of Technology for Media History Research (Part Two)

The Media History Digital Library seeks to bring together communities of scholars and fans. How do you see the relationship between scholarly research and fandom in your own work?  

Eric Hoyt

 

The title of Henry’s blog where we are having this discussion—“Confessions of an Aca-Fan”—speaks to the way that personal passion and scholarly inquiry shape one another.

I am certainly an aca-fan of both historic and contemporary Hollywood. I tend to pursue research questions related to law, culture, and industry, rather than film style or aesthetics. But the whole reason I focus on the film and media industries—rather than, say, the corrugated box industry—comes from a deep love and fascination with films and television programs.

As a Film & Media Studies academic, I also feel grateful to study an area of culture that holds such broad popular interest. I think it’s a shame if we don’t connect with that broader public. We miss an opportunity to share our research. We also miss out on a chance to learn.

Something that many scholars already know but bears repeating is that many of the materials on the MHDL only exist because of fans. From the 1910s forward, fans purchased magazines, such as Motion Picture Story and Photoplay, to extend and deepen the movie-going experience. Most libraries in the early-20th century considered these magazines to be mere ephemera and did not keep them. So many of these magazines only exist today because fans bothered to keep them. I am grateful to fans and collectors for keeping these documents of film history and supporting the MHDL in its endeavor to make them freely available online.

 

Andy Myers:

As Eric mentioned, many of these publications are, in multiple ways, inextricable from the context of fan culture. Fans have not only collected and preserved these publications — their interest and investment in film culture actually provided the necessary market demand for these magazines to exist in the first place. Fan magazines like Photoplay were in constant dialogue with their audience and thus can provide scholars key texts in the history of fan discourse. For me, the eclectic fan letters reprinted in these magazines are one of their most fascinating and entertaining features because they offer so many surprising insights into the breadth of film fan culture.

Kathy Fuller-Seeley:

Various aspects of fandom have always been central to my own work, as the questions that sparked my dissertation research were how did Americans in small towns and rural hinterlands come into contact with motion pictures in everyday life, and how did the growing movie fan culture engage them. I second Eric’s gratitude to fans back in the day who saved fan magazines and ephemera and who compiled scrapbooks and kept diaries. Libraries long turned up their noses at saving such disposable popular culture. We are fortunate that archives like the Smithsonian’s National Museum of American History and Northeast Historic Film have amassed robust collections.  Today, individual collectors still hold the most fascinating moviegoing ephemera – real photo postcards of nickelodeon theaters, posters, illustrated song slides, pressbooks, trade journals, theater accounting ledgers, etc. I’m very appreciative of the generosity of many who have shared their archival treasures with me.

 

Now I am wondering how we can collaborate with collectors to make more of these materials available to the public for research purposes, and I am investigating ways of digitizing my own collections online. (you can see my collection of images from the “world premiere” of the 1937 United Artists film Blockade in, of all places, Elkhart, Indiana at http://www2.gsu.edu/~jougms/blocimg.htm   along with a terrific essay by my colleague Greg Smith).

Where do we go from here?

 

Kathy Fuller-Seeley:

 

My Digitized Dream Wishlist includes

  • A full run of Motion Picture Herald
  • A full run of Moving Picture World (which the MHDL is rapidly accruing, hooray!!!)
  • Exhibitors Trade Review
  • Hollywood Reporter
  • Variety (in a more user-friendly data-searching software! It is difficult to read an issue page by page).
  • New York Clipper and New York Dramatic Mirror    (these two are available in part through the Fulton County website, but the database is clunky and its somewhat difficult to search)
  • orphaned New York City newspapers like the Herald and World and Telegraph; I still have no source for John Crosby’s radio and television criticism or Alton Cook’s radio columns. I wish Proquest would make subscriptions to multiple historic newspapers available and reasonably priced for individual researchers! One year, a membership in the Society for American Baseball Research provided access, and that was terrific.

Even better is free to the public, and I am so grateful for the work of the MHDL to make all these fascinating documents available for everyone!

 

Eric Hoyt:

We’ve now digitized over 500,000 pages of media periodicals. By the end of this year, we may surpass one million pages. A question that I’ve been asking myself is—once you’ve aggregated all of that data, what do you do with it? One thing you clearly need to be able to do is swiftly search through the data. I have been collaborating with a great team—which includes Carl Hagenmaier, Wendy Hagenmaier, Joseph Pomp, Andy Myers, Pete Sengstock, Jason Quist, and new collaborators at the University of Wisconsin-Madison—on building Lantern, a search engine for the MHDL. Search will be an important tool for researchers and historians. It will also provide a much easier entry point into our collection for users who are passionate about classic movies and television but don’t know where to start looking.

 

In addition to search, though, what else can you do with all that aggregated data? It would take me years to read through every page of text in the MHDL. A computer, on the other hand, reads the data in seconds. This is the basis for full-text search, but it also opens up new possibilities that Humanities scholars interested in quantitative methods and “big data” are only beginning to explore. Google Ngram Viewer, for example, allows you to graph the frequency of words and phrases across a corpus of five million books. Here is a graph I quickly compiled of the terms 16mm, 8mm, and 28mm. This graph immediately suggests a story about the cultural, industrial, and technological importance of these “sub-standard” film stocks across a hundred year span. Now, it would tell you a better story if you could refine the searchable corpus—using the collections of the MHDL, rather than GoogleBooks. And it would tell you the richest story of all if you combined the insights of the graph with specific articles and books about non-theatrical film from the MHDL’s collections.

I see this as the direction where the Humanities and MHDL eventually need to head—combining the familiar practice of “close reading” with strategies of “distant reading” (to use the term coined by Franco Moretti). It’s not about abandoning the established critical tools. But we do need to learn from the data-intensive research that is happening in the sciences. I recently attended a “Humanities Hackathon” workshop hosted by UW-Madison’s Center for the Humanities and the Wisconsin Institute for Discovery. I was encouraged by the enthusiasm about using computational methods in Humanities research and legitimate concerns that we perform such analyses in a thoughtful way. I am excited to pursue these new techniques in my own scholarship, as well as to help build the infrastructure and tools that will enable other scholars to join the experiment.

 

 Andy Myers:

 

In the short term, as Eric mentions, our obvious goals are to add much more material, and to make system and interface improvements — such as full text search— which make it easier for users to find relevant material.

As far as long-term goals go, we want to make as much material as possible available to as many people as possible through as many avenues as possible, and we’ve been building lots of momentum. We’re not exactly declaring war on aggregators of public domain material like ProQuest — after all, we recognize that they do add value for many institutions and that these aggregators license more recent, copyrighted content too. However, with our boom in content and the upcoming launch of Lantern, we think that we are reaching a point where we can offer institutions a viable alternative to commercial providers and their high access fees. We firmly believe that our open-access model can provide better-quality material, freely available to everyone, with superior usability, at a fraction of the cost. So I really feel that in terms of growth, to paraphrase Walter White, we’re now in the empire business.

We’d also like to develop good cross-integration with other databases and resources across the web. Our digital assets are starting to be listed in the catalogs of academic libraries as electronic resources, which is a huge step to aiding discovery by researchers. I hope professors and graduate students reading this blog post right now will tell their librarians about the MHDL, and ask them to input MHDL resources into their library catalogs.  Additionally, we hope to eventually add features that will facilitate discovery of material made available by other great projects around the web. Wouldn’t it be great if a full text search on MHDL would not only search our collections, but also pointed users toward results in sites like AmericanRadioHistory.com (which hosts decades of digitized broadcasting periodicals) or the Margaret Herrick Library’s digital collections? We have yet to explore the technical details of such an implementation, but I think that kind of integration is on our distant horizon.

 

Bios

 

Eric Hoyt is Assistant Professor of Communication Arts at the University of Wisconsin-Madison. He co-directs the Media History Digital Library in collaboration with the project’s founder, David Pierce. He is also leading the development of the MHDL’s new search platform, Lantern, which is a co-production with the University of Wisconsin-Madison Department of Communication Arts. His articles on media, law, and culture have appeared in Cinema Journal, Film History, Jump Cut, World Policy Journal, and The International Journal of Learning and Media.

 

Kathy Fuller-Seeley is Professor in the Department of Communication at Georgia State University. She specializes in the history of film, radio, TV and media audiences. Kathy's books include Hollywood in the Neighborhood: Historical Case Studies of Local Moviegoing (California, 2008, edited), At the Picture Show: Small Town Audiences and the Creation of Movie Fan Culture (Smithsonian 1997), and Children and the Movies: Media Influence and the Payne Fund Controversy (Cambridge 1996, with G. Jowett and I. Jarvie).  She has a book forthcoming on the history of nickelodeons and is writing a book project about Jack Benny’s radio program and American culture.

 

Andrew Myers is a doctoral student in Critical Studies at the University of Southern California's School of Cinematic Arts. He serves as the post-processing editor for the Media History Digital Library, which generally entails writing scripts to process images, text, and metadata. He recently received his M.A. in Cinema and Media Studies from UCLA and is also the outgoing co-editor-in-chief of Mediascape, UCLA's Journal of Film, Television, and Digital Media. His diverse research interests include media industries and production culture, archival film and television history, new media (especially video games), and documentary.

 

 

The Affordances of Digital Technology for Media History Research (Part One)

The Media History Digital Library (MHDL) digitizes out-of-copyright periodicals relating to the histories of film, broadcasting, and recorded sound and makes them widely available for public use. The project promotes media history scholarship, provides educational tools for classroom use, and advocates for greater engagement with the public domain. A non-profit initiative, the MHDL is supported by owners of materials who loan them for scanning, and donors who contribute funds to cover the cost of scanning.

In this discussion, MHDL co-director Eric Hoyt talks with project’s post-processing editor Andy Myers and film historian Kathy Fuller-Seeley, who digitized reels of microfilm in the course of her own research that she donated to the MHDL. Eric, Kathy, and Andy talk about the value of digitizing the historical sources, the challenges involved in the process, and bringing together the communities of fans and scholars.

 

What is the value of digitizing historic trade papers and fan magazines?

 

Kathy Fuller-Seeley: I’m an enthusiastic fan of digitized trade journals, fan magazines and newspapers. Their availability has had a transformative impact on my research, allowing me to dig more deeply into published coverage of all my topics, and also creating new research projects.  I live far from the libraries and archives that hold the original publications, and my relatively young state university can’t/won’t invest in an extensive microfilm library. Far beyond the convenience of not having to squint at scratched microfilm and having rare journals available day or night, I’m enthralled that digitization reveals much more complex views of my research topics.  I’m not just cherry picking one or two articles that I might have stumbled across in print or in someone’s scrapbook, I am encountering masses of coverage that I can analyze and weigh as a whole.

Quick examples from current research projects – digitized Photoplay enabled me to uncover how the mainstream Hollywood publicity machinery constructed star personae for unusual performers such as Shirley Temple, Rin Tin Tin and Marie Dressler, and marketed them to working-and middle-class female fans.  Searchable Variety uncovered for me the innovative ways in which Jack Benny and his agents intertwined his performances and star image across radio, film, live appearances and consumer product advertising to achieve incredible career synergy. Newspaper databases like the Library of Congress’s allowed me to trace the diffusion of the earliest “picture personalities” (Florence Turner the Vitagraph Girl and Florence Lawrence the “Imp Girl”) fame far beyond the original promotional material published in national trade journals and the New York papers to the creation of local fan cultures in rural New England, Utah, Arizona and the Yukon.

In researching film history, digitized trade journals and newspapers enable us to learn more about exhibition practices and circulation of fan culture outside major metropolitan areas (and scholar’s over-reliance on the New York Times). Access to the fan magazines enables close readings of how Hollywood structured knowledge about stars and films for their target audiences. Having access to searchable trade papers can not only shows us what they covered, but also topics they purposely avoided (such as competing non-film promotions like Dish Night giveaways during the Depression). They make new research topics possible, such as the recent work of historians Paul Moore and Richard Abel on intersections of newspaper discourse and moviegoing culture. The NEH is currently sponsoring a series of grants for “digging into the data” to expand our thinking about these research possibilities.

Andy Myers:

Our role in digitizing these publications, as I see it, is about ensuring three things: accessibility, discoverability, and browse-ability.

Enabling access is the first and most obvious benefit of digitizing these resources. For example, two years ago as a student at UCLA I was working on a project on the American anti-Bolshevik films from 1919 and 1920. The secondary sources I was using kept referencing articles in The Moving Picture World (MPW), and I had dozens of such references to track down. Even our fantastic library at UCLA didn’t have anything from MPW from those years, but after some searching I eventually located a microfilm copy at the Margaret Herrick Library in Los Angeles and made a special trip. Now, I’m lucky enough to live and study in Los Angeles, where the archival resources for media history are unparalleled, so I only had to drive a few miles and spend an afternoon. But scholars in other parts of the world would have a much more difficult trial in accessing MPW or other publications, many of which are not even available on microfilm.  Now that the MHDL is making long runs of classic publications like The Moving Picture World available digitally, the kinds of barriers to access scholars have faced even in the recent past are now evaporating.

Simply providing access to information is useless, however, if nobody can find it – that new information has to be easily discoverable. Online catalogs and databases are very good at directing the user to an appropriate resource when the user already knows with some specificity what he or she is looking for. For example, I can read an article that mentions the film The World Aflame (1919), see a footnote to a May 1919 issue of Moving Picture World, and then very easily and quickly navigate to the online resource.

But how do I find that article without a secondary reference, if I only have the film title and the year? Full text search is, of course, essential to ensuring discoverability — and our upcoming Lantern search tool should facilitate that. In addition to search, there are also a couple other considerations for making resources discoverable. One priority is that we curate a user experience that empowers users to intuitively cut through the database, narrowing their research net according to criteria such as year, format (e.g. book or periodical), the type of publication (e.g. trade journal or fan magazine), or specific publication title.

Academics are, of course, well accustomed to advanced catalog and database searches, but most of these repositories are so immense and diverse that browsing is impractical or impossible. So we aim to supplement that random-access database searching paradigm with a sort of microfilm paradigm.

Mediahistoryproject.org’s curated collections on various topics are manageable in size and allow users to easily browse the resources we have available for their research without being overwhelmed with irrelevant matches. And once users open a publication, in addition to being able to use standard search tools, they can also flip through the entire book page-by-page incredibly quickly, the same way that microfilm researchers can.

As I suspect nearly every veteran microfilm researcher would attest, skimming through entire months and years of a publication — rather than simply cherry-picking search hits — will yield incredibly valuable insights into the historical and social contexts of the topic that the researcher is studying, as well as uncover unexpected but closely related contemporary issues. The browse-ability features of the MHDL allow researchers to immerse themselves into the historical moment and buttress their argument with a full array of related evidence. OCR is often imperfect, particularly in cases of creatively typeset headlines and advertisements, so it’s often the case that relevant material can only be discovered through good, old-fashioned skimming. By offering this hybridization of research paradigms, we hope users will be able to pursue whatever approach of searching and/or browsing works best for them and their project.

 

Eric Hoyt:

In their responses to this question, Kathy and Andy both nicely highlight some of the ways that digitization improves the research process for film historians. What I want to emphasize here is that I’m proud not simply about the fact that the MHDL has been digitizing important trade journals and magazines but about how we’ve gone about doing this. First, through coordinating with the Internet Archive, we’re primarily scanning original print editions rather than microfilm. This means color, better images, better OCR data, better everything.

I’ve come to appreciate, though, that scanning microfilm also has its place and benefits. For some oversized and particularly brittle newspapers, microfilm scanning is the only option. We were fortunate that Kathy and Q. David Bowers supplied us with over a dozen DVD-Rs worth of microfilm scans for some of these newspapers, including The Clipper and two years worth of Variety. Andy handled post-processing work on the microfilm scans, and now anyone with an open Internet port around the world can see them.

It’s this collaborative and open access model that I think defines the project. The collections of the MHDL only exist because of collaboration. I’ve collaborated closely with the project’s founder and director David Pierce, digital archivist Wendy Hagenmaier, and others in coordinating the scanning and improving our website. However, collaboration also underlies the entire acquisition and funding structure. The MHDL is supported by collectors and institutions who loan materials for scanning, and donors who cover the costs of digitization. We’ve also begun collaborating with academic groups that want to see more publications pertaining to a certain area go online. Domitor, the International Society for the Study of Early Cinema, raised over $6,000 among its member-base to contribute to the digitization of Moving Picture World and other early cinema publications.

The MHDL is also built upon open access. Most people in the world with an Internet connection have the ability to go to our site and freely read or download as many publications as they want. For users who want access to the underlying source, you can click through to a volume’s Internet Archive page (IA page) and access the uncompressed JPEGs, OCR text, and XML metadata.

We work with materials that belong in the public domain, and this provides the legal foundation for open access. But keeping things open is also a decision on our part. Subscription services, such as ProQuest and the Variety Archives, are walled gardens. Although they offer access to licensed copyrighted material, they also store countless pages worth of digitized public domain periodicals. As I have argued in the International Journal Learning and Media, I think we need to encourage public access and engagement with the public domain and call attention to this shared resource. Through digital technology and collaborative loaning and funding structures, we have the opportunity to offer broad access to public domain texts and enable their reuse across a variety of forms. To extend the earlier metaphor, we can build public parks, rather than walled gardens.

 

 

Bios

 

Eric Hoyt is Assistant Professor of Communication Arts at the University of Wisconsin-Madison. He co-directs the Media History Digital Library in collaboration with the project’s founder, David Pierce. He is also leading the development of the MHDL’s new search platform, Lantern, which is a co-production with the University of Wisconsin-Madison Department of Communication Arts. His articles on media, law, and culture have appeared in Cinema Journal, Film History, Jump Cut, World Policy Journal, and The International Journal of Learning and Media.

 

Kathy Fuller-Seeley is Professor in the Department of Communication at Georgia State University. She specializes in the history of film, radio, TV and media audiences. Kathy's books include Hollywood in the Neighborhood: Historical Case Studies of Local Moviegoing (California, 2008, edited), At the Picture Show: Small Town Audiences and the Creation of Movie Fan Culture (Smithsonian 1997), and Children and the Movies: Media Influence and the Payne Fund Controversy (Cambridge 1996, with G. Jowett and I. Jarvie).  She has a book forthcoming on the history of nickelodeons and is writing a book project about Jack Benny’s radio program and American culture.

 

Andrew Myers is a doctoral student in Critical Studies at the University of Southern California's School of Cinematic Arts. He serves as the post-processing editor for the Media History Digital Library, which generally entails writing scripts to process images, text, and metadata. He recently received his M.A. in Cinema and Media Studies from UCLA and is also the outgoing co-editor-in-chief of Mediascape, UCLA's Journal of Film, Television, and Digital Media. His diverse research interests include media industries and production culture, archival film and television history, new media (especially video games), and documentary.

 

 

 

Spread That!: Further Essays from the Spreadable Media Project

  Spreadable Media: Creating Meaning and Value in a Networked Culture, my new book with Sam Ford and Joshua Green, is launching at the end of January. Each week, we are releasing new essays written by friends and affiliates of the Futures of Entertainment Consortium which expand upon core ideas in the book. You will see that these essays are an integral part of the book, even though they are being distributed digitally. We also see these essays as a means of sparking key conversations in anticipation of the book's release. So, in the spirit of this project, "if it doesn't spread, it's dead," so we are asking readers to help circulate these essays far and wide to as many different networks and communities as they seem relevant to the ongoing conversation.

Readers are already responding, including through the creation of "memes." Over the weekend, we received this "Slap Robin" announcement via Twitter from @amclay09.

Share with us your own creations and I will showcase this here as I am posting upcoming essays.

This week, we are releasing essays which are tied to the Introduction and first Chapter of the book. Before I do so, let me share some of the early responses to the book (i.e. the solicited blurbs):

“Something new is emerging from the collision of traditionIal entertainment media, Internet-empowered fan cultures, and the norms of sharing that are encouraged and amplified by social media. Spreadable Media is a compelling guide, both entertaining and rigorous, to the new norms, cultures, enterprises, and social phenomena that networked culture is making possible. Read it to understand what your kids are doing, where Hollywood is going, and how online social networks spread cultural productions as a new form of sociality.”—Howard Rheingold, author of Net Smart

“By critically interrogating the ways in which media artifacts circulate, Spreadable Media challenges the popular notion that digital content magically goes ‘viral.’ This book brilliantly describes the dynamics that underpin people’s engagement with social media in ways that are both theoretically rich and publicly meaningful.”—danah boyd, Microsoft Research

“The best analysis to date of the radically new nature of digital social media as a communication channel. Its insights, based on a deep knowledge of the technology and culture embedded in the digital networks of communication, will reshape our understanding of cultural change for years to come.”

—Manuel Castells, Wallis Annenberg Chair of Communication Technology and Society, University of Southern California

“Finally, a way of framing modern media creation and consumption that actually reflects reality and allows us to talk about it in a way that makes sense. It’s a spreadable world and we are ALL part of it. Useful for anyone who makes media, analyzes it, consumes it, markets it or breathes.”—Jane Espenson, writer-producer of Battlestar Galactica, Once Upon a Time, and Husbands

“It’s about time a group of thinkers put the marketing evangelists of the day out to pasture with a thorough look at what makes content move from consumer to consumer, marketer to consumer and consumer to marketer. Instead of latching on to the notion that you can create viral content, Jenkins, Ford, and Green question the assumptions, test theories and call us all to task. Spreadable Media pushes our thinking. As a result, we’ll become smarter marketers. Why wouldn’t you read this book?”—Jason Falls, CEO of Social Media Explorer and co-author of No Bullshit Social Media

This week's selections include discussions of historical predecessors,  Memes and 4Chan, the debates about free labor, co-creation in the games culture, and the power of consumer recommendations. Read the sample. Follow the links (....) back to the main site. Read.  Enjoy. Spread. Repeat next week.

The History of Spreadable Media

Media have been evolving and spreading for as long as our species has been around to develop and transport them. If we understand media broadly enough to include the platforms and protocols—to use Lisa Gitelman’s (2006) terms—that carry our stories, bear our messages, and give tangible expression to our feelings, they seem intrinsic to the human experience. Some people might even argue that the developments of vocal communication systems (language) and visualization strategies (paintings and carvings) represent defining moments in human evolution, demonstrations of man’s social nature. Human mastery of media was every bit as important as the mastery of tools. Stories of the spread and appropriation of media run across our history, each shaped by the logics of social organization and production characteristic of any given era.

Early traces of the spread and reach of media abound, even if some historical forms of media fall outside our familiar categories. For example, our contemporary understanding of the reach and influence exercised by ancient empires owes much to discoveries of coins—a medium of abstract exchange if we follow Karl Marx’s argument in Capital ([1867] 1999) and elsewhere but also a system of representation and meaning (from the value of the gold or silver to the inscribed monetary value, to the messages or portraits etched on its surface) with precise culturally defined borders. The coin, as a medium, spread with the state’s citizens, enabling their interactions with one another and at the same time attesting to the state’s reign. Ceramic dishes and tiles offer an example of a medium that was seized on for reasons of cultural exchange. The rich intermingling of styles and techniques characteristic of early-seventeenth-century Dutch, Chinese, and Ottoman ceramics speaks to the period’s trade routes and export markets and the creative appropriations of these various cultural models by its artisans. But these ceramics were also platforms, complete with highly nuanced systems of signification, hierarchies of value, and attendant associations of taste. They were carried, traded, collected, and displayed by a surprisingly large cross-section of the northern European population. As the ceramics circulated within different social groups as the vogue for ceramics rose and fell and were handed down to our present as family heirloom or antique shop curio, the journeys they undertook, and the meanings accorded them as media, attest to the energies and interests of those who helped to spread them....

 

In Defense of Memes

Although I agree that the terms “viral” and “meme” often connote passive transmission by mindless consumers, I take issue with the claim that “meme” always precludes active engagement—or that the term has a universal, static meaning. As understood by trolls, memes are not passive and do not follow the model of biological infection. Instead, trolls see (though perhaps “experience” is more accurate) memes as microcosmic nests of evolving content. Contrary to the assumption that memes hop arbitrarily from self-contained monad to self-contained monad, memes as they operate within trolldom exist in synecdochical relationship to the culture in which they inhere. In other words, memes spread—that is, they are actively engaged and/or remixed into existence—because something about a given image or phrase or video or whatever lines up with an already-established set of linguistic and cultural norms. In recognizing this connection, a troll is able to assert his or her cultural literacy and to bolster the scaffolding on which trolling as a whole is based, framing every act of reception as an act of cultural production. Consider the following example.

Founded in the early nineties by rappers Violent J and Shaggy 2 Dope, the Insane Clown Posse (ICP) is a Detroit-based hip-hop group infamous for its violent lyrics, rabid followers, and, as it was recently revealed, secret evangelical Christianity. ICP, which performs in full-face clown makeup, has always been a target for trolling humor. The 2010 release of the group’s single “Miracles,” however, opened the floodgates—in the video, Violent J and Shaggy earnestly extol the virtues of giraffes, rainbows, cats, and dogs, not to mention music (“you can’t even hold it!”) and the miracles of childbirth and the cosmos. The song itself, which is regarded as the group’s evangelical “outing,” is peppered with expletives and features the line “Fuckin’ magnets—how do they work?” a question which inspired immediate and seemingly endless repurposing.

Within a few days of the video’s release, dozens of remixed images and .gifs were posted to 4chan’s infamous /b/ board, many of which merged with existing memetic content. A well-known image of a cross-eyed, bespectacled man captioned with the phrase “are you a wizard,” for example, inspired a series of related macros, including one featuring a close-up shot of Violent J in full makeup. “are you a magnet,” the caption reads, referring not just to the cluster of memes related to the “Miracles” video but also to all the permutations of the “are you a wizard” family of macros.

In short, trolls pounced on the phrase “fuckin’ magnets” not just because it was memorable and amusing on its own (although that played a large part in its popularity, as did the thrill of a gratuitous f-bomb) but because it was easily integrated into an existing meme set. Once the protomeme had been integrated, its resulting permutations—“are you a magnet” being a prime example—became memes unto themselves, establishing further scaffolding onto which new content could be overlaid. By choosing to repost “are you a magnet” on 4chan or off-site, the contributing troll was able to assert his own cultural fluency and, in the process, ensure the proverbial (and, in some ways, the literal) survival of his species. In this sense, the creation and transmission of memes can be likened to the process of human reproduction—specifically the decision to have a child in order to protect one’s legacy. The sexual act is decidedly active, but the resulting zygote is a passive (that is to say, unwitting) vessel for genetic information....

Interrogating “Free” Fan Labor

Over the past two decades, large swaths of the U.S. population have been engaged in copyright wars. On one side, copyright holders struggle to defend their property against what they perceive to be unlawful appropriation by millions of would-be consumers via digital technologies. On the other, millions of Internet users fear or fight expensive lawsuits, filed by entities far wealthier and more powerful than they, that seek to punish them for sharing media online. In this combative climate, fans who produce their own versions of mass-media texts—fan films and videos, fan fiction, fan art and icons, music remixes and mash-ups, and game mods, for example—take comfort and refuge in one rule of thumb: as long as they do not sell their works, they will be safe from legal persecution. Conventional wisdom holds that companies and individuals that own the copyrights to mass-media texts will not sue fan producers, as long as the fans do not make money from their works (for instance, Scalzi 2007 and Taylor 2007).

“Free” fan labor (fan works distributed for no payment) means “free” fan labor (fans may revise, rework, remake, and otherwise remix mass-culture texts without dreading legal action or other interference from copyright holders). Many, perhaps even most, fans who engage in this type of production look upon this deal very favorably. After all, movie studios, game makers, and record labels do not have to turn a blind eye to fan works; U.S. law is (as of this writing) undecided on the matter of whether appropriative art constitutes fair use or copyright infringement, so companies could sue or otherwise harass fan appropriators if they chose. But, even if both sides of the copyright wars consider the issue of fan labor settled, one aspect of the issue has not been sufficiently explored: can, or should, fan labor be paid labor?....

 

Co-creative Expertise in Gaming Cultures

Gamers increasingly participate in the process of making and circulating game content. Games such as Maxis’s The Sims franchise, for example, are routinely cited as exemplary sites of user-created content. Games scholar T. L. Taylor comments that players are co-creative “productive agents” and asserts that we need “more progressive models” for understanding and integrating players’ creative contribution to the making of these game products and cultures (2006b, 159–160; see also 2006a). Significant economic and cultural value is generated through these spreadable media activities. The usual phrases such as “user-created content” and “user-led innovation” can overlook the professional work of designers, programmers, and graphic artists as they make the tools, platforms, and interfaces that gamers use for creating and sharing content. Attention should also be paid to the work of producers, marketing managers, and community relations managers as they grapple with how best to manage and coordinate these co-creative relations.

The Maxis-developed and Electronic Arts–published Spore thrives on user-created content. Players use 3-D editors to design creatures and other in-game content, to guide their creatures through stages of evolution, and then to share their creations with other players. Since Spore’s release in September 2008, more than 155 million player-created creatures have been uploaded to the online Sporepedia repository. Players can also upload directly from within their game videos of their creatures to the Spore YouTube channel. Spreading content is a core feature of Spore; the game is perhaps best understood as a social network generated from player creativity. This spreadability is not just about content, as the players are also sharing ideas, skills, and media literacies....

The Value of Customer Recommendations

With new channels of communication and old, marketers can deliver a dizzying number of advertising messages to consumers—by many accounts, the average American sees between 3,000 and 5,000 ads a day. Yet, perhaps in response to this fusillade, consumers have learned to better armor themselves against the marketing messages they encounter. The Persuasion Knowledge Model (PKM) describes the extent to which consumers develop a radarlike ability to discern content whose aim is to persuade and, further, how they develop a set of skills to deal with such messages (Friestad and Wright 1994). Some of my own recent research (with colleagues Adam Craig, Yuliya Komarova, and Jennifer Vendemia) uses fMRI technology to explore brain activity as consumers are exposed to potentially deceptive product claims. Our findings show that consumers’ deception-detection processes involve surprisingly rapid attention allocation. Potential advertising lies seem to jump out of the marketing environment and rivet our attention like a snake on a woodland trail.

Advertisements are often informative as well as persuasive; consumers know this and don’t dismiss ads out of hand. But they do assess the extent to which they trust or are willing to use such information. First, and most critically, consumers seek to evaluate the credibility of a marketing message’s source. Source credibility is the bedrock of trust that precedes persuasion. People judge a source to be credible if the source shows evidence of being authentic, reliable, and believable. In the old days of marketing, firms sought to increase the source credibility of their ads by featuring the endorsements of doctors, scientists, and other authoritative experts. Once consumers became more aware that these experts were being paid handsomely for their testimony, the practice became less effective. Celebrity endorsers, who often were not product experts, provided warm affective responses but little in the way of believable, persuasive arguments.

Consumers themselves are particularly important endorsers via word-of-mouth (WOM) messages. Our past understanding of WOM (when one consumer recommends a product to another) was that consumers perceive other consumers as highly authentic but of dubious reliability. As when one’s Uncle Joe touts the superior performance of the Brand X computer, the recommender is clearly a real person but may or may not be knowledgeable enough about the product category to make credible claims. Now, with WOM increasingly occurring through spreadable media, it is more difficult for a consumer to assess both the authenticity and reliability of unknown recommenders. The practice of rating consumers’ online opinions and recommendations (e.g., Yahoo! Answers) is a direct attempt to resolve the audience’s uncertainty about who really knows something worth knowing....

 

HOT.SPOT: The Dark Side(s) of DIY

From time to time, I have written here about the work of the Civic Paths research team in the Annenberg School of Communications and Journalism at the University of Southern California. I helped to start this research group when I arrived in Los Angeles three and a half years ago; it has been the seedbed for our Media Activism and Participatory Politics project which has generated a series of case studies of innovative activist groups (and will be the basis of an upcoming book). But, the group has become something more than that -- a space where students and faculty gather to discuss the participatory turn in contemporary culture and politics. Such discussions thrive on our internal discussion list, and we've been experimenting with various ways to get these ideas out to the world both formally through op ed pieces and informally through blogging. The team recently launched a new project -- HOT.SPOT to encourage as many of the members as possible to write short blog posts around a related theme -- think of it as a mini-anthology. Lead by my journalism colleague Kjerstin Thorson and our post-doc Liana Thompson, the first of our "HOT.SPOT" blogs deals with the "Dark Side(s) of DIY."  Our work has been so focused on the values and practices of participatory politics, it seems inevitable that reservations and concerns would rise to the surface. If only Nixon could go to China, perhaps our group has an obligation to also call out the abuses, misuses, and failures of DIY culture and politics.

So, let me pass the microphone over to Kjerstin Thorson who will set up this special issue, and you can follow the links out to the individual posts.

 

Hotspot Philosophy

Welcome to the first of what we hope will be a series of Civic Paths “hotspots.” These collections of mini-blog posts are organized around themes that cut across the diverse interests of participants in our research group. They’re about the things we love to talk about. And, like our in-person conversations, they play with ideas at the intersection of participatory culture, civic engagement, and new media. Our rules for the hotspot are these: No one gets to spend a million hours wordsmithing -- these are idea starters, not finishers -- and posts shouldn’t be a whole lot longer than five hundred words.

Kicking it off: The Dark Side(s) of DIY

Don’t get me wrong: I love DIY. I muddled through the acquisition of basic sewing skills (thanks, Internet) to make a much-loved, crooked crib skirt for my daughter. My now-husband and I navigated the complexities of his immigration to the U.S. without hiring a lawyer, relying entirely on a discussion board about fiancée visas. Last year, we even put a fountain in our backyard (it was crooked, too).

In fact, I venture to say we all love DIY—and are genuinely excited about the role of new media technologies for amplifying the possibilities to make stuff, share stuff, spread stuff and generally participate in public life in a million different ways. But we also believe that DIY (or at least the mythology of DIY) has some dark sides.

Liana [1] and Sam [2] remind us that just because you do it yourself doesn’t mean that what you make will find an audience, or even that what you make will be any good. Kevin [3] considers the often-fraught relationship some DIY practitioners have to potentially dubious funding streams, and Lana [4] points out that the business of DIY can often be the selling of awful. Andrew [5] looks at what happens when crowdfunding goes awry and DIY communities try to mete out justice online. Rhea [6] also examines online communities taking matters into their own hands, highlighting the misunderstandings and mishaps that get created in the process.

Neta [7] and I [8] share an interest in the ways that beliefs about DIY political knowledge—everyone should be a fact checker! Figure out everything for yourself!—may shut down possibilities for political engagement. Mike [9] takes on the contradictions behind the idea of DIY news, and Raffi [10] wonders whether the race to make and spread the pithiest, funniest political nuggets is taking away from other forms of online political talk.

With these posts, we hope to collectively shed light on some of the difficulties that arise from an otherwise celebrated mode of creation and engagement. And while we all love DIY and its range of possibilities for civic life, we think pulling back the curtain to show when it goes wrong is an important step in figuring out how DIY can take us even further in the future.

-- Kjerstin Thorson (Assistant Professor of Journalism)

[1] On Finding an Audience, or Why I'm Not a Rock Star, by Liana Gamber Thompson

[2] Producing Poop, by Sam Close

[3] Makerspaces and the Long, Weird History of DIY Hobbyists & Military Funding, by Kevin Driscoll

[4] Blogging and Boycotting in the "Schadenfreude Economy", by Lana Swartz

[5] Gatekeepers of DIY?, by Andrew Schrock

[6] The Role of Japanese & English-language Online Communities in the Mitsuhiro Ichiki Incident, by Rhea Vichot

[7] DIY Citizenship & Kony 2012 Memes, by Neta Kligler-Vilenchik

[8] Figure It Out for Yourself, by Kjerstin Thorson

[9] Why “DIY News” Could Be a Contradiction in Terms, by Mike Ananny

[10] Memed, Tumbled, & Tweeted, by Raffi Sarkissian