In our study of IMDb, we follow Jeacle and Carter (2011) by employing a “netnography”, which is a “…qualitative research methodology that adapts ethnographic research techniques to study the cultures and communities that are emerging through computer-mediated communications” (Kozinets, 2002: 62). This is due to the fact that tools through which users of popular culture consume, deliberate on and hold conversations regarding singular products and services have radically changed given developments in online digital and social media, often to the extent that the line between ‘real’ and virtual interactions has become blurred (Kozinets, 2002; Mann and Stewart, 2000). As a result, online spaces have become crucial in understanding the nature of discourse surrounding popular culture items (Beer and Burrows, 2007).
Our netnography involves an in-depth study of the reviews posted on IMDb forums with respect to particular films. There are three main reasons why these forums were useful in the context of our study. Firstly, we consider these forums to be a prime example of an online space that mediates and facilitates wide-ranging discussions on particular aspects of popular culture, in this case films, and allows users to create, share, collaborate and communicate. Secondly, we also find that they represent “a series of acts representing the presentation of [the] self by those who have contributed to them” (Miley and Read, 2012: 707). Finally, the IMDB forums appear to also have performative qualities in terms of shaping and influencing the manner in which their users construct and evaluate their own preferences (Hine, 2000).
While the netnographic analysis of online activity on the IMDb website allowed valuable observations of users’ interaction with the site, it was relatively limited with regards to the collection of in-depth information pertaining to why reviewers found certain performance tools to be helpful or unhelpful in terms of evaluating their preferences. Although there are opportunities on IMDb forums to respond to and comment on the reviews of specific users through follow-on reviews, such opportunities were largely related to commenting on previous reviewers’ opinions of the film and not a comment on how they evaluated or chose to view the film in the first instance. Therefore, we found that many of the IMDb reviews of relevance to our study represented one-way communication, with little interactive dialogue. As a result, our data collection comprised of two main components: first, an observational netnographic analysis of online content, specifically film reviews on the IMDb website, and secondly, offline face-to-face interviews with filmgoers to add depth to our understanding of how evaluation tools such as IMDb were used by them (Bly et al., 2015; Kozinets et al., 2011).
3.1 Data collection The netnographic data collection was conducted over a two month period using a passive or ‘lurker’ approach in which the researchers did not reveal their research activity to the online participants on IMDb and did not participate in online exchanges on the site (Mkono, 2012). This allowed the conduct of an unobtrusive data collection technique (Jeacle and Carter, 2011), where the online reviewers on IMDb remained unaware of the researcher’s activities,8 which allowed for uninhibited observation of their interactions with the site (Langer and Beckman, 2005). Theoretical sampling was used to identify reviews that were of potential relevance to our study (Glaser and Strauss, 1967). Therefore, reviews that explicitly stated how an IMDb user chose a specific film to watch or provided details about their decision making process with regard to choosing a film to watch were identified and stored electronically for further analysis. We supplemented our netnography with a series of interviews following the completion of the netnographic data collection and a preliminary analysis of it. In order to recruit prospective interviewees, we advertised our study on a selected number of online business course pages at a large metropolitan Australian university. Additionally, the first author attended postgraduate business classes in the same university, provided a brief presentation about the project, and asked for volunteers to contact him via email. We screened prospective interviewees to ensure that they enjoyed films, watched them regularly, and were prepared to give consent to be interviewed. As a result, twelve business students agreed to be interviewed. Interviewees were all aged between 18 and 30, and all indicated that they watched films either at the cinema or at home at least once a month. Each interviewee received a cinema ticket to compensate them for their time.
In order to enhance the reliability of the interview process, each interview was attended by at least two of the researchers (Pettigrew, 1988). Interviews were semi-structured in nature and were guided by an interview protocol, which was informed by preliminary analysis of the netnographic data. In comparison to the often static nature of the IMDb reviews collected as part of our netnographic study, the interviews were interactive in nature and allowed the researchers to probe interviewees in-depth with regards to their general experiences and use of online evaluation tools, as well as how such tools shape their film preferences and choices. In addition, interviewees themselves were also afforded the opportunity to explore other issues that they considered relevant in the context of the researchers’ general line of enquiry.
Eleven of the twelve interviews were audio-recorded and later transcribed. In the case of the twelfth interview, the researchers took detailed notes which were written up the same day (c.f. Chenhall et al., 2010). When the data from these interviews were considered in conjunction with the vast amount of data collected through the netnography, we were comfortable that theoretical saturation (c.f. Glaser and Strauss, 1967) had been achieved by the end of the data collection process.
3.2 Data analysis Netnographic and interview data was analysed with respect to how performance evaluation tools were used to shape and guide how users evaluated films. Analysis of the data took place iteratively over the entire data collection period. During this time, categories of relevance began to emerge. On completion of the netnography and interviews, interview transcripts and netnographic data were carefully analysed and reorganized around issues and categories of significance (Ahrens and Chapman, 2004; Creswell, 2007). Patterns were identified and considered in light of disconfirming evidence and interpretations (Creswell, 2007; Miles and Huberman, 1994). This enabled the development of three broad research themes, which were then developed into narrative form, and subsequently re-drafted and refined by the authors (Llewellyn, 1996; O’Leary and Smith, 2015; O’Sullivan and O’Dwyer, 2009).