original scientific article UDC 316.775.3:179.8:004.738.5(497.4) received: 2014-07-14 KEEPING HATE SPEECH AT THE GATES: MODERATING PRACTICES AT THREE SLOVENIAN NEWS WEBSITES Igor VOBIČ University of Ljubljana, Faculty of Social Sciences, Kardeljeva pl. 5, 1000 Ljubljana, Slovenia e-mail: igor.vobic@fdv.uni-lj.si Melita POLER KOVAČIČ University of Ljubljana, Faculty of Social Sciences, Kardeljeva pl. 5, 1000 Ljubljana, Slovenia e-mail: melita.poler-kovacic@fdv.uni-lj.si ABSTRACT The study explores the decision-making rationale of news websites' moderators who keep hate speech at the gates by reviewing and selecting users' comments for publication under news items. By using document analysis, newsroom observations and interviews, the study indicates a combination of traditional and network gatekeeping at three leading Slovenian news websites. The adopted minimal measures for regulating hate speech at 24ur.com, Siol.net and Rtvslo.si and their various gatekeeping mechanisms call for reconsideration of some central issues in contemporary social communication: the gatekeeping model and technological innovation as well as multivalent roles of news media in public life. Keywords: online users' comments, hate speech, gatekeeping, self-regulation, journalism, Slovenia. MANTENERE DISCORSO INCITANTE ALL'ODIO ALLE PORTE: PRATICHE DI MODERAZIONE DI TRE SITI DI NEWS SLOVENI SINTESI La ricerca esplora la logica e motivazioni dietro le decisioni di moderatori dei siti di news che tengono discorso incitante all'odio alle porte tramite il selezionamento di commenti da pubblicare nella sezione notizie. Usando analisi di documenti, osservazioni nella redazione, e interviste, la ricerca identifica la combinazione di gatekeeping tradizionale e quello di rete di tre principali siti di news Sloveni. Le minime misure adottate per regolare il discorso incitante all'odio di siti 24ur.com, Siol.net, e Rtvslo.si e i loro meccanismi di gatekeeping invitano alla riconsiderazio-ne di alcuni temi centrali alla comunicazione sociale contemporanea; cioe il modello di gatekeeping e innovazioni tecnologiche, cosi come i vari ruoli di news media nella vita pubblica. Parole chiave: commenti degli utenti online, discorso incitante all'odio, gatekeeping, auto-regolamentazione, giornalismo, Slovenia. INTRODUCTION Comments under news items on news websites are the most popular as well as the most controversial form of audience participation (Ruiz et al., 2011). On the one hand, they provide an opportunity for citizens to engage in a public debate about relevant issues, on the other hand they represent an arena where hatred and offence can easily be expressed and disseminated. Previous studies of hate speech on the Internet mostly focused on monitoring, tracking and regulating hate speech (e.g., Nemes, 2002; Harris et al., 2009; Henry, 2009; Reed, 2009). Several studies analysed discourse in hate group websites (e.g., Duffy, 2003; Brown, 2009; Cammaerts, 2009; Meddaugh & Kay, 2009; McNamee, 2010), including people's perceptions of hate sites (Leets, 2001), while the problem of hate speech in news comments has mostly been neglected, as have the moderating practices. Investigating dynamics between regulatory structure and gatekeeping agency in the context of hate speech in users' comments is relevant because it helps to identify the institutionalised boundaries of meaningful interaction online, character of journalist-audience relations and possibilities for deliberation on websites of traditional media. Some authors researched comments' effects on readers (e.g., Lee, 2012), the problem of commenters' anonymity (e.g., Hlavach & Freivogel, 2011; Rosenberry, 2011; Shepard, 2011; Reader, 2012) and journalists' views on news comments (e.g., Singer et al., 2011; Santana, 2011; Nielsen, 2012; Loke, 2012), while others were concerned with other forms of inappropriate speech in news comments, such as offensive speech (Erjavec & Poler Kovačič, 2013) or impolite reader responses (Neurauter-Kessels, 2011). Only a few analysed characteristics of hate speech in news comments (e.g., Erjavec & Poler Kovačič, 2012) or explored it as an ethical issue in journalism (Singer et al., 2011). Media rules about hate speech comments (as part of general media guidelines of audience participation) were also given only scarce attention (e.g., Ruiz et al., 2011; Singer et al., 2011), as well as media strategies of online content moderation (e.g., Reich, 2011; Hughey & Daniels, 2013). However, the rationale behind winnowing, reshaping or prodding user-generated content, also in regards to hate speech, has not yet been researched, as previous studies on transformations of gatekeeping in journalism (e.g., Lowery, 2009; Singer & Ashman, 2009; Barlow, 2010; Reich, 2011) predominantly focused on the power struggle in journalist-audience relations. Therefore, the goal of this study is to explore the decision-making rationale of news websites' moderators who review and select users' comments for publication under news items. The study is placed at the intersection of the classical debates on freedom of the press and freedom of expression (e.g., Splichal, 2003), and discussions on the challenges to journalism's gatekeeping role in contem- porary online communication contexts (e.g., Shoemaker & Vos, 2009). In this context, moderating users' comments and keeping hate speech at the gates is relevant for the discussions on what Carpentier (2011, 30) calls "socio-communicative relationships" within the social, technological and institutional predispositions, enabling joint content production and mutual reception online as well as the character of interactions in the public sphere. Thus, the study theoretically and empirically investigates the online media (self-)regulation framework with respect to users' comments and respectively practices of moderating hate speech in order to better understand the mechanisms behind gatekeeping in contemporary online communication. The research combines methods of semi-structured interviews with comment moderators and editors at the three most visited Slovenian news websites (24ur.com, Siol.net and Rtvslo.si), observation in the newsrooms and analysis of strategic documents on moderating online users' comments. THEORETICAL BACKGROUND: MODERATING ONLINE USERS' HATE SPEECH COMMENTS Online Media (Self-)Regulation Framework Freedom of the press and freedom of expression are considered cornerstones of modern democracy (e.g., Splichal, 2003). In democratic societies, which are embedded in the tradition of social responsibility (Cammaerts, 2005), freedom is never boundless; it is treated as inseparable from responsibility. The goal of moderating comments, that is, "deleting or blocking those deemed offensive or unsuitable" (Goodman, 2013, 8), is to restore appropriate balance between freedom and responsibility and thus, to ensure a high quality of discussion since comments can impact the way that a news item is interpreted by readers. According to Anderson et al. (2014, 383), online incivility may impede the goal of enriching public deliberation; impolite and incensed comments can polarise users based on value predispositions. Users' comments significantly alter readers' perceptions of an issue, independently or in conjunction with other factors (Lee, 2012, 32). Therefore, hate speech comments can cause damage. The purpose of regulating hate speech is to prevent interference with human rights and values, such as dignity, non-discrimination, equality, (effective) participation in public life, freedom of expression, association, or religion, and to prevent the occasioning of certain harms, such as psychological harm, damage to self-esteem, inhibited self-fulfilment, or fear (McGonagle, 2013, 6). However, this purpose cannot be achieved merely through legal regulations as laws cannot guarantee responsibility and quality in the media (see Bertrand, 1997, 12). Media self-regulation is essential because it helps to preserve the independence of the media, pro- tects them from state interference, and drives up professional standards by requiring organisations to think about and develop their own standards of behaviour (Puddephatt, 2011, 12). Among self-regulatory mechanisms, professional codes of conduct are the most common, yet they are difficult to uphold. An important element of self-regulation is the professional guidelines adopted by media organisations as a matter of editorial policy (Puddephatt, 2011, 14), which can cover various issues in more detail. Adopting editorial guidelines on hate speech in comments is useful because they advise readers as well as guide and defend the moderation process; while some guidelines are rules about what readers cannot do, others offer more constructive advice for writing appropriate comments and articulating arguments (Goodman, 2013, 29). In Slovenia, any incitement to national, racial, religious or other discrimination, the inflaming of national, racial, religious or other hatred and intolerance, and any incitement to violence and war, is prohibited by the constitution (DZ RS, 1991, Article 63). The Criminal Law (DZ RS, 2012, Article 297) prescribes imprisonment for whoever publicly incites or stirs up hatred, violence or intolerance based on nationality, race, religion, ethnicity, gender, etc., and the act is committed in a way that threatens or disturbs public order and peace, or by means of threats or insults. The editor-in-chief, or a person acting as his/her deputy, can also be punished if a criminal offence has been committed through the mass media. In the Mass Media Act, there is a provision which prohibits the dissemination of programmes that incite national, racial, religious, sexual or any other inequality, or violence and war, or incite national, racial, religious, sexual or any other hatred and intolerance (DZ RS, 2006, Article 8). Hate speech is considered unacceptable by media self-regulatory guidelines too. According to the Code of Slovenian Journalists, "inciting violence, spreading hatred and intolerance and other forms of hate speech are inadmissible. A journalist should not allow them; if this is not possible, he/she should immediately react and condemn them" (DNS & SNS, 2010, Article 21). The code also states that the editor-in-chief is responsible for the content of comments and other contributions from media users and should prepare rules for publishing comments; any comment which is not in compliance with the published rules must be deleted as soon as possible (DNS & SNS, 2010, Article 16). In 2010, eight Slovenian online media organisations (Delo.si, Dnevnik.si, MMC, Siol.net, Vecer.com, Zur-nal24.si, 24ur.com and Slovenskenovice.si) signed the Code for Regulation of Hate Speech in Slovenian Web Portals (SAFE, 2010/11). The Code, which has been prepared by the Centre for Safer Internet and its anti-hate speech internet point, the Web Eye, obliges the signatories to introduce registration of commenters as well as a system of content moderation. Web portals should include a warning that hate speech is against the Criminal Law and include a button to report hate speech comments. The Web Eye has also published a manual for moderators and editors of websites (Spletno oko, 2013). Signatories of the Code for Regulation of Hate Speech in Slovenian Web Portals have morally bound themselves to respect legal provisions and ethical norms which prohibit hate speech. However, it has not yet been researched whether (and how) the code has been implemented in their media practices. If signing the code can be understood as the first step to regulating hate-speech on their sites, the second step should be adopting internal guidelines in line with the code. Therefore, our first research question is: How is regulation of online users' hate speech comments under news items defined in strategic documents of Slovenian online media? Online Media Moderating Practices and the Concept of Gatekeeping In the traditional journalistic culture, the term gatekeeper indicates editors' and journalists' claim to be the ones who decide what makes news. The gatekeeper role is maintained and enforced by professional routines and conventions which are to guarantee the quality of institutional journalism (Domingo et al., 2008, 326). However, new possibilities of audience participation through the media present a challenge for the traditional gatekeeping of media and journalists. Namely, moderating users' comments is not a unilateral process as these threads are more inclusive communication spaces than traditional participatory channels, such as letters to the editor (e.g., Thurman, 2008; Reich, 2011). The latter were editorially governed by "journalistic logic", while users' comments are governed by "broader social standards" such as considerations of decency, civility, taste and legality (Reich, 2011, 97). In this context, the notion of gatekeeping calls for precise conceptual work to be used as a tool for analysis and understanding of the practices of moderating online users' comments. Specifically, it appears that moderating users' comments rests at the intersection of two debates on the transformations of gatekeeping in the media. One group of scholars (e.g., Hermida, 2010; Bruns, 2009, 2011; Broersma & Graham, 2012; Graham et al., 2013) discusses gatekeeping in the context of larger alterations in communication where journalists are disappearing as "traditional gatekeepers of political discourse" (Graham et al., 2013, 85). While "people formerly known as the audience" are assuming more active roles in creating news and facilitating public debate, they are able to bypass traditional media when trying to link to political life (cf. Rosen, 2012). For instance, concepts such as "audience gatekeeping" (Shoemaker & Vos, 2009), "gatewatching" (Bruns, 2011) and "gatekeeping Twitter" (Bastos et al., 2013) indicate that jour- nalists are losing gatekeeping privileges and that power is being dispersed among various actors in contemporary communication. The other group of studies (e.g., Thurman, 2008; Thurman & Hermida, 2010; Reich, 2011; Lasorsa et al., 2012; Thurman & Newman, 2014) puts the contemporary gatekeeping transformations in the context of the newsroom. Although Boczkowski (2004) acknowledged the phenomenon of "gate-opening" a decade ago, these studies show that journalism is still not fully inclined to relinquish their gatekeeping role by sharing the stage with the heterogeneous network of news gatherers and commenters. In this context, journalists have started to rethink the services they provide to the public. As a result, journalists are increasingly adopting the "curator role" (e.g., Bruns, 2011; Pöttker, 2012) in order to overcome the monolithic character of traditional news provision, to adapt to the multi-perspectivity of the contemporary news environment, and to distinguish themselves from other actors while the gates are half-open. Thereafter, as the phenomenon of users' comments merges, the solid logic of traditional media, with the always-on presence of online communication threads, would suggest that the "network gatekeeping model", introduced by Barzilai-Nahon (2008), is a useful conceptual framework. By considering "ambient awareness" of contemporary communication (Kuwabara et al., 2002) where journalism - through its interactive websites and online social networks - is constantly connected with audiences, Barzilai-Nahon (2008, 1496-1497) adapts the gatekeeping framework by adding new terms and redefining traditional ones: (1) gate, i.e. entrance to or exit from a network or its threads; (2) gatekeeping, i.e. the process of controlling information (e.g., selection, addition, withholding, display, channelling, shaping, integration, disregard and deletion) as it moves through a gate; (3) gated, i.e. the entity subjected to gatekeeping; (4) gatekeeping mechanism, i.e. a tool, technology, or method used to carry out the process of gatekeeping that defines the interactions between gated and gatekeepers bounding them to a particular structure of discourse; (5) gatekeeper, i.e. an entity that has the discretion to exercise gatekeeping through a gatekeeping mechanism and can choose the extent to which to exercise it contingent upon the gated. Traditional media have adopted different strategies to deal with news comments which affect human dignity - from turning them off or not archiving them to requiring registration and moderating them in different ways (e.g., Hermida & Thurman, 2007; Reich, 2011; Hughey & Daniels, 2013; Goodman, 2013). Research of news websites in the UK found that media are increasingly shifting towards moderating user-generated contents; more than two-thirds of the sites moderate comments, while those that do not, require registration (Hermida & Thurman, 2007, 9). According to a survey of media from 63 countries, only seven organisations do not al- low comments, while 38 moderate pre-publication, 42 moderate post-publication, and 16 use a mixed approach (Goodman, 2013, 7). Similarly, an international comparative study (Reich, 2011, 113) reveals that news websites developed two main strategies of moderation: "interventionist strategy" of pre-moderation of every comment despite heavy financial tolls, and "autonomous strategy" of post-moderation as a response to the flood of comments. In Slovenia, on the basis of interview data, Motl's (2009) research of editorial policies at six online media organisations revealed diverse approaches to hate speech regulation. However, the study neglected to consider the practice of moderation with respect to the mechanisms (i.e., tools, technology and methods), implying particular moderator-user interactions and negotiation of users' comments as a particular communication space. To get such a comprehensive insight into Slovenian news websites' moderation of users' comments, particularly regarding hate speech, our second research question is: How is moderating of users' comments manifested at Slovenian news websites in regards to the main gatekeeping mechanisms? METHODS To explore the decision-making rationale of Slovenian news websites' moderators, who moderate comments published under news items, three methods were combined. The subjects of the research were the three most visited Slovenian news websites (24ur.com, Siol. net and Rtvslo.si) (MOSS, 2013), which are also signatories to the Code for Regulation of Hate Speech in Slovenian Web Portals (SAFE, 2010/2011). According to the code (ibid.), the signatories are obliged to require registration for commenters and to moderate their contents. Submitting comments should be carried out through a form which contains a clear provision that Criminal Law (Article 297) prohibits public incitement of hatred, violence or intolerance based on nationality, race, religion, ethnicity, gender, etc. In the comments section, a "report hate speech" button should be included for anonymous reporting of hate speech. Guidelines in this code provide the minimal level of measures for regulating hate speech on web portals, while the signatories can also adopt additional measures. To answer the first research question and thus establish what regulations of hate speech comments have been adopted by the code's signatories, an analysis of the main documents that formalise rules for publishing comments at the three news websites was performed. Document analysis can be defined as a systematic procedure for reviewing or evaluating documents which, like other analytical methods in qualitative research, requires data to be examined and interpreted in order to elicit meaning, gain understanding and develop empirical knowledge (Bowen, 2009, 27). In order to explore the second research question, from March to May 2014 we conducted observations at 24ur.com, Rtvslo.si and Siol.net. By focusing on gaining an insider's look into what gatekeeping mechanisms are used by the moderators and how processes of reaching and making moderating decisions are negotiated, we entered the small-scale institutional setting for 10 work shifts (four at Rtvslo.si and 24ur.com; two at Siol.net) and took the role of "observers-as-participants" (Gold, 1958). Thus, in the field, we were known and recognised as we related to the subjects on the field solely as researchers. Due to the rather brief observation periods, we had to be systematic in gathering, assembling and analysing directly witnessed data (cf. Neuman, 2006). The first step of the process was to set down what was experienced, based on full field notes containing memos and notes taken in the newsroom. The second step was detached from the field and done after the observation, when we compared what was observed that day to what had been previously observed, and arranged data within an observational scheme organised according to the second research question. The third stage was done after the observations, when we started to conceptually analyse the collected data and to synthesise data from the field within the study's framework. Additionally, in May 2014 we made qualitative interviews with a total seven moderators from 24ur.com, Rtvslo.si and Siol.net and their online executive editors in order to gain their perspectives on moderating through explanations, stories and accounts, as well as to acquire comments on observational data (Lindlof & Taylor, 2002). The semi-structured conversations were based on the interview guide, but simultaneously they were open to allow new ideas to be brought up during the interviews (Morse, 2012). We used three types of questions for these particular research purposes (Legard et al., 2003; Flick, 2006; Wang & Yan, 2012). Content-mapping questions were used to start the conversation on the topic rather loosely, i.e. questions on their moderating experiences and working routines. Then we asked theory-driven questions based on the study's conceptual framework, i.e. questions through which mechanisms of gatekeeping hate speech are reconsidered. Finally, the content-mining questions responded to the notions the interviewee had presented up to that point in order to critically re-examine them, i.e. questions on discrepancies and connections between formalised rules and moderating mechanisms. After all interviews were conducted, we applied McCracken's (1988) five-step process of qualitative interview analysis. Through careful reading, preliminary descriptive and interpretative categories were made. Later, with thorough examination of these codes, connections and patterns in the narratives were identified. Further, by examining clusters of comments, the analysis involved a determination of basic themes. Lastly, we examined themes from all interviews across such groupings to delineate predominant ones in relation to the second research question. RESULTS Regulation of Hate Speech Comments in Online Media Strategic Documents Analysis of strategic documents which regulate websites' content at 24ur.com, Siol.net and Rtvslo.si shows that all three media have adopted the minimal measures for regulating hate speech, as defined by the Code for Regulation of Hate Speech in Slovenian Web Portals (SAFE, 2010/2011): (1) requiring registration for commenters; (2) moderating comments; (3) submitting comments through a form which contains a clear provision that the Criminal Law (Article 297) prohibits public incitement of hatred, violence or intolerance based on nationality, race, religion, ethnicity, gender, etc.; and (4) a "report hate speech" button in the comments section. 24ur.com. At the news website of the private company, Pro Plus, they have introduced registration for commenters and the moderation of comments. When registering, a user has to agree with the General Terms of Using Web Portals of the Company Pro Plus d. o. o. (Pro Plus, b) which explain that comments are moderated and that decisions regarding enabling comments under particular news items are within the competence of the newsroom. A link to the Rules for Publishing Comments (Pro Plus, a) is part of the general terms, and the link can also be found between each news item and its comments sections. There is also a provision that states that an individual is, according to the Article 297 of the Criminal Law, responsible for public incitement of hatred, violence or intolerance. A "report hate speech" button is placed next to it. By pressing the button, a user anonymously reports hate speech to the Web Eye where they check the comment and report it to the police if it contains elements of criminal offence. The Rules for Publishing Comments (Pro Plus, a) have been adopted with the intention to provide a positive contribution to public discussion. In most paragraphs, they are a Slovenian translation of the BBC's House Rules. The word "rules" in the text links directly to the BBC rules, yet they are not explicitly cited as the source. Even though the expression "hate speech" is not used, Pro Plus reserves the right to reject comments which could "severely disturb, provoke, attack or offend other users" or "are racist, sexist, homophobic" (Pro Plus, a). Comments are moderated in two ways: (1) post-moderation (all comments appear on the web immediately and are checked afterwards), and (2) reactive moderation (a comment is checked reactively if a complaint has been received about it). If a user notices a comment that may break one of the house rules, he/she can alert moderators on moderator@pop-tv.si. The time needed to review comments depends on the number and length of comments and in most cases takes a few minutes. Sometimes a comment is sent for further review to an editor or members of the newsroom who are in charge of moderation. If a comment is removed, the commen-ter is notified by email. A user's account can also be blocked for a period of one month. In extreme cases of racist, sexist, homophobic, offensive or otherwise objectionable contents, an account can be immediately and permanently closed. According to the rules, moderation is performed by a team of trained moderators, and a comment is never removed without being read and reviewed by a moderator or an editor. However, filters are also used to prevent publication of certain offensive words or to detect comments which may breach the rules. In such a case, a user cannot post his/her comment because it contains problematic keywords. Siol.net. At the news website of Telekom Slovenije, commenters also need to register to post comments. By registering they bind themselves to an agreement that they will not publish hate speech comments (TSmedia). They can report inappropriate comments by pressing a flag on the right side of a comment or by sending an e-mail to moderator@tsmedia.si. A text titled Commenting and Online Manners at Planet Siol.net is published between each news item and its comments section, and it contains a link to their "rules of commenting", i.e. a document entitled Tolerant and Safe Environment for Discussions Based on Arguments (Planet Siol.net). A provision that Criminal Law (Article 297) prohibits public incitement of hatred, violence or intolerance is also stated there, as well as a link to the editor's column about commenting, addressed to anonymous commenters at Siol.net (Urbas, 2014). The rules of Planet Siol.net aim to provide readers with a tolerant and cultural environment for discussing topics related to news items. They explain the system of deleting inappropriate comments and restricting access to posting comments. Moderation is performed both through computers, by considering different algorithms, and manually. Comments can be placed in a waiting queue and remain there for different periods of time, depending on the number of comments. The rules also provide recommendations for tolerant communication, including a statement that hate speech, both direct and covert, has no place on their forum since critiques of an organisation, an individual or a group can be expressed without attacks and hatred. According to the document, any hate speech in any form will not be allowed. Those who repeatedly breach the rules will lose access to commenting temporarily or permanently; in the case of "extreme hate speech", this measure is carried out without prior e-mail notification. Rtvslo.si. At the news website of the public broadcaster, Radio-Television Slovenia, they have also established a system of registration and moderation. A "report hate speech" button is placed between a news item and its comments section. When pressing the button, a document opens which contains a provision that an individual is, according to Article 297 of the Criminal Law, responsible for public incitement of hatred, violence or intolerance. It is an anonymous report, sent to the Web Eye. There is also a link to the Standards and Rules of Communication on the Website Rtvslo.si (Rtvslo.si, 2014), accompanied by an explicit request to respect these rules and not to use hate speech. When registering, a user has to confirm that he/she agrees with these rules, and that he/she is aware that hate speech is forbidden by the constitution and legislation in Slovenia. According to the document (Rtvslo.si, 2014), users' comments are published directly and are not pre--moderated. Users who seriously or frequently violate the rules or intentionally ignore them can be warned by administrators, they can be put under supervision or their username can be blocked. An administrator has the right to remove a comment which violates the rules. Moderating Hate Speech Comments on News Websites Observations of moderating practices and interviews with moderators and online executive editors acknowledge gatekeeping mechanisms (i.e., tools, technology and methods) that define live interactions between moderators (the gatekeepers) and users (the gated), bounding them to a particular structure of online discourse. According to the interviewees, moderating hate speech indicates primary mechanisms of moderators' gatekeeping of users' comments. 24ur.com and Rtvslo.si have teams only for moderating comments, while Siol.net places moderation in the multitasking of online executive editor's deputy and one online journalist. At all three websites, they perform automated moderation, post-moderation and reactive moderation. Only Rtvslo.si, in the case of users "under control", pre-moderates all their comments. The following dissects seven mechanisms of keeping hate speech at the gates of the respective online media in regards to slightly distinct technology and methods, particularly negotiated gatekeeper-gated relations and different understandings of users' comments as a particular communication space. Disabling comments. According to gathered data, the three media disable users' comments in order to limit the communication space for anticipated hate-speech or to "stop the floods of hatred" (moderator Rtvslo.si A). During observations and interviews with 24ur.com, moderators stress they "anticipate hate speech under certain content" (moderator 24ur.com A). "When there is news on Roma or, recently, on sexual assaults in India, I go to the editor and ask to close comments for such an item. Comments otherwise lose their prime purpose - discussing and expressing opinion." (ibid.) Siol. net disables comments not before a certain news item is published but two days after - when its "lifespan" supposedly ends (online executive editor Siol.net). "That is also because we have a small team that is not completely dedicated to moderating. I have other tasks as an editor and it happens that I can overlook something." (moderator Siol.net A) Yet, at Rtvslo.si moderators and editors agree that "conflicting topics" (moderator Rtvslo.si B) are good opportunities to present their hate speech moderating practices and educate moderators. However, during one of the observations, the editor decided to disable commenting under the item on setting up a memorial statue for the Slovenian Home-Guards, who were Nazi collaborators during the Second World War. "Only in rare cases do we decide to do that. When there is nothing else but a spitting war full of hatred." (moderator Rtvslo.si B) Forbidding specific words and phrases. 24ur.com uses a system that disables comments with "forbidden words" (online executive editor 24ur.com) from being published, while Siol.net uses a semantic system that, on the basis of algorithms, "sets the tone for discussing in the community" (moderator Siol.net A). Yet, the interviewees see these automated gatekeeping mechanisms similarly: as "a minor help" (moderator 24ur.com B) and "being easily bypassed" by the users (moderator Siol. net A). 24ur.com moderators have stopped complementing forbidden words with new examples as it appears as a "Sisyphean task" (online executive editor 24ur.com). As examples during observations indicate, users are inventive and they "use spaces, punctuations and numbers to camouflage offensive or hate speech" (moderator 24ur.com B). Furthermore, interviewees agree that mere words do not build meanings, "A certain word or phrase means different things in different contexts. For instance, 'go home' can be an example of hate speech if it is referred to a certain national minority or a completely normal phrase." (moderator 24ur.com C) Siol.net uses semantics to help moderators by sorting comments with "forbidden words and phrases" into a "pending folder" for pre-moderation (moderator Siol.net A), "This additional sieve learns through time on the basis of moderators' decisions. However, it can be bypassed - some users discuss which words are identified as unsuitable by the system. /^/ Yet, our system is produced by a global provider, therefore it is not adjusted to the Slovenian language, making it a bit clumsy." Rtvslo. si has recently "started to consider the options" (online executive editor Rtvslo.si) of semantic technology. Winnowing, removing and reshaping comments. When registered users write comments in the management system, go through the gates by publishing them and are only then subjected to moderation. The moderators mostly agree that pre-moderation would be a better way to keep hate speech at the gates, but only in principle. "In practice", says online executive editor 24ur. com, "this would kill interactivity and also demand a larger moderating workforce which we cannot afford". In rare instances moderators at 24ur.com also reshape comments in line with the rules. At Siol.net, moderators are gatekeeping comments while they perform journalistic or editorial tasks: "I winnow comments on the website on the basis of my feeling - there are themes that I know will spur a lot of problematic comments. These comments are then erased." (moderator Siol.net A) At 24ur.com, they continuously refresh a joint list of newly published comments and by winnowing they decide whether to "accept" or "hide" them. "There are differences among us - others tell me that I am not strict enough. Particularly when it comes to Roma - I have a lot of experience with them. /^/ We try to overcome these differences at our occasional meetings." (moderator 24ur.com C) At Rtvslo.si, moderators combine both practices - they skim through the online news items and simultaneously follow the list of published comments via the management system. "Moderating happens post-festum. Time pressure is something we are used to. /^/ Sometimes, if I overlook a hateful comment, others follow immediately. It's a Sisyphean task." (moderator Rtvslo.si A) Unlike others at 24ur.com, they reshape comments by replacing signifiers of offensive or hate speech with an asterisk. "When doing that, the meaning should not change. /^/ And also, I do not upset the user as much as I would if I hide the comment - he would write emails or even call and demand an explanation." (moderator 24ur.com B) (In)direct connecting with users. Observations and interviews reveal indirect and direct gatekeeper-gated connections, which have long-term implications for immediate moderators' decisions and gatekeeping hate speech as a cultural practice at the three websites. Indirect connections are initiated by moderators as well as users. First, at 24ur.com, each hidden comment results in automated e-mail citing the rules to the user. "We do not send personal e-mails or other messages. They would understand that as provocation and counter-attack. They often respond with aggression already. We do not respond to those e-mails. /^/ Our role is not to educate them." (moderator 24ur.com B) Then, at Siol. net, users connect with moderators through "flagging". "When there is a certain number of flags ticked, a comment goes back to pending - to be moderated again. It is when the community reconnects and excludes a hostile and intolerant user, which is positive." (moderator Siol. net A) Finally, users of the three websites send anonymous reports to Web Eye which then redirects them back to the moderators. "When we started Web Eye, there were many reports. Now they are rare. And most of the reports do not make any sense." (moderator Rtvslo.si C) Direct connections are also initiated by both groups of actors. For instance, moderators at Rtvslo.si send "personal messages" through the management system to users whose comments have been deleted. "I see this as an opportunity to advise users and improve the culture of commenting - this is important for us as a public service. Most of them understand that. There are others, however, who continue with the hatred." (moderator Rtvslo.si A) Similar connections are initiated by Siol.net moderators, but through e-mails. Further, observations at the three media organisations show users also try to connect with the moderator signalling hate speech in other users' comments with their comments and also through a "report improper content" tool (at Rtvslo.si). Supervising users. According to observational and interview data, moderators of the three websites follow some commenters more closely than others, implying that gatekeepers have developed particular relations with the gated. Rtvslo.si places users "under control" formally, while 24ur.com and Siol.net "pay more attention to some commenters" informally (moderator Siol. net A). Commenters who continue publishing hate speech are being systematically pre-moderated by the gatekeepers, who take "full responsibility" (moderator Rtvslo. si B) for the comments. "This system is great, because some just continue to try publishing unacceptable comments, while some take it seriously and become polite. After a while some, even ask us to stop pre-modera-ting them. And we do that." (moderator Rtvslo.si C) On the other hand, 24ur.com and Siol.net moderators only "place some users under the magnifying glass" (moderator Siol.net A) and "follow those with whom you have history" (moderator 24ur.com A). At 24ur.com, moderators even stress they are stricter. "Users comment in a particular fashion. You learn whose comments you should hide. I mean, hide all their comments." (moderator 24ur.com A) Blocking users. Observations and interviews indicate that moderators of the three websites disable commenting rights for the users who continuously use hate speech or otherwise breach the rules. However, they more or less agree that closing the gates for such users is an effective mechanism, but only to a degree because blocked users register once again as a "clone" (moderator Rtvslo.si C), with a different username, e-mail address and dynamic IP. Interviewed moderators acknowledge blocking a user is a follow-up mechanism of formal or informal user control - some call it a "red card, like in football" (moderator Siol.net A). At Rtvslo.si and Siol.net the moderating system alerts the moderator if a user with the same name or IP as the blocked one tries to register. "Well, this is not completely reliable. A lot of internet users have dynamic IPs. When there is an IP similarity with a blocked user formed a couple of years ago, we do not make trouble." (moderator Rtvslo.si B). However, some moderators see user blockade as "completely useless" (moderator 24ur.com A). For instance, "I do it rarely. I used to block users more. But now I know that they register once again with different credentials and IP. There is no point." (ibid.) Erasing all comments. While at 24ur.com and Rtvslo. si comments under news items are being archived together with journalistic online content, Siol.net erases all the comments seven days after publication of the news item. Despite being the website's policy, an interviewed online executive editor and a moderator, who is also his deputy, understand this mechanism differently. The former says that "it has nothing to do with the moderating practices" and only with comments "not being historically worthwhile" - "maybe only to researchers" (online executive editor at Siol.net). The latter however stresses that erasure of all comments reflects "the moderating dilemmas that cannot be overcome" (moderator Siol.net A). "When I go back to check the comments again, there are some that I would remove. There is so much news and comments that it is impossible to clean everything. Comments get misjudged and overlooked." DISCUSSION AND CONCLUSSION By investigating the online media (self-)regulation framework and practices of moderating hate speech in online users' comments, the study indicates a combination of "traditional" (cf. Shoemaker & Vos, 2009) and "network" gatekeeping (cf. Barzilai-Nahon, 2008) at three leading Slovenian news websites. These practices can be identified as initial automated moderation, prevalent post-moderation, occasional reactive moderation and narrowed pre-moderation to construct an enduring online communication space through a dynamic nature of the relationship between moderators (gatekeepers) and online users (the gated). Moreover, the adopted minimal measures for regulating hate speech at 24ur.com, Siol.net and Rtvslo.si and their various mechanisms of keeping hate speech at the gates, signal the study's multivalent contribution to the existing body of literature. Namely, new conceptual perspectives on gatekeeping and technological innovation and the roles of news media in public life have been gained, while an innovative methodological framework allowed us to gain fresh empirical insights into online users' comments and journalism's moderation. In terms of the conceptual work, the investigation of moderating users' comments on news websites and keeping hate speech at their gates indicates that interactional features of the digital communication environment open the potential for disruption of one-way and linear journalism-audience communication relations characteristic of the mass media world (e.g., Bruns, 2009; Allan & Thorsen, 2009; Rosenberry & St. John III, 2010; Singer et al., 2011; Jones & Salter, 2012). More specifically, the study of gatekeeping users' comments shows that moderators are "guarding open gates" (Hermida, 2011) in an attempt to ensure responsible behaviour and enhance opportunities for meaningful interaction. This, in many ways, is a Sisyphean task, as also characterised by some interviewees, and indicates what can be conceptualised as a four-way gatekeeping of hate speech that is being articulated in nuanced relations between structures, such as time, financial resources, work organisation and human agency. First, news websites use of technology, disabling comments and/or users as well as automated rejection of certain expressions help moderators to keep anticipated hate speech away from the gates. Second, news websites control information and interpretation behind the gates through post-moderation, where the gated are welcomed or pushed out of online communication threads. Third, blockade of particular users and rare examples of pre-moderation at one of the websites do not imply only traditional gatekeeping relations, but rather particular gatekeeper-gated connections based on institutional (self-)regulation and moderators' individual experience. Fourth, when a moderator overlooks hate speech or misjudges his decision, comments users alert the gatekeeper to reconsider pushing a certain comment or user back through the gate. Additionally, the four-way gatekeeping of users' comments, i.e. converging automated, pre-, post- and reactive moderation, also reflect journalism's troubles of (re)engaging with the people to whom they are primarily responsible. This calls for conceptual reconsideration of news media's roles in contemporary social contexts where customisation, multiplication and reinterpretation of news appear as salient trends in communication (cf. Jones & Salter, 2012). Respectively, in the sense of what Dahlgren (2014) calls a "multi-epistemic world", it appears that the classical paradigmatic framework, within which journalism informs and interprets social reality for the people to make judgments about the issues of the day, needs to be reconsidered at the very least. Namely, one can identify an "ambient" character in online communication where "broad, asynchronous, lightweight and always-on" (Hermida, 2010) systems, such as users' comments on news websites, are creating various kinds of interactions around and within the news, and enable citizens to re-develop a complex mental model of the news and commentary. Half-open gates, in the case of hate speech moderation, reflect scrambling the traditional boundaries between journalism and non-journalism, where facts and opinions, debates, gossip, nonsense, misinformation, hatred and insult, the insightful, the deceptive, the poetic, are all mixed together. In this context, journalism needs to ensure high quality discussion by restoring an appropriate balance between freedom and responsibility - only then might journalism overcome the contemporary "crisis of authority" (Gitlin, 2009) and restore its political and cultural relevance in societal life. From the methodological perspective, the study shows the usefulness of the combination of methods which has not been used in previous research on moderating online users' comments and hate speech. This combination enabled us to get comprehensive insights into the decision-making rationale of news websites' moderators. With document analysis, we identified formal regulatory measures and the embodied social rules, but not necessarily the reasoning behind them. In this context, newsroom observations allowed us to directly witness a work environment where moderators struggle between structural conditions and human agency, enabling us to identify the practical implementation of these measures and also reveal additional gatekeeping mechanisms. Additionally, interviews were used to verify data collected with the previous two methods and, by gathering actors' interpretations of moderating practices, they also appeared useful in the Slovenian ethnographic study, especially in regards to how the abstraction of hate speech shaped moderators' decision-making. At the empirical level, on the basis of this study one could argue that pre-moderation would eliminate all the problems of keeping hate speech at the gates, although there is no clear evidence of that in previous international comparative research (e.g., Hermida & Thurman, 2007; Reich, 2011; Hughey & Daniels, 2013; Goodman, 2013). While moderating all users' comments before publication would give 24ur.com, Siol.net and Rtvslo.si privileges of traditional gatekeepers, such measures might also deepen other journalistic and business issues of online media that appear across national contexts, also in Slovenia. Namely, narrowing down the possibilities for hate speech normalisation gives space and recognition to more meaningful exchanges, but also raises classical questions of selection criteria and the nature of user incorporation placed at the intersection between "conservatism of journalistic profession" (Waisbord, 2014, 212) and journalism's attempt to serve "as a common forum for debate" (Dahlgren, 2010, 5). In Slovenia, online journalism has been struggling to provide meaningful participatory spaces to retain the role of central information and interpretation by providers (cf. Vobič, 2013), and tightening moderation would thus only deepen the dilemmas between professional control and openness. Simultaneously, tightening online control over the boundaries of discussions demands additional expanses for a larger moderating activity and workforce, which would probably result in a decline of intensity of interactive exchange between media and audiences and a simultaneous fall of frequency in users' online engagement, one of the primary business signifiers of the success of online journalism (Singer et al., 2011). In addition, in the Slovenian context, the approach where every click counts has made the market motive a crucial element in deciding not to have more restrictions on inappropriate speech online (cf. Erjavec & Poler Kova-čič, 2012; 2013). Nevertheless, although more gatekeeping control would deepen the dilemmas of (online) journalism in the short term, pre-moderation does not per se exclude positive political and cultural implications for public online reasoning in the long run. Despite this study's limited scope, the investigation of a (self-)regulation framework and practice of moderating users' comments with a particular focus on hate speech indicates journalism's struggles to cope with inherently transgressive, boundary-breaking and all-eroding social communication and calls for further scholarly attention. Future explorations of journalism's connections with "pe- ople formerly known as audience" (Rosen, 2012) would standing boundaries between the journalistic production benefit from a combination of different standpoints - from processes, news as text and discourse, and people's enga-theories of the public, critique of the political economy of gement with/through journalism, and second, perform a communication and critical discourse analysis, to identity methodological makeover by borrowing from qualitative formation. As such, integrative research attempts in jour- and quantitative methodological traditions to gain cross- nalism research would need to, first, break down the long- contextual insights. ZADRŽEVANJE SOVRAŽNEGA GOVORA NA VRATIH: PRAKSE MODERIRANJA NA TREH SLOVENSKIH NOVIČARSKIH SPLETNIH MESTIH Igor VOBIČ Univerza v Ljubljani, Fakulteta za družbene vede, Kardeljeva pl. 5, 1000 Ljubljana, Slovenija e-mail: igor.vobic@fdv.uni-lj.si Melita POLER KOVAČIČ Univerza v Ljubljani, Fakulteta za družbene vede, Kardeljeva pl. 5, 1000 Ljubljana, Slovenija e-mail: melita.poler-kovacic@fdv.uni-lj.si POVZETEK Študija je utemeljena na prepletu klasičnih razprav o svobodi tiska in svobodi izražanja ter diskusij o izzivih od-birateljske vloge novinarstva v sodobnih kontekstih internetnega komuniciranja. Avtorja proučujeta utemeljevanje odločanja moderatorjev na novičarskih spletnih mestih, ki sovražni govor zadržujejo na vratih tako, da komentarje uporabnikov pregledujejo in jih izbirajo za objavo pod spletnimi novicami. Z uporabo analize dokumentov, opazovanj v uredništvih in intervjujev študija prepoznava kombinacijo tradicionalnega in omrežnega odbirateljstva na treh v Sloveniji vodilnih novičarskih spletnih mestih. Trajno internetno komunikacijsko okolje se namreč konstruira skozi izhodiščno avtomatizirano moderacijo, prevladujočo pomoderacijo, občasno odzivno moderacijo in omejeno predmoderacijo, ki nakazuje dinamično naravo odnosov med moderatorji (odbiratelji) in internetnimi uporabniki (odbranimi). To v številnih pogledih Sizifovo delo, kot ga označujejo tudi nekateri intervjuvanci, razkriva štiri načine odbiranja sovražnega govora, ki se artikulirajo v raznolikih odnosih med strukturami, kot so čas, finančna sredstva in organizacija dela, ter človekovo dejavnostjo. Sprejeti minimalni ukrepi za reguliranje sovražnega govora na 24ur. com, Siol.net in Rtvslo.si ter različni odbirateljski mehanizmi zahtevajo vnovičen razmislek o nekaterih osrednjih vprašanjih družbenega komuniciranja, tj. odbirateljskega modela in tehnoloških inovacij ter mnogotere vloge novičarskih medijev v javnem življenju. Ključne besede: komentarji internetnih uporabnikov, sovražni govor, odbirateljstvo, samoregulacija, novinarstvo, Slovenija. REFERENCES Allan, S., Thorsen, E. (Eds.) (2009): Citizen Journalism. New York: Peter Lang. Anderson, A. A., Brossard, D., Scheufele, D. A., Xe-nos, M. A., Ladwig, P. (2014): The Nasty Effect. Journal of Computer-Mediated Communication, 19, 3, 373-387. Barlow, A. (2010): The Citizen Journalist as Gatekeeper. In: Rosenberry, J., St. John III, B. (Eds.), Public Journalism 2.0. London, Routledge, 45-55. Barzilai-Nahon, K. (2008): Toward a Theory of Network Gatekeeping. Journal of the American Information Science and Technology, 59, 9, 1-20. Bastos, M. T., Raimundo, R. L. G., Travitzki, R. (2013): Gatekeeping Twitter. Media, Culture & Society, 35, 2, 260-270. BBC: House Rules. Http://www.bbc.co.uk/blogs/le-gacy/moderation.shtml (25. 5. 2014). Bertrand, C.-J. (1997): Quality Control: Media Ethics and Accountability Systems. Paris: Presses Universitai-res de France. Http://www.machineru.polpred.ru/free/ qc/book.pdf (8. 7. 2014). Boczkowski, P. J. (2004): Digitizing the News. Cambridge: MIT Press. Bowen, G. A. (2009): Document Analysis as a Qualitative Research Method. Qualitative Research Journal, 9, 2, 27-40. Broersma, M., Graham, T. (2012): Social Media as Beat. Journalism Practice, 6, 3, 403-419. Brown, C. (2009): WWW.HATE.COM. Howard Journal of Communications, 20, 2, 189-208. Bruns, A. (2009): Gatewatching. New York: Peter Lang. Bruns, A. (2011): Gatekeeping, Gatewatching, Real-Time Feedback. Brazilian Journalism Research, 7, 11, 117-136. Cammaerts, B. (2009): Radical Pluralism and Free Speech in Online Public Spaces. International Journal of Cultural Studies, 12, 6, 555-575. Carpentier, N. (2011): The Concept of Participation. CM, 21, 13-36. Dahlgren, P. (2010): Charting the Evolution of Journalism. Media Studies, 1, 1-2, 3-17. Dahlgren, P. (2014): Social Media and Political Participation. In: Fuchs, C., Sandoval, M. (Eds.), Critique, Social Media and the Information Society. London, Routledge, 191-202. DNS & SNS (2010): Kodeks novinarjev Slovenije. Http://www.razsodisce.org/razsodisce/kodeks_ns_txt. php (21. 2. 2014). Domingo, D., Quandt, T., Heinonen, A., Paulussen, S., Singer, J. B., Vujnovic, M. (2008): Participatory Journalism Practices in the Media and Beyond, Journalism Practice, 2, 3, 326-342. Duffy, M. E. (2003): Web of Hate. Journal of Communication Inquiry, 27, 3, 291-312. DZ RS (1991): Ustava RS. Http://www.dz-rs.si/ wps/portal/Home/PoliticniSistem/URS/besedilo (21. 2. 2014). DZ RS (2006): Zakon o medijih. Http://www.urad-ni-list.si/1/objava.jsp?urlid=2006110&stevilka=4666 (21. 2. 2014). DZ RS (2012): Kazenski zakonik. Http://www.urad-ni-list.si/1/content?id=109161 (21. 5. 2014). Erjavec, K., Poler Kovačič, M. (2012): "You Don't Understand, This is a New War!" Mass Communication & Society, 15, 6, 899-920. Erjavec, K., Poler Kovačič, M. (2013): Abuse of online participatory journalism in Slovenia. Medijska is-traživanja, 19, 2, 55-73. Flick, U. (2006): An Introduction to Qualitative Research. London: Sage. Gitlin, T. (2009): Journalism's Many Crises. Journalism in Crisis Conference, 25 May. London, Westminster University. Gold, R. (1958): Roles in Sociological Field Observations. Social Forces, 36, 3, 217-223. Goodman, E. (2013): Online Comment Moderation. Darmstadt: World Association of Newspapers. Graham, T., Broersma, M., Hazelhoff, K. (2013): Closing the Gap? In: Scullion, R., Gerodimos, R., Jackson, D., Lilleker, D. (Eds.): The Media, Political Participation and Empowerment. London, Routledge, 71-88. Harris, C., Rowbotham, J., Stevenson, K. (2009): Truth, law and hate in the virtual marketplace of ideas. Information & Communications Technology Law, 18, 2, 155-184. Henry, J. S. (2009): Beyond Free Speech. Information & Communications Technology Law, 18, 2, 235-251. Hermida, A. (2010): Twittering the News. Journalism Practice, 4, 3, 297-308. Hermida, A. (2011): Fluid Spaces, Fluid Journalism. In: Singer, J. B., Hermida, A., Domingo, D., Heinonen, A., Paulussen, S., Quandt, T., Reich, Z., Vujnovic, M. (Eds.): Participatory Journalism. West-Sussex: Malden, Wiley and Blackwell, 177-191. Hermida, A., Thurman, N. (2007): Comments please. Https://online.journalism.utexas.edu/2007/papers/Her-mida.pdf (26. 1. 2014). Hlavach, L., Freivogel, W. H. (2011): Ethical Implications of Anonymous Comments Posted to Online News Stories. Journal of Mass Media Ethics, 26, 1, 21-37. Hughey, M. W., Daniels, J. (2013): Racist Comments at Online News Sites. Media, Culture & Society, 35, 3, 332-347. Jones, J., Salter, L. (2012): Digital Journalism. London: Sage. Kuwabara, K., Ohguro, T., Watanabe, T., Itoh, Y., Maeda, Y. (2002): Connectedness Oriented Communication. Symposium on Applications and the Internet, 28 January-1 February. Washington: IEEE Computer Society Washington. Lasorsa, D. L., Lewis, S. C., Holton, A. E. (2012): Normalizing Twitter. Journalism Studies, 13, 1, 19-36. Lee, E.-J. (2012): That's Not the Way It Is. Journal of Computer-Mediated Communication, 18, 1, 32-45. Leets, L. (2001): Responses to Internet Hate Sites? Communication Law & Policy, 6, 2, 287-317. Legard, R., Keegan, J., Ward, K. (2003): In-depth Interviews. In: Ritche, J., Lewis, J. (Eds.), Qualitative Research Practice. London, Sage, 138-169. Lindlof, T., Taylor, B. (2002): Qualitative Communication Research Methods. London: Sage. Loke, J. (2012): Old turf, new neighbours. Journalism Practice, 6, 2, 233-249. Lowery, W. (2009): Institutional Roadblocks: Assessing Journalism's Response to Changing Audiences. In: Papacharissi, Z. (Ed.): Journalism and Citizenship. London, Routledge, 44-68. McCracken, G. (1988): The Long Interview. London: Sage. McGonagle, T. (2013): The Council of Europe against online hate speech: Conundrums and challenges. Expert Paper. Http://www.ivir.nl/publications/mcgonagle/Ex-pert_paper_hate_speech.pdf (21. 2. 2014). McNamee, L. G., Peterson, B. L., Pena, J. (2010): A Call to Educate, Participate, Invoke and Indict: Understanding the Communication of Online Hate Groups. Communication Monographs, 77, 2, 257-280. Meddaugh, P. M., Kay, J. (2009): Hate Speech or "Reasonable Racism"? The Other in Stormfront. Journal of Mass Media Ethics, 24, 4, 251-268. Morse, J. (2012): The Implications of Interview Type and Structure in Mixed-Method Designs. In: Gubrium, J., Holstein, J., Marvasti, A., McKinney, K. (Eds.), The SAGE Handbook of Interview Research. London, Sage, 193-204. MOSS (2013): Valutni podatki za december 2013. Http://www.moss-soz.si/si/novice/11737/detail.html (9. 2. 2014). Motl, A. (2009): Sovražni govor v slovenskih medijih na spletu. Ljubljana: FDV. Http://dk.fdv.uni-lj.si/dip-lomska/pdfs/motl-andrej.pdf (21. 2. 2014). Nemes, I. (2002): Regulating Hate Speech in Cyberspace. Information & Communications Technology Law, 11, 3, 193-220. Neuman, L. (2006): Social Research Methods: Qualitative and Quantitative Approaches - Sixth Edition. Boston: Pearson Education. Neurauter-Kessels, M. (2011): Im/polite reader responses on British online news sites. Journal of Politeness Research: Language, Behavior, Culture, 7, 2, 187-214. Nielsen, C. (2012): Newspaper Journalists Support Online Comments. Newspaper Research Journal, 33, 1, 86-100. Nip, J. (2010): Routinization of Charisma. In: Rosen-berry, J., St. John III, B. (Eds.): Public Journalism 2.0. New York, Routledge, 135-148. Planet Siol.net: Strpno in varno okolje za argumentirane razprave. Http://www.siol.net/subsites/pravila_ko-mentiranja.aspx (10. 5. 2014). Pöttker, H. (2012): Quo Vadis Journalism? Citizens, Communication, and Democracy in the New Digital World, 16-17 November. Piran: EURICOM. Pro Plus (a): Pravila za objavo komentarjev na porta-lih POP TV in Kanala A. Http://image.24ur.com/media/ document/61374531.pdf (20. 5. 2014). Pro Plus (b): Splošni pogoji uporabe spletnih porta-lov podjetja Pro Plus, d. o. o. Http://images.24ur.com/ media/document/61376778.pdf (20. 5. 2014). Puddephatt, A. (2011): The importance of self-regulation of the media in upholding freedom of expression. Brasilia: UNESCO. Http://unesdoc.unesco.org/imag-es/0019/001916/191624e.pdf (8. 7. 2014). Reader, B. (2012): Free Press vs. Free Speech? Journalism & Mass Communication Quarterly, 89, 3, 495513. Reed, C. (2009): The challenge of hate speech online. Information & Communications Technology Law, 18, 2, 79-82. Reich, Z. (2011): User Comments. In: Singer, J. B., Hermida, A., Domingo, D., Heinonen, A., Paulussen, S., Quandt, T., Reich, Z., Vujnovic, M. (Eds.): Participatory Journalism. West-Sussex: Malden, Wiley and Blackwell, 96-118. Rosen, J. (2012): People Formerly known as the Audience. In: Mandiberg, M. (Ed.), The Social Media Reader. New York, NYU Press, 13-17. Rosenberry, J. (2011): Users Support Online Anonymity despite Increasing Negativity. Newspaper Research Journal, 32, 2, 6-19. Rosenberry, J., St. John III, B. (Eds.) (2010): Public Journalism 2.0. New York, London: Routledge. Rtvoslo.si (2014): Standardi in pravila komuniciranja na spletnem mestu rtvslo.si (zadnjič posodobljeno 19. 5. 2014). Http://www.rtvslo.si/strani/moj-splet/669 (25. 5. 2014). Ruiz, C., Domingo, D., Mico, J. L., D^az-Noci, J., Meso, K., Masip, P. (2011): Public Sphere 2.0? The International Journal of Press/Politics, 16, 4, 463-487. SAFE (2010/11): Kodeks regulacije sovražnega govora na spletnih portalih. Http://safe.si/sites/safe.si/files/ kodeks_oblikovan.pdf (24. 3. 2014). Santana, A. D. (2011): Online Readers' Comments Represent New Opinion Pipeline. Newspaper Research Journal, 32, 3, 66-81. Shepard, A. C. (2011): Online Comments. Nieman Reports, 65, 2, 52-53. Shoemaker, P. J., Vos, T. P. (2009): Gatekeeping Theory. New York: Routledge. Singer, J. B., Ashman, I. (2009): User-Generated Content and Journalistic Values. In: Allan, S., Thors-en, E. (Eds.): Citizen Journalism. New York, Peter Lang, 233-242. Singer, J. B., Hermida, A., Domingo, D., Heinon- TSmedia: Pogoji uporabe medija Planet Siol.net. en, A., Paulussen, S., Quandt, T., Reich, Z., Vujnovic, Http://itm.siol.net/doc/PU_SiOLNet.pdf?q=pogoji (20. M. (Eds.) (2011): Participatory Journalism. West-Sussex: 5. 2014). Wiley-Blackwell. Urbas, U. (2014): Ukradli so nam državo. Ukrasti Spletno oko (2013): Modri Moderator Moderira. nam hočejo še internet. Http://www.siol.net/Priloge/Ko- Http://mi.ris.org/uploadi/editor/1386254339Moder_ lumne/Uros_Urbas/2014/05/Ukradli_so_nam_drzavo. Moderator_final.pdf (21. 2. 2014). aspx (20. 5. 2014). Splichal, S. (2003): Bentham, Kant, and the Right to Vobič, I. (2013): Journalism and the Web. Ljubljana: Communicate. Critical Review, 15, 3-4, 285-305. FDV. Thurman, N. (2008): Forums for Citizen Journalists? Waisbord, S. (2014): Reinventing Professionalism. New Media & Society, 10, 1, 138-157. Cambridge: Polity. Thurman, N., Hermida, A. (2010): Gotcha. In: Wang, J., Yan, Y. (2012): The Interview Question. Monaghan, G., Tunney, S. (Eds.): Web Journalism. East- In: Gubrium, J., Holstein, J., Marvasti, A., McKinney, K. bourne, Sussex Academic Press, 46-62. (Eds.), The SAGE Handbook of Interview Research. Lon- Thurman, N., Newman, N. (2014): The Future don, Sage, 231-242. of Breaking News Online? Journalism Studies, .