PLEASE SCROLL DOWN FOR ARTICLE
This article was downloaded by: [University of Auckland]On: 21 September 2010Access details: Access Details: [subscription number 907452848]Publisher RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
Communication MonographsPublication details, including instructions for authors and subscription information:http://www.informaworld.com/smpp/title~content=t713695619
Why Do Virtual Communities Regulate Speech?Cecil Eng Huang Chua
To cite this Article Chua, Cecil Eng Huang(2009) 'Why Do Virtual Communities Regulate Speech?', CommunicationMonographs, 76: 2, 234 — 261To link to this Article: DOI: 10.1080/03637750902828420URL: http://dx.doi.org/10.1080/03637750902828420
Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf
This article may be used for research, teaching and private study purposes. Any substantial orsystematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply ordistribution in any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation that the contentswill be complete or accurate or up to date. The accuracy of any instructions, formulae and drug dosesshould be independently verified with primary sources. The publisher shall not be liable for any loss,actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directlyor indirectly in connection with or arising out of the use of this material.
Why Do Virtual Communities RegulateSpeech?Cecil Eng Huang Chua
Virtual community research argues that regulations restricting the kinds of speech in a
virtual community decrease the utility to members. However, many virtual communities
enact regulations on speech within the virtual community. This research explores the
contradiction through a cross-case analysis of virtual communities. It explains the
contradiction between research and practice using the theory of collective identity.
Communication is important for creating collective identity in virtual communities.
However, multiple collective identities can arise. When one collective identity within a
virtual community defines itself as adversarial to another, silencing speech emerges as
adversarial collective identity creates enduring noise and flames. When the target
collective identity creates formal regulations suppressing the adversarial collective
identity, communication to foster the target collective identity emerges.
Keywords: Virtual Communities; Moderation; Censorship; Hate Speech; Regulation
Research generally argues that virtual communities should allow members to speak
freely (Axelrod, 1984; Godwin, 1994; Kollock, 1996; Ostrom, 1990), as this helps the
virtual community form a cohesive collective identity (Polletta, 1998; Weiss, 2003),
which leads to the ‘‘success’’ of the virtual community. Curiously, many virtual
communities intentionally regulate speech within community boundaries. Such
virtual communities create a board of moderators that screen, reject, and/or remove
problematic messages.
Cecil Eng Huang Chua is an Assistant Professor at Nanyang Technological University. He received a PhD in
Information Systems from Georgia State University, a Masters of Business by Research from Nanyang
Technological University, and both a BBA in Computer Information Systems and Economics and a Masters
Certificate in Telecommunications Management from the University of Miami. His research has been published
in such journals and magazines as Communications of the AIS, Data and Knowledge Engineering, Decision Support
Systems, IEEE Computer, Journal of the AIS, the Journal of Database Management, the MIS Quarterly, and the
VLDB Journal. He was the runner-up for the 2004 ICIS best doctoral dissertation competition. Correspondence
to: Cecil Eng Huang Chua, Information Technology & Operations Management Department, Nanyang Business
School, Nanyang Technological University, Nanyang Avenue, Singapore 639798. E-mail: [email protected]
ISSN 0363-7751 (print)/ISSN 1479-5787 (online) # 2009 National Communication Association
DOI: 10.1080/03637750902828420
Communication Monographs
Vol. 76, No. 2, June 2009, pp. 234�261
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
This study explores why many virtual communities act in a way that research
suggests they should not. The study examines situations where virtual communities
regulate speech and when such regulations encourage the success of the virtual
community, i.e., encourage members to speak about topics relevant to the community.
To investigate this issue, an in-depth, exploratory cross-case analysis of two moderated
virtual communities and unmoderated counterparts was conducted followed by a
third case study to enhance generalizability. The theory of collective identity is applied
to explain findings, which suggest three things:
. Virtual community identity may map to more than one collective identity. It is
possible for two separate groups in a virtual community to adopt two distinct,
mutually exclusive collective identities, for example, one group identifies with
Judaism, and the other comprises Christian missionaries. The virtual community’s
identity is a function of the interaction between the collective identities of
members.. The existence of silencing speech. Communication is critical to the manifestation of
a collective identity (Hardy, Lawrence, & Grant, 2005; Polletta, 1998; Weiss, 2003).
Silencing speech emerges when one collective identity in the virtual community
becomes defined as the adversarial target collective identity. The adversarial
collective identity utters silencing speech, instantiated as enduring noise and
flames focused on the target collective identity. Noise inhibits the target collective
identity’s ability to search and read messages, and flames inhibit member
willingness to post. As communication in many virtual communities comprises
posting and reading, this reduces communication, hindering the manifestation of
the target collective identity. Silencing speech creates a virtual community identity
of conflict. As one collective identity defines itself as an adversary, and the target
community does not, conflict reinforces the adversary’s collective identity while
diminishing the collective identity of the target.. Regulating speech is the only way to control silencing speech. The only way members
of the target group can speak, and manifest collective identity, is to bar members of
the adversarial collective identity from uttering silencing speech. Thus, when
adversarial collective identities exist, the virtual community can only manifest the
collective identity of the adversary or its target. Ironically, diversity in virtual
community identity is realized when two communities emerge, one where the
target group regulates speech, thereby manifesting its collective identity, and one
where both groups can speak, thereby allowing the manifestation of the adversarial
group’s collective identity.
This research proceeds as follows. Research on community governance is first
presented, after which research methodology is discussed. Case site backgrounds are
then presented. Case sites are subjected to both qualitative and quantitative analysis.
Theory on collective identity is then introduced to explain and interpret the findings.
Virtual Community Speech Regulation 235
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Related Research
One fundamental question of virtual community research concerns how virtual
communities should be governed. Definitely some form of governance is necessary for
a virtual community to succeed (Ardichvili, Page, & Wentling, 2003; Castelfranchi &
Tan, 2002; Goodman & Darr, 1998; Rothaermel & Sugiyama, 2001; Stanoevska-
Slabeva, 2002). Research has demonstrated that online speech can be disruptive
(Sternberg, 2000). Incidents of cyber-harassment (Dibbell, 1993; Herring, 1999),
deception and fraud (Castelfranchi & Tan, 2002; Chua & Wareham, 2004), flaming
(Aiken & Waller, 2000; Alonzo & Aiken, 2004; Reinig, Briggs, & Nunamaker, 1997/
1998), hazing (Honeycutt, 2005), impersonation (Donath, 1998), noise (Hiltz &
Johnson, 1990; Hiltz & Turoff, 1985; Wasko & Faraj, 2000), trolling (Campbell, 2004;
Herring, Job-Sluder, Scheckler, & Barab, 2002), and others (Greenhill, Campbell, &
Fletcher, 2002; Kahai & Cooper, 1999; Phillips, 1996; Suler & Phillips, 1998) have been
documented.
Such actions can be potentially harmful not only to individuals in the virtual
community, but to the community itself. Cyber-harassment, for example, can cause
individuals to leave a community (Herring, 1999; Herring et al., 2002). The impact
on the community can be especially devastating when opinion leaders leave (Dutton,
1996). Thus, some level of governance is required for virtual communities.
In online communities, moderation is the standard method of formal governance.
In moderation, individuals called moderators screen postings. Most research argues
that moderation is not a desirable method of virtual community governance (Dutton,
1996). Disputes and concerns should be resolved directly by members of the
community rather than a governing body (Axelrod, 1984; Godwin, 1994; Kollock,
1996; Ostrom, 1990). Owners of the virtual community infrastructure should let
members devise rules, and self-monitor (Baker, 2001; Poor, 2005). Some research
opposes the use of any authority on virtual communities (e.g., moderation) (Cothrel
& Williams, 1999; Rheingold, 1993).
Research generally argues that where moderators must exist, they should lead by
example rather than impose their will (Armstrong & Hagel, 1996; Koh & Kim, 2001).
They should help guide new members in proper community behavior (Andrews,
Preece, & Turoff, 2002), thereby encouraging continuous renewal in the community
(Williams & Cothrel, 2000). Governors should also seed discussions and foster
interest in community activities instead of forcing community members to conform
to expectations (Armstrong & Hagel, 1996; Koh & Kim, 2001). Thus, the literature
generally argues that socially constructed governance mechanisms rather than rules
imposed by the moderator should be developed to manage the topics people discuss
(Blanchard, 2004).
Curiously, many virtual communities prefer formal governance mechanisms that
regulate speech. Why? This research attempts to: (a) determine why virtual
communities regulate speech, and (b) ascertain whether such regulations encourage
community success.
236 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Methodology
A combined qualitative/quantitative cross-case methodology with influences from
multiple streams of research was employed to explore the research question. Research
proceeded in five phases, which are presented and justified in Table 1.
Virtual community is operationalized as all conversations and actions occurring
within a well-defined posting forum. Soc.culture.singapore is a virtual community, as
is soc.culture.jewish. Conversations that occur within the forum are treated as within
the virtual community. However, conversations that occur outside the forum, for
example, in talk.politics.mideast, that are not cross-posted to the community are
outside the community.
Case Sites
Variational sampling (Strauss & Corbin, 1990; Yin, 1984) was employed to maximize
differences between case sites (see Table 2). Two case sites comprising two virtual
communities each were selected. Each case site comprised both a moderated and
unmoderated virtual community. The first case site (Jewish culture newsgroup)
contained a successful moderated community. The second case site (Singapore
culture newsgroup) contained a moderated community that failed. These four virtual
communities enabled a simultaneous comparison of moderation with no modera-
tion, and moderation success with failure. These four virtual communities are
soc.culture.jewish (SCJ), soc.culture.jewish.moderated (SCJM), soc.culture.singapore
(SCS), and soc.culture.singapore.moderated (SCSM).
The ‘‘success’’ of a virtual community is difficult to define. Success was measured
in three ways. First, a virtual community no longer in existence (i.e., SCSM) was a
candidate for ‘‘failure.’’ But, second, community members must perceive the virtual
community as a success/failure. Such perceptions were assessed through: (a)
community member comments about the virtual community and (b) number of
posts and replies to posts in various virtual communities. Finally, a statistical content
analysis of posts over a 10-day period on the newsgroup was employed to assess the
Table 1 Data Collection/Analysis Phases
Phase/methodology Reasoning
Case-site selection/variational sampling Select case sites to maximize differences inconstructs of interest (i.e., speech regulationand virtual community success).
Immersive-exploratory/ethnographic observation Obtain a sense for why virtual communitiespreferred/did not prefer regulating speech.
Theoretical refinement/open sampling Affirm and refine existing conceptualization.Internal validity/statistical testing Test conceptualizations made and refined
during immersive and theoretical refinementphases.
External validity/confirmation and criterionsampling
Identify a case similar to those in case-siteselection where culture was not an issue.
Virtual Community Speech Regulation 237
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
diversity of posts. A newsgroup with few posts was indicative that the newsgroup
failed.
After comparing and contrasting the Singapore and Jewish case sites, it was
important to ascertain that findings existed outside of these two communities.
Multiple social/religious/cultural newsgroups had the same situation as the Jewish
community including soc.culture.palestine (moderated version on Yahoo groups),
soc.culture.islam, soc.culture.african-american, and soc.culture.japan.
However, it was also important to ensure that these phenomena were generalizable
beyond cultural and religious virtual communities. To some degree, science and
culture are opposites. Cultures can be different, but science aims to achieve objective
truths. Thus, stronger generalizability could be established if phenomena discovered
in moderated culture virtual communities were found in moderated science virtual
communities. Moderated newsgroups in the sci.* (i.e., science) hierarchy were
investigated and the newsgroup sci.psychology.psychotherapy.moderated was
uncovered.
Immersion Phase
The immersion phase was employed to identify issues and concerns that encouraged
the regulation of speech. Thus, methodologies associated with ethnographic research
were employed. Several years’ worth of postings were reviewed to obtain a cultural
awareness of the various virtual communities (Myers, 1999).
A neutral observer role was adopted principally because the author is not Jewish,
and did not want to accidentally insult subjects. However, the author was not
invisible to the communities. Postings were sent to case sites disclosing the fact that
they were under observation. E-mail interviews were also requested. Drafts of this
document were prepared and circulated to community members for review and
commentary. All postings disclosed the author’s e-mail address and identity.
Table 2 Case Sites Visited
Case site Speech is Success or failure
soc.culture.jewish (SCJ) Not regulated Jewish members of the newsgroupfelt it did not enable them to say whatthey wanted
soc.culture.jewish.moderated(SCJM)
Regulated Community members regard as‘‘successful’’
soc.culture.singapore(SCS)
Not regulated Frequent posts on a wide variety oftopics relevant to Singaporeans
soc.culture.singapore.moderated (SCSM)
Regulated Site had low posting volumes andultimately died
sci.psychology.psychotherapy(SPP)
Not regulated, usedto generalize findings
Site principally about the legitimacyof psychotherapy rather than aboutspecific issues withinpsychotherapy
sci.psychology.psychotherapy.moderated (SPPM)
Regulated, used togeneralize findings
Site principally used for discussion ofspecific issues withinpsychotherapy
238 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Open Sampling Phase
In the open sampling phase (Strauss & Corbin, 1990), data was gathered to test and
refine ideas generated in the immersion phase. To strengthen construct validity, data
was collected from three separate sources (Klein & Myers, 1999; Yin, 1984):
. Postings on formation of newsgroups. Data on the creation of the moderated
newsgroups was collected by searching Google groups for the keywords ‘‘CFP,’’
‘‘RFP,’’ ‘‘CFV,’’ and the name of the moderated group. The three acronyms (call for
proposal, request for proposal, and call for votes. respectively) refer to formal
procedures necessary for creating new Usenet newsgroups. This allowed an
examination of justifications and reasons behind the creation of the moderated
newsgroups.. Interviews. E-mail interviews and discussions with moderators and newsgroup
participants from both the Jewish and Singapore newsgroups were solicited. Over
a dozen Jewish newsgroup participants volunteered input. Interviews with four
members of the original Singapore moderation team were obtained.. Postings on governance of newsgroups. The Google group archive of these
newsgroups was searched for ‘‘moderated,’’ and ‘‘moderator.’’ Snowball sampling
(Kuzel, 1992) was employed to extend the search with additional keywords. For
example, e-mail addresses of advocates for moderation were employed as search
keys. It was possible to use these e-mail addresses to search the Google group
archive for events and experiences that shaped advocates’ positions.
Techniques from Straussian grounded theory (Strauss & Corbin, 1990) were
employed both for the immersion and open sampling phases. Specifically, data was
coded using open coding techniques that were later collapsed into subgroups (i.e.,
axial coding). Codes were assigned based on posting topic (i.e., this was a content
analysis).
Internal Validity Phase
To affirm that findings were inherent in the case sites, and not the author’s subjective
interpretation, three independent raters were recruited to systematically code all posts
in the newsgroups for two 10-day periods. The first period encompassed a time just
before the formation of the moderated newsgroup. The second period encompassed a
time when both the moderated and unmoderated newsgroup coexisted. This enabled
a cross-comparison of speech patterns in the various communities. Specifically, all
postings on SCJ/M for February 19�28, 2003 and February 19�28, 1999, and all
postings on SCS/M for February 19�28, 1998 and February 19�28, 1996 were
obtained. The Jewish newsgroups were sampled in 2003, because data collection for
this phase began in 2003. As SCSM expired in 2001, a year midway between SCSM’s
creation and expiry was selected (i.e., 1998). It was assumed that the midpoint would
reflect a time when SCSM was most active. Each rater independently coded every
posting. Table 3 summarizes data collected for the systematic coding phase.
Virtual Community Speech Regulation 239
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
To make the analysis tractable, the original summarized codes were further
collapsed into six speech categories comparable across the two case sites. Table 4
presents the categories and example postings, which were:
1. Anticulture posts denigrated the culture within the virtual community or were a
response to such denigration.
2. On-topic posts discussed culture within the virtual community. In the Jewish
newsgroups, the post concerned Jewish faith, practices, experiences (e.g., the
Holocaust), or life. In the Singapore newsgroups, Singapore life, politics, or shows
in Singapore were considered relevant. Politics was not relevant for Jewish
newsgroups, because the FAQ and moderation charter stated that such were
irrelevant.
3. Off-topic politics posts were political discussions that did not center on virtual
community culture.
4. Newsgroup governance posts were about how to make the newsgroup a better
place, or how newsgroup governance worked.
5. Denigration of other culture posts specifically attacked another culture.
6. Other off-topic posts could not fit in another category.
An inter-rater reliability analysis was performed before rater codings were
analyzed. Table 5 presents this analysis and demonstrates that reliability was above
accepted thresholds. Most scores were above .6, and Raters 1 and 3 did not have an
agreement level below .67. Typical recommendations are that a kappa above .6/.7 is
acceptable for exploratory/confirmatory research respectively (Landis & Koch, 1977;
Miles & Huberman, 1994).
After codes were reconciled, nonparametric tests of differences were performed
across the six community periods (e.g., SCJ 1999, SCJ 2003, SCJM 2003).
Nonparametric tests were necessary, given that codes were on a nominal scale
(Lehman, 1988).
Table 3 Data Collected for Systematic Coding
Virtual communityTime
period Reason for collection
soc.culture.jewish (SCJ1999)*526 posts Feb 19�28,1999
Before formation of moderated group
soc.culture.jewish (SCJ2003)*1715 posts Feb 19�28,2003
Without regulated speech afterformation of moderated group
soc.culture.jewish.moderated(SCJM2003)*152 posts
Feb 19�28,2003
With regulated speech after formationof moderated group
soc.culture.singapore (SCS1996)*1462posts
Feb 19�28,1996
Before formation of moderated group
soc.culture.singapore (SCS1998)*1422posts
Feb 19�28,1998
Without regulated speech after for-mation of moderated group
soc.culture.singapore.moderated(SCSM1998)*46 posts
Feb 19�28,1998
With regulated speech after formationof moderated group
240 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
As the research underwent refinement, summarized drafts were circulated in the
virtual communities studied (Mason, 1996). Data can be validated as much of it
originates from publicly available sources, and can be retrieved through Internet
searches employing fragments of the quoted text as a search string (Miles &
Huberman, 1994).
Case Sites
The Jewish and Singapore case groupings had different reasons for moderation.
In the Jewish case grouping, moderation was intended to block discussions
perceived as harmful to Jews, such as those on Middle-East Politics, advocacy
for non-Jewish religions, and anti-Semitic attacks. Users supporting moderation
felt it generates honest and healthy discussions on Jewish topics. In the Singapore
case grouping, advocates for moderation wanted to lower the signal-to-noise
ratio. Few, if any, individuals felt that Singapore or Singaporeans were under
threat.
Table 4 Examples for Each Type of Code
Example quotes
Code Jewish Usenet Singapore Usenet
Anticulture Crude for us; solar panels for Jews. 54years of these parasites, and now theywant us to die for them again (Feb. 28,2003)
Why is Singapore called the lion city?Because the wimps that replaced thelions didn’t want the rest of the worldto know what kind of city this really is!(Feb. 28, 1996)
On-topic Look into the biographies of some ofthem, and start with, say, Maimonides.I think you’ll find they ‘‘got out,’’ as youput it, quite a bit more than you seemto think. (Feb. 25, 2003)
They mentioned about being able toidentify with the characters from thatlocal sitcom Under One Roof whichended its season just today. (Feb. 28,1996)
Off-topicpolitics
Syria is one of the states where the USat some point lost its position. But it ishard to say that it got it back. (Feb. 23,1999)
If it’s any consolation, a lot of the stuffstolen by my great-great-great-grand-father is now in the British Museum(Feb. 28, 1996)
Newsgroupgovernance
soc.culture.jewish FAQ: Introductionto the FAQ and s.c.j Newsgroups (1/12)(Feb. 28, 2003)
YPAP is on the lookout so that theycan make recommendations to SBA sothat guidelines and actions can beformulated for this newsgroup. (Feb.28 1996)
Denigrationof otherculture
When everyone realizes that we are allin this together, and that Muslimextremists are the new Nazis . . . (Feb.28, 2003)
Cantonese was started by southernbarbarians watching pigs squeel andvietnamese by sows screaming. (Feb.25, 1996)
Otheroff-topic
Played by a Canadian (he [is]Canadian, isn’t he?) and played tomake the English hero look like acomplete ditz! (Feb. 28, 2003)
. . . I got to Wishsong and discoveredthe ugly truth that Brooks was run-ning out of ideas and strangling a deadgolden goose . . . (Feb. 26, 1996)
Virtual Community Speech Regulation 241
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
An initial review of the case sites revealed their particular characteristics.
Specifically, the unmoderated Jewish virtual community (SCJ) was characterized by
attacks from other sources, e.g., neo-Nazis, Christian missionaries, and anti-Jewish
Muslims. In contrast, negative statements in the Singapore community were generally
not targeted at average Singaporeans, but instead focused primarily on (often
Singaporean) political leaders. The moderated Jewish virtual community (SCJM) was
characterized by discussions on Jewish topics. The moderated Singapore community
had relatively light discussion with few replies. Figures 1�4 present exemplars of
activity in the four virtual communities.
Jewish Case Site
Soc.culture.jewish (SCJ) was formed in February 1984 for the discussion of Jewish
topics, including the various recognized movements within Judaism, and debates
over Halacha, and Torah interpretations. SCJ began to attract irrelevant discussions
such as those on Middle-East politics, advocacy for non-Jewish religions, and anti-
Semitic attacks on other newsgroup users by 1987. These topics were explicitly stated
under the FAQ as inappropriate.
What topics are *not* appropriate for S.C.J? Middle East politics, especiallyinternational issues concerning Israel, belong in talk.politics.mideast, not S.C.J. . . .Readers of S.C.J are committed to their religion; it is inappropriate to ‘‘witness’’ orpreach . . . Lastly, . . . Don’t write ‘‘Lashon Hara’’, derogatory information aboutpeople or groups. (May 10, 1993)
By 1990, SCJ users began advocating moderation to control objectionable posts.
After reading all the garbage from [name withheld] and [name withheld], I proposethat s.c.j becomes a moderated group. This is not an attempt to block critisism [sic]of Israel or Jews, but to generate an honest and healthy discussions on topics ofinterest, devoid of hate and slime. (September 24, 1990)
Soc.culture.jewish.moderated was finally created on July 9, 2000. The new
newsgroup was put to a public vote, which passed 212 to 34.
Singapore Case Site
The first Singapore Usenet newsgroup, soc.culture.singapore (SCS) was created in
January 1993 to discuss topics relevant to Singaporeans and to provide a resource for
Table 5 Interrater Scores
Kappa
Case sites Raters 1 & 2 Raters 1 & 3 Raters 2 & 3
SCJ1999 .59 .75 .59SCJ2003 .70 .80 .65SCJM2003 .72 .83 .65SCS1996 .72 .83 .63SCS1998 .80 .68 .73SCSM1998 .80 .72 .86
242 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
general information about Singapore. SCS was formed from soc.culture.asean, as
many contributors in the latter were Singaporeans studying overseas.
SCS quickly took on its own special character. Local Singaporeans, Singaporeans
overseas, and foreigners (often Malaysian and Indonesian) shared conversations
about everything from current news events, to the characteristics of local politicians,
to local television shows. The virtual community was marked by its diversity, which
many people enjoyed.
Figure 1 Postings to SCJ: February 22, 1999.
Figure 2 Postings to SCJM: February 22, 2003.
Virtual Community Speech Regulation 243
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Would not this scsm be like eating ‘‘Rojak’’ without the ‘‘Prawn paste’’? I find, apart
from those occasional junks, scs as itself is very informative and dynamic. It reflects
the Singapore that we are in, multi-EVERYTHING. (July 16, 1996)
(Rojak is a south-east Asian salad. The Malay version uses prawns in the salad
dressing.)
In 1995, SCS members advocated the creation of soc.culture.singapore.moderated
(SCSM) as an additional newsgroup for Singapore because of (1) congestion in SCS,
(2) the low signal-to-noise ratio caused by irrelevant commercial and test posts, and
(3) rude and obscene posts.
Figure 3 Postings to SCS: February 22, 1996.
Figure 4 Postings to SCSM: February 20�24, 1998.
244 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
. . . I totally agree that moderation would be a good idea. It’s so difficult to get
around to reading the follow-ups of serious posts when you have to strain out all
the junk along the way. (October 4, 1996)
SCSM was created on October 8, 1996 through a public vote, which passed 192
to 48.
Analysis
This section presents the results of the open sampling and internal validity phases. It
is demonstrated that both soc.culture.jewish.moderated and soc.culture.singapore.
moderated blocked problematic postings. However, soc.culture.jewish.moderated be-
came popular as a venue for ‘‘relevant’’ discussions, whereas soc.culture.singapore.
moderated did not. Qualitative evidence for why these events occurred is presented
and then triangulated with statistical analysis (Klein & Myers, 1999). All quotes below
originate in the forums.
Open Sampling
Jewish virtual communities. The creation of soc.culture.jewish.moderated (SCJM)
partitioned Jewish Usenet into two virtual communities. SCJM became a venue for
discussions on Jewish culture and SCJ became a venue for discussions on anti-Jewish
culture. Although some Jews remained in SCJ to rebut anti-Semitic posts, most
elected to migrate to SCJM:
I haven’t been in that sewer since SCJM came on-line. (June 25, 2003)
SCJ attracted substantial discussions on Middle East politics, advocacy for non-
Jewish religions, and anti-Semitic attacks on newsgroup members. SCJ members
initially tried various methods to remove these forms of speech: (a) They developed
alternate unregulated newsgroups such as talk.politics.mideast to address non-Jewish
cultural issues. (b) A committee of 25 newsgroup members developed a FAQ
detailing acceptable and unacceptable newsgroup behavior. (c) Alternate, unregulated
newsgroups were created to address issues of interest to Jews. For example,
alt.personals.jewish was intended to help Jews find marriage partners. (d) Finally,
filtering technology (i.e., Killfiles) was used to screen undesirable postings.
All methods were ineffective because they did not address the ‘‘sticky’’ nature of the
problematic postings. Undesirable elements wanted to remain close to community
members to harass and intimidate them. Posts were cross-posted (i.e., postings were
sent to multiple newsgroups) frequently, and ignored guidelines in the FAQ.
Technical filters were especially ineffective. Senders of problematic postings had
strong incentives to ensure that posts were read and employed multiple tactics to
overcome technical filtering. For example, some senders would periodically switch
identities.
When I put him in my kill file, he re-emerged with a last name added to the
[identity withheld]. (August 3, 2000)
Virtual Community Speech Regulation 245
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Alternately, senders hijacked posts, subverting the original intent of a message. In
the example below, a holocaust events discussion was subverted to accuse Jews of
creating communism.
The jew Jucov Kurovsky on the orders of the new Trotsky, shot and then bayoneted
to death The Tsar, his wife, 5 children, their doctors, servanats [sic] and even their
little pet spaniel in the cellar where they had been held since their arrest by the
bolsheviks. (Re: Holocaust Calendar: February 20, February 19, 2003)
As a result, users of technical filters and senders of hurtful speech engaged in an
arms race. Technical filters would screen for a particular tactic, whereupon senders of
problematic postings would innovate new deceptive strategies. Thus, Killfiles became
highly complex, which effectively prevented new community users from using them.
Every newsgroup needs new posters, and new posters don’t know what to killfile.
(April 20, 1999)
Furthermore, it was expensive for every virtual community member to block these
posts, as they would miss important discussions because Killfiles accidentally filtered
them. These tactics made moderation useful. Moderators block postings that
contravened the rules of SCJM. The moderators thus absorbed the cost of such
screening from every virtual community member.
Singapore virtual communities. Various problems inhibited the growth of soc.cul-
ture.singapore.moderated (SCSM). First, few people know or care about Singapore.
As a result, SCS required few of the controls necessary to ensure survival of the Jewish
Usenet community. Second, speech in Singapore is tightly regulated and many in
Singapore wanted an outlet free of speech regulations. The anonymous, unregulated
SCS provided such a venue. Finally, the initial set of moderators of SCSM was
perceived as a ‘‘biased’’ governing body. This initial panel belonged to ‘‘Sintercom,’’ a
semiformal organization devoted to encouraging Singaporean participation in virtual
communities. Sintercom furthermore provided the infrastructure for SCSM includ-
ing the server used for moderation. As a result, many members of SCS felt that SCSM
favored certain kinds of Singaporean postings over others:
She has also told me about the moderator’s favouritism. (April 29, 1997)
No actual evidence of bias emerged. SCSM moderators took pains to ensure
transparency, for example, placing all rejected postings on a separate website, with a
public statement as to why each post was rejected. That all moderators were initially
from Sintercom could be explained by their prior familiarity with Usenet. At the time
of soc.culture.singapore.moderated’s inception, most Singaporeans outside Sinter-
com were unfamiliar with the technical and bureaucratic issues surrounding creating
a moderated newsgroup. Also, few outside Sintercom volunteered to be moderators.
SCSM officially closed in 2001 when Sintercom was declared a political
organization by the Singapore government. To avoid censure under Singapore laws,
Sintercom moved its servers overseas. However, SCSM did not move with the
Sintercom servers, and no one moderates the forum today.
246 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Internal Validity Phase
Given that interrater agreement was sufficiently strong, raters reconciled their codes,
and reconciled codes were analyzed for distributional differences. Table 6 presents
codes for the Jewish newsgroups and shows that SCJM was able to filter out off-topic
posts. Only 6, 12, and 24 anti-Jewish, Mid-East Politics, and off-topic posts appeared
in SCJM for the sampling period. In contrast, 544, 753, and 339 such posts,
respectively, appeared in SCJ.
The data suggests that SCJM’s ability to filter such posts encouraged Jewish
participation. SCJM has a higher percentage (67.8%) of Jewish posts compared to
both its predecessor (SCJ 1999*22.8%), and its current nonmoderated competitor
(SCJ 2003*2.4%). Furthermore, SCJM attracts more Jewish posts than SCJ 2003
(103 vs. 41). Although SCJ 1999 has a greater quantity of posts than SCJM 2003 (120)
members of the Jewish community had a harder time accessing posts from SCJ 1999.
Only 22.8% of posts are on-topic. In contrast, three in five posts in SCJM are on-
topic.
Furthermore, the data suggests that the ‘‘flavor’’ of SCJM and SCJ were different.
Both a chi-square and a Kolmogorov-Smirnov test with a Bonferroni adjustment
were employed to test for differences in posting distribution across groups (Tooth-
aker, 1993). All tests on the Jewish sites were significant (pB.05), suggesting that
SCJM and SCJ had separate ‘‘characters.’’ Table 7 presents relevant statistics.
SCSM likewise filtered non-Singaporean postings (see Table 8). Only 14 off-topic
posts appeared for the sampling period (2 Asian politics, 12 other off-topic). In
contrast, SCS had 986 such posts (35 anti-Singaporean, 100 political, and 851 other
off-topic).
However SCSM only attracted 30 Singapore culture posts. In contrast, fully 418
appeared in SCS. Thus, although the percentage of relevant posts in SCSM was higher
(65.2% vs. 29.4%), this difference can be discounted by disparities in the total posting
volumes.
Distributional differences in the Singapore newsgroups were weaker than those in
the Jewish newsgroups. When SCSM was compared against SCS for the identical time
period, the results were significant. In contrast, when SCSM was compared with SCS
1996, the results were not significant (p�.17/.11). These numbers contradict each
other. One (SCSM 1998 vs. SCS 1998) suggests that posts to SCSM were of a different
Table 6 Content Distribution of SCJ and SCJM: Feb. 19�28, 2003/Feb. 19�28, 1999
SCJM SCJ SCJ % SCJM % SCJ % SCJCategory 2003 2003 1999 2003 2003 1999
Anticulture 6 544 215 3.9% 31.7% 40.9%On-topic 103 41 120 67.8% 2.4% 22.8%Off-topic politics 12 753 124 7.9% 43.9% 23.6%Newsgroup governance 7 1 1 4.6% 0.0% 0.2%Denigration of other culture 0 37 8 0.0% 2.2% 1.5%Other off-topic 24 339 58 15.8% 19.8% 11.0%Total 152 1715 526
Virtual Community Speech Regulation 247
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
tenor than posts to SCS. The other (SCSM 1998 vs. SCS 1996) does not. Results
suggested that SCSM filtered off-topic posts, but such posts were so infrequent that
filtering was unnecessary. Table 9 presents relevant statistics.
To evaluate this interpretation, the effect size of the distributional differences
were measured. As Table 10 demonstrates, the effect sizes of posting distribution
differences in the Jewish newsgroups ranged from .36 to .70, whereas those for
the Singapore newsgroups ranged from .09 to .15. Cohen (1988) suggests that a W
of .1, .3, and .5 is weak, moderate, and strong, respectively. Thus, posts in SCJM were
strongly different from those in SCJ, and therefore attracted a different community.
In contrast, posts in SCSM were weakly different from SCS, and hence the SCS/SCSM
communities overlapped to a great extent.
Generalizability
The results of the first study suggested that moderation appeared necessary in a
situation where distinct factions within a virtual community were opposed to other
factions. The Jewish community was beset by missionaries, neo-Nazis, and
representatives from countries with antagonistic relationships with Israel. However,
one open question was whether this was only true for cultural virtual communities,
or whether this finding was more general.
Table 7 Tests of Distributional Differences between SCJ and SCJM
Chi-square Bonferroniadjustment
Kolmogorov-Smirnov Bonferroniadjustment
Test Statistic p p Statistic p p
SCJM 2003vs. SCJ2003
x2�926.17, df�5 B.01 B.01 Z�4.44 B.01 B.01
SCJM 2003vs. SCJ1999
x2�16.23, df�5 B.01 B.01 Z�4.01 B.01 B.01
SCJ 1999 vs.SCJ 2003
x2�305.14, df�5 B.01 B.01 Z�5.93 B.01 B.01
Table 8 Content Distribution of SCS and SCSM: Feb. 19�28, 1998/Feb. 19�28, 1996
SCSM SCS SCS % SCSM % SCS % SCSCategory 1998 1998 1996 1998 1998 1996
Anticulture 0 35 32 0.0% 2.5% 2.2%On-topic 30 418 612 65.2% 29.4% 41.9%Off-topic politics 2 100 71 4.3% 7.0% 4.9%Newsgroup governance 2 17 32 4.3% 1.2% 2.2%Denigration of other culture 0 1 9 0.0% 0.0% 0.6%Other off-topic 12 851 706 26.1% 59.8% 48.3%Total 46 1422 1462
248 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
The sci.psychology.psychotherapy/moderated newsgroups were studied to answer
this question. The charter (i.e., intent) of sci.psychology.psychotherapy was
the discussion of all different modalities of psychotherapy, their efficacy, and theiracceptance within the scientific psychology community. (March 15, 1995)
Sci.psychology.psychotherapy suffered a number of problems. First, various groups
and individuals had antipsychology agendas and employed the unmoderated
newsgroup as a platform to air their views.
The name? Psychiatry. In the name of helping to clear engrams, it has brutalizedmany millions of individuals, hacking at their brains, searing them with electricity,numbing them with drugs or using other sick and twisted mind-control techniques.(January 28, 1996)
(The speaker in the quote is a scientologist. ‘‘Engrams’’ refer to a belief that
scientologists have. The term should not be confused with the homonym used by
Psychologists.)
Others employed the newsgroup to raise particularly contentious topics such as
pedophilia (sexual desire for children). In 1994, the American Psychological
Association dropped pedophilia as a psychiatric disorder from its diagnostic and
statistical manual (DSM-IV). Since then, advocates for pedophilia used the
unmoderated psychotherapy newsgroup as a platform to argue that pedophilia was
not harmful to children.
Table 9 Tests of Distributional Differences between SCS and SCSM
Chi-square Bonferroniadjustment
Kolmogorov-Smirnov Bonferroniadjustment
Test Statistic p p Statistic p p
SCSM 1998vs. SCS1998
x2�32.42, df�5 B.01 B.01 Z�2.26 B.01 B.01
SCSM 1998vs. SCS1996
x2�12.57, df�5 .03 .17 Z�1.52 .02 .11
SCS 1996 vs.SCS 1998
x2�65.55, df�5 B.01 B.01 Z�3.27 B.01 B.01
Table 10 Effect Sizes
Newsgroups Contingency coefficient W
SCJ2003 vs. SCJM .58 .70SCJ2003 vs. SCJ1999 .35 .37SCJ1999 vs. SCJM .44 .49SCS1998 vs. SCSM .15 .15SCS1998 vs. SCS1996 .15 .15SCS1996 vs. SCSM .09 .09
Virtual Community Speech Regulation 249
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Many felt that these problems could only be addressed by moderation, and
discussion began on the creation of a moderated newsgroup.
In the last few weeks whe [sic] have been raided by the self-indulgent, scientologist,
pedophiles, the mentally ill, and the ritual-satanic-abuse-repressed-memory-move-
ment crowd. Is it not time to begin talking about a moderated forum? (April 22,
1996)
Several individuals voiced that the current atmosphere in sci.psychology.psy-
chotherapy was inhibiting their willingness to speak and read the messages of others:
I lurk, mostly because I am reluctant to expose my comments to insensitive
criticism by [name withheld]. I would never send messages in the sort of tones he
uses, even if I disagree with someone. I often feel relieved when reading one of his
tasteless, insensitive (sometimes incoherent) responses, that it wasn’t me who
originally posted. (June 3, 1996)
A petition to transform the newsgroup to a moderated one failed on July 13, 1996.
There were 125 yes votes, and 48 no votes. The petition failed, because a change to a
Usenet newsgroup must obtain at least 100 more yes votes than no votes.
Most who abstained or voted no agreed there were substantial problems with the
existing newsgroup. However, they wanted a way to speak without fear of censorship.
Proponents of moderation proposed an alternative moderated group to sci.psycho-
logy.psychotherapy. Those who felt that posting rules in the moderated group were
harsh or unfair could post to the unmoderated newsgroup. Sci.psychology.psy-
chotherapy.moderated was put to a public vote on August 13, 1997 and passed with
171 yes votes, and 34 no votes.
Figures 5�7 present typical examples of discussions on sci.psychology.psychother-
apy/moderated for the period before its creation and in 2004. A comparison of
Figures 5 and 6 against Figure 7 demonstrates that although more posts appear on the
unmoderated newsgroup, posts about psychology and psychotherapy (e.g., psycho-
pharmacology) go unanswered. Ad hominem attacks (e.g., ‘‘Shut up, child
molester’’), or provocative (e.g., ‘‘aliens are stealing my thoughts’’) dominate. On
the moderated forum, questions about psychotherapy obtain replies.
Discussion
The previous analysis presented three cases. One case (SCS/SCSM) reflects an
expected outcome. In the moderated community, speech was successfully regulated,
and the moderated community failed. Two cases (SCJ/SCJM and SPP/SPPM) reflect
an unexpected outcome. Specifically, these virtual communities thrive while being
moderated.
The data suggests that the successful regulation of virtual community speech
requires two factors: (1) the presence of an adversarial collective identity, i.e., one
opposed to the existence of another collective identity, and (2) the presence of a target
collective identity that does not define itself adversarially. This combination creates
silencing speech, which inhibits the expression and thus formation of the target
250 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
collective identity. To survive, the target collective identity must create a virtual
community where speech is regulated. Figure 8 summarizes the argument.
Collective identity refers to a group’s cognitive, moral, and emotional connection
with each other (Polletta & Jasper, 2001). Collective identity is similar to social
identity in that both are concerned with the self-concept of individuals and their
Figure 5 Posts on sci.psychology.psychotherapy: Nov. 27�Dec. 2, 1995.
Figure 6 Posts on sci.psychology.psychotherapy: Nov. 29�Dec. 2, 2004.
Virtual Community Speech Regulation 251
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
feelings of belongingness to a group (Tajfel, 1981; Turner, Hogg, Oakes, Reicher, &
Wetherell, 1987). However, whereas social identity is concerned with determining
how individuals form this self-concept, collective identity is concerned with choice
among potential social identities (Ashmore, Deaux, & McLaughlin-Volpe, 2004). For
example, a homogeneous group could define itself as women (Ray & Korteweg,
1999), mothers (Ray & Korteweg, 1999), bisexuals (Slagle, 1995), Indians (Bacon,
1999), South Asians (Bacon, 1999), or Asians (Bacon, 1999). Furthermore, collective
identity focuses on the group. A collective identity endures beyond the life of any
individual member (Ashmore et al., 2004; Triandafyllidou & Wodak, 2003). Most
research focuses on its role in social movements, be they worker (Brown &
Humphreys, 2002), homosexual (Slagle, 1995), racial (Bacon, 1999; Ogden & Hilt,
2003; Suleiman, 2002), feminist (Ray & Korteweg, 1999), or political (Bernstein,
2005; van Aalst & Walgrave, 2002).
Figure 7 Posts on sci.psychology.psychotherapy.moderated: Nov. 6�Dec. 14, 2004.
Formation ofsuccessful
moderated virtualcommunity
Silencingspeech
Adversarialcollectiveidentity
Non adversarialtarget collective
identity
Figure 8 Factors predicting successful moderated virtual communities.
252 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
The particular collective identity an individual affiliates with depends on the
company kept, and more importantly, what is communicated in that social circle
(Hardy et al., 2005). Communication shapes not only identification with the group,
but also an understanding of what it means to be a member of the group (Polletta,
1998; Weiss, 2003). For example, the self-concept of first and second generation Asian
Indian-Americans is markedly distinct. First-generation Indian-Americans separate
themselves into religious and caste lines partly because language in the social groups
they inhabit reinforce those distinctions. In contrast, the social groups of second-
generation Indian-Americans emphasize and reinforce that all Indian-Americans are
together, regardless of religion or caste (Bacon, 1999).
A disruption of communication processes caused by formal regulation leads to a
dysfunctional collective identity. For example, the fall of the Berlin Wall has been
attributed to an East German realization of a collective identity. Prior to 1989, formal
regulations against public speech meant East Germans did not realize that the
majority shared their dissatisfaction with the communist government (Fulbrook,
2002). The mass migration of young East German workers to West Germany in 1989
communicated this collective identity precipitating the Leipzig demonstrations that
resulted in German reunification (Pfaff, 1996).
Arguments against the use of formal governance in virtual communities follow this
logic. As virtual communities are experienced only on screen, the sole method of
communication is via posters’ speech in the form of text and images. The formal
suppression of speech inhibits the formation of collective identity, because concepts
inherent to the collective identity cannot be communicated. However, this chain of
reasoning is problematic for two reasons. First, such reasoning assumes a virtual
community identity is equivalent to the collective identity of the members of the
virtual community. Virtual communities, like any other human-defined institution,
can have multiple, distinct, collective identities (Simon & Klandermans, 2001). An
American virtual community, like soc.culture.usa, could have members who vote
Democrat or Republican.
Second, that a policy has negative implications does not necessarily mean the
policy should not be deployed. For example, doctors amputate limbs to save patients
from gangrene. In the same way, a policy with negative implications should be
deployed if a failure to deploy the policy results in a worse outcome.
One danger of collective identity is that it is by nature dualistic. Collective identity
requires an ingroup and an outgroup. Those who share the collective identity are the
ingroup, and those who do not are the outgroup (Lamont & Molnar, 2002). In most
cases, this property of collective identity is inconsequential. There are Indians and
non-Indians, just as there are mothers and nonmothers. However, in some cases, the
‘‘us’’ vs. ‘‘them’’ nature of collective identity manifests so that ‘‘they’’ are adversaries
(Bernstein, 2005; Kozinets & Handelman, 2004; Simon & Klandermans, 2001). White
supremacists, for example, define ‘‘them’’ with antipathy (Adams & Roscigno, 2005).
It is therefore possible for a virtual community to have members belonging to two
separate, mutually exclusive collective identities, one adversarial to the other. In such
Virtual Community Speech Regulation 253
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
a situation, the virtual community identity formed from the interaction between the
collective identities may be undesirable to one of the collective identities.
The research findings suggest that this dualistic conflict between collective
identities explains when moderation and formal governance help virtual commu-
nities succeed. For SCJM, restricting Mid-East political, non-Jewish religious, and
anti-Semitic speech suppressed the anti-Jewish collective identity and allowed Jewish
members to more freely discuss Jewish ideas. However, regulating speech does not
work when dualistic collective identities are absent. Although SCSM filtered ‘‘noise,’’
most Singapore Usenet participants felt that was insufficient justification for its
existence. Three major themes on why virtual communities regulate speech emerged
from the analysis: (a) community identity versus collective identity, (b) how silencing
speech works, and (c) addressing silencing speech.
Community Identity and Collective Identity
The three virtual communities studied here vividly illustrate that a virtual
community can have multiple collective identities. SCJ was frequented not only by
Jews, but also neo-Nazis, Christian missionaries, and anti-Jewish Muslims. SCS
similarly was frequented not only by local Singaporeans, but overseas Singaporeans,
and non-Singaporeans. There was, for example, noticeable Malaysian and Indonesian
participation. Finally, SPP was inhabited by psychotherapists, the mentally unstable,
pedophiles, and Scientologists. The identity of these unmoderated communities was
characterized by the interactions of these separate collective identities.
The virtual communities that enacted moderation successfully (SCJM/SPPM) were
distinguished from the one that did not mainly by the identity of their unmoderated
equivalents. Both SCJ and SPP were marked by substantial conflict between collective
identities. Statistically, SCJM and SCJ had their own posting distributions, suggesting
they had separate identities. In SCJ, the identity was marked by conflict. In SCJM, by
Jewish discussion. In SCS, there would be sporadic conflict, but generally, a wide
variety of separate topics would be discussed. Furthermore, the statistical analysis
suggests the nature of conflict in SCS and SCSM were not materially different.
One key factor differentiating virtual communities that enact moderation
successfully is conflict between collective identities. In SCJ and SPP, some collective
identities defined themselves as adversaries of other (target) collective identities in the
virtual community. Neo-Nazis, and anti-Jewish Muslims were opposed to Jewish
existence. Christian missionaries wanted to convert Jews, i.e., transform the Jewish
collective identity into a Christian one. Similarly, Scientologists were opposed to
psychotherapy. Pedophiles, although not opposed to psychotherapy, opposed a
psychotherapy tenet. Specifically, they deliberately misrepresented psychotherapists as
recognizing pedophilia as a legitimate life choice. Psychotherapists recognized that
pedophilia was immoral and illegal, but felt it was not a disease.
Interestingly, the target collective identities did not define themselves as opposed to
their adversaries. The Jewish identity exists separately from the neo-Nazi, Christian,
and anti-Jewish Muslim identity. Similarly, psychotherapists do not define themselves
in relation to Scientologists or pedophiles.
254 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Collective identity requires that the beliefs and concepts of the collective be
articulated. Furthermore, a demonstration of adversarialness requires a collective
identity to articulate its beliefs in the presence of target collective identities. The
target collective identity does not view itself in relation to the adversary, so this
articulated belief is viewed as noise (i.e., irrelevant) or a flame (i.e., offensive).
Somehow, this speech of the adversarial collective identity is silencing. It disrupts the
conversations of the target group thereby inhibiting the manifestation of the target
collective identity. As seen with SCJM and SPPM, when no adversarial relationship
exists, target collective identities manifest.
Adversarial collective identities need not hate their targets. Christian missionaries,
for example, do not hate Jews. Instead, the adversarial collective identity is opposed
to the existence of the target collective identity. A Christian missionary’s mission is to
transform the Jewish collective identity into a Christian one.
In the Singapore community, the various collective identities did not define other
collective identities as permanent adversaries. Malaysians and Singaporeans may
occasionally be in opposition*typically because of current political tensions.
However, once tensions ease, the conflict ceases. Similarly, some members of the
community oppose some collective identities. One quote in Table 4 demonstrates
anti-Cantonese and anti-Vietnamese sentiment. However, in no case did a collective
identity define itself as adversarial to another one; the anti-Cantonese/anti-
Vietnamese member never identifies with a group.
The presence of the adversarial collective identity and nonadversarial target causes
virtual communities to reject research advice and regulate speech. A virtual
community’s identity is normally defined by members. Over time, the virtual
community evolves a ‘‘sense of community,’’ as participants’ beliefs shape and are
shaped by community participation (Koh & Kim, 2001, 2003). If there is insufficient
overlap between the identity of the virtual community and a particular collective
identity, that collective identity emigrates and forms a new virtual community. For
example, when the Singapore Usenet community grew sufficiently large, it broke
from soc.culture.asean.
However, there can be no ‘‘sense of community’’ if the community space is
characterized by eternally opposing collective identities. Furthermore, emigration
fails to be a viable strategy when one collective identity defines itself by its adversaries.
To assert its collective identity, the adversarial group must pursue the emigre to its
new cyber-settlement. Thus, without soc.culture.jewish.moderated, Jewish Usenet
would largely be characterized by endless debate between Jews and non-Jews. The
creation of groups like talk.politics.mideast and alt.personals.jewish provide vivid
examples. The former group was created to remove topics of tangential interest to
Jews; the latter was created to provide a new private space for Jews. Neither helped
Jews talk about Jewish topics. In contrast, soc.culture.jewish.moderated enabled
cultural discussion on Jewish matters to continue, and various Jewish vs. non-Jewish
debates continue on soc.culture.jewish. Thus, moderation is necessary for the target
collective identity to assert itself.
Virtual Community Speech Regulation 255
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
How Silencing Speech Works
One unanswered question in the above discussion is how an adversarial collective
identity can inhibit speech by the target collective identity. Individuals participate in
virtual communities because virtual communities provide them with some kind of
utility, be it knowledge, social relationships, recreation, or transactional opportunity
(Armstrong & Hagel, 1996). To achieve such utility in a virtual community,
individuals must participate in conversations. In online forums, there are only two
ways to participate: by (1) reading conversations, or (2) posting conversations.
Silencing speech disrupts speech and therefore collective identity by making both
these activities unpleasant. Reading is made unpleasant, because the reader must filter
threads containing the silencing speech, which are both the majority of topics, and
difficult to do automatically. Once such topics are eliminated, the reader must deal
with topics hijacked by the adversary collective identity.
Posting is made unpleasant, because it requires that the poster relinquish some of
the poster’s anonymity. Even when a post is under an assumed name, the poster
reveals his or her existence. Furthermore, one generally posts with the expectation
that one receives replies. Members of the adversary collective identity reply to the
post with unpleasant speech. As demonstrated in the psychotherapy case, these replies
are sufficiently unpleasant that they not only deter existing posters, but potential
posters from contributing to the collective identity.
The problem is aggravated because silencing speech is enduring and targeted.
Virtual communities without silencing speech encounter noise, which makes reading
difficult, and flames, which make posting difficult. However, because the noise is not
targeted at a particular collective identity, it has a degree of randomness, and thus can
be filtered. Similarly, flames are relatively short-lived and have a narrow focus. When
interest in the issue that caused the flame dies, the flame dies as well. Similarly, over
the life of the virtual community, flames will target a range of groups and individuals.
Thus, a representative of a random group in the virtual community will take offense
at some flame, but will read others and feel amusement. Harm, if any, done to a
collective identity is minimal. Occasionally, animosity may develop between
individuals so that one individual leaves. However, although the loss of small
numbers of individuals may influence the virtual community identity, the magnitude
of the loss in no way equals that associated with an assault on a collective identity.
With silencing speech, noise is targeted at disrupting the speech of a collective
identity. It cannot be filtered, because the silencing speech changes to overcome the
filter. Similarly, flames created by the adversarial collective identity are targeted at a
collective identity. Any member of the target group is negatively affected when he/she
reads the flame.
Furthermore, adversarial collective identities desire that the virtual community
identity be characterized by conflict. Conflict reinforces the adversarial aspect of the
collective identity. Thus, unlike the transient flames that occur in most virtual
communities, the flames of silencing speech are enduring. In short, the eternal
disruption of reading (via noise) and posting (via flames) by the adversarial collective
256 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
identity destroys communication by the target collective identity, preventing the
formation of the target collective identity.
Addressing Silencing Speech
It is therefore not possible for the target collective identity to embrace silencing
speech, ignore silencing speech or establish technical barriers against it. The target
collective identity cannot embrace silencing speech, because it does not define itself as
an adversary. Conflict diminishes the collective identity. Likewise, it cannot ignore the
silencing speech, because such speech silences the speech of group members, thereby
reducing collective identity. Finally, silencing speech desires to overcome technical
barriers, and harasses virtual community members until it cannot be ignored. In the
Jewish case, disseminators of hate speech would hijack discussions, or change
usernames. Human intervention is necessary.
Furthermore, individual-level protections against silencing speech are needlessly
costly. Every member of a virtual community must update him-/herself each time
silencing speech innovates. Human-driven filters like moderation are thus more
attractive, because only one individual or group must update him-/herself to address
innovation.
Ironically, the best way for both an adversarial and target collective identity to
emerge is to form two virtual communities. One virtual community is moderated to
prohibit the silencing speech. The other is unmoderated to allow the adversarial
collective identity to utter its adversarial speech. A few members of the target
collective identity and supporters of the target collective identity will participate in
the unmoderated community, allowing the conflict to continue.
Note that the above discussion should not be construed as advocacy for regulation
of virtual communities by country governments. On the contrary, this research has
demonstrated that the virtual communities themselves have a mechanism (modera-
tion) sufficient for protecting themselves against silencing speech.
Conclusion
The research literature advocates that virtual communities should impose minimum
authority to promulgate speech. However, many virtual communities enact and
enforce rules specifically to regulate speech in the community (i.e., moderation). This
paper explains moderation via a cross-case analysis of virtual communities. A
successful and an unsuccessfully moderated virtual community are contrasted and
both are compared against their unmoderated equivalents. A third case demonstrates
that findings were generalizable.
The analysis revealed that moderation is necessary to preserve collective identity in
virtual communities. It is argued that a virtual community’s identity comprises
multiple collective identities. Ordinarily, the virtual community’s identity is a function
of the interaction between these collective identities. In virtual communities, collective
identity is created and reinforced through conversation. In some cases, collective
identities define themselves as adversaries of target collective identities. When these
Virtual Community Speech Regulation 257
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
adversarial collective identities create and reinforce their identities online, they create
conversations that target collective identities view as flames and noise. Because the
flames and noise are enduring, they silence the speech of the target group thereby
disrupting its collective identity. It was shown that alternate methods to screen
silencing speech such as technological filters, and ignoring suppressive speech do not
work. Formal governance is the only mechanism that works for silencing speech.
This research has identified silencing speech as one reason for moderation, and
suggests that distinct governance mechanisms must be deployed for separate virtual
communities. Future research must be performed to classify virtual communities and
identify appropriate governance mechanisms for each classification. For example,
should hate speech be governed in a manner distinct from polar political speech, or
missionary speech?
Acknowledgements
This paper received assistance from numerous sources. I would like to thank Pok
Hongling, Tay Yi Pei, and Tin Pay Yng for coding. I would also like to thank Andrew
Burton-Jones, Boh Wai Fong, Suay Bah Chua, Goh Kim Huat, Lim Wee Kiat, Mark L.
Gillenson, Cindy Levey, Mark Keil, Ron Rice, and Christina Soh for comments and
insights on earlier drafts of this paper. I benefited from comments from various
individuals at OASIS 2004 especially Lynette Kvasny, and Noriko Hara. I am also
grateful to members of the Jewish Usenet community for reading and commentary
including Jonathan Baker, Ken Bloom, Henry Goodman, Chanoch Kesselman, Dan
Kimmel, David Roth, Moshe Schorr and those who wish to remain anonymous.
Finally, I am grateful for interviews with members of the Singapore Usenet
community including Wynthia Goh, Tan Chong Kee, William Anthony Timmins,
and Xiao Jinhong. Any mistakes or omissions are the sole responsibility of the author.
References
Adams, J., & Roscigno, V. J. (2005). White supremacists, oppositional culture and the World Wide
Web. Social Forces, 84(2), 759�778.
Aiken, M., & Waller, B. (2000). Flaming among first-time group support system users. Information
and Management, 37(2), 95�100.
Alonzo, M., & Aiken, M. (2004). Flaming in electronic communication. Decision Support Systems,
36(3), 205�213.
American Psychiatric Association. (1994). Diagnostic and Statistical Manual of Mental Disorders 4th
Edition. Arlington, VA: American Psychiatric Association.
Andrews, D., Preece, J., & Turoff, M. (2002). A conceptual framework for demographic groups
resistant to online community interaction. International Journal of Electronic Commerce, 6(3),
9�24.
Ardichvili, A., Page, V., & Wentling, T. (2003). Motivation and barriers to participation in virtual
knowledge-sharing communities of practice. Journal of Knowledge Management, 7(1), 64�77.
Armstrong, A., & Hagel, J., III. (1996). The real value of on-line communities. Harvard Business
Review, 74(3), 134�140.
258 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Ashmore, R. D., Deaux, K., & McLaughlin-Volpe, T. (2004). An organizing framework for collective
identity: Articulation and significance of multidimensionality. Psychological Bulletin, 130(1),
80�114.
Axelrod, R. (1984). The evolution of cooperation. New York: Basic Books.
Bacon, J. (1999). Constructing collective ethnic identities: The case of second generation Asian
Indians. Qualitative Sociology, 22(2), 141�160.
Baker, P. (2001). Moral panic and alternative identity construction in Usenet. Journal of Computer-
Mediated Communication, 7(1), 4.
Bernstein, M. (2005). Identity politics. Annual Review of Sociology, 31, 47�74.
Blanchard, A. (2004). Virtual behavior settings: An application of behavior setting theories to
virtual communities. Journal of Computer-Mediated Communication, 9(2), 4.
Brown, A. D., & Humphreys, M. (2002). Nostalgia and the narrativization of identity: A Turkish
case study. British Journal of Management, 13(2), 141�159.
Campbell, T. (2004). Internet trolls. Unpublished manuscript. Retrieved from http://members.aol.
com/intwg/trolls.htm
Castelfranchi, C., & Tan, Y.-H. (2002). The role of trust and deception in virtual societies.
International Journal of Electronic Commerce, 6(3), 55�70.
Chua, C. E. H., & Wareham, J. (2004). Fighting internet auction fraud: An assessment and proposal.
IEEE Computer, 37(10), 23�29.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ:
Lawrence Erlbaum Associates, Inc.
Cothrel, J., & Williams, R. L. (1999). Online communities: Helping them form and grow. Journal of
Knowledge Management, 3(1), 54�60.
Dibbell, J. (1993). A rape in cyberspace. Village Voice, 38. Retrieved May 4, 2003, from http://
www.ludd.luth.se/mud/aber/articles/village_voice.html
Donath, J. S. (1998). Identity and deception in the virtual community. In P. Kollock & M. Smith
(Eds.), Communities in cyberspace (pp. 27�58). London: Routledge.
Dutton, W. H. (1996). Network rules of order: Regulating speech in public electronic fora. Media,
Culture, and Society, 18(2), 269�290.
Fulbrook, M. (2002). History of Germany 1918�2000: The divided nation. San Francisco, CA: Wiley-
Blackwell.
Godwin, M. (1994, June). Nine principles for making virtual communities work. Wired, 2.
Retrieved May 4, 2003, from http://www.wired.com/wired/archive/2.06/vc.principles.html
Goodman, P. S., & Darr, E. D. (1998). Computer-aided systems and communities: Mechanisms for
organizational learning in distributed environments. MIS Quarterly, 22(4), 417�440.
Greenhill, A., Campbell, J., & Fletcher, G. (2002, August). Tribalism and conflict: Conflict as a social
unifier in a technologically enabled community. Paper presented at the eighth Americas
Conference on Information Systems, Dallas, TX.
Hardy, C., Lawrence, T. B., & Grant, D. (2005). Discourse and collaboration: The role of
conversations and collective identity. Academy of Management Review, 30(1), 58�77.
Herring, S. C. (1999). The rhetorical dynamics of gender harassment online. The Information
Society, 15(3), 151�167.
Herring, S. C., Job-Sluder, K., Scheckler, R., & Barab, S. (2002). Searching for safety online:
Managing ‘‘trolling’’ in a feminist forum. The Information Society, 18(5), 371�384.
Hiltz, S. R., & Johnson, K. (1990). User satisfaction with computer-mediated communication
systems. Management Science, 36(6), 739�764.
Hiltz, S. R., & Turoff, M. (1985). Structuring computer-mediated communication systems to avoid
information overload. Communications of the ACM, 28(7), 680�689.
Honeycutt, C. (2005). Hazing as a process of boundary maintenance in an online community.
Journal of Computer-Mediated Communication, 10(2), 3.
Kahai, S. S., & Cooper, R. B. (1999). The effect of computer-mediated communication on
agreement and acceptance. Journal of Management Information Systems, 16(1), 165�188.
Virtual Community Speech Regulation 259
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Klein, H. K., & Myers, M. D. (1999). A set of principles for conducting and evaluating interpretive
field studies in information systems. MIS Quarterly, 23(1), 67�94.
Koh, J., & Kim, Y.-G. (2001). Sense of virtual community: Determinants and the moderating role of the
virtual community origin. Paper presented at the twenty-second international conference on
Information Systems, New Orleans, LA.
Koh, J., & Kim, Y.-G. (2003). Sense of virtual community: A conceptual framework and empirical
validation. International Journal of Electronic Commerce, 8(2), 75�93.
Kollock, P. (1996). Design principles for online communities. Paper presented at the Harvard
conference on the Internet and Society, Cambridge, MA.
Kozinets, R. V., & Handelman, J. M. (2004). Adversaries of consumption: Consumer movements,
activism, and ideology. Journal of Consumer Research, 31(3), 691�704.
Kuzel, A. J. (1992). Sampling in qualitative inquiry. In B. F. Crabtree & W. L. Miller (Eds.), Doing
qualitative research (Vol. 3, pp. 31�44). Newbury Park, CA: Sage.
Lamont, M., & Molnar, V. (2002). The study of boundaries in the social sciences. Annual Review of
Sociology, 28, 167�195.
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data.
Biometrics, 33(1), 79�94.
Lehman, R. S. (1988). Statistics and research design in the behavioral sciences. Belmont, CA:
Wadsworth Publishing Company.
Mason, J. (1996). Qualitative researching. Thousand Oaks, CA: Sage.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook.
Newbury Park, CA: Sage.
Myers, M. D. (1999). Investigating information systems with ethnographic research. Communica-
tions of the AIS, 2(23), 1�19.
Ogden, D. C., & Hilt, M. L. (2003). Collective identity and basketball: An explanation for the
decreasing number of African-Americans on America’s baseball diamonds. Journal of Leisure
Research, 35(2), 213�227.
Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. New
York: Cambridge University Press.
Pfaff, S. (1996). Collective identity and informal groups in revolutionary mobilization: East
Germany in 1989. Social Forces, 75(1), 91�118.
Phillips, D. J. (1996). Defending the boundaries: Identifying and countering threats in a Usenet
newsgroup. The Information Society, 12(1), 39�62.
Polletta, F. (1998). Contending stories: Narrative in social movements. Qualitative Sociology, 21(4),
419�446.
Polletta, F., & Jasper, J. M. (2001). Collective identity and social movements. Annual Review of
Sociology, 27, 283�305.
Poor, N. (2005). Mechanisms of an online public sphere: The website Slashdot. Journal of Computer-
Mediated Communication, 10(2), 4.
Ray, R., & Korteweg, A. C. (1999). Women’s movements in the third world: Identity, mobilization,
and autonomy. Annual Review of Sociology, 25, 47�71.
Reinig, B. A., Briggs, R. O., & Nunamaker, J. F. (1997/1998). Flaming in the electronic classroom.
Journal of Management Information Systems, 14(3), 45�59.
Rheingold, H. (1993). The virtual community: Homesteading on the electronic frontier. Reading, MA:
Addison-Wesley.
Rothaermel, F. T., & Sugiyama, S. (2001). Virtual Internet communities and commercial success:
Individual and community-level theory grounded in the atypical case of TimeZone.Com.
Journal of Management, 27(3), 297�312.
Simon, B., & Klandermans, B. (2001). Politicized collective identity. The American Psychologist,
56(4), 319�331.
Slagle, R. A. (1995). In defense of queer nation: From identity politics to a politics of difference.
Western Journal of Communication, 59(2), 85�102.
260 C. E. H. Chua
Downloaded By: [University of Auckland] At: 02:14 21 September 2010
Stanoevska-Slabeva, K. (2002). Toward a community-oriented design of Internet platforms.
International Journal of Electronic Commerce, 6(3), 71�95.
Sternberg, J. (2000). Virtual misbehavior: Breaking rules of conduct in online environments. Paper
presented at the Media Ecology Association, New York.
Strauss, A. L., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and
techniques. Thousand Oaks, CA: Sage.
Suleiman, R. (2002). Perception of the minority’s collective identity and voting behavior: The case
of the Palestinians in Israel. Journal of Social Psychology, 142(6), 753�766.
Suler, J. R., & Phillips, W. (1998). The bad boys of cyberspace: Deviant behavior in multimedia chat
communities. Cyberpsychology and Behavior, 1, 275�294.
Tajfel, H. (1981). Human groups and social categories. Cambridge, UK: Cambridge University Press.
Toothaker, L. E. (1993). Multiple comparison procedures. Newbury Park, CA: Sage.
Triandafyllidou, A., & Wodak, R. (2003). Conceptual and methodological questions in the study of
collective identities. Journal of Language and Politics, 2(2), 205�223.
Turner, J. C., Hogg, M. A., Oakes, P. J., Reicher, S. D., & Wetherell, M. S. (1987). Rediscovering the
social group: A self-categorization theory. Oxford, UK: Blackwell.
Van Aalst, P., & Walgrave, S. (2002). New media, new movements? The role of the Internet in
shaping the ‘‘anti-globalization’’ movement. Information, Communication and Society, 5(4),
465�493.
Wasko, M. M., & Faraj, S. (2000). ‘‘It is what one does’’: Why people participate and help others in
electronic communities of practice. Journal of Strategic Information Systems, 9(2/3), 155�173.
Weiss, M. (2003). The ‘‘chosen body’’: A semiotic analysis of the discourse of Israeli militarism and
collective identity. Semiotica, 145(1/4), 151�173.
Williams, R. L., & Cothrel, J. (2000). Four smart ways to run online communities. Sloan
Management Review, 41(4), 81�91.
Yin, R. K. (1984). Case study research: Design and methods. Beverly Hills, CA: Sage.
Virtual Community Speech Regulation 261
Downloaded By: [University of Auckland] At: 02:14 21 September 2010