+ All Categories
Home > Documents > Vol. 2, No. 1, January-July 2009 - BHU

Vol. 2, No. 1, January-July 2009 - BHU

Date post: 03-Oct-2021
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
137
North Ryde, New South Wales, Sydney, , Australia NSW 2109 Vol. 2, No. 1, January-July 2009
Transcript
Vol. 2, No. 1, January-July 2009
ISSN : 0974-0600
Homogenization, Gregg A. Payne 199-208
21 Surveillance, Control, and Privacy on the Internet: Challenges to 209-224
Democratic Communication, Lauren B. Movius
22 The transformation of political communication in Mexico (1994-2006) 225-247
German Espino
23 No More Bowling Alone: When social capital goes digital 248-263
Anders Svensson
24 I am P/Ninoy:  Filipino Diaspora, Technology and Network Nationalism 264-278
Reggy Capacio Figer
25 Manipulation, Informative Control and Iraq War, Aurora Labio Bernal 279-288
26 President Michelle Bachelet and the Chilean Media : A Complicated Affair 289-312
Claudia Bucciferro
27 The Function of Blogs in Democratic Discourse, Ming Kuok LIM 313-326
l
Professor Naren Chitty AM, with
Her Excellency Professor Marie Bashir,
Governor of New South Wales,
after the investiture ceremony for the Order of Australia at Government House, Sydney, on May 6, 2009
The International Team of GCRA Congratulates
Professor Naren Chitty, Founder President of GCRA
For winning the award of
“Order of Australia” for
“Services to education, particularly in the field of international communication as a researcher and academic, and to a range of professional associations”.
(January to June 2009)
Mohammad Sahid Ullah Associate Professor
Department of Communication and Journalism, Chittagong University Chittagong-4331, Bangladesh
Phone: 88-01554-352573, Mobile:88-01819-333539, Fax: 88-031-726310 Email: [email protected]
Special issue on
Media and Democratization
and News Content Homogenization
Gregg A. Payne, Ph.D. Department of Communication Studies, Chapman University
1 University Drive Orange, CA 92866, USA 01.714.997.6815, [email protected]
Abstract
This paper is, in part, a response to Gandy’s (1982) recognition of a need to go beyond conventional borders of agenda setting theory, to examine who sets the media agenda, for what purposes, and with what consequences. Conventional conceptual explications of gatekeeping and agenda setting are revisited, and substantive modifications proposed. Theoretical linkages between the two are examined, together with the consequences for homogenization of mass media news content. A general content homogenization model is proposed that provides an explanatory and predictive framework for news analysis; irrespective of dominate social, political, and economic ideology. It is argued that that gatekeeping controls over the agenda setting process produce a homogenized news product that curtails opportunities for robust public discourse. McCombs contention that agenda setting is an inadvertent by product of the mass communication process is problematized (2006).
Key words : gatekeeping, agenda setting, media, news, homogenization, information
Introduction
This paper makes the argument that mass media news content is a product of a gatekeeping hierarchy whose dictates determines news frames (Ghanem, 1997, Takeshita, 1997), specify exemplars associated with priming, and imbue media and public agendas with the issues and attributes that generate first and second-level agenda-setting effects. The architecture of the gatekeeping process and its agenda-setting outcomes, it is argued, result in homogenization of mass media content, the marginalization of minorities, curtailment of expression of dissident viewpoints, naturalization of a distorted reality, and restricted dialectical possibilities available for public discourse. The discussion here is, in part, a response to Gandy’s recognition of the need to look beyond agenda setting to reveal the forces that set the media agenda, the purposes for which it is set, and the resulting influence on the distribution of social power and values (Gandy, 1982, p. 7). In general, discussions of the media agenda treat the phenomenon as a spontaneous event, lacking any antecedent generative force. In fact, the antecedents are several, a number of which are identified here. It is also suggested that there is little evidence to support McCombs contention (2004, pp. 11, 19) that agenda setting is an inadvertent by product of mass
Journal of Global Communication, Vol2., No.1, Jan.-July 2009 pp. 199-208
A General Process Model of Content Homogenization (figure 1) is advanced. The model reflects a theoretical synthesis of gatekeeping and agenda setting processes in the production of largely undifferentiated, conservatively-biased mass communication content. It is argued that the model is subject to validation through hypothesis testing examining relationships postulated by the model. In addition to authenticating the proposed model, the expectation is that empirically- based research will have substantive implications for advancement of agenda setting theory, and its relation to news content homogenization.
For those whose primary interests lay elsewhere, a brief lexicon of relevant terms may be helpful. Framing is the ideological lens through which environmental phenomena are viewed in the construction of news. Priming is psychological construct suggesting media content makes salient for news consumer’s issues and their attributes that are subsequently reflexively consulted as typifications (Zillmann, 2002; Willnat, 1997). Issues are events or activities featured in news coverage, attributes specific qualities of the issues. The notion of first-level agenda setting asserts that news content determines what people think about; second-level effects are located in how people think about what they think about.
While the theoretical work and derivative research by US scholars has produced a voluminous literature related to largely to US news production, the global application to a variety of social, political, and economic circumstances has gone largely unexamined. The present paper demonstrates their relevance to news generation under a variety of ideological conditions, and addresses the oppressive consequences of what Schiller (1996, p. 87) has called the tyranny of gatekeepers. The global relevance of extant theoretical and empirical work is suggested by the influence of mass media in cultivating values, beliefs, attitudes, norms, and behaviors already present in a society, stabilizing and reinforcing conventional beliefs and behaviors, and, in the process, homogenizing public opinion, irrespective of the cultural context (Gerbner, Gross, and Signorielli,1986).
While the analysis presented here argues for the global applicability of a content homogenization model, the illustrative exemplar employed involves the news product of US media. The contention is that mass media embedded in capitalist economic systems are profit- driven enterprises that subordinate the political welfare of nations to revenue generation (Bagdikian, 2004, 2000; Shiller, 1996; Herman and Chomsky, 1988; Jencks,1987). More broadly, it is asserted that dominate ideology under any set of political, social, and economic conditions will dictate news content advantaging elites, without reference to the needs of the citizenry.
Gatekeeping
Ruminations on gatekeeping typically conjure mental images of White’s (1950) wire service editor, a conscientious employee located somewhere in lower strata of a management hierarchy, diligently engaged in vetting wire service copy, selecting some for publication and rejecting the rest, based upon a subjective, idiosyncratic assessment of news value. Like White’s Mr. Gates, gatekeepers historically, have been cast as relatively low-level, well-intentioned functionaries in
202 Information Control and Imperiled Public Discourse: A General Process....
Such a perspective, however, tends to both sanitize and trivialize a set of relationships that are considerably more complex and insidious. Unaccounted for is a multi-layered process driven by elite priorities that produces a homogenized, information-deficient public agenda hospitable to dominate ideology, and delimiting topics and perspectives available for debate.
Primary-level Gatekeeping
Gatekeeping is a tripartite process. Primary-level gatekeeping involves an external locus of control residing with social, political, and economic power centers and their individual and institutional spokespersons, who control information available to the media. Among the consequences is establishment of a media agenda and a macro-level frame stipulating an acceptable ideological context for news presentation. The governing aristocracy includes what McChesney has referred to as homogenized ownership (2004, p. 47). There is a vested interest in making available to the media and, ultimately, the public only information supportive of the status quo, and, in both topical and ideological content, not inimical to corporate well being.
Secondary-level Gatekeeping
Secondary-level gatekeeping is an internal function involving publishers and senior editors, whose content decisions reflect the priorities of primary-level gatekeepers, and the dominate cultural viewpoint (Paul and Elder, 2006, p. 10; Gans, 2003, pp. 24, 198; Gitlin, 2003, pp. 5,40, 80, 95, 274; Bagdikian, 2000, pp. 17-18). Decisions made at this level translate into events and activities selected for coverage, and treatments that constitute micro-level framing, achieved, in part, by careful selection of sources expounding elite perspectives (Herman and Chomsky, 1988). The priming process prioritizing for audiences the content of the media-produced public agenda is initiated here. Involved is a two dimensional decision-making process. In the first step, choices are made that produce selective coverage of environmental phenomena extracted from a universe of possibilities. (Kim and McCombs, 2007; Sie-Hill, Dietram, Shanahan, 2002; Nelson and Kinder, 2001; Scheufele, 2000; Scheufele and Tewksbury, 2007). The second step involves decisions about how the news is played. The choices made dictate the prominence of issues within the larger context of the news product, and are central to establishing salience in the public agenda. For both print and electronic media, the consequences of the decisions are observable in the placement of stories, coupled with space or time afforded them.
Tertiary-level Gatekeeping
Priming and framing find their final, micro-level configuration in the product of tertiary- level gatekeepers, whose function is to generate content. Their contribution to homogenization is a consequence of top-down pressure, and their social and economic location among the middle and upper-middle classes (McChesney, 2004, pp. 98-137). News room socialization (Breed, 1955), personal predilections and a survival instinct compel compliance with dictates of superiors. The framing and priming effects of primary and secondary-level gatekeeping influences are realized in the rendering of news. The homogenizing consequences of the top down coercive
Gregg A. Payne 204
Agenda Setting
Content admitted by gatekeepers for publication or broadcast is taken to have multiple agenda-setting impacts. It sets the public agenda that establishes both issue and attribute salience, the first a generalized account of some event or activity, the second specifying selected qualities of the issue (Takeshita, 1997). It also has first-level agenda setting effects, dictating at some level, what people think about as a consequence of mass media engagement, and second-level influences affecting how people think as a product of news framing by the gatekeeping establishment (McCombs, 2004; McCombs, M., Llamas, P., Lopez-Escobar, E., and Rey, F., 1997; McLeod, J., Becker, L., and Byrnes, J., 1974; McCombs, and Shaw, 1972; Rosenberry, and Vicker, 2009, pp. 150-153; McCombs, 2004, pp. 86-97; Nelson, T, and Kinder, D. 1996; Pan, Z, and Kosicki, 1993; Scheufele, D, 1999, 2007).
From this perspective agenda setting is a product of gatekeeping, and an intervening variable between gatekeeping and content homogenization. The seminal theoretical and empirical work in agenda setting originated with the Chapel Hill study executed by McCombs and Shaw (1972). The study, executed during the 1968 US presidential elections, investigated links between news content and voter perceptions of the most compellingly important issues of the time. It revealed significant correlations between media and public agendas, and produced evidence, contrary to that then prevailing (Klapper, 1960; Berelson, Lazarsfeld, and McPhee, 1954; Lazarsfeld, Berelson, and Gaudet, 1948), that media content had profound effects. In the intervening decades, there has been considerable theoretical extension and intension growing out of research examining relationships between media and public agendas in a variety of social and other contexts. Effects in relation to differing audience characteristics have been studied, as have conditions, events, and people influencing the agenda (McCombs, Shaw, and Weaver, 1997).
While the media agenda is defined here in terms of information made available by elites to media practitioners, operationalization of the public agenda has been preoccupied traditionally with counting the frequency with which certain matters are reported, the ways in which they are framed or contextualized, and the presumptive influence on priming, or the salience of those matters in the public mind (McCombs, 2004, p. 87). Moreover, the conventional focus has been on first-level agenda setting, which is assumed to make salient an attitude object (Griffin, 2006, p. 401). The relatively recent conceptual extension of agenda setting to accommodate second-level effects has not adequately accounted for a multidimensional gatekeeping hierarchy. Second- level agenda setting suggests a covert transmission of ideology, conceding the long-denied possibility that media content may not only determine what is thought about, but also how salient issues are thought about (McCombs, Llamas, Lopez-Escobar, and Rey, 1997). The concession, coupled
205 Information Control and Imperiled Public Discourse: A General Process....
Content Homogenization
Content homogenization is a product of the relationships between gatekeeping and agenda setting. Conceptually, content homogenization, as the term is used here, suggests that events and topics selected for news coverage and the ideological perspectives with which they are infused, provide little in the way of diversity, contribute little to a free marketplace of ideas, and are inherently hegemonic (Gitlin, 2003, p. 211, 271). They reflect the priorities of the relative few who dominate news production operations, including the very rich, chief executives, the corporate rich, senior members of the military, and the political directorate, all representing a relatively monolithic presence in their acceptance of a common set of values, beliefs, attitudes, perspectives, norms, rules, and behaviors (Mills, 1956; Gans, 1980, p. 206-213). While the structural relationships and relative power vested in those social categories may vary from country to country and culture to culture, occupants of the categories inhabit the same castes that comprise media and other elites everywhere. Their machinations as superordinate gatekeepers dictate first the media agenda and, ultimately, the public agenda, with their ideological sensibilities reflected in the news product (Schudson, 2005; Gitlin 2003). Among the consequences is control over the construction of social reality, and the capacity to eliminate conflicting perspectives (Berger and Luckmann, 1966, p. 123, 128). Homogenization of news themes, topics, and treatments, neither in the US nor globally, can be explained exclusively by reference to either media time or space constraints, or predispositions of news workers occupying subordinate roles in a hierarchal gatekeeping structure.
Theoretical Links
The conceptual explications proffered here show gatekeeping to consist of three interrelated tiers, and agenda setting to be a multi-tiered process derivative of gatekeeping structures and processes. Additionally, framing is shown to be a hierarchal, dichotomous phenomenon consisting of macro and micro-level framing predictive of cognitive structures associated with priming. All contribute to a calculus of power and control. Macro framing is conceptualized as the editorial process of accommodating news to elite ideology (Gitlin, 1980). Micro framing, circumscribed by the ideological parameters of the macro frame, accommodates transient exigencies in the evolution of news. Priming suggests a heuristic psychological process in which media references prompt recall of previously acquired information (Ghanem, 1997; Wilnat, 1997; Zillmann, 2002; Roskos-Ewoldsen, D., Roskos-Ewoldsen, B., and Carpentier, F. (2002). Zillmann (p. 27) advances the notion of a representativeness heuristic, arguing that media exposure results in an inductive process that generalizes from samples of events to populations of events. The relationship between content homogenization and priming is located in issue and attribute agendas. The first,
Gregg A. Payne 206
Clarifying relationships involving the various levels of gatekeepers, levels of agenda setting, and agenda types is critical to an improved understanding of homogenizing influences both on and of media content. It is clear that primary-level gatekeepers have a vested interest in framing a media agenda that produces secondary-level effects consistent with the dominate ideology. The news production process involves micro-level framing at the secondary gatekeeping level, and priming at all levels. The public agenda is realized in content choices of secondary-level gatekeepers and treatments of tertiary-level gatekeepers, resulting in first-level and second-level effects consistent with objectives of primary-level gatekeepers and the media agenda.
Ultimately, both public and media agendas reflect a hegemonic confluence of external and internal interests, driven by the prerogatives of power and perquisites of the powerful. The consequences are typified by the conservative positions of those occupying senior status in the gatekeeping hierarchy and subscribed to as a matter of both organizational efficacy and self preservation by subordinates. The resulting insular and parochial news product, characterized by a mendacious topical, thematic, and ideological sterility, imposes on consumers a restricted set of perceptual and cognitive filters. The outcome of the consequent information deprivation suggests media-imposed social control (Noelle-Neumann,1984).
Consequences
The empirical implications of the reconceputalizations suggested here are several. They provide space required to move away from the assumptions associated with traditional constructions of gatekeeping and agenda setting. Operationalization of the various levels of gatekeeping makes it possible to assess their relative impacts on news gathering and presentation, and relationships with priming and framing. In particular, it becomes possible to examine influences on agenda setting of gatekeeping structures under a variety of social, economic, and political conditions, to examine the relative influences of a hierarchally constituted gatekeeping establishment on subordinate gatekeeping roles, and how the consequences comport with a range of normative mass media theories
From the relationships modeled, several theoretically useful propositions emerge:
1) Media content is a product of economic, social, and political power exerted through primary-level gatekeeping.
2) Primary-level gatekeeping is committed to maintenance of the status quo.
3) Protecting the status quo is linked to news content reflecting the dominate ideology.
4) The application of power produces among secondary and tertiary-level gatekeepers a consensual definition of news consistent with that of primary-level gatekeepers.
5) The influence of the dominant ideological perspective is primarily attributable to second-
207 Information Control and Imperiled Public Discourse: A General Process....
6) Secondary and tertiary-level gatekeepers produce a public agenda supportive of the media agenda.
7) The public agenda is formulated as a homogenized news product consistent with dominate political, economic, and social ideologies.
Conclusion
The model and these propositions suggest a range of empirically testable hypotheses germane to examination of news production structures, and their impacts on news content under a global assortment of political, economic, and social conditions. The General Content Homogenization Model provides a foundation for empirical analysis of media as cultural artifacts in relation to structure, function and bias arising out of dominate ideological commitments. Additionally, it enables examination of relative influences of multiple, sometime conflicting, and ideological positions. An example can be located in US media attempts to reconcile the schizoid tension between satisfying expectations of a social contract in which a free press is expected to contribute to development, maintenance, and repair of democracy, and the competing, and more compelling, profit production mandates of a capitalist economy.
One of the consequences is that the variegated subtleties and complexities inherent in a free marketplace of ideas, and indispensable to robust public discourse, are collapsed into a pedestrian, monochromatic narrative where democracy and capitalism become isomorphic, with democratic ends attainable only through capitalistic means. Potential alternative realities are left unexamined, and consequently absent from civil discourse. What emerge are narrowly circumscribed media and public agendas that, at both primary and secondary levels, are antithetical to democratic process, but may produce precisely the outcomes coveted in authoritarian political circumstances.
References
Bagdikian, B. (2000). The media monopoly. (6th Ed). Beacon Press: Boston.
Bagdikian, B. (2004). The new media monopoly. Beacon Press: Boston.
Berger, P., and Luckmann, T. (1966). The construction of social reality: A treatise in the Sociology of knowledge. New York: Anchor Books.
Berelson, B.,Lazarsfeld, P. and McPhee, W. (1954): Voting. Chicago: University of Chicago Press.
Biagi, S. (2007). Media/Impact: An introduction to mass media. Belmont, CA: Thomson/ Wadsworth.
Breed, W. (1955). Social control in the newsroom: A functional analysis. Social Forces, (33), 326-335.
Crouse, T. (1972). The boys on the bus: Riding with the campaign press corps. New York: Random House.
Gregg A. Payne 208
Gandy, O. (1982). Beyond agenda setting: Information subsidies and public Policy. Norwood, NJ: Ablex
Gans, H. (1980). Deciding what’s news: A study of CBS Evening News, NBC Nightly News, Newsweek, and Time. New York: Vintage Books.
Gans, H. (2003). Democracy and the news. Oxford: Oxford University Press.
Gerbner, G., Gross, L., and Signorielli, N. (1986). Living with television: The dynamics of the cultivation process. In J. Bryant and D. Zillmann (Eds.), Perspectives on media effects. Hillsdale, NJ: Lawrence Erlbaum Associates.
Gitlin, T. (1980, 2003). The whole world is watching: Mass media in the making and unmaking of the new left. Berkeley: University of California Press.
Ghanem, S. (1997). Filling in the tapestry: The second level of agenda setting. In M. McCombs, D. Shaw, and D. Weaver (Eds.), Communication and democracy: Exploring the intellectual frontiers in agenda-setting theory (pp. 3-14). Mahwah, NJ: Lawrence Erlbaum Associates.
Griffin, E. (2006). A first look at communication theory. Boston: McGraw-Hill.
Herman, E., and Chomsky, N. (1988). Manufacturing consent: The political economy of the mass media. New York: Pantheon.
Jencks, C. (1987). Should news be sold for profit? In D. Lazere (Ed.), American media and mass culture: Left perspectives (pp. 565-567). Berkeley, CA.: University of California Press
Kim, K, and McCombs, M. (2007). News story descriptions and the public’s opinion of candidates. Journalism and Mass Communication Quarterly, 84(2). Pp. 299-314.
Klapper, J. (1960). The effects of mass communication. New York: Free Press.
Lazarsfeld, P., Berelson, B., and Gaudet, H. (1948). The people’s choice. New York: Columbia University Press.
McChesney, R. (2004). The problem of the media: U.S. communication politics in the 21st
century. New York: Monthly Review Press.
McCombs, M. (2004). Setting the agenda: The mass media and public opinion. Cambridge, UK: Polity Press.
McCombs, M., Llamas, J., Lopez-Escobar, E., and Rey, F. (1997). Candidate images in Spanish elections: Second-level agenda-setting effects. Journalism and Mass Communication Quarterly, 74, pp. 703-717.
McCombs, M., and Shaw, D. (1972). The agenda setting function of mass media. Public Opinion Quarterly, 36, 176-187.
McCombs, M, Shaw, D, and Weaver, C. (1997). Communication and democracy: Exploring the intellectual frontiers in agenda-setting theory. Mahwah, NJ: Lawrence Erlbaum Associates.
209 Information Control and Imperiled Public Discourse: A General Process....
McLeod, J, Becker, L, and Byrnes, J. (1974). Another look at the agenda-setting function of the press. Communication Research, 1, pp. 131-166.
McClellan, S. (2008). What happened: Inside the Bush White House and Washington’s culture of deception. New York: Public Affairs.
Mills, C. (1956). The power elite. New York: Oxford University Press.
Nelson, T, and Kinder, D. (1996). Issue framing and group-centrism in American public opinion. Journal of Politics, 58, pp. 1055-1078.
Noelle-Neumann, E. (1984). The Spiral of Silence: Public Opinion — Our social skin. Chicago: University of Chicago.
Pan, Z, and Kosicki, G. (1993). Framing analysis: An approach to news discourse. Political Communication, 10, pp. 55-75.
Paul, R., and Elder, L. (2006). How to detect media bias and propaganda. Foundation for Critical Thinking: Tomles, CA.
Rosenberry, J., and Vicker, L. (2009). Applied mass Payne communication theory: A guide for media practitioners. Boston: Pearson.
Roskos-Ewoldsen, D., Roskos-Ewoldsen, B., and Carpentier, F. (2002). Media priming : A synthesis. In J. Bryant and D. Zillmann (Eds.), Media effects: Advances in theory and research. (pp. 97-120). Mahway, NJ: Lawrence Erlbaum Associates.
Schudson, M. (2005). Four approaches to the sociology of news. In J. Curran, & M. Gurevitch (Eds.), Mass media and society (pp. 198-214). London: Hodder Arnold.
Scheufele, D. (1999). Framing as a theory of media effects. Journal of Communication, 49, pp. 101-120.
Scheufele, D. (2000). Agenda-setting, priming, and framing revisited: Another look at cognitive effects of political communication. Mass Communication and Society, 3(2/3), pp. 297- 317.
Scheufele, D., and Tewksbury, D. (2007). Framing, agenda setting, and priming: The evolution of three media-effects models. Journal of Communication, 57, pp. 9-20.
Sei-Hill, K., Dietram, A., Skhanahan, J. (2002). Think about it this way: Attribute agenda-setting function and the public’s evaluation of a local issue. Journalism and Mass Communication Quarterly, 1(79), pp. 7-25.
Shiller, H. (1996). Information inequality: The deepening social crisis in America. New York: Routledge.
Takeshita, T. (1997). Exploring the media’s roles in defining reality: From issue-agenda setting to attribute-agenda setting. In M. McCombs, D. Shaw, and D. Weaver (Eds.), Communication and democracy: Exploring the intellectual frontiers in agenda-setting theory (pp. 3-14). Mahwah, NJ: Lawrence Erlbaum Associates.
White, D. (1950). The gatekeeper: A case study in the selection of news. Journalism Quarterly 27(3), pp. 383-390.
Gregg A. Payne 210
Willnat, L. (1997). Agenda setting and priming: Conceptual links and differences. In M. McCombs, D. Shaw, and D. Weaver (Eds.), Communication and democracy : Exploring the intellectual frontiers in agenda-setting theory. (pp. 51-66). Mahway, NJ: Lawrence Erlbaum Associates.
Zillmann, D. (2002). Exemplification theory of media influence. In J. Bryant and D.
Zillmann (Eds.), Media effects: Advances in theory and research. (pp. 19-42). Mahway, NJ: Lawrence Erlbaum Associates.
l
211 Information Control and Imperiled Public Discourse: A General Process....
Lauren B. Movius
Annenberg School of Communication, USC 3502 Watt Way, Los Angeles 90089
Cell: 714.362.1328, Email: [email protected]
Abstract
The Internet represents a medium for both liberty and control and the Internet is assumed to have an inherent democratic nature and be a force for democracy. However, undemocratic uses of the Internet exist as well, even by democratic regimes of the West. The Internet can be used as a tool of control and dominance; it can increase government power, enhancing government ability to monitor its citizens and potentially control individuals. This article examines national security and individual’s privacy from U.S. government surveillance, in the context of the Internet post September 11, 2001. However, government control of the Internet is not simply a response to or a result of the terrorist attacks of 9/11; the U.S. government has always tried to increase its control over information and technology. The article documents several examples of attempts to control the Internet and communications prior to 9/11 and argues that the events of 9/11 provided the justification necessary to enact legislation to broaden surveillance powers. The article then discusses how surveillance technologies work and examines the key actors involved in surveillance
Key words: Internet, privacy, communication, democracy, dominance, surveillance, democracy
Introduction
The Internet represents a medium for both liberty and control. When the Internet first emerged, many celebrated its potential for autonomy and freedom, since governments could seemingly do little to control the borderless network (Barlow: 1996). The Internet is seen by many as a means of freedom of expression and a “kind of democratization of access for all” (Mathiason: 2009: xiv). The Internet is assumed to have an inherent democratic nature and be a force for democracy. Indeed, a link between technological advance and democratization remains a strong assumption in popular thinking (Kalathil & Boas: 2003). However, undemocratic uses of the Internet exist as well, even by democratic regimes of the West (Vegh: 2006). The Internet can be used as a tool of control and dominance; it can increase government power, enhancing governments’ ability to monitor its citizens and potentially control individuals. Therefore, loss of privacy and anonymity on the Internet is an area of concern in need of investigation and analysis.
The development of the Internet as a networked global communications medium and the extent to which people use it have produced a qualitative change in the nature of communications, and in the nature and amount of information which is exposed to interception and surveillance.
As a result of the digital revolution, many aspects of life are now captured and stored in digital form. Indeed, it is rare for a person in the modern world to avoid being listed in numerous databases (Diffie & Landau: 1998). Much of this translates into individual’s privacy from companies in the context of privacy and economic efficiency (Movius & Krup: 2009; Varian: 1996). However, focusing on the U.S. context, this article examines national security and individuals’ privacy from government surveillance, in the context of the Internet post September 11, 2001 (9/11).
Surveillance is not new, and by focusing on post 9/11, it is not to suggest that this period represent a new type of surveillance. As will be discussed below, there is a long history of government surveillance. New technology, however, has led to new forms of surveillance, and has also attributed to the rise of surveillance studies over the last two decades (Lyon: 2006).
The period since 9/11 is of particular concern, since this legitimated the expansion of surveillance (Ball & Webster: 2003). Indeed, 9/11 “encouraged an alignment of actors, organizations, debates and viewpoints, from different policy and academic spheres, all of which featured surveillance as a germane issue. Accordingly, national security was constructed as relevant to public and private sector positions on… Internet security…with privacy issues temporarily taking a back seat” (Ball & Webster: 2003:9).
In order to understand the significance of 9/11 to Internet surveillance, we must consider the situation prior to 9/11. Thus, the article first discuses U.S. government attempts to conduct surveillance or to control the Internet prior to 9/11, in order to support the argument that surveillance post 9/11 was not expanded simply to increase security, but to increase government control. Second, I will discuss why the Internet has been historically difficult to control due to technological and institutional factors. Third, I will analyze how the U.S. government sought to overcome these technological and institutional challenges by enacting new legislation, notably the Patriot Act, and through the use of new technology. I argue that 9/11 provided a window of opportunity to enact these changes, and the U.S. government used social alarm to pass legislation. Fourth, I discuss some limits of technology. Fifth, I consider the effects of increased surveillance and question whether it helps in reducing terrorism and increasing security.
Attempts to Control Communication Prior to 9/11
Governments have always sought to control communication and information. Indeed, “control of information has been the essence of state power throughout history, and the U.S. is no exception” (Castells: 2001:169). The U.S. government, as all governments, seeks to maximize their control of communications under limits of institutional constraints. Government control of the Internet is not simply a response to or a result of the terrorist attacks of 9/11; and there are several examples of government surveillance programs that have existed well before 9/11.
U.S. government surveillance has been well documented through the release of documents under Freedom of Information Act requests. Ward Churchill and Jim Vander Walls (1990) document the scope of the FBI Counter Intelligence Program acts of surveillance on various domestic social movements between 1957 and 1974, and Sasha Costanza-Chock (2004) provides an historical overview of state surveillance of social movements in the US. The Senate investigation
213 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
known as the Church Committee investigated the FBI Counter Intelligence program and other programs of surveillance during the 1970’s. This investigation revealed details of “domestic intelligence activities [that] threaten to undermine our democratic society” (Church Committee: 1976:1). The Committee recommended that the “CIA, NSA, the Defense Intelligence Agency, and the armed services be precluded, with narrow and specific exceptions, from conducting intelligence activities within the United States, and that their activities abroad be so controlled as to minimize the impact on the rights of Americans” (Diffie & Landau: 1998:121). Since these recommendations, limits on surveillance have been eroded, and this process began well before 9/ 11.
Turning to electronic communication, federal agencies had legal powers to monitor e-mail and computers well before the Patriot Act. Two sources of authority for wiretapping exist in the US and set the framework for U.S. electronic-surveillance laws: the Federal Wiretap Act, also referred to as Title III, and the Foreign Intelligence Surveillance Act (FISA) of 1978. The Federal Wiretap Act was adopted in 1968 and expanded in 1986. The Electronic Communications Privacy Act of 1986 updated Title III and the FISA to apply to electronic communications, and allowed for “roving wiretaps” - wiretaps with unspecified locations, where the government can tap any phone or Internet account that a suspect uses. Under the Electronic Privacy Act, law enforcement needs a search warrant, and not a more stringent wiretap warrant, to access stored communication. This authority to use a roving wiretap was granted in 1986 and broadened in 1999. The Patriot Act added roving tap authority to the FISA and made a number of significant changes that have led to an increase in FISA investigations (Jaeger, Bertot, & McClure : 2003).
In 2007, wiretap applications under Title III increased 20 percent from 2006. The Administrative Office of the United States Courts publishes annual Wiretap Reports on wiretap activity of federal, state, and local police, and there has been a nearly steady increase in the use of wiretaps from 1994 - 2007, with applications rarely being denied. It is important to note that Wiretap Reports only include data for Title III electronic surveillance and do not include intercepts regulated by FISA. Since the early 2000s, FISA wiretap orders constitute a majority of federal wiretaps (Rule: 2007).
In addition to the above examples of government surveillance, the U.S. Congress and U.S. Justice Department attempted to gain legal control of the Internet through the 1996 Communications Decency Act. The rationale for the Act was that it was necessary to protect children from sexual indecency on the Internet. Here, we see how social alarm - the need to protect our children from sexual predators and indecent material on the Internet - was used in proposing legislation. Many people saw this as the “first great attack on cyberspace” (Goldsmith & Wu: 2006: 19). This law was seen as an attack because it threatened the fact that the Internet was open to both children and adults without any discrimination, and this openness was seen as the Internet’s strength. The Act was ruled unconstitutional in 1997 by the Supreme Court in a vote of 7 to 2, since it was overly broad and could result in the chilling of speech unrelated to protecting minors.
Lauren B. Movius 214
Year Wiretap Applications Authorized Denied
1994 1154 1154 0 1995 1058 1058 0 1996 1150 1149 1 1997 1186 1186 0 1998 1329 1327 2 1999 1350 1350 0 2000 1190 1190 0 2001 1491 1491 0 2002 1359 1358 1 2003 1443 1442 0 2004 1710 1710 0 2005 1774 1773 1 2006 1839 1,839 0 2007 2208 2208 0
Thus, it is not correct to view 9/11 as generating surveillance measures and a completely new surveillance landscape. Instead, there is a history of surveillance, with surveillance systems being broadened after 9/11. Lyon contends that surveillance societies already existed in many “democratic” countries, and that 9/11 produced “socially negative consequences that hitherto were the stuff of repressive regimes and dystopian novels or movies” (2003: 15). The events of 9/11 and the “war on terror” justified these measures. Before examining attempts by the U.S. government to control the Internet after 9/11, an understanding of the characteristics and obstacles of controlling the Internet may be useful.
Control of the Internet
When the Internet first appeared, it was widely believed that the Internet could challenge the authority of the nation state, and that because of its borderless nature, it lay beyond government control (Barlow: 1996; Johnson & Post: 1996). The co-founder of MIT’s Media lab Nicholas Negroponte argued, “The Internet cannot be regulated. It’s not that laws aren’t relevant, it’s that the nation-state is not relevant” (Higgins & Azhar: 1996). Unlike other networks, such as a telephone network, the Internet is not dependent on a central server. Instead, the Internet is a network of networks. There is no single central authority on the Internet. There is a decentralized routing system that was designed to carry messages from point to point, even if intermediate exchanges are blocked. As John Gilmore famously stated, “The net interprets censorship as damage, and routes around it.”
The lack of centralized control on the Internet is due to historical factors (Abbate: 1999), as well as the Internet’s technical architecture. Let us now consider why network architecture matters and how it made the Internet initially difficult to control. Design features do not necessarily
215 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
come about because they represent the best technical option. Instead, the architectural design of an information system is a choice, and political and economic forces shape these choices. The founders of the Internet self-consciously built a network with open architecture through the “end-to-end” principle (Abbate: 1999). Thus, the Internet founders “embraced a design that distrusted centralized control. In effect, they built strains of American libertarianism…into the universal language of the Internet” (Goldsmith & Wu: 2006: 23).
The end-to-end principle has evolved from the original notion of where to put and not to put functions in a communication system (Saltzer, Reed, & Clark: 1984), and it has come to address issues such as maintaining openness, maintaining user choice, and increasing reliability (Kempf & Austein: 2004). The end-to-end principle “grew over time to become a (if not the) foundation principle of the Internet architecture” (Kempf & Austein: 2004). Because of the end-to-end nature of the Internet, intelligence, as well as control, is decentralized. Messages on the Internet are broken up into packets of data, and the network routes them via the most efficient path, regardless of the packets’ content or origin. Thus, the network is said to be “dumb”, and intelligence lies on the edges of the network.
In addition to the technological architecture, the Internet developed in the United States, and is thus under the constitutional protection of the First Amendment. The First Amendment limits governmental ability to regulate speech, and everything is potentially “speech” on the Internet. As discussed above, the U.S. Supreme Court Case ACLU v. Reno found the Communications Decency Act to be a violation of the First Amendment. This case afforded Internet related matters the strongest First Amendment protection.
While the Internet represented a site of freedom and liberty based on the aforementioned technological and institutional grounds, new technologies and regulations challenged these obstacles to control of the Internet. The U.S. government attempted to circumvent such limits with: first, changes in legislation using social alarm, and second, developments in new technologies.
New Legislation: U.S. Patriot Act
After 9/11, the U.S. government called for increased surveillance in order to protect against future terrorist attacks. The key initiative that emerged to augment government access to information was the U.S. Patriot Act. The Patriot Act was introduced less than a week after 9/ 11 and signed by President Bush and passed into law on October 26, 2001. The Patriot Act qualitatively extended the government’s electronic-surveillance capabilities. The Act allows the U.S. government to investigate both citizens and non-citizens and to engage in surveillance, with many elements effecting communication on the Internet, leading to increasing surveillance and control of the Internet.
The Act threatens civil liberties and potentially provides government power to suppress free exchange of knowledge. With civil rights and communication rights in the digital environment being eroded, dissemination of electronic material may be inhibited through censorship, and interaction in the public sphere may be limited (Ó Siochrú: 2005). There are considerable implications for online privacy. For example, the Patriot Act increases the ability of law enforcement to authorize installation of pen registers and trap and trace devices, and to authorize
Lauren B. Movius 216
Use of Social Alarm
Increased surveillance and the use of high-technology surveillance have increased in the context of the new global politics of terrorism. Events such as 9/11 are referred to as “trigger crimes” by scholars (Innes: 2001), who argue that such events allow for the introduction of new technologies with less public debate than usual, since such technologies are perceived to be a necessary response. Thus, the events of 9/11 led to a “surveillance surge” (Ball & Webster: 2003), and they legitimated existing trends. For example, many of the provisions of the Patriot Act relating to electronic surveillance were not new, and were proposed before 9/11, when they were subject to much criticism and debate (EPIC, 2005a). The Justice Department has been lobbying for the power to conduct “secret searches” long before the 9/11 attacks took place and terrorism became a justification for enacting such legislation. Furthermore, the Act was proposed only a week after 9/11, and it has been argued that “details of this complex and far-reaching expansion of investigative powers were prepared and ready to be put forward before the events of September 11 — as surveillance interests awaited an auspicious moment” (Rule: 2007: 55).
With the events of 9/11, this social alarm factor allows government to do things that they would otherwise not be allowed to do, as the enactment of legislation and introduction of new technology is met with less public debate. In 2000 when the FBI’s Carnivore program, which scanned and recorded network traffic and “wiretapped” the Internet (King & Bridis: 2000), became public, there were criticisms that the system could get more information than the government was entitled to under a limited subpoena used for pen registers and trap-and-trace devices. Carnivore was extremely controversial, and there was a great deal of concern expressed by members of Congress, who stated their intent to examine the issues and draft appropriate legislation (EPIC: 2005b). Former Attorney General Reno announced that issues surrounding Carnivore would be considered by a Justice Department review panel and that its recommendations would be made public. That review, however, was not completed prior to 9/11. As a result, Congress did not have any findings and recommendations when it enacted the Patriot Act. Thus, the 9/11 terrorist attacks provided social alarm and a way for government to thwart concerns of Carnivore that were predominant prior to 9/11.
The use of social alarm is exemplified by the administration’s argument for the necessity of the Patriot Act. Without appropriate policy responses to the terrorist attack, the argument goes, there would be serious consequences for national security and public safety. Attorney General Ashcroft warned that further terrorist acts were imminent, and that Congress could be blamed for such attacks if it failed to pass the bill immediately. Congressional leaders warned that the legislation was the only way to protect against terrorism and that the Act had to be passed within days, which it was.
217 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
Technologies of Surveillance
The above discussion of the end-to-end principle of the Internet focused on how this design feature made it difficult to exert control over the network. The end-to-end principle is one of the reasons why the Internet has been so successful (Auerbach: 2004). However, the end-to-end principle also enables harmful activities. There is a constant tension between the need to control or limit harmful activities, while not compromising the Internet’s core architecture.
As the Internet grew in scale and importance, several factors challenged the end-to-end principle (Clark & Blumenthal: 2000). The Internet was developed by a community of users. Trust between end users, and authentication of end nodes, was not a concern, and it was assumed that end nodes would cooperate in order to achieve mutually beneficial action, per the end-to-end principle implies. However, with the growth in users of the Internet and its turn to a commercial nature, motivations of some end users are not ethical, and may include attacks on the network or on individual end points. Another challenge to the end-to-end principle may come if governments or corporate network administrators seek to interpose between two parties in an end-to-end conversation. For example, a government may claim the right to wiretap, thereby inserting itself in a communication between two end nodes. Censorship or surveillance on the Internet violates the end-to-end principle, since the principle states that intelligence should be at end-points, and not in the middle of the network. Control in the middle of a network, such as China’s “Great Firewall”, instead of control at the user level, is argued to limit growth of the network, since the end-to-end principle has been attributed with rapid growth of the Internet throughout the world (Deibert, Palfrey, Rohozinski & Zittrain: 2008).
Thus, as the Internet developed, some of its original design features that had made control difficult began to be challenged. Additionally, surveillance devices and systems were introduced after 9/11. Four main ways to improve technological surveillance were proposed since 9/11: biometrics, such as iris scans or fingerprints; identification cards with embedded programmable chips; closed circuit television (CCTV), enhanced with facial recognition software; and communication measures such as wiretaps and Web-based surveillance (Lyon: 2003). Most of these technologies were not new. For example, retinal scans had been tested for years in the context of bank machines, and were deployed at airports for security after 9/11 (Lyon: 2003). Some of the other measures, such as increased wiretaps, had to wait for legal change, which was provided by the Patriot Act, in order to be implemented.
Lauren B. Movius 218
In order to differentiate between types of electronic surveillance, it is useful to categorize the spectrum of intelligence gathering activity. While distinctions are not always clear, I will briefly outline the major categories so that we may better understand the vast landscape of surveillance technologies. The Director of National Intelligence lists six main categories of intelligence: human intelligence (HUMINT), signals intelligence (SIGINT), imagery intelligence (IMINT), measurement and signature intelligence (MASINT), open source intelligence (OSINT), and geospatial intelligence (GEOINT).
Human intelligence is the most common form of intelligence gathering; information is collected by tracking or interviewing a subject of investigation. The organization primarily responsible for the collection of human intelligence in the US is the Central Intelligence Agency and Defense Intelligence Agency. Signals intelligence is intelligence gathered by intercepting electronic signals, such as signals sent over radio and broadcast, and forms of telecommunication, such as emails or encrypted messages. Echelon is an example of signals intelligence, and the National Security Agency is the authority corresponding to this category. Because information may be encrypted, signals intelligence often involves use of cryptanalysis. Imagery intelligence includes representations of objects reproduced electronically or optically on film. Open source intelligence is publicly available information; the Director of National Intelligence and the National Air and Space Intelligence Center are the main collectors of open source intelligence. Geospatial intelligence is the analysis of security related activities on the earth.
Major Actors
I will now discuss who are the major actors involved in surveillance and how surveillance processes work. U.S. government agencies involved in surveillance include the Federal Bureau of Investigation (FBI), the National Security Agency (NSA), and the Central Intelligence Agency (CIA). Also involved is the Homeland Security Act, which was signed by President Bush on November 25, 2002 and consolidates 22 agencies into one department. One of the department’s main roles is to access, receive, and analyze information collected from intelligence agencies, law enforcement, and the private sector to assess terrorist threats. The Homeland Security Act included the Cyber Security Enhancement Act, which expands ISPs’ ability to disclose information to the government, such as the content of e-mail or instant messages, which can be given to a government official.
The FBI is one of the major players in surveillance. As of June 2002, the FBI’s official top priority is counterterrorism. In the fiscal year 2003, the FBI received a total of $4.298 billion, including $540.281 million in net program increases to enhance Counterterrorism, Counterintelligence, Cyber crime, Information Technology, Security, Forensics, Training, and Criminal Programs. The Patriot Act granted the FBI increased powers, especially in wiretapping and monitoring of Internet activity.
Carnivore is a system implemented by the FBI that is analogous to wiretapping, except in this case it is e-mail that is being tapped. The technology uses a standard packet sniffer and filtering. When an e-mail passes through that matches the filtering criteria mandated by the
219 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
warrant, the message is logged along with information on the date, time, origin and destination of the message and then relayed in real time to the FBI. It has been reported, as of January 2005, that the FBI has abandoned the use of Carnivore in favor of commercially available software (Associated Press: 2005).
The NSA is another U.S. government agency responsible for both the collection and analysis of communication messages. For years, not much has been know about the NSA, despite having been described as the world’s largest single employer of Ph.D. mathematicians, the owner of the single largest group of supercomputers, and having a budget much larger than that of the CIA (Bamford: 2001). The NSA has been heavily involved in cryptanalytic research, continuing the work of its predecessor agencies, which had been responsible for breaking many World War II codes and ciphers. The NSA, in combination with the equivalent agencies in the United Kingdom, Canada, Australia, and New Zealand, is believed to be responsible for, among other things, the operation of the Echelon system.
Echelon is thought to be the largest signals intelligence and analysis network for intercepting electronic communications in history, with estimations of intercepting up to 3 billion communications every day. The signals are then processed through a series of supercomputers that are programmed to search each communication for targeted addresses, words, phrases, or individual voices. However, the limits of a large system such as Echelon are defined by its very size, as discussed below.
The proposed “Total Information Awareness” (TIA) program relied on technology similar to Echelon, and it would integrate the extensive sources it is legally permitted to survey domestically, with the “taps” already compiled by Echelon. TIA is part of the Information Awareness Office, a mass surveillance development branch of the United States Department of Defense’s Advanced Research Projects Agency (DARPA). The TIA project would develop data mining tools that would be capable of sorting through huge amounts of information to find patterns. The system would use information search and retrieval tools or programs, which automatically translate recoded messages in order to sift out patterns and associations from the massive amounts of information, which is mostly held in private sector databases. The TIA initiative could combine individuals’ bank records, tax filings, medical data, e-mail records, and other information into one centralized database, which could be used for evidence of any suspicious activity.
The assumption behind these technologies is that terrorists exhibit patterns of behavior that can be identified by data mining many pieces of data and activities that are subject to surveillance. By data mining a range of databases, it is presumed that officials can identify terrorists before they strike, thus preempting any terrorist activity. The underlying premise is that if everything can be seen, then all threats can be stopped. Perhaps more importantly, there is the conviction that if everything can be seen, then everything can be controlled; this is a motivation that drives TIA, but this is not unprecedented, and is a familiar goal of government.
DARPA did acknowledge concerns of accessing information that is not normally accessible to government. The Information Awareness Office amended the Total Information Awareness name in May 2003 to Terrorist Information Awareness (still TIA) and emphasized in its report to Congress that the program is not designed to compile dossiers on U.S. citizens, but rather to
Lauren B. Movius 220
gather information on terrorist networks. Despite this name change, the description of the program’s activities remained essentially the same in the report. Congress passed legislation in February of 2003 halting activities of the Information Awareness program, since it could greatly infringe on individuals’ privacy rights.
While Congress passed a provision shutting down the Pentagon’s TIA, some of the same ideas appeared in a new program called Multistate Anti-Terrorism Information Exchange (Matrix). The Matrix is operated by a private company on behalf of a cooperative network of state governments, and represents an example of a recent “data surveillance” program. It ties together government and commercial databases, and according to Congressional testimony and news reports, it then makes those dossiers available for search by government officials. For example, in Congressional testimony by Paula Dockery, she described how the Matrix works as a process that involves combining government records with information from public search businesses into a data-warehouse, where these dossiers are combed by specialized software to identify anomalies using mathematical analysis (Dockery: 2003). If the details of one’s life should happen to contain “anomalies”, they will then be scrutinized by analytical personnel and investigators looking for evidence of terrorism or other crimes. Company officials have refused to disclose details of the program, but according to news sources, the kind of information to be searched includes credit histories, driver’s license photographs, marriage and divorce records, Social Security numbers, dates of birth, and the names and addresses of family members, neighbors and business associates (Stanford & Ledford: 2003). The Matrix program was terminated in April 2005, although components continued to be made available to police in individual states, according to the ACLU, which filed Freedom of Information Act requests concerning Matrix.
Limits of Technology
A main goal of surveillance practices appears to be the development of superior technologies. The paradox is that the 9/11 terrorists relied primarily on older technologies, such as jet aircraft and sharp knives, but the solution to combat terrorism is assumed to be found in high-technological solutions. Technology is seen as a savior, and “technological fixes are the common currency of crisis in late modern societies” (Lyon: 2003: 65). However, there are limits to technology, which will now be explored.
Can using technology predict terrorist activity? Commenting whether this is feasible, Steven Aftergood, the head of the Federation of American Scientists’ projects on government secrecy and intelligence, doubts that “technology can be precise enough to distinguish a few suspicious transactions in a sea of activity” (Harris: 2002). Furthermore, a policy analyst said, “it’s statistically unlikely that the system could predict and pre-empt attacks and also avoid targeting innocent people as suspected terrorists” (Harris: 2002).
Most of the surveillance devices and systems introduced after 9/11 relied on researchable databases. Technologies such as face recognition, iris scans, and biometrics all rely on searchable databases, which are used to anticipate and preempt terrorist activity (Lyon: 2003). Data is sorted by an automated system according to certain categories in order to isolate an abnormality, which may be a risk. An algorithm is used to code for indicators of characteristics or behavior
221 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
patterns that are related to the occurrence of certain behavior. Structural elements of the surveillance technologies raise a host of complex questions; the Electronic Privacy Information Center (2008) asks the following: “What is the basis for developing the algorithm? What are acceptable false positive and false negative rates? What indicators are relevant? Who will collect and store the relevant indicators? How are the indicators related to particular kinds of behavior? Is that relationship reliable? Who determines what behavior should be targeted? What types of specific behavior will the system try to catch?”
Additionally, there are technical issues, such as the reliability of the data used to make decisions, and questions of who will have access to the data and for what purposes. Finally, there are policy issues that need to be addressed, including a determination of individuals’ rights to control their personally identifiable information, and recourses available for someone wrongly identified or denied a service.
A key aspect of contemporary surveillance is “social sorting”. Lyon (2003) argues that this type of automated discriminatory mechanism for social categorizing reproduces social and economic divisions in societies. Information is stored in large databases, and data mining is used as a tool to discover patterns in the data. Data mining facilitates the classification of individuals into segments or groups. However, the means by which these groups are created is problematic. While the technologies employed are very high-tech, the categories with which they are coded are much more simple. Lyon (2003) notes how database marketers in the US use crude behavioral categories to describe neighborhoods, such as “pools and patios” or “bohemian mix”, and CCTV operators in the UK target the “young, black, male” group. The social implications of data mining are discrimination and exclusion. Gandy (2003) cites a commentator who referred to data mining as “Weblining”, in order to draw a parallel between “redlining”, which uses spatial or geo- demographic discrimination, to data mining, which uses conceptual categories instead of spatial in order to potentially discriminate against groups.
Social sorting raises a critical question about how we understand and theorize surveillance developments: does surveillance entail intrusion or exclusion? (Lyon: 2003). This paper has discussed how increased surveillance post 9/11 has led to the infringement of individual’s privacy; therefore, surveillance in these terms is viewed in individualist terms as an intrusion on privacy. This individualistic view of intrusion contrasts with how “social sorting” excludes people by categorizing them into cultural or social divisions. Thus, we can question whether intrusion or exclusion is a better conceptualizing or motif of surveillance post 9/11. Arab and Muslim minorities have been disproportionately targeted by surveillance measures in several countries (Lyon: 2003), suggestion that categorical exclusion is just as important to consider as intrusion of individual privacy.
Analysis of Intelligence
The actual ability to survey leads to information, but the key problem is that sufficient resources do not exist to properly analyze this information. The ultimate goal of intelligence is accurate analysis. The Congressional Research Service report states that “analysis is not an exact science and there have been, and undoubtedly will continue to be, failures by analysts to prepare accurate and timely assessments and estimates” (Best: 2006). The overall quality of
Lauren B. Movius 222
analysis has not been high, looking to the failure to provide advance warning of the 9/11 attacks and a flawed estimate of Iraqi weapons of mass destruction as evidence of systemic problems (Best: 2006).
Returning to Echelon, the limits of a large system such as Echelon are defined by its very size. Though the system intercepts 3 billion communications daily, analysts must know which intercepted communications to monitor before they can realize an intelligence advantage. For example, in the months prior to the 9/11 attacks, there were snippets of dialogue found that suggested some sort of attack was imminent. However, analysts were unable to pin down the details of the attack because operatives planning the attack relied largely on non-electronic communications (Bamford: 2001).
The strategic analysis of information, not just the accumulation of information, is the key to success of increased surveillance. Some efforts have been made to resolve this problem of analytical shortcomings since 9/11. For example, Congress has increased funding for analytical offices, and the Intelligence Reform Act of 2004 contains a number of provisions designed to improve analysis, including the designation of an entity to ensure that intelligence products are timely, objective, and independent of political considerations, and the designation of an official to whom analysts can turn for problems of analytical politicization or lack of objectivity. However, there remain several impediments to a much-needed comprehensive analysis. First, there are long lead-times to prepare and train analysts, especially in fields of counterterrorism and counter proliferation (Best: 2006). Secondly, there has been a shortage of trained linguists, especially in languages of current interest. While the National Security Education Program is designed to meet this need, most observers believe the need for linguists will remain a pressing concern for some years (Best: 2006).
A report by the Information and Privacy Commissioner/Ontario (IPC), which reviewed national security measures introduced since the 9/11 attacks, found that security experts do not think a huge electronic infrastructure is needed to solve national security problems (Cavoukian: 2003). In fact, a lack of information was not what prevented the FBI from discovering the terrorists’ plans for 9/11. On the contrary, it was “an excess of badly organized and poorly shared data” (Cavoukian: 2003: 26). A 2001 Congressional report on NSA states the agency is ‘faced with profound “needle-in-the-haystack” challenges’ because of the volume of information collected (Bamford: 2002). This finding was echoed by a congressional investigation into the 2001 terrorist attacks, which found that the failure by government to prevent the attacks was not caused by a lack of surveillance technology; instead, it was the result of fundamental organizational breakdowns in the intelligence community.
Instead of assisting in the war on terror, more realistically, perhaps, surveillance results in a chilling effect. The Information and Privacy Commissioner states that programs of surveillance may ‘impact the behavior of both terrorists and law-abiding individuals alike. Terrorists are likely to go to great lengths to make certain that their behavior is statistically “normal,” while ordinary people are likely to avoid unusual but lawful behavior out of fear of being labeled “un-American.”’ (Cavoukian: 2003: 16).
223 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
Conclusion
This article has focused on the U.S., but a major shift has taken place in the political and legal landscape of many counties around the world that introduced legislation to aid their ability to fight terrorism following 9/11. For example, France passed 13 anti-terrorism measures on October 31, 2001, the United Kingdom passed the Anti-Terrorism, Crime and Security Act on December 15, 2001, Canada passed the Anti-Terrorism Act on December 18, 2001, and Australia introduced five anti-terrorism bills in 2002. Moreover, many countries have adapted their political discourse and broadened definitions of “terrorism” and “terrorists,” in order to pass laws which had previously failed - illustrating the pattern of using social alarm of terrorism to enact legislation that would otherwise likely encounter resistance.
National security is often used as the rationale to enact legislation and heighten surveillance powers, and the debate is centered on security versus privacy. Indeed, following 9/11, the debate between security and privacy gained new momentum (Neocleous: 2007). However, the dichotomy between security and privacy is a false one (Cavoukian: 2008). Security and privacy is not a zero sum game; giving up privacy does not necessarily lead to greater security, and increased security need not result in a loss of privacy.
There are two logics operating in the context of the Internet. First, there is the logic of security vis-à-vis the Internet that creates conditions for surveillance. Second, there is the logic of control. I suggest that the real debate is about liberty versus control, and the U.S. government has operated under the logic of control. Liberty means having both privacy and security. As Benjamin Franklin famously observed, people willing to trade their freedom and liberty for temporary security deserve neither and will lose both. Surely there is a better way to balance our civil liberties and the nation’s security without abandoning either. Indeed, in response to the erosion of civil liberties, social movements have formed to advocate for increased privacy protection and more checks and balances on government surveillance. Further research may investigate this crucial area of the impact of social movements on Internet policy and privacy protection.
Surveillance initiatives analyzed in this article are only symptoms of deeper shifts in political culture, governance, and social control (Lyon: 2003). Diffie and Landau (2007: 313) remark, “Control of society is, in large part, control of communication.” As society and technology evolve, the government’s power to control communication changes. If increased surveillance continues, it does not bode well for democracy and personal liberties. The struggle over control of communication and individual’s rights will surely continue for years to come. Research on these battles for power and control will help us understand the role of technology in society, as well as how to better balance the needs of both the government and the public.
References
Administrative Office of the United States Courts. (2007): “Wiretap report” http:// www.uscourts.gov/wiretap04/contents.html
Associated Press. (2005): “FBI ditches Carnivore surveillance system” on January 18, 2005.
Lauren B. Movius 224
Bamford, J. (2001) “Body of secrets: anatomy of the ultra-secret national security agency” New York, Random House.
Bamford, J. (2002): “War of secrets; Eyes in the sky, ears to the wall, and still wanting” The New York Times, on September 8, 2002.
Ball, K. & Webster, F. (2003) (Eds) “The intensification of surveillance: Crime, terrorism and warfare in the information age” London, Pluto Press.
Barlow, J. P. (1996): “A declaration of the independence of cyberspace” Available at http:/ /www.eff.org/~barlow/Declaration-Final.html.
Best, R. A. (2006): “Intelligence issues for Congress” CRS Issue Brief for Congress.
Castells, M. (2001) “The internet galaxy: Reflections on the internet, business, and society” New York, Oxford University Press.
Cavoukian, A. (2003): “National security in a post-9/11 world: The rise of surveillance ... the demise of privacy?” Information and Privacy Commissioner, Ontario.
Church Committee (1976): “Final report of the Senate Select Committee to study governmental operations with respect to intelligence activities: Book II: Intelligence activities and the rights of Americans” USS 94d, 1976.
Churchill, Ward and Wall, Jim Vander. (1990) “The Cointelpro papers: documents from the FBI’s secret wars against domestic dissent” Boston, MA, South End Press.
Clark, D.D. & Blumenthal, M.S. (2000). “Rethinking the design of the Internet: The end-to- end arguments vs. the brave new world”. Version for TPRC Submission.
Costanza-Chock, Sasha (2004): “The whole world is watching: Online surveillance of social movement organizations,” in Pradip Thomas and Zaharom Nain (Eds) “Revisiting Media Ownership”, London, WACC & Southbound. 2004. 271-292.
DARPA. (2002) “BAA 02-08 Information Awareness Proposer Information Pamphlet” Defense Advanced Research Projects Agency 2002. http://www.eps.gov/EPSData/ODA/ Synopses/4965/BAA02- 08/IAPIP.doc
Deibert, Ronald, Palfrey, John, Rohozinski, Rafal, & Zittrain, Jonathan, (2008). “Access denied: The practice and policy of global Internet filtering” Cambridge, MIT Press.
Diffie, Whitfield and Landau, Susan. (2007). “Privacy on the line: The Politics of wiretapping and encryption” Updated and expanded edition, Cambridge, MIT Press.
Diffie, Whitfield and Landau, Susan. (1998) “Privacy on the line: The Politics of wiretapping and encryption” Cambridge, MIT Press.
Director of National Intelligence. Accessed April 10, 2009 at http://www.dni.gov/ what_collection.htm
Dockery, P. (2003): “Data Mining: Current Applications and Future Possibilities” House
225 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
Electronic Communications Privacy Act. 18 U.S.C. §§ 2701 et seq.
Electronic Privacy Information Center (2005): “The USA Patriot Act” at http://epic.org/privacy/ terrorism/usapatriot/, Accessed on October 2, 2008.
Electronic Privacy Information Center. (2005b):”The Carnivore FOIA Litigation”. http:// www.epic.org/privacy/carnivore. Accessed on September 20, 2008.
Electronic Privacy Information Center. (2008): “Air travel privacy” at http://epic.org/privacy/ airtravel/, Accessed on October 2, 2008.
Gandy, O. (2003): “Data mining and surveillance in the post-9/11 environment” in K. Ball & F. Webster (Eds) “The intensification of surveillance: Crime, terrorism and warfare in the information age” London, Pluto Press. 2003. 26 – 41
Goldsmith, Jack and Wu, Tim (2006) “Who controls the Internet: Illusions of a borderless world” Oxford, Oxford University Press.
Harris, Shane, (2002): “Counterterrorism project assailed by lawmakers, privacy advocates,” Government Executive Magazine on November 25, 2002.
Higgins, A. and Azhar, A. (1996): “China begins to erect second great wall in cyberspace” The Guardian on February 05, 1996.
Innes, J. (2001): “Control creep” Sociological Research Online, 6(3). Available at: www.sacresonline.org.uk
Jaeger, Paul, Bertot, John Carlo and McClure, Charles R. (2003): “The impact of the USA Patriot Act on collection and analysis of personal information under the Foreign Intelligence Surveillance Act” Government Information Quarterly, 20(2003) 295-314.
Johnson, David & Post, David. (1996): “Law and Borders: The Rise of Law in Cyberspace”, 48 Stan. L. Rev. 1367
Kalathil, Shanthi and Boas, Taylor (2003) “Open networks, closed regimes: The impact of the Internet on authoritarian rule” Carnegie Endowment.
Kempf, J. & Austein, R. (2004). The rise of the middle and the future of end-to-end: Reflections on the evolution of the Internet architecture. The Internet Society. Network Working Group, Request for Comments: 3724.
King, N. & Bridis, T. (2000): “FBI wiretaps to scan e-mail spark concern” Wall Street Journal on July 11, 2000, p.A3.
Lyon, D. (2006): “Theorizing surveillance : the panopticon and beyond” Portland, OR: Willan Publishing.
Lyon, D. (2003): “Surveillance after September 11, 2001” in K. Ball & F. Webster (Eds) “The intensification of surveillance: Crime, terrorism and warfare in the information age”, London: Pluto Press. 2003. 16 – 25.
Lauren B. Movius 226
Mathiason, John. (2009) “Internet governance: The new frontier of global institutions” Routledge, London.
Movius, L. and Krup, N. (2009): “U.S. and EU privacy policy: Comparison of regulatory approaches” International Journal of Communication, 3(2009) 1-19.
Neocleous, M. (2007): “Security, liberty and the myth of balance: Towards a critique of security politics” Contemporary Political Theory, 6, 131–149.
Ó Siochrú, Sean. (2005): “Assessing communication rights: A handbook” CRIS Campaign. Retrieved January 2, 2009, from http://www.crisinfo.org/
Rule, J. (2007). “Privacy in peril” Oxford: Oxford University Press.
Saltzer, J. H., Reed, D.P., & Clark, D.D. (1984): “End-to-end arguments in system design” ACM TOCS, 2(4), 277 – 288.
Schwartz, J. (2001): “Seeking privacy online, even as security tightens” New York Times on November 11, 2001.
Stanford, Duane and Ledford, Joey (2003): “Perdue rethinks Matrix: Driver records stay private” “The Atlanta (GA) Journal and Constitution”, on October 22, 2003: 1A
Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001 (PATRIOT ACT). Pub. L. No. 107-156, 111 Stat. 272 (2001).
USA Today. (2005): “USA TODAY/CNN/Gallup Poll results” on May 20, 2005.
Varian, H. (1996): “Economic aspects of personal privacy” in “Privacy and Self-Regulation in the Information Age” National Telecommunications and Information Administration Report.
Vegh, S. (2006): “Profit over principles: The commercialization of the democratic potentials of the Internet” in Sarikakis, K. & Thussu, D. “Ideologies of the Internet” New Jersey, Hampton Press. 2006. 63-78.
l
227 Surveillance, Control, and Privacy on the Internet : Challenges to Democratic Communication
German Espino
Professor, Faculty of Political and Social Sciences Autonomous University of Queretaro, Mexico.C.P. 76000.
E-mail: [email protected]
Abstract
With the downfall of the authoritarian regime ruling Mexico for more than seven decades, the relationship between politicians, the mass media and public opinion transformed dramatically. This paper addresses such a transformation of political communication by looking at the 1994, 2000 and 2006 presidential campaigns in comparative perspective. The analysis observes specific changes in the area, such as the displacement of the traditional centers of power that determined the political relations during the authoritarian regime as well as the breakdown of the corporatist compliances that characterized the media-government relations. Further traits that illustrate contemporary changes in political communication in Mexico is the establishment plural spectrum of powerful media whose barons act as powerful pressure groups; and the instability of Mexican electorate, which in turn fuels the prevalence of both independent and ‘soft vote’ over the ‘hard vote’. Finally, the paper concludes that candidate’s media and communication strategies during the 2006 presidential campaigns were the most influential factors of the election.
Key words: Elections/campaigns, agenda setting, media effects, Quantitative/survey, Quantitative/content analysis.
Introduction
This paper presents the results of a study aiming to analyze the 2006 presidential election in Mexico. The central hypothesis sustains that the overthrow of the authoritarian regime which ruled the country for over seventy years, has sparked an ongoing transformation of the roles being played by three key political communication actors –politicians, mass media and the public opinion. Wolton (1998) defines political communication as a space where there is an exchange of contradictory speeches from among politicians, mass media, and public opinion, as three actors that have the legitimate right to express their views. In order to explain the new process of political communication, this article describes the new roles of these three players.
A new political communication scenario has been established in the Mexican presidential elections with the mentioned change in regimen and the principal transformations lie in: 1) the non-existence of an authoritarian regime that ordered the rotation of power in Mexico due to a new plurality of actors and game rules, taking the president out of a central power position and putting into place a variety of different sectors that formed groups embattled in the process of
Journal of Global Communication, Vol2., No.1, Jan.-July 2009 pp. 225-247
Basic concepts to situate the political communication in Mexico
Some authors have argued for the medias’ attested ability to influence voters, giving us such theories as the Spiral of Silence and Agenda Setting. The latter is a key concept in this study as one of the theories that have taken up the concept of the powerful role of the media in modern life. It helps us understand how political actors such as the media and the different sectors of civil society participate in influencing political campaigns. According to this theory, the media are not consigned to telling people what to think but they can guide them in a space where the media has the capacity to direct public attention towards particular issues. The fact that the media can chose these issues corresponds to a fixation on agenda setting, suggesting that we all have the need to know what is happening around us and that it is the medias job to satisfy this need. As such, the media no longer persuade, instead they choose issues and guide the public towards them.
There are two recent contemporary political communication tendencies that are crucial to a critical approach towards political campaigns: 1) the media as newly arrived public space protagonists, a trend commonly denominated as “video-politics”; and 2) the Americanization of the political campaigns. The first one is called video-politics because of the hegemonic nature of television and the tendency of journalists and communicators to place themselves as the protagonists of public space which results in displacing politicians’ influence towards that of media personalities. The second phenomenon of the Americanization of political campaigns refers to the way politicians, the media and diverse political systems adopt political electoral technology designed in the United States. These trends are often intermingled and this analysis sometimes makes a distinction between the two tendencies while staying close to showing how they work together in the new scenario of political communication.
Furthermore, the analytical lenses of Americanization and Video-politics help us approach other political processes such as voter volatility that gets translated into polls results in which independent voters constitute a decisive majority. They also help us to look at political party configurations such as the “catch-all parties” and how these displace the candidates’ parties. As a result, these lenses identify how the political campaigns have become more personalized, where a candidate gets turned into a media personality. In the same manner, they help us to see the processes taking place in the current landscape of political communication as key campaign strategies are centered on the political marketing, and where the three main political parties institute primaries to select their candidates.
229 The Transformation of Political Communication in Mexico (1994-2006)
However relevant, Americanization and video-politics theories fall short in terms of being able to describe the totality of these processes. ‘Reception studies’ are one of the more comprehensive means through which to tackle the macro-processes of the new configuration of political communication that is taking place in Mexico (Morley 1996, 37; Orozco 2004, Jensen 1995). This theoretical corpus offers the following Mexican political communication assumptions:
I. Within the plural spectrum of media outlets in a democracy, the bulk of the media develop an “internal plurality” that consists of a diversity of ideological voices and affiliations. There also are media networks that represent the principal existing ideologies that pertain to the “external plurality” of the public space. These two aspects of the spectrum compensate each other in a process that tends to reduce the potential bias inherent in the media.
II. News media is a product of different voices and cultural traditions that often results in the polysemic character of their messages.
III. The theory highlights the importance of active receptors who are constantly reinterpreting messages. The meaning the audience gives to the messages is not inextricably linked to the source or to the means of communication. Reception is a complex process of influences that interact with the agents and with the circumstances of the communicative process. Within this complexity, it is the receptors who establish the final meaning given to any message.
IV. This process can be thought of in terms of a complex contextual resignification of media events. It is not the only factor that determines the meaning process since meaning is a result of a permanent battle between different agents and circumstances of the communicative process. There are a number of factors that come into play such as the diversity of media outlets, group influences, political ideologies, etc.
V. The process of message resignification is like a power struggle. Texts are not a blank page which succumb to the whims of the receptor. Instead, the cultural studies tradition insists that there are sense enclosures inscribed in texts despite their polysemic character. Given the range of possible meanings, a receptor’s reading could be: dominated (reproducing the outlets intended meaning); negotiated (partially reinterpreting the intended meaning); or critical (deconstructing the inscribed power mechanisms in the text which allows the receptor to develop a personal interpretation of the text).
German Espino 230
Methodology
Electoral-campaigns literature typically focuses on research methods which aid to evaluate the main variables affecting electoral processes, their impact or degree of variance. However, this paper aim is not to assess such variables separately, but rather, goes beyond to further examine the configuration of a new political communication landscape by focusing on the interplay and roles of the three key actors, as the following diagram shows (Graph I).
Graphic I: Relationships between the political communications agents
Consequently, media campaigns are the chosen strategy that allows candidates to intervene in media outlets. The principal elemen

Recommended