+ All Categories
Home > Documents > Software variety and hardware value: a case study of complementary network externalities in the...

Software variety and hardware value: a case study of complementary network externalities in the...

Date post: 18-Sep-2016
Category:
Upload: tom-cottrell
View: 214 times
Download: 0 times
Share this document with a friend
30
Ž . J. Eng. Technol. Manage. 15 1998 309–338 Case study Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry Tom Cottrell a, ) , Ken Koput b a UniÕersity of Calgary, Calgary, Canada b UniÕersity of Arizona, AZ, USA Accepted 23 April 1998 Abstract We estimate the effects of software provision on the valuation of hardware in the early microcomputer industry, covering the period 1981–1986. Since hardware and software technolo- gies ‘co-evolve,’ a discussion of the economic and technological relationships is provided to clarify the nature of the relationship between software variety and hardware price. q 1998 Elsevier Science B.V. All rights reserved. Keywords: Microcomputer technology; Network externalities; Computer platforms 1. Introduction Ž . In the early computer industry, computer software was sold together or bundled with hardware. The first estimates of the quality-adjusted price of computers were made during a time when this bundling was unquestioned; Chow’s classic work estimates the quality-adjusted decline in hardware price over the years 1955–1965. But by 1969, the business of bundling computer hardware and software was fundamentally changing, due Ž . to U.S. Department of Justice DOJ threatened antitrust action against International Ž . Business Machines IBM . The fall in price of computer hardware over the past 3 ) Corresponding author. Faculty of Management, University of Calgary, 2500 University Drive NW, Calgary, AB T2N 1N4, Canada. Tel.: q1-403-220-4477; fax: q1-403-282-0095; e-mail: [email protected]. 0923-4748r98r$ - see front matter q 1998 Elsevier Science B.V. All rights reserved. Ž . PII: S0923-4748 98 00021-6
Transcript
Page 1: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

Ž .J. Eng. Technol. Manage. 15 1998 309–338

Case study

Software variety and hardware value: a case studyof complementary network externalities in the

microcomputer software industry

Tom Cottrell a,), Ken Koput b

a UniÕersity of Calgary, Calgary, Canadab UniÕersity of Arizona, AZ, USA

Accepted 23 April 1998

Abstract

We estimate the effects of software provision on the valuation of hardware in the earlymicrocomputer industry, covering the period 1981–1986. Since hardware and software technolo-gies ‘co-evolve,’ a discussion of the economic and technological relationships is provided toclarify the nature of the relationship between software variety and hardware price. q 1998 ElsevierScience B.V. All rights reserved.

Keywords: Microcomputer technology; Network externalities; Computer platforms

1. Introduction

Ž .In the early computer industry, computer software was sold together or bundledwith hardware. The first estimates of the quality-adjusted price of computers were madeduring a time when this bundling was unquestioned; Chow’s classic work estimates thequality-adjusted decline in hardware price over the years 1955–1965. But by 1969, thebusiness of bundling computer hardware and software was fundamentally changing, due

Ž .to U.S. Department of Justice DOJ threatened antitrust action against InternationalŽ .Business Machines IBM . The fall in price of computer hardware over the past 3

) Corresponding author. Faculty of Management, University of Calgary, 2500 UniversityDrive NW, Calgary, AB T2N 1N4, Canada. Tel.: q1-403-220-4477; fax: q1-403-282-0095; e-mail:[email protected].

0923-4748r98r$ - see front matter q 1998 Elsevier Science B.V. All rights reserved.Ž .PII: S0923-4748 98 00021-6

Page 2: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338310

decades is a well-known and often cited example of the economic benefit of technologi-cal innovation. But none of the models to estimate the declining price of hardwareaccount for the critical complementary input, computer software, which can comprise asignificant portion of the cost of computing. By estimating the quality-adjusted price ofhardware without accounting for software variety and availability, the estimated effectsmay overstate the benefit consumers actually receive from the fall in hardware prices.

Ž . Ž .Both Triplett 1989 and Stoneman 1976 acknowledge the importance of the softwarecomponent in estimating the value of hardware.

The early microcomputer industry is an ideal setting to study the effect of softwareavailability on hardware valuation. Mainframe and mini-computers require a level ofpersonnel support that is not similarly required in microcomputers. In larger computersystems, one might argue, users more commonly provide their own software applica-tions. Microcomputer software applications, in contrast, are typically supplied bythird-party developers. But separate supply, or unbundling, of hardware and softwaresales need not impact consumer valuation of hardware. If the variety of applications

Ž .software is similar for all hardware platforms, the quality-adjusted or hedonic pricedifferences between platforms would reflect only differences in raw computing perfor-mance. If both hardware and software are supplied in perfectly competitive markets, thequality-adjusted price of variety should not differ by platform. In such an environment,computing power is completely generic. If software were generic, a similar argumentcould be made.

But if there are market imperfections, most notably if these products are characterizedby network externalities, the quality-adjusted price of a hardware platform would differdepending on the size of the installed base and the variety of software available. Theemergence of the IBMPC as a dominant standard is a frequently cited example of the

Ž .effect of network externalities in the microcomputer industry. Shurmer 1993 surveysusers to determine the nature of the externality. As Shurmer demonstrates, there is abenefit to a larger installed base because users value the availability of productcomplements for software applications, such as training, manuals and other support

Ž .products. Gandal 1994a shows that there are network externalities in spreadsheet use.With a direct network effect in software, developer strategies should account for aninstalled base effect. If the user of a certain software application benefits from the sizeof the installed base of other users of the same product, software developers have aninterest in a larger installed base. But since the size of the software installed base islimited by the installed base of hardware platforms on which it runs, an application thatruns on only one platform can always be made more valuable if it is ported to otherplatforms. That is, software developer’s might benefit from a larger installed base bymaking the application available for different platforms, a possibility noted in Farrell

Ž .and Saloner 1992 .There are several examples of multi-platform development in the early industry, but

they serve mainly to illustrate the difficulties in such a strategy. For example, Visicalcwas developed for the Apple II in 1978, where it was an extremely successful product.With the 1982 introduction of the IBMPC, Visicalc was ported to the new environmentusing a tool called a cross-assembler. The cross-assembler translates commands for the

Ž .Motorola 6502, Apple II’s central processing unit CPU , into the language of the

Page 3: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 311

IBMPC’s CPU, the Intel 8088. But Visicalc’s success on the Apple II was not repeated inthe IBMPC environment, for reasons we shall discuss later. Because the hardwareenvironment improves with time, successful application products must eventually beported to new hardware designs. 1 While providing compatible user interface, fileexchange and product functionality across different hardware platforms is not trivial,software design engineers did eventually find a way to provide it. But the portingprocess occurred only slowly in the infant PC industry, primarily because of theinefficiency of cross-platform development.

If porting were quick and inexpensive, platforms would not differ by variety ofsoftware applications available, and we would expect to find little effect of softwarevariety on hardware value. We argue that two links in a chain lead to an indirectnetwork effect where software variety benefits hardware value. The first link is that adeveloper’s sunk cost in specializing products for a certain platform makes it expensiveto supply the same product to a number of platforms. The second link, to be discussedlater, is that a user’s sunk cost in learning a software application imposes a switchingcost on new applications. These two links combine to produce the effect of softwarevariety on hardware value. While there is some empirical work on the effect ofcomplementary product externalities on related markets, this paper provides a moredetailed level of analysis across a wider spectrum of products.

In this paper, we estimate the effects of software provision on the valuation ofhardware in the early microcomputer industry, covering the period 1981–1986. Thehardware and software technologies ‘co-evolve’ and a discussion of the economic andtechnological dependencies establishes the rationale for the relationship between soft-ware variety and hardware price. Because microcomputer technology is subject tonetwork externalities, we begin Section 2 with an overview of the literature. Weconclude from this overview that a single computer platform would likely emerge to

Ž Ž . Ž . Ž ..dominate the industry Langlois and Robertson 1992 ; Langlois 1992 ; Arthur 1988 .More importantly, software development decisions influence which hardware designeventually dominates, so that the process is endogenously determined. We further arguethat without sunk investments in the requisite knowledge to develop software applica-tions for specific platforms, externalities in hardware would be far less significant. Thisinsight is important for understanding not only how software development affected thehardware market, but also the future evolution of computing technologies.

To understand assumptions about firm strategy in this industry co-evolution, weshould understand the competitive world as it was seen by software manufacturers. Animportant decision for early 1980s software manufacturers was the selection of operatingsystems and hardware for development. Section 3 describes the hardware environmentand its implications for software development. It is tempting to project our knowledge ofthe eventual dominance of the IBMPC onto developers in the 1980s and assume a single

1 Ashton-Tate’s Dbase II, one of the first database applications for microcomputers, was first developed as aminicomputer product called ‘Vulcan’ before it was ported to MS-DOS. Eventually, a successor to Dbase IIŽ .Dbase IV was available for the Macintosh and Unix operating systems. MicroRim’s relational databaseproduct, RBase, followed a similar path. Finally, Lotus 1-2-3, at first written specifically for the Intel8088rIBM-PC was eventually available under Windows, Macintosh, and the DEC VAX.

Page 4: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338312

dominant standard was anticipated, but it was not. 2 Section 4 presents the statisticalmethods and data to evaluate the softwarerhardware relationship, and Section 5provides a discussion of the results. A brief summary and implications for futureresearch conclude the paper.

2. Overview of literature on network externalities

The literature on network externalities emphasizes demand-side characteristics ofproduct competition. 3 Classical examples derive from competition in an environmentwhere product compatibility plays an important role, such as the VCR or computer

Ž Ž ..industry see Farrell 1989 . To illustrate this in the computer software industry,consider the decision to purchase microcomputer hardware circa 1985. In choosingbetween an IBM-PC vs. Macintosh or Apple II, a potential buyer is concerned not onlyabout raw computing power and ease of use, but also the availability of productcomplements such as applications software or peripherals. 4

2.1. Defining and using network externalities

There are many benefits associated with the number of other consumers that purchasecompatible products. The first benefit from a large installed base is termed a ‘directnetwork externality’ where standardized products provide access to a larger physicalnetwork. A common example is the telephone system; the larger the phone network, themore likely it is that the person one wishes to call also is accessible on the network. 5

ŽDirect network effects for software are evident whenever users exchange files seeŽ ..Church and Gandal 1993 . This direct effect pertains to applications software such as

Žword-processing or spreadsheets whenever the use of software e.g., written text or.financial analysis is the result of a collaborative effort where users wish to share files.

The second benefit to a larger installed base is a ‘market mediated effect.’ Here theprovision of a complementary good becomes cheaper and more readily available thegreater the extent of the compatible market. Examples include parts and servicing

2 Ž .Even IBM did not expect that the PC would do so well Chposky and Leonsis 1988 .3 Ž . Ž . Ž . Ž .Katz and Shapiro 1994 ; Katz and Shapiro 1992 ; Katz and Shapiro 1986 ; Katz and Shapiro 1985 ;

Ž . Ž . Ž . Ž .Farrell and Shapiro 1992 ; Farrell and Saloner 1992 ; Farrell and Saloner 1986 ; Farrell and Saloner 1985Ž .are representative works. The literature is ably reviewed by David and Greenstein 1990 , and we review only

the salient issues here.4 Ž .Gandal 1994b argues that this distinction is superfluous.5 This is only a statement about the expected benefit from a wide range of choices, not from realized

exchanges. As a result, a large installed base is not essential to network externalities if network architecture isŽ .well-understood. For example, the development of Integrated Services Device Network ISDN , was so

Ž . Ž .protracted that some companies e.g., Tenneco determined to establish their own non-standardized networksŽ .Cargill 1989 . This investment in intra-organizational communications was justified because network benefits

were easily identified. In microcomputer software, network benefits and the network architecture are measuredless precisely than, for example, intra-organizational communications, and this uncertainty may result in therequirement for larger networks.

Page 5: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 313

networks, or the allocation of a large fixed cost good with small marginal costs, such ascomputer software. It is often assumed that a larger installed base in hardware providesincentives for stronger competition in the complementary product market for software.That is, standardization may increase the scale of production as a result of a decrease inthe extent of variety supplied, and if there are scale economies, this results in decreasedaverage costs. 6 Finally, the increased size of the installed base leads to more variety or

Ž .more abundant supply of complementary goods. Church and Gandal 1993 argue thatincreased software variety has a feedback effect on the hardware industry, and that thisexternality drives the market for standardization in hardware. This effect is shown for

Ž . Ž .microprocessors in Swann 1987 . Finally, Cottrell 1994 argues that greater standardsfragmentation in the installed base of Japanese microcomputers constrains Japan’smicrocomputer software industry.

Ž .Third, standards decrease a user’s switching costs. Katz and Shapiro 1986 arguethat ‘‘Any technology requiring specific training is subject to network externalities: thetraining also is more valuable if the associated technology is more widely adopted.’’ Incomputer software, training can be a significant expense. Users of certain wordprocessors report a significant amount of time spent in learning the interface. As newtechnologies and techniques evolve, such as the transition to Windows 95 witnessedrecently, upgrade decisions will depend at least in part on the amount of time required tolearn the new command interface. If all word processors used the same key-strokermousecommands, this switching cost would be smaller, a result seen in the evolution ofoperating systems software with the success of the graphical user interface. Eliminatingthe switching cost itself is not necessarily Pareto-improving, since an interface that ismore difficult to learn may also be more efficient to use. The problem with switchingcosts only arise over time, after the initial cost of learning an application are incurred.

2.1.1. Empirical work on network externalitiesWhile there is little empirical work on direct network externalities, four works are

Ž .worth noting. Greenstein 1993 finds that, in addition to an incumbency effect,government agencies were more likely to choose backward compatible hardware tech-nologies in subsequent acquisitions. This effect is likely related to the investment in

Ž .software applications. Saloner and Shepard 1991 find that the size of the installed baseof customers correlates with an increasing probability that the branch bank will provide

Ž .automated teller machines. Gandal 1994a employs a hedonic price model to show apositive value associated with spreadsheet applications that provide Lotus file compati-bility.

These direct network effects contrast with study on indirect network externalities,where the externalities in product complements have received less attention. In a

Ž .theoretical model, Church and Gandal 1992b argue that hardware platform desirabilityŽ .increases with the variety of software available. Cusumano et al. 1992 use similar

reasoning to assert that the availability of complementary software in the form of

6 We are careful to point out, however, that a decrease in variety in order to yield lower production costsmay have either a positive or negative effect on consumption externalities.

Page 6: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338314

pre-recorded tapes set off a second ‘bandwagon effect’ during the early 1980s which ledto the eventual dominance of VHS over Beta. In this paper, we examine the networkeffect more directly in its influence on evolution of the hardware industry.

2.2. Sources of network externalities

In this section we present a model of application software from the user’s perspec-tive. The point of this argument is to substantiate the claim that operating systems andhardware need not be the basis for network externalities in computer software. We firstexamine the extent of user knowledge necessary for applications software. An applica-tion users’ acquisition of this knowledge is a sunk cost and, we assume, a cost imposedon potential entrants who wish to induce a rival’s installed base of users to switch. Ifusers require specialized knowledge at the applications software level, then the evolutionof the hardware industry depends on the ease with which applications can be adapted toother operating systems, a process known as ‘porting.’ When user’s sink a cost to learnsoftware, and producers incur a cost to port it to another platform, a strong rationale forcomplementary product externalities in hardware emerges.

2.2.1. A model of user incentiÕes for informationUsers have differing levels of knowledge about their computer systems. The incen-

tives for greater understanding of the workings of the computer hardware and applica-tions software have not been objectively studied, and the discussion below is specula-tive. We believe, nonetheless, that the assumptions and implications in the model arereasonable; they are based on an understanding of the software applications both in ourpersonal experience in applications and systems software, and in conversations withusers and software development firms. Fig. 1 shows a graphical depiction of

Ž .humanrcomputer interaction in the accomplishment of some task e.g., word-processing .We will spend some time in the detail of this model because it is important to

Fig. 1. Model of user incentives for knowledge of processing.

Page 7: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 315

understand the functions available at each layer of the ‘onion,’ and the implications fornetwork effects.

2.2.2. Knowledge of CPUŽ .At the center of the diagram is the Central Processing Unit CPU , a single chip in

microcomputer, and a cluster of semiconductors however arranged in older machines.The user has little incentive for an extensive knowledge of the CPU, even though all ofthe productive use of the computer flows through these electronics. In the early 1980s,

Žmany users knew the make, model and clock speed of their CPU e.g., Intel 8088 4.77.MHz , but few understood, or needed to understand, performance characteristics of the

chip, such as segmented architecture or register constraints. Perhaps even more impor-tantly, few would have had an intimate knowledge to justify the additional amount, often

Žseveral hundred dollars, paid for this particular chip when a close substitute chip e.g.,.the NEC 8088 could offer the same quality in running an application. The length of the

ray of the CPU circle is relatively small to reflect the insignificance of user knowledgeof this layer.

2.2.3. Knowledge of BIOSŽ .Surrounding the CPU is the Basic InputrOutput System BIOS . The BIOS offers a

variety of services for CPU access to memory and peripherals such as keyboards,cassettes, disks and monitors. In the IBM Technical Reference Manual:

w x w xThe goal of the BIOS programming is to provide an operational interface to thesystem and relieve the programmer of the concern about the characteristics ofhardware devices. The BIOS interface insulates the user from the hardware, thusallowing new devices to be added to the system, yet retaining the BIOS levelinterface to the device. In this manner, user programs become transparent tohardware modifications and enhancements. International Business MachinesŽ .1984 : p. 5-3.

Table 1 is an abbreviated list of ROM-BIOS services for the IBMPC. Manycomputers in the early PC years contained an Intel 8088 or equivalent chip, but only theIBMPC contained the PCBIOS. IBM’s technical reference provided a program listing ofthe BIOS instruction code to subsidize software developers with a complete knowledgeof the workings of the computer. IBM understood the importance of better performanceof the complementary products, and especially applications software, for hardware sales.But the BIOS was only protected by copyright law; stronger intellectual propertyprotection was not available. Publishing this detailed view of the computer without morerigorous protection of the BIOS resulted in the ‘clone’ copy-cat strategy that played suchan important role in the emergence of the IBMPC as a ‘dominant design.’ Usersinterested in Lotus 1-2-3 were concerned that hardware be 100% ‘IBM compatible’because that application worked directly with the BIOS, and not through the DOS layer,for improved speed and performance. While users were concerned about this compatibil-ity, their knowledge of how it was supplied was minimal. AMI and Phoenix were known

Page 8: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338316

Table 1IBM-BIOS

Subject Interrupt Service Description

Print Screen 5 nra Send screen contents to printerVideo 10 8r9 ReadrWrite character and attributeVideo 10 CrD ReadrWrite pixel dotVideo 10 1 Set cursor sizeEquipment 11 nra Get list of peripheral equipment

Ž .Memory 12 nra Get usable memory size in K-bytesDiskette 13 2r3 ReadrWrite diskette sectorsDiskette 13 C Seek to cylinderDiskette 13 5 Format diskette trackSerial port 14 1r2 SendrReceive a characterCassette 15 2r3 ReadrWrite data blocksDevices 15 80r81 Device openrcloseJoystick 15 84 Joystick supportSystem Request 15 85 Sys Req Key pressMemory 15 88 Get extended memory sizeKeyboard 16 3 Control Typematic keyboard featuresPrinter 17 2 Get printer statusBASIC 18 nra Switch control to BASICBootstrap 19 nra rebootTime 1A 4r5 ReadrSet date in real time clock

to produce very good BIOSs, so that brand labels became important as a substitute foruser knowledge of the details. Thus, while important for compatibility, user knowledgewas minimal, which we reflect by a shortened ray in the diagram above.

2.2.4. Knowledge of operating systemThe next layer in the diagram is the disk operating system, the exemplar here is

Ž .Microsoft Disk Operating System MS-DOS , but several other operating systems wereŽ .available e.g., CPrM, Apple DOS, etc. . There are two parts to this system, the

low-level DOS interrupts, which are very similar to the BIOS, and the kernel, known toIBMPC users as ‘COMMAND.COM.’ Users typically do not, and have very little needto, know the implications of DOS interrupts for their particular software application.User understanding of kernel operations, however, is important for file management andprogram launch. This knowledge is important periodically, such as at the time anapplication program is installed, but of less concern afterward. Many software applica-tions provide the majority of the services needed for file management listed below as‘kernel’ commands. In the early 1980s, then, the length of this ray should be larger than

Ž .the prior two layers Table 2 .

2.2.5. Knowledge of operating enÕironmentThe next layer in the ‘onion’ is the Operating Environment, and commonly under-

stood examples are the Macintosh or Microsoft Windows Graphical User InterfaceŽ .GUI . The functions provided in this level are generally designed to make the computer

Page 9: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 317

Table 2Microsoft DOS

Subject Interrupt Service Description

DOS Interrupt 33 30 Get DOS version number33 31 Terminate-but-stay-resident33 36 Get disk free space33 39 Make directory33 3A Remove directory33 3B Change current directory33 3C Create file33 3D Open file33 41 Delete file

DOS Command kernel MKDIR Make directoryDEL Delete fileCOPY Copy fileRENAME Rename fileFORMAT Prepare disk for use

easier to use, and to facilitate an ‘intuitive’ model of the computer. The majoruser-interface innovation in the Macintosh was the use of Icons in conjunction with apointer device. At the operational level, however, multitasking and multiprocessing werealso important innovations. There were several character-based multi-tasking DOS-layer

Ž . Ž .operating environments in the early PC era: QEMM Quarterdeck , GEM DRI , TOPSŽ . Ž .IBM , and Mondrian Microsoft . The operating environment facilitates launchingprograms, manages the ability to launch several programs simultaneously, and is animprovement over a strictly command-line oriented environment. The Operating envi-ronment is closely related to the command kernel discussed above, but can provideservices to the application level as well.

2.2.6. Knowledge of the applicationŽThe final layer in the ‘onion’ is the application the outer ‘user’ circle in the diagram

.is only for completeness . We argue that understanding at this layer is rewarded with thegreatest payoffs. While a typical user will rely on several application programs, effortspent in acquiring the specialized application knowledge at this layer receives a higherpayoff. As with prior layers, the application relies on layers closer to the CPU forcompleting tasks. For example, a word processing program offers file management,screen updating, printer management, control of the serial or parallel port, and a varietyof other functions related to word processing. To the user, access to many lower-levelfunctions are supplied transparently. Only occasionally is knowledge of a hierarchical

Ž .file structure from DOS needed, or the kernel commands to erase or rename a file.Many of the DOS-level functions available within applications by the end of the 1980swere not similarly accessible in the early 1980s. Since most early users of personalcomputers were required to be reasonably well-informed across layers of the onion,application developers specialized in the capabilities and enhancements that moredirectly related to topical functionality, such as word processing features. Over time,

Page 10: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338318

however, as less sophisticated users adopted microcomputer technology, more of thesecapabilities were provided within applications.

Three pieces of corroborative evidence support the importance of the applicationlayer in users sunk costs. First, a perusal of training manuals for specific applications

Ž .devotes remarkably little space to specific MS-DOS functions. In Neibauer 1990 , notmore than 47 of 342 pages relate to knowledge of MS-DOS. Indeed, the book claims inits introductory chapter that operating system is relatively unimportant:

Of course, WordPerfect comes in versions for other computers that use otheroperating systems. While the differences are too numerous to be covered ade-quately in an introductory book, once you’ve learned the elements of WordPerfectpresented here, it will be easy to apply what you’ve learned to other versions if theneed arises.

The second piece of corroborative evidence comes from a study reported in Hro-Ž .madko and Mutert 1986 . The research found an annual investment of US$12,500 user

time in learning new applications compared with a US$7500 corporate investment inhardware, software, support, supplies and maintenance. The claim is an important one:users sink significantly more resources in learning to use applications software than in

Ž .purchasing and maintaining it. Third, Shurmer 1993 interview of users of packagedsoftware emphasizes the availability of training and resources in learning softwareapplications. Once a users spends this time on learning to use a certain piece ofsoftware, there is a switching cost to learning a similar application from anotherdeveloper. We argue that the sunk cost of learning application use drives the networkexternalities in hardware when software cannot cheaply be ported to another machine.

2.3. Implications

Since users sink the major cost of learning at the application level, understanding ofthe underlying hardware is much less important. If WordPerfect works exactly the sameunder Unix, MS-DOS, Macintosh, Windows and OSr2, users should be indifferent overplatforms. Applications can be written for several environments, and tools availabletoday enable application developers to simultaneously release a product on a widevariety of platforms. A well-known technical journal Dr. Dobb’s annually produces anissue devoted to cross-platform development. While tools for cross-platform develop-ment were available in the early 1980s, the speed of the microprocessor and theefficiency of the cross-assemblers, for example, did not provide sufficient performance

Žfor generic platform strategies to work effectively for an example, see the discussion.below regarding the use of a cross-assembler to port Visicalc to the IBMPC .

The issue, then, is that if users care about application software, their choice ofoperating systems software is secondary, and the network externality effects in hardwareare secondary. But during the early days of the IBMPC, porting was an expensive anddifficult task. If, however, this cost drops as computing speeds continue to improve forfuture platforms, the implication is that complementary externalities in computer hard-ware will not matter in the long run. Instead, network exernalities will attach to thesoftware, and the effect on hardware will be to make the platform generic.

Page 11: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 319

Ž .Fig. 2. Old integrated computer industry structure.

We turn next to a discussion of an important piece of the argument: providingsoftware for a variety of platforms is difficult and expensive. This point is critical to theargument that software variety influences hardware valuation. If software were easilyported to new computing environments, we expect to see similar software variety acrossplatforms.

3. Industry structure and network effects

The importance of network externalities does not in itself prescribe an applicationsdevelopment strategy. One frequently cited strategy for establishing a standard advises

Ž Ž .broad and rapid diffusion through liberal licensing see Khazam and Mowery 1994 forŽ .the RISC computing example, and Rosenbloom and Cusumano 1987 for VHS vs. Beta

.VCRs . But because in software there are many levels upon which to base a standard, itŽ .is not clear which one to choose to broadly disseminate. In Manasian 1993 , Intel

provides two models to characterize a revolution in the computer hardware industry thatillustrates the nature of the problem. Fig. 2 stylizes the ‘before’ picture to emphasizethat channels of product development and distribution were historically verticallyintegrated.

Ž .Fig. 3. New less integrated computer industry structure.

Page 12: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338320

Fig. 3 provides the ‘after’ picture, and it is dramatically different from the historicalpresentation. Unbundling has led to a variety of integration levels for providers of bothsoftware and hardware. For example, Lotus 1-2-3 provides applications software as wellas file management services at the level of the operating system. Microsoft producesboth Server and Client operating systems software, applications, and in some cases,distribution of software products. That is, single applications may be found at several‘layers’ in the graphic, as well as across several platforms.

The economic literature on network externalities has for simplicity emphasized theformer structure of the computer hardware and software industries, a world wherevertically integrated firms consider providing products that are compatible for a few orseveral systems, or alternatively the development of completely incompatible systems. 7

3.1. Design considerations

Products must be developed for use on a computer ‘platform,’ where each platformmay be classified according to Layers 1–3 in the figures above. While our use of theterm platform below differs slightly from Intel’s graphic, we retain the moniker for easeof communication. These levels include: the microprocessor, the Basic Input Output

Ž .System BIOS or more generally, the ‘architecture’, and the operating system. Intel’sdescription above has a natural counterpart in the microcomputer hardware industrycirca 1980, as we describe below.

3.1.1. Defining the platformAt the microprocessor level, a number of producers of chips competed over the

dominant design. By 1980, there were a number of chips and chip families that mightŽconceivably have emerged as ‘the standard’ see Table 3, based on Elsevier Publishing

Ž ..1980 . Successful software development strategies had to account for this variety inŽ .microprocessors Freiberger and Swaine 1984 . Each chip had characteristics that made

it more appropriate for some tasks over others.The chip design and architecture also laid the ground rules for the development of the

computer that would be built around the chip. The choice of the microprocessor, then,has implications for the design of the rest of the hardware, and by extension, softwaredevelopment. As the IBM-PC was announced in August of 1981, there were several

Žwell-established architectures competing for market-share: the S-100 bus based on the. ŽMITS Altair , the Apple II-Bus which was an ‘open architecture’ in the sense that

.hardware not manufactured by Apple could be added to the motherboard , the Com-modore PET, the Radio Shack TRS-80, and a host of others. A wide variety of softwarewas available for these platforms, and a significant installed base of users was

7 Ž . Ž .Recently, however Church and Gandal 1992a and Church and Gandal 1992b have provided a model todistinguish software providers from hardware firms. In their model, the variety of software provided dependson the expected installed base of hardware. Hardware sales are dependent on the expected variety of softwareprovided.

Page 13: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 321

Table 3MicroprocessorrBIOSrOperating System combination

Microprocessor Basic Input Output System Operating System

Intel 8088 IBM-BIOS PC-DOSIntel 8088 Compaq IBM compatible MS-DOSIntel 8088 Tandy MS-DOSIntel 8086 IBM-BIOS UnixMotorola 6509 Apple II CPrMMotorola 6502 Commodore CPrMMotorola 68000 Macintosh Apple Proprietary

developing around the Apple II and CPrM. 8 It was difficult to evaluate the variousstrengths and weaknesses of the hardware market to pick an eventual winner. Instead,software developers in the infant industry anticipated a diverse and complex variety ofplatforms. This uncertainty spurred software developers to employ strategies to accountfor the anticipated variety.

In Table 3, each line is a separate ‘standard’ operating environment for applicationssoftware. Each of these platforms was important in the nascent industry. The intersectionof the three columns for chips, BIOS, and operating system make up a platform so thateach line in the table is a unique ‘standard.’ The table illustrates both the variety ofpossible standards, and the similarity among the component parts. The apparent similar-ity among components is misleading, however. The difference between the first two

Ž .lines IBM and Compaq is arguably negligible: Compaq was one of the first 100% IBMcompatibles. The difference between these and the third are significant enough to requirecompletely separate software product packaging. The difference from the fourth line isso great that entirely new products must be designed.

3.1.2. Anticipating a dominant designBy 1986, sales of the IBM-PC far exceeded other architectures, and historians

Ž Ž ..identify it as the ‘dominant design’ cf. Utterback 1994 . And clearly it has come todominate the industry, with an installed base of IBM-PCs reaching 75 million by the endof the decade; the next closest was the Apple Macintosh with around 20 millionmachines. Ex post analysis of this industry claims to have foreseen the ‘destiny’ of the

Ž Ž .. 9de facto IBMPC standard Young 1985 . Some companies did bet on the emergenceof a dominant platform. Lotus, as we have described, was optimized for an IBM PCrunning MS-DOS with at least 256 K RAM. If the IBM PC architecture had failed togain significant market acceptance, Lotus could have been seriously hurt. However,

8 Ž .Katz and Shapiro 1985 state ‘‘In the personal computer market, the CPM operating system has beendesigned to allow several brands of computers to use common programs.’’ MS-DOS was also available for anumber of computer systems at the time Katz & Shapiro were writing, but it was not as visible.

9 Ž .Young 1985 :110 ‘‘‘IBM compatibility’ has been the industry buzzword since 1981 when it wasrecognized, as software authors and publishers rushed to adapt or write programs for the new arrival, that theIBM PC was destined to set de facto standards. The ability to run the mass of software accumulated for theIBM PC is undoubtedly a huge advantage.’’

Page 14: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338322

Lotus’ strategy could have paid off if the PC were even a minor success; the strategydoes not imply that Lotus anticipated IBM’s dominance. 10

Other major participants in the emerging industry clearly did not anticipate theemergence of the PC in this dramatic fashion. Microsoft and Digital Research, twopre-IBMPC firms dominant in operating systems and languages, did not foresee it. Infact, Microsoft guessed that there would be a large number of platforms, and tailored itsapplications software strategy to account for it; Microsoft’s character-based spreadsheetMultiplan, while no match for 1-2-3 on the IBM-PC, was ported to over 100 platformsand became the dominant spreadsheet in Europe. 11 Digital Research charged such ahigh price for its CPrM operating system that it could not have been anticipating largevolume sales on any single platform. 12 In fact, Digital Research’s stated strategy was toprovide software under a wide variety of platforms to meet a wide variety of customerneeds. 13 This suggests that Digital believed that the network effect is at the operatingsystems layer. Digital clearly assumed that applications developers would rely onoperating systems to give their products portability, rather than application developersporting the software themselves.

Even the designers of the IBMPC did not expect the rapid diffusion and eventualdominance of the PC architecture. 14 The ‘open architecture’ strategy, which turns out to

10 Ž . ŽWallace and Erickson 1993 :232 ‘‘But no one, including Lotus, knew in 1982 when 1-2-3 was.developed that the IBM PC would become the standard. According to Multiplan marketing manager Jeff

w xRaikes, Kapor just happened to guess right. I don’t even think Lotus understood the key elements of theirsuccess.’’

11 Ž .Wallace and Erickson 1993 :222 ‘‘Though Microsoft had developed the operating system for the IBMpersonal computer, not even Gates believed the machine or his DOS would become as successful as they did,eventually dominating the market. So a key Microsoft goal for Multiplan was portability. Multiplan was to bean application that could run on different machines and different operating systems; it was eventually tailoredto run on more than 80 different computer platforms. ‘‘Everybody was guessing about how the personalcomputer market would develop,’’ said Jeff Raikes. ‘‘To be honest we guessed wrong. We thought therewould be dozens and dozens of platforms, maybe even hundreds.’’

12 Ž .Wallace and Erickson 1993 :211 ‘‘When CPrM was finally released for the PC in the spring of 1982, itwas priced at $240, or four times as much as DOS. Eventually Digital slashed its price to be more competitivewith Microsoft. Gates wanted to eliminate Digital Research before CPrM was available for the IBM PC andcould compete directly with MS-DOS. Soon after IBM’s PC made its debut, Gates suggested to his friendEddie Curry of LifeBoat Associates that perhaps Microsoft should put DOS in the public domain as a way ofgetting rid of CPrM once and for all. Gates may have been only half serious, said Curry, but the remarkshowed how badly Gates wanted to eliminate what he thought could be a serious competitor for the PCoperating system.’’ This latter comment suggests that Gates did not anticipate the enormous revenuesassociated with MS-DOS.

13 Ž . w xSherman 1984 :76 Gary Kildall states ‘‘ Unix will be another standard. It’s not going to be anything thatreplaces existing standards. I don’t think it’s unreasonable to have a couple of different standards, because

w xthere are different purposes for the software .’’14 Ž .Wallace and Erickson 1993 214 Don Estridge, who was in charge of the development of the IBM PC.

w x‘‘We reached that conclusion to offer open architecture because we thought personal computer usage wouldgrow far beyond any bounds anybody could see back in 1980. Our judgment was that no single softwaresupplier or singe hardware add-on manufacturer could provide the totality of function that customers wouldwant. We didn’t think we were introducing standards. We were trying to discover what was there and thenbuild a machine, a marketing strategy, and distribution plan that fit what had been pioneered and establishedby others in machines, software and marketing channels.’’

Page 15: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 323

have been highly diffusion oriented, was ideal for a market structure with ‘externalŽ Ž ..economies’ Langlois 1992 , a lesson IBM undoubtedly learned from the rapid success

Ž .of the Apple II Freiberger and Swaine, 1984 . IBM’s strategy was to produce a box thatwas unexceptional, but functional. 15 And, while it is helpful to talk about installed baseand the importance of network externalities, it is useful to remember that there wasalready a significant installed base of microcomputer users with 3 firms capturing 75%of the market. If the IBM PC had failed to become a dominant platform, as its DEC, HP,TI, and other counterparts did, it would have been attributed to the large installed baseof Apple or Commodore or Radio Shack.

3.2. Implications of Õariety for deÕelopment strategies

The discussion above illuminates the fierce variety of development platforms thatsoftware developers were obliged to consider in offering new software or in portingsoftware to new platforms. We next describe the approaches taken in response to thisvariety.

3.2.1. DeÕice dependent strategiesThe first strategy we consider is to design and develop software for a single platform,

usually the one with the largest, or expected largest, installed base. After successfullaunch there, port the software to the platform with the next largest installed base, anditerate. Alternatively, one could specialize development to a smaller, perhaps lesscompetitive platform, learn from the users there, and port to platforms with largerinstalled bases. However, the lag time between product release and porting to the nextplatform would be a significant disadvantage if incumbents could copy winning ele-ments of design or functionality. In the absence of this lag time, such a strategy wouldbe a useful device for learning customer tastes. But, such a strategy would not berecommended unless that knowledge could be applied quickly to more profitable

Ž .markets. Thus, according to Sigel and Giglio 1984 , firms usually developed productsfor the IBMPC first.

Either a single- or multi-platform strategy would most likely be implemented in ahighly platform dependent way. That is, each software product would have to becarefully designed for its hardware environment. It is helpful to note that microcomput-ers in the late 1970s were almost toys in their computing power. In order to providereasonable performance, developers used a specialized language called ‘assembler’ toovercome the inabilities of the microprocessor and peripheral devices. As we shall see,

15 Ž .Wallace and Erickson 1993 :214 ‘‘Before the PC announcement in August of 1981, Commodore, Apple,and Tandy’s Radio Shack had been the Big Three of the personal computer industry, with a 75% market share.None of them seemed to take IBM’s PC seriously because there was nothing innovative about it. The computerused existing technology and software. But that was just what the Boca Raton team had intended. ‘‘When wefirst conceived the idea for the personal computer in 1980, we talked about IBM being in a special position toestablish standards, but we decided we didn’t want to introduce standards,’’ explained Project Chess leaderDon Estridge in an interview with Byte magazine two years after the PC was announced. ‘‘We firmly believedthat being different was the most incorrect thing we could do.’’

Page 16: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338324

this strategy improved performance on one platform, but increased costs when portingfrom one platform to the next.

3.2.2. DeÕice independent strategiesA second strategy would be to design software in a relatively device independent

environment, and then generate executable code for a particular platform. The productcould then be released into a number of different hardware platforms rapidly orsimultaneously. One of the eventual motivations for a graphical user interface later inthe decade was to provide this ‘device independence’. As the nascent microcomputersoftware industry emerged, BASIC was useful for this kind of ‘platform independent’development, since there were versions of BASIC for each platform. While BASIC allowedfor greater portability, the code was usually too slow for commercial development.Microsoft, for example, turned first to ‘p-code’ 16 and then to ‘C’ for portability acrossseveral platforms. More sophisticated development tools could have been used, but the

Ž .CASE Computer Aided Software Engineering tools prevalent in today’s computingenvironment were not well developed in 1980.

3.2.3. Trade-offs between strategiesTables 4 and 5 illustrate of the implications of these two strategies. First is an

example of a piece of ‘C’ code that prints a new line and ‘Hello!’ on the screen. The8086 assembler code required to do the same thing follows.

Ž .Pressman 1992 provides the following estimates of the relative programming timedemands of various languages for an equivalent piece of product development. It ismisleading to only consider the cost without the expected benefits, and these are onlyrough estimates. Table 6 shows the extremely high cost of developing assemblylanguage programs in relation to more general approaches such as FORTRAN or C.

3.2.4. DeÕelopment in portable Õs. machine-specific languagesWhile the assembler program above requires more care in development, it runs about

20% faster than the comparable C code. 17 The C program, however, is easier to updateor to develop further. Its English-like structure allows a programmer to more easilyidentify the purpose and execution of the code. The assembler code makes use of

Žesoteric commands like ‘int 21 h’, which are specific to the platform in this case.MS-DOS sets an interrupt on the Intel chip . Even the layout of the program, which

identifies a code, data, and stack segment, is specific to the Intel chip or those based onits design. Motorola chips, for example, do not require separate segments. Everyhexadecimal number in the code segment of the program refers to a location or functionthat is specific to the Intel 80=86rIBM BIOSrMS-DOS environment. Thus, to port

16 Ž .Packed-code. See Ichbiah and Knepper Susan 1991 .17 We could improve on this performance by generating code from the assembler for a COMMAND file

Ž . Ž .Hello.COM instead of an EXECUTABLE Hello.EXE file. The COM file has fewer bytes and certain layoutrestrictions so that DOS can load and begin execution faster.

Page 17: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 325

Table 4Hello.c

Ž.main� Ž . 4puts ‘‘_nHello!’’ ;

the assembler program to another platform, one would change the layout of the program,the sequence of steps involved in preparing an interrupt, and the address of the interruptcalls. The program example here is extremely simple and only displays text. If theprogram were to display graphic information, porting would be an even thornier issue.In contrast, it would be a simple matter to port the C code simply by recompiling it on amachine for which a C compiler was available.

Figs. 4 and 5 provide an important context for this discussion. First, Fig. 4 shows theŽ .power, measured in MIPS Million Instructions Per Second , available on the family of

Intel microprocessors. After understanding this technological trajectory in semiconductorfunctionality, Bill Gates of Microsoft concluded in the late 1970s that ‘hardware is free’and determined that software developers would play a significant role in the microcom-puter industry. His insight was that the dramatic increase in hardware capability over thedecade would lead to a software bottleneck; that software firms, who provide theinterface between applications and CPUs, would profit from this increase in processorpower. Fig. 5 shows the number of transistors on a single chip. This growth in

Table 5Hello.asm

codeseg segment para public ‘CODE’assume cs:codeseg, ds:dataseg, ss:stackseg

print proc farmov ax,datasegmov ds,axmov dx,offset messagemov ah,9int 21hmov ax,4C00hint 21hprint endpcodeseg ends

dataseg segment para ‘DATA’message db 0Dh,0Ah,‘Hello!’,0Dh,0Ah,‘$’dataseg ends

stackseg segment para stackseg ‘STACKSEG’Ž .dw 64 dup ?

stackseg ends

end print

Page 18: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338326

Table 6Relative cost of various software development languages

Assembly Language 20.00COBOL 6.67FORTRAN 6.67Pascal 6.00Ada 4.67Object-oriented languages 2.004GL 1.33Code Generators 1.00

Ž .Source: Pressman 1992 .

functionality also meant that more and more processing power would be placed on theŽdesktop over the 1980s also called ‘Moore’s Law’ after Intel co-founder Gordon

.Moore . This slow processor power in the early 1980s demanded careful product designand optimization.

A brief real-world example illustrates the importance of this argument. When DanBricklin developed Visicalc for the Apple II, it was written in Motorola 6502 assemblerto provide peak performance. When the IBM PC was announced, Visicalc was ported tothe Intel 8080 environment by means of a cross-assembler—an automated tool fortaking assembler instructions for one microprocessor and porting them to another

Fig. 4. Semilog plot of number of MIPS.

Page 19: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 327

Fig. 5. Semilog plot of number of transistors.

environment. 18 While using a cross-assembler demanded less programmer time thancompletely redesigning the product under the PC standard, it resulted in executable codethat was slower. Lotus 1-2-3 was written in assembler for the IBM-PC to offer aperformance difference that was dramatic enough to mitigate the network effects of thelarge installed base of Visicalc users on the Apple II. In order to accomplish thisremarkable performance, Lotus had to have a comprehensive understanding of bothMS-DOS and the Intel 8086 microcode. In fact Lotus 1-2-3v1 was so closely wedded tothe way MS-DOS worked that it was claimed that Lotus was inoperable without

Žsignificant adaptation in subsequent MS-DOS versions see Wallace and EricksonŽ .. 191993 .

3.2.5. ImplicationsBy writing in high-level code, firms can quickly port to many platforms, but at the

cost of reduced performance. It is critical to understand that software cannot be ported to

18 Per Robert Merges, Software Arts employee at the time.19 Ž .Wallace and Erickson 1993 state the following ‘‘According to one Microsoft programmer, the problems

encountered by Lotus were not unexpected. A few of the key people working on DOS 2.0, he claimed, had asaying at the time that DOS isn’t done until Lotus won’t run. They managed to code a few hidden bugs intoDOS 2.0 that caused Lotus 1-2-3 to break down when it was loaded.’’ Whether this is indeed the case is notwell documented. However, one of the early tests for compatibility was whether the computer ran all modulesŽ .graphics, spreadsheet and database of Lotus 1-2-3.

Page 20: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338328

several platforms without a significant investment of time and skill. If a strategy ofdevelopment in assembler language is chosen, performance can be optimized for aparticular platform, but at the cost of ease of portability. Of course, the choice is reallymade along a continuum of degree of portability; creative design can improve bothportability and performance, but there are diminishing returns. We should also note thateventually, the need to port software cannot be avoided. As next generations ofhardware and operating environments emerge, most software manufacturers will chooseto develop for the new environment, such as the need to adapt Lotus 1-2-3 for MS-DOS2, 3, 4 etc. The transition from the Apple II to the IBM-PC, MS-DOS to Windows, orWindows 3.1 to Windows 95 are all examples of this dynamic. While some of thesechanges seem minor, others are so significant that they require fundamentally rewritingthe application. As firms anticipate the emergence of new platforms, they shouldevaluate the marginal benefit of porting. Potential entrants to the new platform must alsoestimate the likely level of competitiveness in the new niche, since success on oneplatform does not guarantee success on another. In making this decision, strategists aremore likely to port products that have already proven themselves in one environment, orones for which there are strong expectations for success. Thus, while porting is adifficult task, it is likely to be embarked upon only for more promising softwareproducts.

While possible, providing software across a variety of platforms was difficult andexpensive. Software variety could not be cheaply provided to multiple platforms, at leastnot on the relatively slowly performing microprocessors available circa 1980. Thisestablishes two useful results. First, that it is expensive to supply software for a varietyof platforms, and second, that due to the network effect, customers care about competi-tive supply of complementary product variety and this affects their willingness to payfor hardware.

4. Methods and data

We now turn to the statistical analysis of the effect of software variety on the value ofhardware to evaluate the network effect.

4.1. Hedonic prices for software Õariety

Ž .Rosen 1974 provides the economic theory for hedonic prices: implicit prices ofproduct attributes where separate markets for bundles of these attributes do not exist. Inthe computer hardware industry, the valuation of software is available separately fromthe hardware, but the effect of the variety of the complementary good is not separated.Thus, when consumers purchase a piece of hardware, they are also buying an option on

Ž .the software that is expected to be available for that machine. Hedonic pricingtechniques have received some refinement since Rosen’s pioneering work, which wascriticized because it did not account for the fact that consumers simultaneously chooseprice and product attributes. While several methodologies have been advanced to

Ž Ž . Ž . Ž ..address this simultaneity see Epple 1987 ; Bartik 1987 ; Brown and Rosen 1982 ,

Page 21: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 329

we employ one of the more common approaches: time series cross-sectional methods.Ž .Kanemoto and Nakamura 1986 recommend observing a market over several time

periods to circumvent problems with simultaneity, and the panel data methods describedbelow accomplish this.

Ž .Triplett 1989 argues that the form of the hedonic equations may derive either fromsupply or demand for the good. In empirical work, we often do not have enoughinformation to determine the form of the hedonic function. Industry characteristics andmarket demand will favor either a ‘resource-cost’ or a ‘user-values’ form of the hedonicequation, where the hedonic function bows either towards or away from the origin. 20

Hedonic methods thus require some flexibility over the choice of functional form,although none of the studies of computer hardware mentioned earlier consider this issue.To employ an appropriately flexible form, Triplett advises the use of the Box–Cox

Ž .transformation; Halvorsen and Pollakowski 1981 discuss the implication and imple-mentation issues.

We can explore the effect of software variety on hardware valuation by using theŽ .hedonic function approach. As Church and Gandal 1996 argue, software variety

imposes a indirect network effect on hardware valuation. Understanding and estimatingthe strength of this relationship is important for a model of the co-evolution of thesoftware and hardware industries and has significant implications for both hardware andsoftware industries.

4.2. Statistical estimation: panel data methods

Ž . Ž .The following discussion relies on Judge et al. 1985 and Hsiao 1986 . We applypanel data methods to study the effect of software variety on the decline in hardwareprices as follows. Suppose that there are I individual platforms with K productattributes observed over T time periods, indexed by i, k, and t respectively. The pricefor platform i at time t, p , is modeled as a function of platform factors and producti t

attributes. The most general model for time series and cross-sectional data would be:p sb qÝK b x qe . This specification allows all coefficients to vary overi t 1 i t ks2 k i t i t i t

time and for all platforms. We employ a more restricted model, called the analysis ofcovariance model: p sa ) qb qu where a ) is the individual platform effect, bi t i x i t i t i

are the effect coefficients, and the u are assumed i.i.d. with mean 0 and variance s 2.i t m

This specification holds a platform effect constant and allows other characteristics tovary over the observation period. Since the platforms remain relatively stable inperformance attributes and composition over time, this model assumes a fixed effect ofthe platform that is increased or decreased by the covariates. We also assume that theunobserved covariates, such as the hardware package of memory, disk, etc., remainsconstant in relation to other platforms over time. This assumption seems innocuous, asuser’s demand for peripheral and increased processing power would appear to be

Ž .consistent across computers. Hsiao 1986 recommends this fixed effects model over the

20 Ž . Ž .Dulberger 1989 and Triplett 1989 discuss the issues.

Page 22: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338330

random effects model when the identity of the individual subject matters. Since weknow that individual platforms vary over quality and peripherals, it is important to usethe fixed effects approach. 21

Because the hedonic function may take on various forms, we test Box–Cox transfor-mations of the data as discussed above. That is, we maximize the so called Box–Coxfunction with respect to the parameters of the regression u and the transformation

1 2 nŽ . Ž .variable l, the log-likelihood: L u ,y1 sy nln s q u ,l Ý ln P . For thismax Žu ,l. ks1 k2

particular data, the Box–Cox function is maximized when l is near zero, and thisprovides justification to transform the dependent variable to the log of price.

4.3. Data

Two sets of data are required for this research. One piece gathers information on thesoftware availability, the other on the microcomputer hardware. The software data wereassembled from The Software Catalog: Microcomputers published by Elsevier. 22 Thecatalog lists thousands of software packages available by hundreds of firms for the U.S.market. The data cover the years from 1981 through 1986, a critical period in themicrocomputer software industry since it spans the pre-IBMPC era through to theintroduction of the Macintosh. During this period, the number of products availablegrew from 3,700 to 13,000, the number of companies from 500 to 6,000. International

Ž .Data IDC provides information on the microcomputer hardware for the related period.Since the differing sets of data do not code hardware platforms identically, we created afield for platforms to establish the relationship between the hardware and software data.

5. Analysis and results

To summarize, we described the situations under which consumer willingness to payfor hardware should be greater: increasing variety and increasing competition in thesupply of software. We then offered a rationale for the use of time series cross-sectional

Ž .analysis Table 7 .

21 The unique problems of time series cross-sectional data call for some extensions to a linear model in orderto appropriately account for the distribution of the error terms. This specification is subject to all of theconcerns in a typical linear regression, especially autocorrelation. However, the diagnostics are more difficultto perform because of the complex structure of the data. Of commercial software packages to estimatetime-series cross-sectional series, most applications are inadequate to the task for this particular data set. Forexample, SAS’s procedure TSCSREG procedure does not allow missing data, which pares the number ofobservations down to an unacceptably small number. GAUSS’s tscs procedure does not account forautocorrelation which is certain to be an important factor in this study. Our approach is to use software written

Ž .specifically for TSCS work by Koput 1995 under Matlabe. The software implements methods advocated byŽ .Hsiao 1986 . This tool is more efficient in that more of the observations can be included in the analysis and

explicit modeling for the autocorrelation is allowed.22 The 1981r82 data was published by Imprint Software.

Page 23: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 331

Table 7Selected platforms for time-series cross-sectional analysis

Computer make and model

80 MICROCOMPUTER: 80 MICRO 286-8qAPPLE COMPUTER: APPLE IIAPPLE COMPUTER: MACINTOSHARCHIVES: ABS 1, 2, 3ATARI: 130 XE,65XEATARI: 400,800,600XL,800XL,1200XLCOLECO INDUSTRIES: ADAMCOMMODORE INTERNATIONAL: 128CROMEMCO: SYSTEM ZEROrD,ZERO,ONE,THREEDIGITAL EQUIPMENT: RAINBOW SERIESEAGLE COMPUTERS: EAGLE IVEPSON AMERICA: GENEVA PX-8EXIDY SYSTEMS: 80r1,2,SORCERERGROUPE BULL: ZENITH Z89,Z90KAYPRO: 1, 2X, 10LOBO SYSTEMS: MAX-80LOGICAL BUSINESS MACHINES: DAVIDNORTH STAR COMPUTERS: HORIZON SERIESOSBORNE COMPUTER: EXECUTIVESINCLAIR: ZXr80,81,1000SOUTHWEST TECHNICAL PRODUCTS: 09qTANDYrRADIO SHACK: TRS-80rIIITELEVIDEO: TS 800TEXAS INSTRUMENTS: 99r4, 4AVECTOR GRAPHIC: VECTOR SX SERIESVISUAL TECHNOLOGY: VISUAL 1050

Selection Criteria. In order to be included in the database for this analysis, products had to be first classifiedby IDC as ‘microcomputers.’ Some of the platforms initially included were clearly multi-user systems, andthese were excluded. Of the remaining platforms, and those with prices over US$7000 were excluded. Of theremainder, platforms were selected based on information on software availability and length of the time seriesfor each platform. This reduced the number of data points to 188 periodrplatform observations.

5.1. Hedonic prices

The quality-adjusted value of hardware platforms is a classic application of hedonicŽ .pricing. Chow 1967 finds, in the absence of detail on software availability, only three

parameters statistically significant in the hedonic equation for computer hardware price:Ž .memory size, multiply time, and access time. Stoneman 1976 chooses coefficients for

cycle time, floor space and maximum memory to explain 1963 computer prices.However, both of these studies, cover a time when hardware and software are bundled.Further, software variety and availability continues to be ignored in more recent workŽ Ž .. Ž .e.g., Hartman and Teece 1990 . Stoneman 1983 regrets the lack of good informationon the cost of software supply in estimating computer technology diffusion models.Although it was common in the 1960s for hardware consumers and producers to developtheir own software, by the time the microcomputer emerged in the late-1970s, the supply

Page 24: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338332

of third-party software was well established. With this 3rd party product information, wehave a more objective measure of software supply and its effect on hardware valuation.

5.2. CoÕariates in the study

We employ a ‘best subsets’ approach to chose the addition of covariates to themodel, where the criteria for inclusion of a covariate is better model fit measured byR-squared. The covariates selected under this method are: platform i’s market share in

Ž .year t current year’s percent of total sales , platform i sales in year t, year t variety forŽthe platform measured for each platform over cs15 categories as entropys

c .Ý log Ý p r p , hardware installed base in year ty1, and total number ofŽ . Ž .Ž .is1 i i i

software products available for the hardware platform in year t.Estimating the hedonic function requires transformation of the variables in order to

Ž Ž ..produce unbiased estimates see Kanemoto and Nakamura 1986 . The variables in thehedonic estimations of hardware value based on software availability are transformed tomaximize the Box–Cox function as described in Section 4 above. With the exception ofModel 1, the optimal transformation employed a l near 0; Table 8 presents the resultsof the time-series cross-sectional estimation. Model 1 uses the Box–Cox l which wasderived as the optimization of the Box–Cox function.

Of all single variable regressions, the best R-squared is found with entropy as thecovariate added to the time-series regression. This is good support for the complemen-tary externalities argument that buyers value software variety. Model 2 is the best2-variable regression, and shows covariates for market share and current year’s sales.

Table 8Hedonic pricing with time series cross-sectional analysis

2Ž .Model Hedonic pricing Log Average Price Coeff StdErr P-value P )0 Sig R

Ž . Ž .1 Software Variety entropy See note below 0.340 0.125 2.715 0.0073 ))) 0.040Ž .2 Log Market Share 0.498 0.070 7.075 0.0000 ))) 0.215Ž .Log Sales y0.575 0.084 y6.818 0.0000 )))

Ž .3 Log Market Share 0.498 0.069 7.245 0.0000 ))) 0.255Ž .Log Sales y0.562 0.083 y6.809 0.0000 )))

Ž .Software Variety entropy 0.307 0.097 3.156 0.0019 )))

Ž .4 Log Market Share 0.547 0.073 7.469 0.0000 ))) 0.269Ž .Log Sales y0.619 0.088 y7.063 0.0000 )))

Ž .Software Variety entropy 0.260 0.100 2.599 0.0101 ))

Ž Ž ..Lag Log Installed base 0.027 0.015 1.848 0.0662 )

Ž .5 Log Market Share 0.590 0.077 7.625 0.0000 ))) 0.280Ž .Log Sales y0.655 0.090 y7.288 0.0000 )))

Ž .Software Variety entropy 0.339 0.110 3.073 0.0024 )))

Ž Ž ..Lag Log Installed base 0.036 0.016 2.331 0.0208 ))

Ž .Log Total number of SrW products y0.081 0.049 y1.661 0.0983 )

Model 1 lambda is 0.2558.)))Significant at 1% level.))Significant at 5% level.)Significant at 10% level.

Page 25: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 333

Sales should evidence an inverse relationship with price, and the coefficient enters themodel with the correct sign. Market share is calculated as platform sales over sales of allplatforms in the database for a given year. The regression shows that greater marketshare in the current year correlates with higher prices. This effect suggests support forthe network effect if current year market share reflects expectations about futureinstalled base. That is, if customers value a larger network of users, producers may beable to capture part of this increased valuation in the price of the good. This resultsupports the complementary externalities model.

The entropy covariate enters the equation for the best 3-variable regression. Entropyenters earlier than simply the total number of products available for the platform. Sincethe measure of variety is only a crude approximation of what any individual consumer islikely to be concerned with in a hardware purchase, we consider this to be strongsupport for the effect of complementary product network externalities.

Since we cannot infer causation in the relationships, it is helpful to consideralternative rationale for the size and strength of this coefficient. Hardware producersaggressively seek successful developers to create software for a new hardware platform.For example, when the IBMPCjr was announced, a number of programmers of popular

ŽIBMPC packages were requested by IBM to provide scaled down applications for it seeŽ ..Freiberger and Swaine 1984 . Developers are more likely to invest time and money in

producing software if they believe that the hardware platform will be successful.Similarly, hardware marketers expect a higher price for products for which consumersfind interesting applications. In general, hardware promoters will have to offer discountsto induce customers of current hardware products to switch to a new platform. Thisdiscount is expected to be greater if sufficient variety is not available.

The size of the installed base of hardware users affects potential hardware buyers.The coefficient for this covariate is not significant if taken in the current year, but this islikely due to the higher correlation with current year market share over the periodsobserved. Network effects beyond mere software availability may derive from anynumber of sources. Network externalities in computer hardware derive, at least in part,from hardware consumer user groups. Most large cities have user groups for a numberof popular computer platforms, including Amiga, Atari, PC, Mac, and Sun computers.Users benefit from the information these groups provide on maintenance or applicationtechniques. A larger network of users often results in greater competition over parts andservice expertise, and consumers will value this externality. Through similar reasoning,it is possible that larger installed base will result in a higher price. However, givenuncertainty over adoption patterns, consumers may demand better information oninstalled base than simply current estimates, and there is a lag until size of the installed

Table 9Correlation of platform age with number of software products, entropy and installed base

Age

Total products 0.2542Entropy 0.2134Installed base 0.4411

Page 26: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338334

Table 10Mean and standard deviation of covariates by platform age

Age Entropy Number of products Installed base

Mean StdDev Mean StdDev Mean StdDev

All Ages 0.733 0.283 503 1284 337,212 1,033,724Less than 1 year 0.563 0.325 179 336 107,633 202,9601 year 0.669 0.315 327 722 162,264 333,4292 years 0.758 0.270 775 1891 632,098 1,801,5953 or more years 0.796 0.237 502 1161 298,415 589,791

base is known with any assurance; consumers may seek more objective evidence of thesize of the platform’s installed base. Thus, we lag the size of the installed base toaccount for the fact that users will look for stronger support for size of the installed basesince they do not know it contemporaneously.

The final variable to enter the model is the total number of software productsavailable for the platform. This coefficient is the opposite of the predicted sign from thetheory of complementary product externalities. But the coefficient is not statisticallysignificant. Even so, there are two ways to explain the negative coefficient:

ŽFirst, the number of software products for a platform increases over time see Table.9 . While an intuitively appealing measure of competition, number of products does not

appear to affect consumer’s willingness-to-pay as postulated in the models by ChurchŽ .and Gandal 1993 . The number of products is only a proxy for competition in the niche

and may miss the effect of expected greater competition. The measure of competitive-ness presented here only examines total number of software products, and does notaccount for application type; mixing types in the analysis may swamp the effect of thefew important categories. For example, game software comprises a significant numberof products but may not represent any particular level of competition in applications thatcustomers are most concerned about. Thus, the number of products may not be a goodmeasure of either the preference for variety or the expected competitiveness of softwaresuppliers. For this reason, we performed a separate analysis, both with and without gamesoftware, but the results are not qualitatively different.

To explain a second rationale for the negative coefficient, it is useful to considerthree effects as platforms age: the installed base increases, newer platforms are intro-duced, and prices decline. Software producers may be attracted to the market for oldermachines because the installed base continues to increase even though the price for thesame computing technology declines. While platforms with more software may declinemore slowly in price, there is no guarantee that this effect will be captured in thestatistical analysis of a range of platforms. 23 Further, as the installed base increases, thesupply for used computers increases and constrains the price of new hardware.

Tables 9 and 10 show how the covariates important to complementary externalitiesdevelop with age of the platform. The first table demonstrates that number of products,

23 In separate work not reported here, a regression of number of products against the price of platforms over2 years of age shows this coefficient is positive, but not statistically significant.

Page 27: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 335

variety and size of the installed base all increase with age of the platform. Platforms lessthan a year old have generally about a standard deviation less of each metric: entropy,number of products, and installed base.

6. Discussion

In this analysis, we examine microcomputer platforms over a longer period of timethan is typical to similar studies of computer hardware. Historically, hedonic regressionsare only estimated on the ‘technological frontier’ in order to capture the improvement in

Ž .the quality-adjusted price that results from new technology. Both Chow 1967 andŽ .Stoneman 1976 estimate hedonic prices only for new platforms under this rationale.

However, if new platforms are characterized by lower software variety, and we show thejustification for this in an earlier table, the cost of adopting new technologies isunderstated. The lower price offered by platform producers could be seen as aninducement to compensate early adopters for the switching costs of porting software tothe new platform, or as a risk premium for the possibility that the platform won’t gainsoftware developer support. While clearly the ‘quality-adjusted’ price is still declining,the rates of change on the ‘frontier’ over-states the benefits users realize. In thisanalysis, we see that product variety and the number of products available are bothincreasing as the installed base of users increases. As software variety increases overtime, the effect may be to compensate late adopters for use of otherwise obsoletetechnology. The effect of software in the adoption decision may lead a hardware-onlyanalysis to either overstate or understate the consumer benefit to the diffusion ofcomputer technology. In the hedonic models for computer hardware by Stoneman andChow, failure to account for this software effect suggests that the overall benefit ofhardware is overstated. The fact that federal procurement of computer hardware

Ž .evidences a platform inertia Greenstein 1993 supports significance of software.With respect to the complementary product externalities literature, the real-world

implications of software development in a multiple-platform environment provide auseful complement to that envisioned in game-theoretic models of compatibility andnetwork effects. Indeed, if firms are able to cheaply port software to multiple platformsby using general software development techniques, more products on any particularplatform may not represent any future commitment to that platform and may not induceexternalities. The risk of stranding a particular platform is greater if there are fewersoftware vendors who have made a credible commitment to it in the form of difficult-to-port software. A richer theoretical model where producers can select alternativeporting strategies is needed.

We conclude that there is a positive relationship between software variety and price.One interpretation is that variety serves as a signal of platform quality. Since consumerscannot easily observe the quality of hardware, software variety serves as a quality signalto those uninformed. Software variety may be a signal of quality for new computerplatforms, but once the platform is established, that role as a signal may be lessimportant. However, the important link between software variety and platform value issupported.

Page 28: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338336

7. Summary and conclusion

The variety of software available for a computer platform is important to the adoptionand diffusion of new computer technologies. Rather than a static or unresponsiveenvironment, computer software and hardware markets are subject to significant interac-tion effects. Computer hardware manufacturers are acutely aware of the need to providesoftware for new platforms. The analysis above supports the importance of such astrategy, but identifies the significant need for variety over merely numbers of products.The network effect from larger installed base appears to increase consumer valuation ofhardware. However, with changes in the current computing environment, the nature ofthese effects is most likely evolving. More empirical work is needed to determineexactly which aspects of network externalities are most important. Further, the extent towhich these effects are important will vary over the evolutionary stage of the platform.Early adopters will likely look for signals of platform quality from the variety ofproducts available. As installed base increases, however, variety may play a smaller role,but number of products may be important. New computer languages that facilitatedistributed applications, Java for example, may make platform-specific applicationsobsolete. Whatever the pattern, more research on the consumption is needed to developthe strategic implications of this research.

References

Arthur, W.B., 1988. Competing technologies—an overview. In: Dosi, G., Freeman, C., Nelson, R.R.,Ž .Silverberg, G., Soete, L. Eds. , Technical Change and Economic Theory, Pinter Publishers, London,

England.Ž .Bartik, T.J., 1987. The estimation of demand parameters in hedonic price models. J. Pol. Econ. 95 1 , 81–88.

Ž .Brown, J.N., Rosen, H.S., 1982. On the estimation of structural hedonic price models. Econometrica 50 3 ,765–768.

Cargill, C.F. 1989. Information Technology Standardization: Theory Process and Organizations. Digital Press,Bedford, MA.

Chow, G.C., 1967. Technological change and the demand for computers. Am. Econ. Rev. 57, 1117–1130.Chposky, J., Leonsis, T., 1988. Blue magic: the people, power, and politics behind the IBM personal

computer. Facts on File, New York, NY.Church, J., Gandal, N., 1992a. Integration, complementary products, and variety. J. Econ. Manage. Strategy 1

Ž .4 , 651–675.Ž .Church, J., Gandal, N., 1992b. Network effects, software provision, and standardization. J. Ind. Econ. 40 1 ,

85–103.Church, J., Gandal, N., 1993. Complementary network externalities and technological adoption. Int. J. Ind.

Organization 11, 239–260.Church, J., Gandal, N., 1996. Strategic entry deterrence: complementary products as installed base. Eur. J. Pol.

Ž .Econ. 12 2–Special Issue , 331–354.Cottrell, T.J., 1994. Fragmented standards and the development of Japan’s microcomputer software industry.

Ž .Res. Policy 23 2 , 143–174.Cusumano, M.A., Mylonadis, Y., Rosenblum, R., 1992. Strategic maneuvering and mass-market dynamics—the

Ž .triumph of VHS over Beta. Business History Rev. 66 1 , 51–94.David, P.A., Greenstein, S., 1990. The economics of compatibility standards: an introduction to recent

Ž .research. Economics of Innovation and New Technology, 1 1 .

Page 29: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338 337

Dulberger, E.R., 1989. The application of a hedonic model to a quality-adjusted price index for computerŽ .processors. In: eds. Ed. , Technology and Capital Formation, The MIT Press, Cambridge, MA.

Elsevier Publishing, 1980. The Software Catalog: Microcomputers. Elsevier, New York.Epple, D., 1987. Hedonic prices and implicit markets: estimating demand and supply functions for differenti-

Ž .ated products. J. Pol. Econ. 95 1 , 59–80.Ž .Farrell, J., 1989. Standardization and intellectual property. Jurimetrics J. 50 Fall , 35–50.

Ž .Farrell, J., Saloner, G., 1985. Standardization, compatibility, and innovation. Rand J. Econ. 16 1 , 70–83.Farrell, J., Saloner, G., 1986. Installed base and compatibility: innovation, product preannouncements, and

predation. Am. Econ. Rev. 76, 940–955.Ž .Farrell, J., Saloner, G., 1992. Converters, compatibility, and the control of interfaces. J. Ind. Econ. 40 1 ,

9–35.Farrell, J., Shapiro, C., 1992. Standard setting in high-definition television. Brookings Papers on Economic

Activity: Microeconomics, 1–77.Freiberger, P., Swaine, M., 1984. Fire in the Valley: the Making of the Personal Computer. OsbornerMc-

Graw-Hill, Berkeley, CA.Gandal, N., 1994. Refining the concept of network externalities. Discussion Paper, 94, 1–11.Gandal, N., 1994b. Hedonic price indexes for spreadsheets and an empirical test for network externalities.

Ž .Rand J. Econ. 25 1 , 160–170.Ž .Greenstein, S.M., 1993. Did installed base give an incumbent any measurable advantages in federal

Ž .procurement?. Rand J. Econ. 24 1 , 19–39.Halvorsen, R., Pollakowski, H.O., 1981. Choice of functional form for hedonic price equations. J. Urban Econ.

10, 37–49.Hartman, R.S., Teece, D.J., 1990. Product emulation strategies in the presence of reputation effects and

network externalities: some evidence from the minicomputer industry. Economics of Innovation and NewŽ .Technology, 1 1 .

Hromadko, G.F., Mutert, B.L., 1986. Strategic View of Software Market Segments. Robertson, Colman andStephens, San Francisco, CA.

Hsiao, C., 1986. Analysis of Panel Data. Cambridge Univ. Press, Cambridge, MA.Ichbiah, D., Knepper Susan, L., 1991. The Making of Microsoft: How Bill Gates and His Team Created the

World’s Most Successful Software Company. Prima Pub., Rocklin, CA.International Business Machines, 1984. Personal Computer Hardware Reference Library: Technical Reference.

IBM, Boca Raton, FL.Judge, G.G., Griffiths, W.E., Hill, R.C., Lutkepohl, H., Lee, T., 1985. The Theory and Practice of

Econometrics, 2nd edn., Wiley, New York, NY.Kanemoto, Y., Nakamura, R., 1986. A new approach to the estimation of structural equations in hedonic

models. J. Urban Econ. 19, 218–233.Katz, M., Shapiro, C., 1985. Network externalities, competition and compatibility. Am. Econ. Rev. 75,

424–440.Katz, M., Shapiro, C., 1986. Technology adoption in the presence of network externalities. J. Pol. Econ. 94,

822–841.Ž .Katz, M.L., Shapiro, C., 1992. Product introduction with network externalities. J. Ind. Econ. 40 1 , 55–83.

Katz, M.L., Shapiro, C., 1994. Systems competition and network effects. J. Econ. Perspect. 8, 93–115.Khazam, J., Mowery, D.C., 1994. The commercialization of RISC—strategies for the creation of dominant

Ž .designs. Res. Policy 23 1 , 89–102.Koput, K.W., 1995. PCTSCS: Time Series Cross-sectional Statistical Tool. Self, Tucson, AZ.Langlois, R.N., 1992. External economies and economic progress: the case of the microcomputer industry.

Ž .Business History Rev. 66 Spring , 1–50.Langlois, R.N., Robertson, P.L., 1992. Networks and innovation in a modular system: lessons from the

microcomputer and stereo component industries. Res. Policy 21, 297–313.Manasian, D., 1993. The Computer Industry: Survey. The Economist, 3–18.Neibauer, A.R., 1990. The ABC’s of WordPerfect 5.1. Sybex, Alameda, CA.Pressman, R.S., 1992. Software Engineering: a Practitioner’s Approach. McGraw-Hill, New York, NY.Rosen, S., 1974. Hedonic prices and implicit markets: product differentiation in pure competition. J. Pol. Econ.

Ž .82 1 , 34–55.

Page 30: Software variety and hardware value: a case study of complementary network externalities in the microcomputer software industry

( )T. Cottrell, K. KoputrJ. Eng. Technol. Manage. 15 1998 309–338338

Rosenbloom, R.S., Cusumano, M.A., 1987. Technological pioneering and competitive advantage: the birth ofŽ .the VCR industry. California Manage. Rev. 29 4 , 51–76.

Saloner, G., Shepard, A., 1991. Adoption of technologies with network effects: an empirical examination ofthe adoption of automated teller machines. Res. Pap. No 1146, 1–29.

Sherman, C.E., 1984. Up and Running: Adventures of Software Entrepreneurs. Ashton-Tate Culver-City, CA.Shurmer, M., 1993. An investigation into sources of network externalities in the packaged PC software market.

Information Econ. Policy 5, 231–251.Sigel, E., Giglio, L., 1984. Guide to Software Publishing: an Industry Emerges. Knowledge Industry

Publications, White Plains, NY.Stoneman, P., 1976. Technological Diffusion and the Computer Revolution. Cambridge Univ. Press, Cam-

bridge.Stoneman, P., 1983. The Economic Analysis of Technological Change. Oxford Univ. Press, Oxford.Swann, G.M.P., 1987. Industry standard microprocessors and the strategy of second-source production. In:

Ž .Naert, P.A., Bensoussan, N. Eds. , Product Standardization and Competitive Strategy, North-Holland,New York, NY.

Triplett, J.E., 1989. Price and technological change in a capital good: a survey of research on computers. In:Ž .eds. Ed. , Technology and capital formation, The MIT Press, Cambridge, MA.

Utterback, J.M., 1994. Mastering the dynamics of innovation: how companies can seize opportunities in theface of technological change. Harvard Business School Press, Boston, MA.

Wallace, J., Erickson, J., 1993. Hard Drive: Bill Gates and the making of the Microsoft Empire. HarperBusi-ness, New York, NY.

Young, G., 1985. Venture Capital in High-tech Companies: the Electronics Industry in Perspective. QuorumBooks, Westport, CT.


Recommended