+ All Categories
Home > Documents > Where is computing headed?

Where is computing headed?

Date post: 22-Sep-2016
Category:
Upload: tg
View: 217 times
Download: 0 times
Share this document with a friend

If you can't read please download the document

Transcript
  • Where Is Computing Headed? Ted G. Lewis Naval Postgraduate School

    o know where we are going, we need a road map. As we venture farther into the computing terrain, technology assessment can help us draw the spe- cialized road map we need in this highly dynamic and volatile landscape.

    Hardware and software trends interact with economic forces to favor one approach over another, and all this activity plays out against a socio-political backdrop of both commercial and military enterprises. For example, technology is influenced by ma- jor battles over who will control the airwaves: communication companies such as Time-Warner or computer companies such as Microsoft? Who will achieve operat- ing system supremacy: Novell (Unix), Microsoft, Taligent, or IBM? Who will domi- nate the network software environment: cable TV companies like Tele-Communi- cations Inc., computer companies like Novell, or telephone companies like AT&T? And, closer to home, who will control our television sets? Engaging in these some- times corporate-death-defying feats promises decades of prosperity for victors and ruin for losers.

    Computer science in particular feels the impact of shifts in technology and the for- tunes of industrial warfare. For example, IBMs tardiness in recognizing the shift from mainframes to networks has decimated the company. Where the entire computer industry once followed IBMs lead, it now must chart its own course. Other less dra- matic examples are easy to find. Many small companies have suffered setbacks by rec- ognizing too late the importance of the shift from character-oriented user interfaces to graphical user interfaces. Only a few years ago, graphical user interfaces were never discussed in a computer science course; now they constitute an entire field of study within computing.

    Technological change is putting entire industries on the betting line. For computer tech- nologists, these shifts can mean opportunity or disappointment as one technology is re- placed by another. Therefore, it is important that we consider economic and technical forces when we plan for the future.

    How to predict the future Sometimes significant shifts are predictable, but many times they are not. For ex-

    ample, hardware packaging and performance are among the most accurately pre- dictable variables in the industry. Hardware vendors will continue to leapfrog one an- other throughout the foreseeable future.

    0018-9162/Y4/$4.000 1994 IEEE 59

  • With predictable technology, we can extrapolate along trend lines with great accuracy, but where technological sur- prises lurk, we can only ask what if questions. What if power supply tech- nology doubles energy density while dropping prices by 80 percent? This alone could speed the development of mobile computing and perhaps radically alter portable, hand-held, and subnote- book computers. The hand-held New- tons hardware technology results largely from advances in power supplies. The use of mobile command-and-control net- works in future cyberwars could de- pend as much on power supplies as on processor speed.

    Using trend lines where we are confi- dent of relative stability, and reverting to

    the more speculative what if approach where there is doubt, let us begin. The trends clearly support certain shifts.

    Faster cheaper a a a smaller

    Computational power densities will continue to increase geometrically. In- tel, for example, has consistently deliv- ered processors with a fourfold increase in capabilities every three years through- out the past decade. In 1992, a typical workstation ran at about 50 SPECmarks (a benchmark established by the SPEC consortium of vendors that more or less measures Unix performance under sim-

    ulated conditions). Projecting along this line, we can expect the 200-SPECmark PowerPC 620 processor to be available in 1995 and processor performance in general to surpass the astounding figure of 800 SPECmarks near the year 2000 (see Figure 1) - a 10-fold increase in less than a decade. (Intel and Hewlett- Packard recently announced a joint proj- ect to develop a 2,000-SPECmark PA- RISC chip by the end of the decade.)

    Database performance of 100 to 200 transactions per second was considered fast in the late 1980s. Current midrange systems have exceeded the 1,000-tps mark, and more impressively, they have done so while reducing the cost per tps. This is a consequence of faster proces- sors and the use of multiprocessing. A

    not all of their properties and characteristics

    sometimes takes.

    ogy such as mag-

    ears, but only recently has its

    ion, voice mail, conference calls, and 800 numbers,

    omplementary technologies may the last couple of decades with the ter technology to raise the level of United States above its rather at is so, he added, the economic

  • 10,000-tps database machine is on the drawing boards now.

    These raw performance trends have major implications for all of computing. They mean that we can pack more pro- cessors in a single product, run audio and video on the desktop, recognize speech, filter image data, and reduce the cost of processing transactions in banks, stock markets, insurance companies, universi- ties, and most other businesses. E-mail systems will change dramatically, and telephone systems will be service pro- grammed by consumers able to order up their own options. TV sets and home appliances will have graphical user inter- faces much like todays Macintosh. In the year 2000, soldiers will wear heads-up helmets that will let them see over the horizon with the help of radio networks.

    The expectation of faster, cheaper, and smaller processing hardware is basic to the technology of computing. Its like the laws of physics to a physicist or the peri- odic table to a chemist. But unlike the physical laws of science, the principles underlying computer technology change constantly. This single fact of constant change, while disconcerting to many, is the most important challenge facing com- puter science.

    Multiple processors

    One of the direct consequences of faster, cheaper, and smaller processing devices is the long-promised arrival of multiprocessing and parallel processing. (Multiprocessing describes any system that uses multiple processors; typically, each processor runs a separate job and only infrequently communicates with its peers. Parallel processing is a subset of multiprocessing whereby a single prob- lem is solved using many processors.)

    Multiprocessing, in its simplest form, is used to increase responsiveness of on- line transaction-processing systems such as bank ATMs, stock quotation systems, or telephone switching systems. Multi- processing workstation systems have also been used to improve the speed of engi- neering design and desktop computer- aided design. But these are only modest

    20x

    15x

    I 1994 1996 1998 2000 Figure 1. Over the next few years, we can expect underlying computer technology of commodity-priced components to continue its advance: Program address space will increase from 32 bits to 42 bits, RAM from 8 megabytes to 64 megabytes, processor speed from 80 SPECmarks to 800 SPECmarks, and disk storage from 340 megabytes to 7 gigabytes.

    applications of multiprocessor comput- ers, yielding at most a five- to 10-fold in- crease in performance.

    Parallel processing is currently ex- pressed in two forms: MIMD (multiple instruction, multiple data) and SIMD (single instruction, multiple data). MIMD systems based on the transputer and on PowerPC, Sparc, Intel, and HP PA-RISC processors have been able to achieve high, but not extremely high, pro- cessing rates. These systems are called scalable because performance roughly doubles when the number of processors is doubled. Typical speedups range from lox to 1OOx for systems of 32 to 512 pro- cessors.

    SIMD systems such as Thinking Ma- chines Connection Machine and Mas- Pars MP-2 have achieved extremely high performance on a restricted class of sci- entific problems. These problems must have a regular structure, as found in ma- trix calculations, certain graphics pro- cessing algorithms, signal processing, and data reduction applications.

    Because of SIMDs speed and MIMDs generality, these two paradigms of high performance are merging. However, they both suffer from a lack of software, which will continue to keep them outside the mainstream of computing. The lack of good parallel processing software lan- guages, tools, and environments makes for fertile areas of research that will con- tinue to attract the research community throughout this decade.

    We can expect multiprocessing to be- come widely accepted in the practical world of everyday computing. Entering at the high end of the workstation mar- ket, multiprocessor systems consisting of

    four, eight, and 16 processors will be in- tegrated into desktop PCs over the next five to 10 years. This added power will open new avenues to those prepared to rewrite their graphics, voice, and trans- actioddata processing applications. This too can become fertile ground for, in this case, applications-oriented research.

    Paradigm shift to object orientation

    In the software arena, the action is in object-oriented languages, object-ori- ented databases, object-oriented soft- ware engineering, object-oriented systems management, and object-oriented think- ing in general.

    More than a buzzword, this is a para- digm shift of major proportions. To illus- trate how serious it is, Taligent, the joint venture funded by Apple, IBM, and Hewlett-Packard, is betting $100 million on object-oriented software environ- ments. Microsoft, not to be left with less than 90 percent of any market, is spend- ing a considerable amount of money and effort on Cairo, the object-oriented next generation of Windows NT. Other play- ers, such as Borland Intemational, Syman- tec, and ParkPlace, are following suit.

    Object-oriented technology is not new. It began in Scandinavia in 1967 and ma- tured at Xerox PARC in the 1970s, but it has been held back by its need for pow- erful processors. However, in a world where typical softwarelhardware budgets allot 80 percent for software develop- ment and only 20 percent for hardware purchases, the capability of object tech-

    August 1994 61

  • nology to reduce software development costs easily justifies its added consump- tion of processing power. Moreover, faster, cheaper, and smaller devices will continue to drive this shift.

    There are compelling reasons for ob- ject technologys current vogue. It solves three major problems facing software de- velopers:

    (1) the need for rapid, incremental, it- erative development of new systems,

    (2) the need to capitalize software and thus encourage reuse of proven components, and

    (3) the need to reduce postdelivery maintenance.

    Object technology achieves rapid de- velopment through a mechanism called inheritance. New pieces of code inherit their behaviors from existing (reusable) designs and codes. Notice that design, not simply code, is reused. This is key. Once an object is designed, written, debugged, and used for a while, its utility increases because it is deemed more reliable and proven. It is trusted.

    The second cornerstone, capitaliza- tion, is important as a basis for a reusable component industry. Many attempts have been made to reuse soft- ware. They have failed largely because of the misunderstanding surrounding capitalization of reusable libraries, which for the most part are simply collections of code with some form of retrieval. Unlike the company carpool, these libraries have not been treated as a capital asset be- cause they are not as (re)usable as the carpool. Nobody accesses, uses, or copies these components, because they do not incorporate design. Design is the key. (Megabytes of Ada components are freely available to anyone on the Inter- net, but little reuse occurs because code is not design.)

    How does object technology solve the problem of design encapsulation? The power of object-oriented software lies in the construction of application-specific frameworks. A framework is a design for a generic application; it incorporates not only code but also the interactions be- tween code modules. Advocates of ob- ject technology are betting that design

    and coding of frameworks will revolu- tionize software.

    By way of illustration, the Taligent strategy is to layer the Mach (an alterna- tive to Unix) microkernel operating sys- tem with frameworks. A file framework does 99 percent of an applications file processing, for example, and a graphical- user-interface framework handles 99 per- cent of multimedia human-computer in- teraction. Taligent plans to sell tools for rapid application development and pre- dicts that these tools will increase pro- grammer productivity 10-fold to lOO-fold! (Programmer productivity has barely doubled over the past 20 years.)

    Maintenance is reduced because ob- ject-oriented programs are incrementally implemented; once released into the

    For large organizations, the

    network has literally replaced the mainframe.

    field, they continue to be incrementally improved (maintained) with only local- ized effects on the total program. This is due to the encapsulation features of ob- ject-oriented programming. Interest- ingly, object-oriented programs rarely change their overall structure, but rather their objects continue to evolve. Some have called this evolutionary or or- ganic programming.

    For most of the world, object technol- ogy lessens the need to design and code. Systems such as Prograph CPX have shown that object-oriented programming leads to high yields in programmer pro- ductivity. This is the nature of paradigm shifts, and this paradigm shift is affecting all facets of computer software. #en ap- plied to database technology, the result is object-oriented databases. #en applied to the enterprise, the result is object-ori- ented thinking. Pervasive object tech- nology means that all fields of computer science will change. Moreover, the door for research opens farther because the

    effects of this change are not clearly un- derstood.

    The network is the computer

    One of the most difficult lessons that university computer centers and compa- nies like Digital Equipment, IBM, Ap- ple, and Microsoft have learned is that for large organizations, the network has literally replaced the mainframe. This means that computing activity has shifted from management information systems (MIS) departments to all of the other de- partments; that is, computing has become departmentalized.

    Even though communications tech- nology has rapidly advanced since the in- vention of the telephone, computer tech- nology has advanced even faster. Thus, the price-performance ratio for commu- nicating has declined much more slowly than the price-performance ratio for computing (wireless or not). This is one of the driving factors underlying decen- tralization. Had the price-performance ratios been inverted, we might still be us- ing large-scale, centralized mainframes with millions of data-entry terminals con- nected by telephone lines.

    The movement toward decentraliza- tion is not only continuing but also accel- erating. By the end of the decade, com- puting will become personalized rather than departmentalized. High-powered computers will fit in a persons hand, and all this power will make them extremely easy and intuitive to use. The Message- Pad hand-held device introduced by Ap- ple Computer in August 1993 has a 15- million-instruction-per-second RISC processor, stores 48 pages of text andor graphics, communicates by infraredradio waves, and sells for under $1,OOO. This is more power than a mainframe had in the 1980s - all for a single user! In addition, the MessagePad incorporates handwrit- ing-recognition algorithms and can adapt to individual handwriting - no keyboard is needed. Therefore, it can be used by a great many people.

    Once everyone has their own person- alized computer, why will we need net-

    62 COMPUTER

  • Table 1. Increases in available bandwidth accompanied by full LAN/WAN connection and Internet access will affect a variety of digital media.

    Technology BitslSec.

    ~~~ ~

    Applications Enabled

    Modem telephony

    Compressed video (ISDN)

    MBone, MPEG, Ethernet

    Asynchronous transfer mode (ATM)

    10 Kbps

    100 Kbps-1 Mbps

    1 Mbps-10 Mbps

    10 Mbps-1 Gbps

    Textual e-mail, postage-

    Apple LocalTalk stamp video,

    Point-to-point video teleconferencing,

    limited ftp, Apple LocalTalk, wireless telephony

    Multipoint video, distributed multimedia, electronic publishing, fast Ethernet

    NI1 superhighway, campus backbone, Sonet, X.25, etc.

    working? After all, isnt networking de- signed for connecting a terminal to a big, central computer? Two factors argue for networks: (1) There has been a funda- mental change in measures of utility, and (2) network speeds have reached us- able levels.

    Price-performance ratios are no longer used to justify the utility of networks. Rather, time to market, competitive ad- vantage, real-time warfare strategy, and quality are the factors that necessitate networking. Competitive forces are so strong that cable TV, telephone, and computer companies are all vying for dominance in new networking ventures. They see networks as the supermarkets of the future, a way to more competi- tively distribute their products.

    Perhaps even more dramatic is the re- cent and rapid rise in transmission speeds (see Table 1). Even at 10 megabits per second, digitized newspapers, magazines, books, and other textual information re- quire too much time and expense to move over a network. But recently an- nounced speeds of 100 megabits per sec- ond, plus compression technology, en- able a whole new class of content to be

    moved. Such networks can handle movies, TV, game graphics, and text. Thus, an entirely new industry is build- ing up around this expanded capability.

    Gigabit networks are already on the drawing boards and will soon permit computer-computer communication that feels instantaneous to human users. With the necessary software and physical in- frastructure, thousands of hand-held computers will be able to work together on problems that today require a super- computer. Moreover, this capability will give the average person access to the li- braries of the world; playback of educa- tional and entertainment content will also be provided.

    Aside from the social, political, and eco- nomic issues, a fully connected world of computer networks, cellular telephones, and interactive cable TV systems raises significant research issues. First, what about network security? For local-haul data, radio signals will probably be used. But radio can be intercepted, altered, and rebroadcast without the knowledge of the sender or the receiver. Authentication is needed to protect bank accounts, stock markets, and corporate data.

    Encryption technology has become controversial because of the US govern- ments proposed public key encryption technique. While encrypted communica- tion would be protected against most unauthorized eavesdropping, the govern- ment would nevertheless have the means to listen in. Many computer scientists as well as civil libertarians are concerned about this potential invasion of privacy.

    Network interoperability is another is- sue. Millions of local area networks al- ready exist. What happens when they are tied together?

    Distributed database issues are ubiqui- tous. Issues of access control have largely been solved in the research literature, but other issues persist. For example, when networks contain millions of users, dead- lock avoidance algorithms may have to be modeled after the telephone system rather than operating systems.

    Economic, technical, and competitive factors have all conspired to make the network the computer. Once considered a subversive activity engaged in by hack- ers and computer nerds, networking is now as mainstream as mountain biking.

    he next 10 years promise to be even more exciting than the pre- T vious decade, which seems diffi-

    cult to believe, given the rapid changes since 1984. Telephones, televisions, fax machines, and the Internet will all be rad- ically different in 2004, affecting factories, schools, and offices. And what about com- puters? They may become so pervasive and specialized that they will cease to ex- ist as separate, general-purpose devices. = Ted G. Lewis is professor and chair of com- puter science at the Naval Postgraduate School, Monterey, California. He has made technical contributions to cryptography, dis- tributed computing, parallel programming, and object-oriented framework designs. Cur- rently editor-in-chief of Computer, he also served in that capacity for IEEE Software.

    Lewis received a BS in mathematics from Oregon State University in 1966 and MS and PhD degrees from Washington State Univer- sity in 1970 and 1971, respectively.

    The author can be contacted at the Naval Postgraduate School, Code CS, Monterey, CA 93943-5118, e-mail lewis@cs,nps.navy.mil.

    August 1994 63


Recommended