+ All Categories
Home > Documents > An assessment of on-line engineering design problem presentation strategies

An assessment of on-line engineering design problem presentation strategies

Date post: 22-Sep-2016
Category:
Upload: mb
View: 213 times
Download: 1 times
Share this document with a friend
9
IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000 83 An Assessment of On-Line Engineering Design Problem Presentation Strategies Anthony A. Renshaw, Joshua H. Reibel, Charles A. Zukowski, Katie Penn, Robert O. McClintock, and Morton B. Friedman Abstract—This paper describes the assessment of three on-line learning modules in engineering design for first-year students de- veloped at Columbia University. The assessment includes results from more than 200 students who used test and control versions of each module during the 1996–1997 academic year. The goal of the assessment was to identify presentation formats and strate- gies for on-line engineering design problems that improved stu- dent performance on the design problem or on a short paper and pencil follow-up quiz taken immediately after module use. Students nearly unanimously preferred modules that incorporated anima- tion and interactive design tools over those with static snapshots of the same material. Interactive design tools also improved per- formance on the design problems. However, performance on the follow-up quizzes did not vary among student cohorts regardless of presentation format. Similarly, although students generally en- joyed and valued group work activities, and although these expe- riences frequently increased students’ confidence in their answers, follow-up quiz performance was not enhanced by group work ac- tivities. In an unanticipated result, students were twice as likely to sketch their answers when the module itself contained animated illustrations rather than static graphic material. This result sug- gests that computer-based learning tools can significantly affect the character and texture of students’ representation of their own ideas in manners that do not emerge from traditional performance measures. Index Terms—Engineering design, freshman engineering design, multimedia asssessment, on-line learning, web-based curriculum. I. INTRODUCTION I NTEREST in engineering design education as a tool for motivating and retaining engineering students of all backgrounds and as a method for fostering skills essential to engineering practice, such as communication, persuasion, and teamwork, has surged over the past two decades [1]–[3]. In response, numerous colleges and universities have created hands-on, project-based courses in engineering design for first-year engineering students (for example, [4]–[6]). Such courses offer opportunities for creative design, active learning, Manuscript received April 8, 1999; revised October 29, 1999. This work was supported in part by the National Science Foundation under Course and Cur- riculum Development Award DUE-95-55032. A. A. Renshaw is with the Department of Mechanical Engineering, Columbia University, New York, NY 10027 USA. J. H. Reibel, K. Penn, and R. O. McClintock are with the Teachers College, Columbia University, New York, NY 10027 USA. C. A. Zukowski is with the Department of Electrical Engineering, Columbia University, New York, NY 10027 USA. M. B. Friedman is with the Department of Civil Engineering, Columbia Uni- versity, New York, NY 10027 USA. Publisher Item Identifier S 0018-9359(00)04312-0. group work, and student/faculty interaction, and they are often received enthusiastically by students and faculty. These courses have been developed primarily at institutions that have committed substantial resources to curriculum reform and that regard curriculum innovation as central to their research and teaching missions. It remains unclear how well these courses will transfer to colleges and universities with fewer resources and less institutional support. On-line technologies such as the World Wide Web (WWW) have the potential to support creative design and cooperative learning opportunities that readily transfer to other courses and institutions. The ready image handling, instant access, and plat- form independence inherent in the WWW provide a rich envi- ronment for exploring and learning about communication and teamwork, and feature many characteristics that are more and more relevant to engineering practice. Unfortunately, the time and effort required to produce on-line learning tools can be substantial. This difficulty is particularly acute in engineering design because content and assessment techniques usually cannot be taken directly from existing textbooks and course materials. For example, Hsi et al. [7] spent four years developing hands-on activities, computer courseware, and assessment techniques for teaching and improving the spatial reasoning and visualization skills of first-year engineering students. Wallace and Mutooni [8] spent more than 500 man-hours developing WWW materials to replace a single 1.5-hour lecture in a course on product design. As part of an institutional effort to introduce substantive de- sign experiences into the early stages of undergraduate engi- neering education, Columbia’s School of Engineering and Ap- plied Science and Teachers College have developed and evalu- ated several Web-based modules in engineering design. During the 1996–1997 academic year, three prototype modules in elec- trical engineering, physics/math, and mechanical engineering were used by more than 200 students, and evaluation data were collected to measure student learning and to assess the strengths and weaknesses of each module. While this evaluation effort addressed many questions, a pri- mary goal was assessing on-line presentation formats and strate- gies to identify those that improve or otherwise alter student performance and work. By identifying these features system- atically, we seek to render future development and modifica- tion of on-line tools more efficient and effective. This paper describes the assessment methodologies and findings for three Web-based modules prototyped at Columbia University and in- dicates changes in student work produced by the different pre- sentation strategies. 0018–9359/00$10.00 © 2000 IEEE
Transcript

IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000 83

An Assessment of On-Line Engineering DesignProblem Presentation Strategies

Anthony A. Renshaw, Joshua H. Reibel, Charles A. Zukowski, Katie Penn, Robert O. McClintock, andMorton B. Friedman

Abstract—This paper describes the assessment of three on-linelearning modules in engineering design for first-year students de-veloped at Columbia University. The assessment includes resultsfrom more than 200 students who used test and control versionsof each module during the 1996–1997 academic year. The goal ofthe assessment was to identify presentation formats and strate-gies for on-line engineering design problems that improved stu-dent performance on the design problem or on a short paper andpencil follow-up quiz taken immediately after module use. Studentsnearly unanimously preferred modules that incorporated anima-tion and interactive design tools over those with static snapshotsof the same material. Interactive design tools also improved per-formance on the design problems. However, performance on thefollow-up quizzes did not vary among student cohorts regardlessof presentation format. Similarly, although students generally en-joyed and valued group work activities, and although these expe-riences frequently increased students’ confidence in their answers,follow-up quiz performance was not enhanced by group work ac-tivities. In an unanticipated result, students were twice as likely tosketch their answers when the module itself contained animatedillustrations rather than static graphic material. This result sug-gests that computer-based learning tools can significantly affectthe character and texture of students’ representation of their ownideas in manners that do not emerge from traditional performancemeasures.

Index Terms—Engineering design, freshman engineering design,multimedia asssessment, on-line learning, web-based curriculum.

I. INTRODUCTION

I NTEREST in engineering design education as a toolfor motivating and retaining engineering students of all

backgrounds and as a method for fostering skills essentialto engineering practice, such as communication, persuasion,and teamwork, has surged over the past two decades [1]–[3].In response, numerous colleges and universities have createdhands-on, project-based courses in engineering design forfirst-year engineering students (for example, [4]–[6]). Suchcourses offer opportunities for creative design, active learning,

Manuscript received April 8, 1999; revised October 29, 1999. This work wassupported in part by the National Science Foundation under Course and Cur-riculum Development Award DUE-95-55032.

A. A. Renshaw is with the Department of Mechanical Engineering, ColumbiaUniversity, New York, NY 10027 USA.

J. H. Reibel, K. Penn, and R. O. McClintock are with the Teachers College,Columbia University, New York, NY 10027 USA.

C. A. Zukowski is with the Department of Electrical Engineering, ColumbiaUniversity, New York, NY 10027 USA.

M. B. Friedman is with the Department of Civil Engineering, Columbia Uni-versity, New York, NY 10027 USA.

Publisher Item Identifier S 0018-9359(00)04312-0.

group work, and student/faculty interaction, and they areoften received enthusiastically by students and faculty. Thesecourses have been developed primarily at institutions that havecommitted substantial resources to curriculum reform and thatregard curriculum innovation as central to their research andteaching missions. It remains unclear how well these courseswill transfer to colleges and universities with fewer resourcesand less institutional support.

On-line technologies such as the World Wide Web (WWW)have the potential to support creative design and cooperativelearning opportunities that readily transfer to other courses andinstitutions. The ready image handling, instant access, and plat-form independence inherent in the WWW provide a rich envi-ronment for exploring and learning about communication andteamwork, and feature many characteristics that are more andmore relevant to engineering practice.

Unfortunately, the time and effort required to produceon-line learning tools can be substantial. This difficulty isparticularly acute in engineering design because content andassessment techniques usually cannot be taken directly fromexisting textbooks and course materials. For example, Hsiet al. [7] spent four years developing hands-on activities,computer courseware, and assessment techniques for teachingand improving the spatial reasoning and visualization skillsof first-year engineering students. Wallace and Mutooni [8]spent more than 500 man-hours developing WWW materials toreplace a single 1.5-hour lecture in a course on product design.

As part of an institutional effort to introduce substantive de-sign experiences into the early stages of undergraduate engi-neering education, Columbia’s School of Engineering and Ap-plied Science and Teachers College have developed and evalu-ated several Web-based modules in engineering design. Duringthe 1996–1997 academic year, three prototype modules in elec-trical engineering, physics/math, and mechanical engineeringwere used by more than 200 students, and evaluation data werecollected to measure student learning and to assess the strengthsand weaknesses of each module.

While this evaluation effort addressed many questions, a pri-mary goal was assessing on-line presentation formats and strate-gies to identify those that improve or otherwise alter studentperformance and work. By identifying these features system-atically, we seek to render future development and modifica-tion of on-line tools more efficient and effective. This paperdescribes the assessment methodologies and findings for threeWeb-based modules prototyped at Columbia University and in-dicates changes in student work produced by the different pre-sentation strategies.

0018–9359/00$10.00 © 2000 IEEE

84 IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

TABLE INUMBER OF STUDENTS PARTICIPATING IN

THE STUDY

II. I MPLEMENTATION STRATEGY

A course entitled “Computers in Engineering” was used as atestbed for the Web-based design modules. All first-year engi-neering students at Columbia are required to take this courseduring either the Fall or Spring semester. The course aim isto acquaint students with different engineering disciplines andorient them in the use of digital technologies that may be usedthroughout their undergraduate and professional careers. Threeweeks during the Fall 1996 and four weeks during the Spring1997 semester were dedicated to the prototype modules. Nineseparate sections were conducted—four in the fall, five in thespring, each with approximately 40 students. Table I shows thenumber of students participating in each semester utilizing eachmodule.

During each two-hour class, various types of assessment in-formation were collected. During the first 1.5 hours of the class,students worked with a specific module and recorded their so-lutions to the open-ended design problems on either paper oron-line worksheet. This worksheet also contained questions onthe clarity of the module, its best and worst features, the rel-evance of each section of the module, and the students’ opin-ions about the module as a whole. Once students completedthe design problem, or, during the final half-hour of each class,they took a short, follow-up, paper and pencil quiz based on themodule material. Students took the quiz only after logging outof the module. Thus, for each class, information was collectedon the students’ abilities to solve the design problem, their im-mediate retention of the module material, and their impressionsof the module and its features.

The immediate objective of the study was to increase under-standing of how different on-line presentation formats and stu-dent interaction scenarios influence student performance, atti-tudes, and perceptions. Several versions of each module wereprepared and used by different sections of the course. The prin-cipal differences among the versions were as follows.

• Static Versus Animated Modules.Some versions of eachmodule incorporated interactive Java applets that providedreal-time, animated illustrations and simulations of the de-sign problem. Other versions replaced each applet with asequence of static snapshot images of the applet at dif-ferent stages of its animation.

• Paper Versus On-Line Response.In some versions of eachmodule, students were asked to document answers to thedesign problems on a separate paper worksheet that washanded out at the beginning of class. In other versions,students were asked to fill out an on-line worksheet dis-tributed and submitted electronically. The questions onthe paper and on-line versions were the same; however,

electronic submissions were text only. For those moduleswhere sketches were relevant to the design problem, addi-tional paper worksheets were handed out to be filled outin conjunction with the on-line worksheet.

• Individual Versus Group Work.In some versions of theeach module, students were asked to work in groups ofthree or four students to develop answers to the designproblem. The group submitted one answer to the designproblem, but each student took the quiz independently.Groups were assigned at random. In other versions of eachmodule, students completed the module independently.

III. T HE MODULES

Three modules representative of electrical engineering,physics/mathematics, and mechanical engineering were tested.Each module consists of background tutorials that lead upto an open-ended design problem. The tutorials include text,illustrations, equations, and dynamic, interactive animations orstatic images of these animations. This information describesconcepts and principles related to the design problems as wellas some historical background to contextualize the problemand provide orientation with respect to key features. Theworksheets that students filled out as they worked through themodules included items to assess understanding, space in whichto document solutions to the design problems, and items tosolicit opinions about the modules’ strengths and weaknesses.

Diverse problems were selected in order to test differentpresentation formats on a broad range of design problems.In some cases, student answers are either right or wrong; inothers, there are degrees of correctness. Some problems askfor specific kinds of answers—mathematical, electrical—whileothers allowed students to determine their own answer format.

A. The Logic Gate Module (Electrical Engineering)

The design problem of the logic gate module requires stu-dents to construct a circuit ofAND, OR, andNOT gates that willdetermine the majority of three input signals. If two or three in-puts are true, the circuit outputs true; it outputs false in all othercases.

After introducing students to logical decisions used in ev-eryday life, such as a three-way light switch (theXOR function),the module introduces a real-time logic gate simulator. The sim-ulator permits students to wire circuits and test the outcome ofany combination of inputs. Fig. 1 shows anXOR function in thesimulator. In the on-line version, true values are represented bygreen elements and connections, false by red. By clicking on theinputs, students can change them from true to false, see the up-dated circuit, and thereby trace the flow of truth values throughthe circuit as inputs change. Students in the static version viewedpictures of the simulator with various input scenarios.

More than either of the other two modules, this module is anon-line workbook in that every task the student is asked to per-form is explicitly described and illustrated in the backgroundmaterial. By working through this material, students should bewell prepared to address the design problem. In this respect, themodule clearly instructs about logic and logic gates but is notso directly instructive vis-à-vis circuit design. In particular, the

RENSHAWet al.: ON-LINE ENGINEERING DESIGN PROBLEM PRESENTATION STRATEGIES 85

Fig. 1. TheXOR function depicted in the simulator.

module does not directly address modular design of complexcircuits wherein a complex problem is broken down into sim-pler subproblems. Addressing this element of circuit design is afuture goal.

B. The Bungee Omelet Module (Physics/Mathematics)

In the Bungee Omelet module, students perform an on-lineexperiment to determine the maximum, noncontact length foran elastic chord attached to an egg that is dropped toward theground. For a fixed chord length, the extension of the chordvaries with the mass of the egg (randomly selected by the com-puter) and the computer-generated, experimental “noise” (alsocomputer simulated). For the design problem, students are askedto extrapolate from their data to determine the proper chordlength for a drop from a height beyond the experimental datarange in order to maximize the egg’s proximity to the groundwithout its touching. The students are also asked to estimate amargin of safety for their predictions. In the dynamic versionof this module, an animated cartoon illustrates the experimentalegg drop, and the data are immediately tabulated and graphed assoon as each drop is completed. Fig. 2 shows the module in themiddle of an experiment with five data points collected, plotted,and tabulated. In other versions of the module, the illustrationand graphing features are eliminated so that the students merelygenerated, or, in some cases, were provided a set of experimentaldata.

In this module, the background material is only marginallygermane to the design problem since it describes the mathe-matics and physics underlying the experimental simulation. For-mally, the data extrapolation is several semesters beyond thepreparation of the students. Nevertheless, a number of the stu-dents requested access to regression tools, such as those foundin MS Excel, indicating that some were familiar with problemsof this kind. When told that such tools were not available forthis problem, these students frequently expressed frustration ordismay at being asked to solve the problem without these tools.

C. The Steam Engine Module (Mechanical Engineering)

In the Steam Engine module, students are asked to developimprovements to a Newcomen steam engine based on insightsand experimental information used by Watt to create a steamengine with a separate condenser.

Like the others, this tutorial includes a series of backgroundtutorials that lead to the design problem. The background mate-rial begins by describing the properties of condensing steam andseveral historical steam-powered inventions. The module thenprovides a detailed description of the Newcomen steam engine.Fig. 3 shows are a snapshot of one of the animations describingthe Newcomen engine. The background tutorials then offer sev-eral suggestive descriptions of the heat capacity of steam andof how this thermal energy can be lost. The design problem isto redesign the Newcomen engine to significantly improve itsefficiency. As a final and crucial hint, a quotation from Watt isprovided in which he recounts the train of his thinking leadingup to the “eureka” moment at which he envisioned his design:

It was in the Green of Glasgow. I had gone to take awalk on a fine Sabbath afternoon. I had entered the greenby a gate at the foot of Charlotte Street—had passed theold washing-house. I was thinking upon the engine at thetime and had gone as far as the Herd’s house when the ideacame into my mind, that as steam was an elastic body itwould rush into a vacuum, and if a communication wasmade between cylinder and an exhausted vessel, it wouldrush into it, and might be there condensed without coolingthe cylinder.. . . I had not walkedfurther than the Golf-house when the whole thing was arranged in my mind.Because the quotation is somewhat oblique and in eighteenth-

century English, students must interpret his comments basedon the background material and decide for themselves how toincorporate this information into their redesign effort.

The historical dimension of this module presented a problemfor our research study. Because the students were working on theWeb, many simply searched the Web for information on Watt’ssteam engine rather than try to solve the problem with the lim-ited information provided. The students were asked not to usesuch external resources, and most students complied, but it wasimpossible to monitor students at all times.

IV. K EY FINDINGS

A. Interactive Design Tools Versus Animated Illustrations

Table II shows student results on the design problem and thequiz for all three modules. In order to identify the correlationsbetween the presentation format and student performance, theresults have been broken down into distinct categories for eachmodule. For the Logic Gate and Steam Engine modules, the ani-mated and static versions of each module are distinguished. Forthe Bungee Omelet module, those versions that automaticallygraphed experimental data are distinguished from those that didnot (Graph/No Graph). Included in the “No Graph” category arethose versions of the module with dynamic cartoon illustrationsof the experiment, those that allow students to generate their

86 IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

Fig. 2. The Bungee Omelet experiment with five data points shown.

Fig. 3. A snapshot of an animation describing a Newcomen engine.

own data interactively, and those that simply supply studentswith a fixed set of data. Hence, the Graph/No Graph designa-tion does not distinguish static and dynamic presentation for-mats, only the presence and absence of a computer-generatedgraph.

In addition, the Logic/Dynamic and Bungee/Graph resultshave been further labeled “Design Tool,” while the Steam

Engine/Dynamic results have been labeled “Animated Illus-tration.” “Design Tool” indicates here that the module assistsstudents in evaluating potential solutions to design problemsby either explicitly simulating its performance (Logic) or pro-viding strong clues for solving the design problem (Bungee).An “Animated Illustration,” on the other hand, illustratesmaterial without providing any interactive feedback or solutionevaluation criteria.

More quizzes than design problems are registered for theLogic Gate and Steam Engine modules because some classesworked in groups on the design problems of these modules.Other differences arise from some students completing onlyone part of the assignment.

The average score on the design problem for students usinginteractive design tools was higher than those using the staticversions, with statistical significance of 8.5% (Logic) and 1%(Bungee). Student performance with animated and static illus-trations was essentially the same (Steam Engine). For all ver-sions of the modules, quiz performance was the same regard-less of the presentation format. Hence, although presentationformats that assist the student in evaluating potential design so-lutions enable students to produce statistically better work, useof such tools does not appear to be correlated with better testperformance.

These results occurred in spite of strong student preferencefor the animated or graphing versions in all cases. For example,students using the Steam Engine/Static module complained

RENSHAWet al.: ON-LINE ENGINEERING DESIGN PROBLEM PRESENTATION STRATEGIES 87

TABLE IISTUDENT PERFORMANCERESULTS

TABLE IIIMODULE FEATURESDEEMED MOST HELPFUL BY STUDENTS

bitterly about excessive textual explanations and about diffi-culty interpreting several of the diagrams. Twenty percent of theStatic cohort reported that the Newcomen engine illustrationwas unclear to them and named improvement of this element(without prompting) as “what would have made [the module]better,” none of the Dynamic cohort reported any confusion inthis area. During the Static cohort sessions, we observed manystudents studying the Newcomen engine illustration at length,clearly struggling to make sense of it. The Dynamic cohort didnot labor over the animated Newcomen engine illustration orcomplain as much about the module format. Yet despite theseclear disparities, the two student cohorts performed equally aswell on the design problem and the quiz.

B. Helpful Features of the Module

One of the questions on all worksheets asked students to iden-tify those features of the module that were most helpful forsolving the design problem. Table III presents a summary of thestudent responses broken down into the same categories used inSection IV-A.

In all but one case—Bungee/No Graph—the simulatorsand diagrams were the most frequently cited features. Ofstudents using the Steam Engine module, over 80% cited thesefeatures regardless of whether or not they were animated,whereas only 13–18% cited the background material and textas the most helpful. Of students using the Logic Gate module,

approximately 50% cited the simulator and diagrams (again,independent of presentation format), whereas just 22–23%cited the background material and text. For the Bungee/Graphmodule, students found the animated applet/graphing featuremore helpful than the text by a 77% to 22% margin. However,for the Bungee/No Graph module, 49% cited the backgroundmaterial as helpful, whereas only 36% cited the data andillustrations.

The relative importance assigned to the background mate-rial by the Bungee/No Graph cohort is particularly striking be-cause the background material in this module is not necessaryto solve this design problem. The problem can be solved by ex-trapolating from experimental data without any knowledge ofthe physics and mathematics underlying that data, which is de-scribed in the background material. Nevertheless, students whodid not have the computer-generated graph reported that thebackground description was more helpful than the data itself.Along similar lines, 84% of the Bungee/No Graph cohort re-ported that they could not have solved the problem without thebackground material, while just 41% of the Bungee/Graph co-hort reported such dependence. Automated graphing of the datadramatically altered students’ conceptions of the problem athand, focusing their attention on the essential underlying trend,and allowing them to distinguish important and irrelevant infor-mation. While such confusion also suggests that the clarity andstructure of the module needs improvement, it is nevertheless

88 IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

TABLE IVSTUDENTS’ USE OFDIAGRAMS AND SKETCHES

striking that the presence or absence of a computer-generatedgraph can alter students’ perception of the module so dramati-cally.

C. Students’ Own Use of Diagrams and Sketches

Despite the students’ general preference for diagrams overtext in the modules, student responses to the design problemsand quizzes made little use of diagrams. Table IV presents abreakdown of the percentages of responses that contained dia-gramed material.

On the Logic Gate worksheet, one question presented stu-dents with a circuit diagram and asked them to describe the cir-cuit’s function. Only 26% of the Static responses and 23% of theDynamic responses used any kind of diagram or mathematicalsymbols (such as a truth table) to describe the circuit’s perfor-mance. The other students described the different cases entirelywith text. For the Bungee Omelet module, only 15% of the NoGraph responses and 19% of the Graph responses sketched agraph, no matter how crude, on their quizzes when asked to ex-trapolate from two numbers. While both of these problems canbe fully and correctly answered without using diagrams of anykind, the extent to which students used partial to text-only re-sponses in these modules was noteworthy.

In contrast to those two modules, there is an advantage tosketching responses in the Steam Engine design problem. Stu-dents often consider combinations of pistons, steam sources,and other elements that are not explicitly proposed in themodule, and their ability to describe these conceptions and theirfunctions is essential. In responding to this design problem,a picture may indeed be worth about a thousand words, andanswers that included a sketch were generally clearer andmore succinct than those that did not. Text-only responsesoften went on for 50 lines or more, whereas responses thatincluded sketches averaged only ten lines of text. Fig. 4 showsfour typical student sketches from the Steam Engine designproblem.

For this module, presentation strategy strongly influenced thecharacter of students’ responses. The results in Table IV showthe percentage of students using a sketch broken down into thosewho solved the design problem on a paper worksheet and thosewho submitted on-line solutions. Although the on-line work-sheets only accepted text responses, these students were pro-vided blank paper worksheets at the beginning of class and toldthey could submit additional information about their answer onthe paper worksheet if they felt they could not express theiranswer correctly with the text-only, on-line format. It shouldbe noted that many students using on-line worksheets used theblank paper worksheet as scratch, sketching potential answerswhile they worked on the design problem. Many students did notwant to submit these scratch efforts as part of their formal an-swer, clearly attaching little value to that work. As much as pos-sible, these scratch efforts were inconspicuously collected andincluded in the statistics presented. Both the paper and on-linegroups are divided into Static and Dynamic cohorts. Of studentsusing the paper and pencil worksheet, 50% of the Dynamic co-hort included sketches, whereas only 25% of the Static cohortdid. Of students using on-line submission, 25% of the Dynamiccohort used sketches whereas only 7% of the Static version did.

This result is not indicative of the students’ ability to sketch;on the subsequent quiz, students were asked to sketch a steam-powered invention, and the sketching on virtually all answerswas adequate.

This result may be closely related to those of Section IV-Bwhere the presence of a computer-generated graph greatly al-tered students’ assessment of the significance of the backgroundmaterial of the module. In the Steam Engine module, the mo-tion of the sketches appears to have been an important cue tothe students about whether or not to employ sketching in theirown responses. As indicated in Section IV-A, the Steam Engineanimated illustrations did not improve performance on the de-sign problem or quiz; they merely increased the likelihood thatstudents would use sketches in their answers.

D. Group Work

Efforts to investigate the effect of grouping students in teamswere frustrated somewhat by the practical circumstances of thecourse. In the original experimental design, some class sectionswere scheduled to complete the modules individually, somewere to work individually for the first part of the class andthen form groups to determine a group response in additionto their individual responses, and other sections were to formgroups right from the start of the class and submit only groupresponses. Although this schedule was followed, in practice itwas impossible to prevent students from spontaneously talkingwith their friends and neighbors, explaining the modules, andcomparing answers. The computer lab in which all sectionsworked has 40 SGI workstations tightly clustered together infour rows, so students were in close proximity to each other.Furthermore, the self-paced structure of the modules, and thefrequent arrival of students not in the class looking for freeworkstations, encouraged a casual, workshop-like atmosphere.

To the extent that we were able to control the group versusindividual activity, we found students valued the group activi-ties. For the Logic cohort that worked in teams, despite the fact

RENSHAWet al.: ON-LINE ENGINEERING DESIGN PROBLEM PRESENTATION STRATEGIES 89

Fig. 4. Typical student sketches from the Steam Engine problem.

that 86% of the students solved the design problem individu-ally before working with their peers, 83% reported that theypreferred having the group experience to just working individ-ually. Ninety-four percent of the Steam Engine cohort moduleanswered “yes” to the worksheet question, “Are you glad youhad a chance to work with other students on this problem (as

opposed to just working on it individually)?” and 60% agreedthat the tutorials were easier to understand after working withothers on them. Nearly all (89%) of the students who solved (orbelieved they solved) the steam engine design problem alonebefore joining with the group still reported that the group workincreased their understanding. And while only approximately

90 IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

35% of the group cohort believed they solved the problem suc-cessfully alone, 80% reported understanding the problem afterthe group work.

E. On-Line Versus Paper Response

Apart from the propensity for students to draw more when theentire worksheet was on paper than when the paper was an op-tional addition to an on-line response,no significant differencesin quality or response character between the responses receivedon-line and those submitted on paper were obtained.

V. DISCUSSION

In retrospect, few, if any, of the results presented here aresurprising, and it is relatively easy to develop speculative ex-planations for each result. However, in contrast to most studiesof on-line learning tools, this study has analyzed presentationstrategies and techniques across a range of different problems.This study makes clear that on-line learning strategies are infact problem dependent, and that students’ understanding of aproblem can be significantly affected by relatively subtle cuesin presentation format. The presence or absence of a graph orthe use of animation can, in some circumstances, substantiallyalter the character of the students’ interaction with a module.While the results presented here offer hints about how students’behavior may change, predicting such changes is still a majorstumbling block in the systematic development of on-linelearning tools.

The major limitation in this study is the use of the follow-upquiz as a measure of student learning. The authors recognizethat the material absorbed by the students in a single 1.5-hourclass cannot be very substantial and that a follow-up quiz on thismaterial is by necessity simple and not very probing. The quizresults presented here may be poorly correlated with long-termretention or new understanding by the students. Nevertheless,despite these limitations, the quiz results do offer some insightinto whether or not the students understood the material pre-sented in the modules and whether or not this understanding canbe applied by the students.

In this context, the inability of the design tools to improvequiz results may indicate only that the problems posed on thequiz were too simple. Regardless of whether or not design toolslead to improved student learning, such tools may be crucial formotivating students, particularly over long periods of time. Inaddition, providing students with experience using tools similarto those used in professional practice, which increasingly incor-porate interfaces similar to those found in the animated illustra-tions and design tools, may have inherent value that this studywas not designed to address.

The on-line environment appears to be rich for fostering threespecific design-related skills: 1) the ability to distinguish im-portant and irrelevant information; 2) the ability to communi-cate nontextually; and 3) the opportunity to selectively and prof-itably interact with fellow students in order to better understandand solve a design problem. Each of these skills is essential toengineering practice, and yet teaching them effectively is diffi-cult. While the ability to distinguish relevant from marginally

relevant information is an important problem-solving skill, ef-forts to develop this skill necessarily “squander” scarce learningtime and energy on putatively irrelevant work. Similarly, while itmay be valuable to know that specific cues within a presentationformat can trigger students to represent their ideas visually, it isalso important for students to recognize for themselves when avisual representation may be effective. A key issue for furtherstudy is whether a student who has been coaxed into applyingthese skills by specific on-line cues is able to apply these skillsin novel situations.

It is clear from the Bungee problem that students are not adeptat recognizing and applying mathematical skills on problemsoutside of math class. This conclusion is not particularly sur-prising, as many educators and researchers have long registeredsimilar conclusions and complaints for classroom and textbooklearning. The question is: in what ways are these problems dif-ferent in the domain of on-line learning?

Perhaps the biggest obstacles to on-line learning are techno-logical limitations. For example, the form and styles of studentdrawings and sketches shown in Fig. 4 far exceeded the capabil-ities of most interactive, on-line drawing software known to theauthors. It is unlikely that any on-line responses would have thesame qualities as these paper sketches. Furthermore, the widerange of possible responses points to a fundamental problemwith putting design problems on-line: how to handle unexpectedanswers. For problems in which the range of answers is well de-fined, such as the Logic and Bungee problems, capturing studentresponses on-line is straightforward. But for more open-endeddesign problems such as the Steam Engine, the range of possibleanswers is ill defined and, except for copious textual answers ine-mail and on-line bulletin board formats, probably cannot beprefigured into the module. In hands-on, project-based courses,the most successful elements of the course are often those thatare least predictable, as when students generate novel ideas that,while flawed or problematic, raise important and interesting is-sues. If on-line learning tools are to be more than electronicworkbooks and reference resources, they need to incorporatethis important element of project-based learning.

VI. CONCLUSIONS

Although it is easy to list skills and behaviors that undergrad-uate engineering design courses are meant to engender—team-work, communication skills, creativity—it is more difficult toidentify specific pedagogical strategies that actually engenderthese skills. This problem is compounded by the fact that de-sign problems and projects frequently generate student excite-ment and enthusiasm that is uncorrelated with student learning.This pilot study has compared student performance and attitudeson three short design problems using a variety of on-line peda-gogical formats in order to identify useful correlations betweenthese formats and student behavior and learning. Among the keyfindings are the following.

1) On-line resources with interactive design tools thatenable students to construct and evaluate potential so-lutions appear to improve student performance only onthe design problems themselves and do not appear tosignificantly affect student performance on a follow-up

RENSHAWet al.: ON-LINE ENGINEERING DESIGN PROBLEM PRESENTATION STRATEGIES 91

quiz when compared to static presentations of thesame material. However, animated illustrations and stu-dent/computer interaction are often preferred by studentseven when they do not improve student performance.Such preferences may be important motivational tools.

2) For some problems, the presentation cues can signif-icantly alter the character of student response and thestudents’ understanding of the problem. For example,in the Steam Engine problem, students using animatedmodules were at least twice as likely to sketch theiranswers than those using static versions. In the Bungeemodule, students presented with a computer-generatedgraph drew markedly different conclusions about thevalue of the background material of the module.

3) On-line design tools offer a rich environment for studentinteraction and teamwork. Students in this study sponta-neously sought help from their friends and neighbors toexplain and discuss the tutorials. Most students who wereforced to work in groups found the experience to be a pos-itive one, even if the students had already solved the de-sign problem on their own.

4) Except for changing the character of student response,on-line responses were essentially the same as those doneon paper. But this result is clearly influenced by the lackof available software that can capture the full range ofunexpected answers that students generate from certainkinds of problems. This situation may well change in thenear future.

ACKNOWLEDGMENT

The authors would like to thank Z. Tan, C. Wong, and thestaff members of the Botwinick/Gateway Laboratory for theirinvaluable assistance running the class sections. The authorsalso thank D. Zukowski of IBM’s T. J. Watson Research Centerfor her help with the modules.

REFERENCES

[1] D. Schon,Educating the Reflective Practitioner. San Francisco, CA:Jossey-Bass, 1987.

[2] NRC, “Improving engineering design: Designing for competitive ad-vantage,” inNational Research Council Report. Washington, DC: Na-tional Academy Press, 1991.

[3] B. Bahner, “Redesigned college curriculums are launching a new kindof mechanical engineering student,”ASME News, vol. 15, no. 1.

[4] J. M. Starkey, A. Midha, D. T. DeWitt, and R. W. Fox, “Experiences inthe integration of design across the mechanical engineering curriculum,”in Proc. Frontiers in Education Conf., L. P. Grayson, Ed. San Jose, CA,1994, pp. 464–468.

[5] J. G. Harris (Moderator), “Journal of engineering education round table:Reflections on the Grinter report,”J. Eng. Educ., vol. 83, no. 1, pp.69–94, Jan. 1994.

[6] S. A. Ambrose and C. H. Amon, “Systematic design of a first-yearmechanical engineering course at Carnegie-Mellon University,”J. Eng.Educ., vol. 86, pp. 173–182, April 1997.

[7] S. Hsi, M. C. Linn, and J. E. Bell, “The role of spatial reasoning inengineering and the design of spatial instruction,”J. Eng. Educ., vol.86, pp. 151–159, April 1997.

[8] D. R. Wallace and P. Mutooni, “A comparative evaluation of world wideweb-based and classroom teaching,”J. Eng. Educ., vol. 86, no. 3, pp.211–220, July 1997.

[9] B. Bahner, “Redesigned college curriculums are launching a new kindof mechanical engineering student,”Amer. Soc. Mech. Eng., p. 1, 7, May1995.

Anthony A. Renshaw received the Ph.D. degree in mechanical engineeringfrom the University of California, Berkeley, in 1992.

He then worked at the General Electric Corporate Research and Develop-ment Division. His areas of expertise are mechanical dynamics and design. Hehas been at Columbia University, New York, since 1994 and is an Assistant Pro-fessor in the Department of Mechanical Engineering.

Joshua H. Reibelreceived the A.B. degree in philosophy from Harvard Uni-versity, Cambridge, MA, and the M.A. and Ed.M. degrees in communicationsand education from Columbia University, New York.

He is currently Vice President for Business Development of eSCORE.com,an Internet e-commerce and educational services subsidiary of Kaplan Educa-tional Centers and its parent company, the Washington Post Company. Previ-ously, he was Associate Director of the Institute for Learning Technologies atTeachers College, Columbia University, a research and development group ded-icated to exploring educational applications of new media technologies in K–12and higher education. Prior to his study and work at Columbia, he was a ProjectManager at the New Lab for Teaching and Learning, an educational multimediadevelopment facility at the Dalton School, where he also taught high school lit-erature and philosophy.

Charles A. Zukowski (S’84–M’85–SM’95) received the Ph.D. degree in elec-trical engineering from the Massachusetts Institute of Technology, Cambridge,in 1985.

Since then he has been on the Faculty of Columbia University, New York. In1998–1999, he was Acting Chairman of the Electrical Engineering Department.His primary research interests include VLSI circuit design, CAD, and commu-nications circuits. He is the author ofThe Bounding Approach to VLSI CircuitSimulation. He has received a U.S. patent on a time-division multiplexer circuit.

Prof. Zukowski received an NSF Presidential Young Investigator Award in1987. He has also served on a number of program committees for IEEE confer-ences.

Katie Penn is a Research Associate at the Institute for Learning Technologiesat Teachers College, Columbia University, New York.

Robert O. McClintock received the Ph.D. degree from the Teachers College,Columbia University, New York.

He is a Professor of history and education, Department of Scientific Founda-tions, at Teachers College.

Morton B. Friedman received the D.Sc. degree from New York University,New York.

He is Vice Dean of Education for the Fu Foundation School of Engineering.


Recommended