+ All Categories
Home > Documents > Assessment Report Annual Outcomes Assessment Report Assessment Reports... · 2010-10-05 · Annual...

Assessment Report Annual Outcomes Assessment Report Assessment Reports... · 2010-10-05 · Annual...

Date post: 24-Jun-2018
Category:
Upload: doandieu
View: 216 times
Download: 0 times
Share this document with a friend
94
MTU/ECE/RMK 1 Approved by ECE Faculty – 10/5/2004 Annual Outcomes Assessment Report Covering Academic Year 2003-2004 Final Release Abstract In accordance with the ECE Department Standard Operating Procedure (SOP) on Undergraduate Program Assessment and Control, this report presents the results of the Outcomes Assessment Process for academic year 2003-2004. The Alpha Draft version was submitted by the Assessment Coordinator (AC) to the Undergraduate Program Committee (UPC). The UPC reviewed and revised the Alpha Draft version to produce the Beta Draft version for submission to the ECE faculty. Subsequently, the faculty reviewed and revised the Beta Draft, producing the Final Release of the document. This process has produced 12 “Problem Items”, resulting in 12 “Action Items” to be completed during AY 2004-2005. Detailed statistical analyses of individual assessment instruments are confined to the appendices. The casual reader may wish to begin this report with Section 5 Compilation of Problem Items, or Section 6 Action Items. Assessment Report
Transcript

MTU/ECE/RMK 1 Approved by ECE Faculty – 10/5/2004

Annual Outcomes Assessment Report

Covering Academic Year 2003-2004

Final Release

Abstract In accordance with the ECE Department Standard Operating Procedure (SOP) on Undergraduate Program Assessment and Control, this report presents the results of the Outcomes Assessment Process for academic year 2003-2004. The Alpha Draft version was submitted by the Assessment Coordinator (AC) to the Undergraduate Program Committee (UPC). The UPC reviewed and revised the Alpha Draft version to produce the Beta Draft version for submission to the ECE faculty. Subsequently, the faculty reviewed and revised the Beta Draft, producing the Final Release of the document.

This process has produced 12 “Problem Items”, resulting in 12 “Action Items” to be completed during AY 2004-2005. Detailed statistical analyses of individual assessment instruments are confined to the appendices. The casual reader may wish to begin this report with Section 5 Compilation of Problem Items, or Section 6 Action Items.

Assessment Report

MTU/ECE/RMK 2 Approved by ECE Faculty – 10/5/2004

Table of Contents

1 Introduction..................................................................................................................................... 3

2 Previous Action Items..................................................................................................................... 3 2.1 Status of Individual Action Items ............................................................................................. 3

2.2 Carryover Items ...................................................................................................................... 10

3 Evaluation of Outcomes ............................................................................................................... 11 3.1 General Results....................................................................................................................... 11

3.2 EE and CpE Common Outcomes ........................................................................................... 11

3.3 CpE-Specific Outcomes.......................................................................................................... 20

3.4 EE-Specific Outcomes............................................................................................................ 24

4 Evaluation of Outcomes Assessment Process............................................................................. 26 4.1 Evaluation Relative to External Criteria................................................................................. 26

4.1.1 Relative to ABET Criteria .............................................................................................. 26

4.1.2 Relative to Other Criteria................................................................................................ 26

4.1.3 Relative to Department-Defined Target Attributes ........................................................ 26

4.2 Summary of Assessment Process ........................................................................................... 27

5 Compilation of Problem Items .................................................................................................... 28

6 Action Items .................................................................................................................................. 30 6.1 Definition of Action Items...................................................................................................... 30

6.2 Execution of Action Items ...................................................................................................... 32

7 Assessment Summary................................................................................................................... 34 7.1 Coverage of Assessment Process............................................................................................ 34

7.1.1 Computer Engineering.................................................................................................... 34

7.1.2 Electrical Engineering .................................................................................................... 34

7.2 Outcome and Process Rankings.............................................................................................. 36

8 References...................................................................................................................................... 38

MTU/ECE/RMK 3 Approved by ECE Faculty – 10/5/2004

1 Introduction In accordance with the ECE Department Standard Operating Procedure on Undergraduate Program Assessment and Control (UPAC SOP) [1, Sec 7], this report presents the results of the Outcomes Assessment Process for academic year 2003-2004. According to the SOP, the Alpha Draft version is submitted by the Assessment Coordinator (AC) to the Undergraduate Program Committee (UPC). The UPC reviews and revises the Alpha Draft version to produce the Beta Draft version for submission to the ECE faculty. Subsequently, the faculty reviews and revises the Beta Draft, producing the Final Release of the document.

Section 2 reports the status of actions recommended in the previous year’s annual report. Section 3 summarizes the results of assessment instruments executed during AY 2003-2004 and evaluates outcome execution. Section 4 summarizes our assessment of the assessment process itself, relative to various criteria. Section 5 presents a list of problem items in need of action, compiled from the previous sections. Each problem is assigned as Relative Priority Number (RPN), using the methods described in reference [1, Sec 7.2]. Section 6 lists action items mandated by the ECE faculty to deal with the problems listed in Section 5. Finally, Section 7 summarizes our performance relative to the program outcomes and the assessment process itself. Detailed analyses of individual instruments are contained in the Appendices.

2 Previous Action Items This section lists the action items mandated by last year’s Annual Outcomes Assessment Report [2, Sec 6] and presents their completion status. Those items whose RPNs were low enough to be labeled as “Optional” are individually identified.

Incomplete action items are not automatically carried over to the next year. Rather, incomplete actions deemed worthy of carry-over must be individually specified, and are documented in Subsection 2.2. A summary of completion status is listed in Table 2.1.

Table 2.1: Summary of Action Item Status

Total Number of Action Items 21Number of Mandatory Action Items 21Number of Mandatory Items Completed 19Pcnt of Mandatory Items Completed 90%Highest RPN for Incomplete Items 729

2.1 Status of Individual Action Items

1. [For Problem 1] The UPC shall consult with all relevant faculty members to:

a. Identify the causes of and implement solutions for PRE-reported dissatisfaction with applied mathematics topics, including differential equations and frequency domain analysis,

MTU/ECE/RMK 4 Approved by ECE Faculty – 10/5/2004

Status: Complete. This was found to actually be a misunderstanding of precisely what the students were taught regarding differential equations as applied to R-L-C circuits in EE2110. The solution was a revision of the EE-2110 course spec to clarify the topics actually covered. See item 5 below for further details.

b. Modify course specifications as needed to offer opportunities to demonstrate outcome EE(n), knowledge of advanced mathematics and make those offerings visible to the COVer instrument, Status: Partially Complete. No From examination of the course binders, it is clear that knowledge of advanced mathematics is demonstrated in nearly all of the courses in the department, but is simply not specifically listed as an outcome. The UPC and lead instructors need to identify these topics and list EE(n) in the appropriate course specs.

c. Modify course specifications to ensure that EE majors actually have the opportunity to demonstrate outcome EE(l), knowledge of probability and statistics, including applications appropriate for electrical engineers, and make those offerings visible to the COVer instrument. Status: Partially Complete. There are courses in the department that make use of probability and statistics and present problems, including applications. A lecture period in EE4900 is devoted to the proper use of statistical analysis in engineering design, and students are told to make use of these tools as appropriate in EE4901/EE4910, and to include statistical data and arguments in their design reports. This will be included in senior design starting in the fall of 2004. The department requires that EE students take either MA3720 (Probability) or MA3710 (Engineering Statistics). In reviewing the textbooks and syllabi of these two courses, it is clear that both courses have some probability and some statistics. The Associate Chair for Electrical Engineering met with Professor Allan Struthers of the math department and reviewed both the text and the syllabi. Professor Struthers noted that both courses cover material and contain examples that are relevant to electrical engineers. The courses, especially Engineering Statistics, are taught from an engineering perspective, rather than a mathematical perspective.

2. [For Problems 2, 5, 10] The UPC shall collaborate with the laboratory coordinator to update the laboratory course specifications to formalize and expose to the assessment process:

a. Experiments (such as the “circuit X” experiment) that specifically address outcome (b), the ability to design and conduct experiments, as well as to analyze and interpret data,

b. Increased opportunities for open-ended design in the labs.

c. Requirements for formal technical writing in the lab courses.

Status: completed. Lab course specs have been updated to reveal the content specified.

MTU/ECE/RMK 5 Approved by ECE Faculty – 10/5/2004

3. [For Problem 3] The Associate chair for CpE shall consult with relevant faculty to investigate the discrepancies in the coverage assigned to specific courses for EE and CpE outcomes (b), (c), and (e), as revealed by the COVer instrument, and draft

a. revisions to the affected course specs,

b. revisions to the CpE Outcomes statements so that they are stated in a less overly-restrictive manner.

Status: Completed. Course specs updated 05/01/04. Outcome statements revised 05/20/04.

4. [For Problem 4] The UPC shall modify the Senior Surveys, Alumni Surveys, and Senior Design Sponsors’ assessments, to enhanced assessment of (a) ability to apply knowledge of mathematics, science and engineering.

Status: Completed. Instruments revised 10/24/03.

5. [For Problem 5, 16] The UPC shall consult with all relevant faculty members, and the lab coordinator to identify the causes of and implement solutions for:

a. The widespread dissatisfaction with student knowledge in the area of Circuits Electronics, and Circuit Design identified by multiple instruments,

b. Dissatisfaction with student knowledge in the area Signals and Systems, that may be related to the weakness in Circuits and Electronics,

c. Dissatisfaction with student knowledge of Boolean Algebra and Combinational Logic.

Status: Completed. The Associate Chairs convened meetings of the relevant faculty members to discuss these topics. Two problems were uncovered (1) ambiguities in the course specs, (2) differing interpretations among the faculty over those ambiguities. Revisions to the course specs were drafted for EE-2110 and 2150 to eliminate these ambiguities. The Associate Chair for Electrical Engineering then circulated a memo to all concerned instructors [3] and solicited their approval of the proposed changes. Finally, the course spec revisions were submitted to the entire faculty for approval.

6. [For Problem 6] The Associate Chair for CpE shall consult with the CpE faculty to make the following revisions to the CpE PreCAP Courses and their Assessment Instruments:

a. Revise EE-3173 labs to give more opportunities for open-ended design in the context of integrated hardware/software systems.

b. Revise the EE-3173, 3175, and/or 3970 course specs to improve depth and breadth of coverage for topics identified in Appendix H.

c. Add evaluation of outcome (k) ability to use tools and techniques… to the EE-3173 assessment rubric.

d. Devise a standard algorithm for selecting samples for PreCAP assessment and publish it in the UPAC SOP [1, Sec 6.8].

Status: Completed. The associate chair for CpE convened meetings of the CpE program faculty to discuss the issues and revise the course specs as needed (completed by 06/04/04). The EE-3173

MTU/ECE/RMK 6 Approved by ECE Faculty – 10/5/2004

rubric was revised 05/20/04. The sample selection algorithm was entered into the UPAC SOP on 05/20/04.

7. [For Problems 7, 8] With regard to outcome (d) ability to function on a multi-disciplinary team, the UPC shall add assessment of outcome (d) to the Senior Design assessment tools, including assessment of the newly-defined skill set contributing to outcome EE(d).

Status: Completed. Changes entered 10/24/03.

8. [For Problem 9] The UPC shall devise a delivery method and assessment instrument to assess outcome (f) understanding of professional and ethical responsibility for EE majors.

Status: Completed. It was found that ethics are covered in the EE program and that this was not well documented. In addition, the senior surveys asked if students could, “respond to a professional ethical dilemma in accordance with the IEEE code of conduct”. This outcome ranked very low for EE majors, because only CpEs are taught that specific code. The following actions were taken [4]:

• Delivery of an ethics module in ENG1101 was documented in Table 2 of the UPAC SOP,

• The emphasis on ethics in EE4900 will be increased starting in the fall of 2004,

• The senior exit survey has been edited to drop the reference to the IEEE Code of Conduct, and simply ask about the ability to " respond to a professional ethical dilemma in accordance with current standards".

9. [For Problem 10] The UPC shall formalize the prototype instruments for assessing technical writing of EE majors in EE-3120. These methods shall be added to the course specification and, if necessary, the UPAC SOP [1].

Status: Completed. The EE Mid-Program Writing Assessment instrument and rubric have been defined entered into the draft of the UPAC SOP forwarded for approval with this report.

10. [For Problem 11] Regarding outcome (h), broad education necessary to understand the impact of engineering solutions in a global and societal context, the UPC shall:

a. Inform the MTU assessment coordinator and Dean of Engineering of the nature and severity of this problem,

b. Collaborate with Gen Ed personnel and experts within the College of Engineering to improve assessment and demonstrate delivery of this outcome,

c. Identify and implement actions that the department can take independent of the Gen Ed program to improve assessment and demonstrate delivery of this outcome.

Status: Mostly Completed. Following consultations with the Dean of Engineering, the Assessment Coordinator (AC), on 03/05/04, sent an email to both the dean and the Director of General Education informing both of the problems with outcomes (h) and (j) identified in last year’s annual report. Subsequently, the AC worked with the Director of General Education and the university assessment committee to correct these problems. The Dean of Engineering was kept

MTU/ECE/RMK 7 Approved by ECE Faculty – 10/5/2004

informed of all major steps. Details of the actions taken are recorded in Appendix I.2. To summarize, the major accomplishments were:

• At the suggestion of the AC, the Director of General Education adapted the ECE Course Outcome Verification (COVer) form to assess how heavily students are required to actually demonstrate each desired core concept in both UN-1002 World Cultures, and 2002 Institutions. This required first defining and articulating these concepts, and then formatting them into the instrument.

• The director of General Education has instituted new instruments that indicate that outcomes (h) and (j) are being accomplished prior to graduation. This is the first direct contradiction to the uniformly negative results returned by ECE department instruments. Additional instruments are in the planning stages, to be implemented in AY 2004-2005.

• The AC has joined with the Director of General Education as Co-PI on a grant proposal entitled Globalization, Engineering, and Public International Law (GEPIL). Its purpose is to develop a support infrastructure and course materials that can be imported into engineering courses to cover social economic and legal aspects of various engineering topics.

• The UPC identified several Social Sciences (SS) courses that appear to address precisely the requirements of outcome (h). The AC contacted the Chair of the Social Sciences department to inquire whether requiring EE and CpE students to take one of a menu of courses would be practical within the resource constraints of the SS department. Subsequently, two UPC members met with the chair and two faculty members of the SS department. The current status of this action item is:

o Blind addition of SS courses to the CpE and EE cores could easily overload SS resources, resulting in sections being closed and unavailable to the students.

o Based on their intimate knowledge of the courses’ contents, SS faculty recommended revising the draft list of courses suggested by the ECE department.

o The SS Department is committed in principal to solving this problem and identifying a set of courses that is both supportable and relevant, in time for the fall 2004 curriculum change deadline.

o Specific implementation decisions were tabled until the fall semester, when all faculty have returned.

11. [For Problem 12] Regarding outcome (j) knowledge of contemporary issues, the UPC shall:

a. Inform the MTU assessment coordinator and Dean of Engineering of the nature and severity of this problem,

b. Collaborate with Gen Ed personnel and experts within the College of Engineering to improve assessment and demonstrate delivery of this outcome,

c. Identify and implement actions that the department can take independent of the Gen Ed program to improve assessment and demonstrate delivery of this outcome.

MTU/ECE/RMK 8 Approved by ECE Faculty – 10/5/2004

Status: Completed. Actions were identical to those for item 10 above, with the exception that the proposed SS course requirement does not directly apply to outcome (j).

12. [For Problem 13] The Associate Chair for CpE shall consult with the relevant lead instructor and repair the typographical errors in the EE-3170 course specification for outcomes (h), (i), and (j).

Status: Completed.

13. [For Problem 14] The UPC shall collaborate with the Computer Science Department to:

a. Develop a second semester programming course for EE majors that teaches C++,

b. Develop courses designed to help “Computing Undecided” freshmen select a major from the five computing majors in the CS and ECE departments.

c. Develop and propose an alternative to the CoE first year curriculum that better serves the needs of students interested in computing careers, and eliminates the chronic irrelevance problem of the first year ENG courses.

Status: Completed. The CS department is initiating CS-1129, a C++ version of CS-1122 specifically for EE students.

The ECE and CS departments have jointly developed a common first year program that allows “computing undecided” students to switch between any of the two departments’ five majors at the end of the freshman year. This has been implemented for first-year students enrolled in the Computer Science Department, and advertised as a “Transfer Option” into CpE or EE at the beginning of the second year.

This program was proposed to and rejected by the College of Engineering. While, technically speaking, that proposal completes this action item, it is recommended that the proposal be presented again this year.

14. [For Problem 15] The UPC shall collaborate with the Physics department to correct the chronic problem of perceived irrelevance and non-productivity of PH-1100.

Status: Completed. A member of the UPC met with several Physics department faculty and discovered that Physics department assessments had arrived at the same conclusion as this department. The following corrective actions have already been taken, but some will not show up on the AY-2004/2005 Sophomore surveys [5].

• Changed the Physics Labs from being prerequisites to the lecture to being prerequisites OR co-requisites. This was to allow more flexibility in student schedules and to allow students to more closely link the lecture to the lab if they so desired. This was changed for the 2003-2004 academic year.

• Rewrote the labs to include introductory material and to give more guidance to the open ended questions. The students will be given a single page written introduction to each lab. In addition the students watch a ~5 min PowerPoint presentation on the computer at each lab station. This was started in the spring of 2004 and tested in the summer of 2004. It will be fully implemented By Fall 2004.

MTU/ECE/RMK 9 Approved by ECE Faculty – 10/5/2004

• Rewrote the open ended questions at the end of each lab to give more direction but at the same time make them more challenging. This was started in the spring of 2004 and tested in the summer of 2004. It will be fully implemented By Fall 2004

15. [For Problems 18, 25] The UPC shall make the following changes to the UPAC SOP [1, Sec 6]:

a. Eliminate the Specification Validation (SpecVal) instrument and upgrade other instruments, if deemed necessary,

b. Eliminate the Laboratory Assessment Summary Report, and upgrade lab course specifications to make the relevant topics visible to the COVer instrument,

c. Replace the CoE alumni survey with a department outcomes-specific survey to be administered shortly after graduation (e.g. 1 year),

d. Collaborate with the department’s undergraduate advisor to institute an instrument to give the UPC feedback on waivers granted in Degree Audits (without generating an unsustainable workload for the advisor),

e. Implement and execute a department-level instrument for assessing undergraduate advising.

Status: Completed. All actions have been implemented. Items the results for items c, d, and e are reported in Appendices C and E.

16. [For Problem 19] The UPC must take positive actions to dramatically reduce, redistribute, and/or automate the assessment workload currently borne by the AC. This problem is potentially the single biggest threat to the sustainability of the entire assessment process.

Status: Completed. The workload of the AC has been reduced to sustainable levels. The particular actions taken to implement this item are:

• The Course Specification Validation and Laboratory Assessment Summary Report instruments have been eliminated (as per item 15 above).

• Administration, execution, analysis, and evaluation of the Sophomore, Senior, and Alumni Surveys have been delegated to the department Undergraduate Advisor.

• Analysis and evaluation of other instruments was delegated to other members of the UPC.

• Spreadsheets and report templates for most assessment instruments have now stabilized.

17. [For Problem 20] The AC shall bring items 1, 2, and 3 of Table 4.1 to the attention of the UPC (in writing) for use in evaluation of program Educational Objectives.

Status: Completed.

18. [For Problem 21] The UPC shall revise the UPAC SOP [1] to eliminate Table 4 of the SOP, and mandate the use of tables analogous to Tables 3.1 and 3.2 mandated of this report.

Status: Completed.

19. [For Problem 22] The UPC shall review Tables 1 and 2 of the UPAC SOP [1], (which map outcomes to courses), and update them to be consistent with the outcomes listed in each course

MTU/ECE/RMK 10 Approved by ECE Faculty – 10/5/2004

specification. A final review shall be conducted after all course-spec changes specified in this report are complete.

Status: Completed.

20. [For Problem 23] The UPC shall establish a readily accessible “Web Presence” on the department web site, including links from the department homepage to Program Educational Objectives and other assessment data.

Status: Completed.

21. [For Problem 24] The UPC shall define metrics for measuring the quality of outcome achievement that are not “arbitrary, capricious or easily manipulable”. To achieve some focus, Table 7.3 shall serve as a starting point.

Status: Complete. The following criteria are proposed as target RPNs based on the FMEA ranking process defined in the UPAC SOP [1, Sec.7.2]. These criteria have been added as subsection 7.3 in the draft revision of the UPAC SOP forwarded with this report.

1. Any outcome that had an RPN ≤ 340 last year shall not exceed an RPN of 340 this year.

2. Fore any outcome that had an RPN of >340 last year, its RPN shall have been reduced this year.

The rational for using 340 as the cut-off RPN is that 340 < 343= 73. Thus, if the RPN ≤ 340, then at least one of the RPN components (Severity, Credibility, or Recurrence) must be ≤ 6. A review of the verbal descriptions in Subsection 7.2 of the SOP shows that below level 7, the problems become more benign.

2.2 Carryover Items Based on the status of the items in Subsection 2.1, the following carryover items are recommended.

1. From Action Items 1b, and 1c: Regarding outcomes EE(n) Advanced Mathematics, and EE(l) Probability and Statistics, The UPC has determined that these outcomes are covered in a variety of courses. This reduces the severity of the action item from a one of questionable delivery to one of correcting its documentation. Carryover Item: The UPC shall coordinate as all instructors examine their course content and document delivery of outcomes EE(l) and EE(n) in their course specs; the UPC shall then update Tables 1 and 2 of the UPAC SOP[1] as needed.

2. From Action Item 10: The UPC has initiated discussions with the Social Sciences Department regarding possible required courses to serve outcome (h). Carryover Item: The UPC shall complete the currently ongoing discussions with the Social Sciences department, specify a set of required courses to serve outcome (h), and incorporate those courses into the new course proposals in the fall 2004 revision cycle.

MTU/ECE/RMK 11 Approved by ECE Faculty – 10/5/2004

3. From Action Item 13: The College of Engineering rejected the proposed new First Year curriculum for “Computing Undecided” students.. Carryover Item: The UPC shall resubmit the new curriculum in the fall 2004 revision cycle.

3 Evaluation of Outcomes The results gathered from individual assessment instruments, and originally reported in the appendices to this report are summarized below. The assessment results are mapped to specific outcomes.

3.1 General Results App E.2: In the closed-ended questions on the Senior Exit Surveys, CpEs felt more confident than EEs in all outcomes. In fact, CpEs indicated that they are “fairly” confident to “very” confident in their capabilities for all outcomes.

App F: The FE Exam EE Subject Test showed that the overall EE test grades were relatively constant at just below 2.0, and further shows a tight clustering of topic grades right around the national average (2.0), with grades confined to the range: [1.62 ... 2.22]. Thus there were no outstandingly good or bad topics shown in the subject test.

3.2 EE and CpE Common Outcomes

(a) apply knowledge of mathematics, science and engineering. App A: In Prerequisite Exams, The single biggest area of instructor dissatisfaction covers the following Electrical Engineering topics affecting outcome (a):

1. Communication theory and electromagnetics, due solely to low scores reported by EE-4255

2. Instructor dissatisfaction with electronics was addressed this year, but not in time to show up on the PRE results (see Sec 2, item 5.a).

3. Photonics, due solely to low scores reported by EE-3190.

App B: Senior Design sponsors rated student skills and abilities in this outcome at well above 3.0 on a [0...4] scale.

App D: The lead instructors for the photonics courses (e.g. EE 3190 and 3291) should review and redistribute the material between the courses, and revise the relevant course specs.

App D: The lead instructors for EE 4221 and 4222 should review and redistribute their material and revise the relevant course specs.

App D: The UPC shall address the time-domain and z-domain topics mentioned by the EE-3160 instructor.

App D: The UPC shall consider whether EE-3120 or EE-3160 shall be added as a prerequisite to EE-3221.

App E.2: Senior Exit Surveys showed that EEs felt much less confident than CpEs regarding this outcome.

MTU/ECE/RMK 12 Approved by ECE Faculty – 10/5/2004

App E.2: In Senior Exit Surveys, the open-ended responses of seniors were critical of the laboratory experience. They were considered to be poorly organized, poorly equipped, and lacking in design-related hands-on experiences. Other comments received indicated that the students would like to have more hands-on courses and learn more about the practical applications of what they learn in class. Since this class of seniors took the core laboratory courses, significant improvements have been and are being made that address the type of complaints voiced in this instrument.

App: E.3: In Alumni Surveys, ability (a3) apply knowledge of engineering topics to real-world problems was ranked as highly important by both majors, but received the second most negative residuals. By contrast, the two associated abilities (a1) apply knowledge of mathematics and (a2) apply knowledge of physical sciences faired considerably closer to average in both importance and preparation.

App F: In the FE Exam EE Subject Test, the three weakest areas were Solid State Electronics & Devices, Digital Systems, and Analog Electronic Circuits, indicating minor weaknesses in Outcome (a).

Conclusions: Due to its generality it is difficult to apply a direct reporting measure across this entire outcome. However, the following conclusions can be drawn.

1. Senior Design sponsors seem well satisfied with student capabilities for this outcome, although the alumni feel under prepared to actually apply their knowledge of engineering topics.

2. Several of the items mentioned above are matters of inter-course coordination, which can be solved by bringing together the relevant instructors

3. Seniors highlighted a lack of practical hands-on or design-oriented lab experiences, which corroborates the alumni opinion that they are under prepared to apply their knowledge. However, since these two classes took the core laboratory courses, significant improvements have been and are still being made that address the type of complaints voiced.

(b) design and conduct experiments, as well as to analyze and interpret data. App E.2: Senior Surveys indicate confidence in this outcome at or above average for both majors.

App E.3: Alumni Surveys show that for both majors, the importance of this outcome is about average, while the residuals are near zero.

CpE Only: App G: PreCAP assessment of EE-3970 written assignments showed significant decreases in the demonstration of this outcome. Specifically, students were sloppier in the execution, analysis, and exposition of their team projects than in the previous year.

Conclusions:

MTU/ECE/RMK 13 Approved by ECE Faculty – 10/5/2004

Senior and Alumni surveys concur that the level of preparation for this outcome is adequate for its importance.

The level of demonstration of this outcome in EE-3970 was considerably lower than the previous year. Examination of the EE-3970 Course Specification reveals that the level of rigor required by the PreCAP assessment rubrics is not clearly stated in the specification. This ambiguity should be corrected to provide future instructors with adequate guidance on the depth of knowledge required to be demonstrated.

(c) design a system, component, or process to meet desired needs. App B: Senior Design sponsors rated student skills and abilities in this outcome at well above 3.0 on a [0...4] scale. The only question to receive less than a 3.0 mean score was “How well did the project deliverables achieve your overall goals?” This question received a mean score of 2.75.

App E.2: In Senior Exit Survey closed-ended questions, this outcome received the lowest ranking for EEs and the second lowest ranking for CpEs. Since this class of seniors took the core laboratory courses, significant improvements have been and are being made that address this complaint.

App E.2: In Senior Exit Surveys, the open-ended responses of seniors were critical of the laboratory experience. They were considered to be poorly organized, poorly equipped, and lacking in design-related hands-on experiences. Other comments received indicated that the students would like to have more hands-on courses and learn more about the practical applications of what they learn in class. Since this class of seniors took the core laboratory courses, significant improvements have been and are being made that address the type of complaints voiced in this instrument.

App E.3: In Alumni Surveys, this outcome received the most negative residual values from both majors, despite ranking as the third most important ability for CpEs.

App I.3: Enterprise is an effective substitute for senior design in the delivery of this outcome.

App I.4: In Engineering Fundamentals, simple design projects were assigned and evaluated according to a rubric. The mean assessment score reported for Spring 2004 was 55.4 out of a possible 63 points. No data has been reported for previous years.

CpE Only:

App G: PreCAP assessment of EE-3173 and EE-3175 assignments showed significant improvements in the demonstration of this outcome. Both the flexibility of the assignments, and depth of knowledge demonstrated increased from last year.

Conclusions: Senior design sponsors seem well satisfied with the students’ abilities in this outcome.

Both seniors and alumni have a relatively low confidence in their abilities. However, since these two classes took the core laboratory courses, as well as EE-3173 and 3175, significant improvements have been and are still being made that address the type of complaints voiced.

MTU/ECE/RMK 14 Approved by ECE Faculty – 10/5/2004

(d) ability to function on a multi-disciplinary team. App B: Senior Design sponsors rated student skills and abilities in this outcome at well above 3.0 on a [0...4] scale.

App E.2: In Senior Exit Surveys, seniors in both majors ranked this outcome as their highest in confidence level.

App: E.3: In Alumni Surveys, all three components for the ability to function in a multidisciplinary team received mildly non-positive residuals, despite their relatively high importance. In particular, for ability (d3) Execute a project to completion & produce the required deliverables EEs felt significantly less prepared than CpEs..

App I.3: Enterprise is an effective substitute for senior design in the delivery of this outcome.

App I.4: In Engineering Fundamentals, teaming was evaluated by self and peer evaluation, in which over 80% of students ranked the “functionality” of their teams highly.

CpE Only: App G: PreCAP assessment of EE-3970 written assignments showed significant decreases in the demonstration of this outcome. Specifically, the final project assignment did not require demonstration of all teamwork skills required in the assessment rubric.

Conclusions: Both seniors and Senior Design sponsors ranked students’ abilities in this outcome as very high, while alumni gave this outcome mildly negative residuals. Thus, there does not appear to be a problem with the current delivery of this outcome.

PreCAP assessment of EE-3970 showed that the final project assignments did not require demonstration of all teamwork skills required by the rubric. Examination of the EE-3970 Course Specification reveals that the level of rigor required by the PreCAP assessment rubrics is not clearly stated in the specification. This ambiguity should be corrected to provide future instructors with adequate guidance on these topics.

(e) identify, formulate and solve engg problems. App B: Senior Design sponsors rated student skills and abilities in this outcome at well above 3.0 on a [0...4] scale.

App E.2: In Senior Exit Surveys, this outcome received the third lowest ranking for CpEs, and ranked about average for EEs.

App E.3: Alumni ranked this outcome above average, and gave it mildly negative residuals.

App I.3: Enterprise is an effective substitute for senior design in the delivery of this outcome.

Conclusions:

MTU/ECE/RMK 15 Approved by ECE Faculty – 10/5/2004

Senior Design sponsors rated the students’ abilities in this outcome very highly. However, the seniors and alumni themselves felt less confident. Since these two classes took the core laboratory courses, significant improvements have been and are being made that address this complaint.

(f) understanding of professional and ethical responsibility. App E.2: Senior Exit Surveys showed that EEs felt much less confident than CpEs regarding this outcome.

App E.3: Alumni Surveys showed CpEs to be considerably less confident than EEs in this outcome. However, the CpE class of 2003 took the 1-credit EE-3900, rather than the 2-credit EE-3970.

App F: The FE Exam Ethics Section shows significant improvement in the ethics grades of EE students, from a marginal grade of just under 2.0 last year to a respectable grade of just under 3.0 this year.

App I.2: There is now evidence that all sections of UN-1001 (Perspectives) make minor, but documentable contributions to this outcome.

App I.4: In Engineering Fundamentals, students executed role playing assignments and performed oral presentations of case studies to the class. Initial assessment using final exam questions presented ethical dilemmas to the students; results were good, with a mean score of over 80% correct. Subsequently, assessment was changed to use rubric-based assessment of the ethics presentations, evaluated on a 0-2 scale (where 2 is best). For Spring 2004, the mean scores were well above 1.0, although some data appears to be missing from the tables. No data was reported for any other semesters.

Conclusions: Among Seniors, CpEs were much more confident than EEs, while among alumni, the opposite is true. This may or may not be related to the switch from the 1-credit EE-3900, to the 2-credit EE-3970 for CpEs. Nonetheless, the FE Exam results showed good performance in this outcome for both majors.

It may now be credible to list UN-1001 and/or ENG-1102 as making a minor contribution to this outcome.

(g) communicate effectively.

App B: Senior Design sponsors reported that all aspects of both the visual and spoken portions of the presentations averaged well above 3.0 on a [0...4] scale.

App B: Senior Design Sponsors reported mean scores for all aspects of the written reports well above 3.0 on a [0...4] scale.

App E.2: In Senior Exit Surveys, both majors ranked their written and oral communications skills at or above a high 4.0 on a [1...5] scale.

App E.3: In Alumni Surveys written and oral communication yielded residuals near zero. From a historical standpoint, this is a significant improvement.

MTU/ECE/RMK 16 Approved by ECE Faculty – 10/5/2004

App I.2: There is now evidence that all sections of UN-1001 (Perspectives) make minor, but documentable contributions to this outcome.

App I.2: There is no evidence that UN-2001 (Revisions) actually delivers toward outcome (g). However, new assessment instruments are being developed for implementation in AY-2004/2005.

App I.3: Enterprise assesses outcome (g) by sampling simple memoranda. Although it was alluded to, no direct results regarding the written final design reports were presented. The single most significant problem reported was “grammar and mechanics”. Oral communication skills were deemed to be “adequate”.

App I.4: Written Communication assessment in Engineering Fundamentals was limited to rubric-based assessment of simple Memoranda using a 0-2 scale (where 2 is best). Although no tabular data was presented in the report, the main text stated that mean scores were in the range of 1.05 ... 1.33, indicating general satisfaction with the results.

CpE Only: App G: PreCAP assessment of EE-3970 written assignments showed significant decreases in the demonstration of this outcome. Specifically: (1)written assignments were too short and simplistic to demonstrate all desired attributes, (2) grammar and writing style have not improved since last year, (3) the ability to properly use references and citations showed some degradation.

EE Only:

App I.5: EE-3120 writing assessment indicates that EE students meet our expectations in their ability to write a technical report. The only area in which they show a minor weakness is in Grammar, Punctuation, and Spelling.

Conclusions: Oral presentation skills for both majors were assessed as excellent by the Senior Design sponsors and by outgoing seniors, and as adequate by alumni.

Technical writing skills of the best writers on each team were assessed as excellent by Senior Design sponsors. In addition, seniors in both majors felt very confident in their writing skills, while alumni felt adequately prepared relative to importance.

The EE-3120 written assignments indicate that EE students generally meet our expectations in their ability to write a technical report.

The EE-3970 written assignments were too short and simplistic to demonstrate all desired attributes stated in the assessment rubric, Examination of the EE-3970 Course Specification reveals that the level of rigor required by the PreCAP assessment rubrics is not clearly stated in the specification. This ambiguity should be corrected to provide future instructors with adequate guidance on these topics.

It may now be credible to list UN-1001 (Perspectives) as making a minor contribution to this outcome.

MTU/ECE/RMK 17 Approved by ECE Faculty – 10/5/2004

To summarize, the improvement in writing skills observed last year seems to have been sustained. These results are in sharp contrast to the poor results identified in previous years’ assessments.

(h) broad education necessary to understand the impact of engineering solutions in a global and societal context.

App E.1: In Sophomore Surveys, UN 1001 and 1002 received the lowest possible rankings (2 and 1), respectively from EEs, and a more modest rankings of 5 and 5, respectively from CpEs. Both majors gave UN-1002 their lowest ranking in Productivity, while EEs also gave it their lowest ranking in Efficiency. Of the 7 written comments directed at these courses, most of them labeled these courses as worthless and a “waste of time”.

App E.2: In Senior Exit Surveys, this outcome received the lowest confidence ranking for CpEs and the second-lowest for EEs.

App E.3: Alumni surveys showed that EEs ranked this outcome as low in importance, while CpEs ranked it near average. Both majors gave it residuals near zero.

App I.2: There is now evidence that students do, in fact, receive a sound general education, including demonstration of this outcome. However, engineering seniors perceive that they have not received these skills. Additional instruments planned for AY-2004/2005 will test this conclusion in further depth.

Conclusions: Overall performance in this outcome has improved since last year. While seniors had very little confidence in their abilities, alumni ranked their preparation as adequate to the level of importance. In addition, new Gen Ed assessment instruments indicate that the outcome is being delivered even though seniors may not realize it.

While there have been improvements in the assessment of this outcome, Gen Ed assessment is still in its early stages. In addition, there is evidence that UN-1002 (World Cultures) does not contribute to this outcome. Continued collaboration with the Director of Gen Ed should keep this outcome moving in the right direction.

(i) recognition of the need for, and an ability to engage in life-long learning. App E.2: In Senior Exit Surveys, over 79% of seniors in both majors either strongly or mildly expect to take continuing education courses throughout their careers. In addition, 33% of EEs, but only 9% of CpEs either strongly or mildly plan to seek an advanced degree.

App E.3: In Alumni Surveys, both majors ranked abilities (i1) succeed in graduate-level course work and (i2) conduct independent graduate-level research as the two skills of lowest importance. Accordingly, these abilities received the two highest residuals, indicating alumni felt over-prepared for graduate studies. This result makes sense for recent graduates getting started in industry.

Conclusions:

MTU/ECE/RMK 18 Approved by ECE Faculty – 10/5/2004

Although the number of seniors planning to immediately enter graduate school has declined since last year, a high percentage of seniors stated an intent to take continuing education courses. In addition, alumni gave this outcome their highest residuals. Combined, these facts indicate a very good accomplishment of this outcome.

(j) knowledge of contemporary issues. App E.2: Senior Exit Surveys show a high level of confidence in Outcome (j1), ability to work with people of different cultural backgrounds, but a low confidence in outcome (j2), knowledge of the nature and role of institutions in shaping today’s world.

App E.3: Alumni Surveys showed that both majors ranked outcome (j) as slightly below average in importance. CpEs gave (j1) ability to work with people of different cultural backgrounds a high residual. Otherwise, all residuals were close to zero.

App I.2: There is now evidence that students do, in fact, receive a sound general education, including demonstration of this outcome. However, engineering seniors perceive that they have not received these skills. Additional instruments planned for AY-2004/2005 will test this conclusion in further depth. However,

• Pre/post tests in UN-1002 (World Cultures) indicate that the course may not be contributing to its assigned outcomes.

• Course-specific assessment for UN-2002 (Institutions) consists of only the one indirect instrument imposed by the Director of General Education. The report states that there is considerable variability in the depth to which different instructors require demonstration of different outcomes. This result indicates that some sections may not be contributing to the outcomes of this course

Conclusions: Overall performance in this outcome has improved since last year. For the first time, the results of the Senior Exit Surveys and Alumni Surveys are not completely negative, and show some perception that this outcome is being achieved. This perception is backed up by other Gen Ed assessment instruments that corroborate the conclusion.

While there have been improvements in the assessment of this outcome, Gen Ed assessment is still in its early stages. In addition, there is evidence that UN-1002 (World Cultures) and UN-2002 (Institutions) may not be contributing to this outcome. Continued collaboration with the Director of Gen Ed should keep this outcome moving in the right direction.

(k) ability to use the techniques, skills, and modern engineering tools… App B: Senior Design sponsors rated student skills and abilities in this outcome at well above 3.0 on a [0...4] scale.

App B: 71% of senior design teams were required to use at least one industry or government standard, and 94% needed to use commercially available hardware or software tools.

MTU/ECE/RMK 19 Approved by ECE Faculty – 10/5/2004

App E.2: In Senior Exit Surveys, CpEs ranked their mean confidence level regarding this outcome as a high 4.42 on a [1..5] scale. By contrast EEs ranked it at a somewhat lower 3.71.

App E.1: In Sophomore Surveys, ENG 1101 and 1102 received the lowest overall rankings (2 and 1, respectively) from CpEs and a more modest 5 and 4 from EEs. In addition, CpEs gave them the two lowest scores for efficiency, indicating that they provided “the least gain for the most pain”. In addition, these courses received more written comments than any other courses, all of which were negative. The overwhelming consensus of the written responses is the same as it has been for the previous two years: that these courses serve no purpose, have no relevancy to EE or CpE, and the work associated with these courses involves “busy work” and is more applicable to mechanical engineering students.

Based on the previous two years’ outcomes assessments, this department has chosen not to rely on these courses to contribute to any outcomes for either the CpE or EE program. As discussed in last year’s annual report, their chronic failure to deliver meaningful education can no longer be considered a minor inconvenience. In the bigger picture, the credit hours wasted in these courses must be considered as contributing to the detriment of outcomes that could be better served if they were replaced with relevant courses. In particular, there is no room for expansion or improvement of coverage in the CpE curriculum because all 128 credits are spoken for.

App E.2: In Senior Exit Surveys, some seniors complained that the freshman engineering courses ENG-1102, and ENG-1102 were “worthless”, “a big waste of time”, and should be dropped

App I.4: While ENG-1101 and ENG-1102 do provide instruction in this outcome, the evidence in the Engineering Fundamental Assessment Report [6] that the outcome is actually accomplished is incomplete. While raw data tables are provided, no evaluation of the statistics was presented. Nonetheless, the report claimed “students have a solid understanding of the use and selection of the computational tools they are using.” This missing analysis leaves open questions. In addition, many of the “tools and techniques” taught in ENG-1101 and 1102 are not relevant to EEs or CpEs.

App H: The UAC concurred with development of the computing undecided freshman option, stating that it “should help alleviate some of the problems associated with the discontent of the students of the department with regards to the engineering fundamentals program.”

App E.3: In Alumni Surveys, EEs ranked ability (k2) utilize modern software tools & techniques of {electrical or computer} engineering as the single most important skill, while ranking ability (k1) utilize modern hardware tools & instruments of {electrical or computer} engineering as considerably less important. By contrast, CpEs ranked both of these skills closer to average

App I.3: Enterprise does not assess outcome (EE k), as is done in senior design. It may therefore be necessary to reduce the level of dependency on Senior Design for this outcome, and further distribute it throughout the EE curriculum.

CpE Only: App G: PreCAP assessment of EE-3173 and EE-3175 assignments showed significant improvements in the demonstration of this outcome. Both the flexibility of the assignments, and depth of knowledge demonstrated increased from last year. In particular, EE-3175 doubled the number of simulation methods employed. Further improvements can be achieved by increasing the complexity of the modeling assignments.

MTU/ECE/RMK 20 Approved by ECE Faculty – 10/5/2004

EE Only: App E.3: In Alumni Surveys, EEs gave abilities (k1) utilize modern hardware tools & instruments of {electrical or computer} engineering and (k2) utilize modern software tools & techniques of {electrical or computer} engineering the third most negative residuals

Conclusions: The overall results from Senior Design sponsors and advisors are very favorable.

CpE seniors and alumni feel adequately prepared for this outcome, a conclusion that is corroborated by the PreCAP Assessments of EE-3173 and EE-3175.

By contrast, EE seniors and alumni are considerably less confident about their preparation for this outcome. However, since these two classes took the core laboratory courses, significant improvements have been and are being made that address this complaint.

Sophomores Surveys of CpEs and comments on Seniors Surveys concur that ENG-1101 and 1102 may not be relevant to our majors. Furthermore, the Engineering Fundamentals Department's own assessment shows that those courses spend considerable time on several "tools and techniques" that are irrelevant to either major.

3.3 CpE-Specific Outcomes

(l) ability to function in a major design experience incorporating most of: economic, environmental, sustainability, manufacturability, ethical, health and safety, social, and political considerations.

App C: All students are required to fulfill a major design experience. This is done via senior design or enterprise and we do not make any exceptions to this requirement.

App B: Senior Design sponsors were generally satisfied with the results and expressed interest in participating again. However, the only question to receive less than a 3.0 mean score was “How well did the project deliverables achieve your overall goals?” This question received a mean score of 2.75.

App B: Of the nine considerations listed, the majority were of non-trivial importance to the majority of teams. As shown in Table B.3, economic benefit to the sponsor and project budget limits were important to 100% of the teams, while sustainability, manufacturability, ethical issues, health or safety issues, and societal impacts, were of major concern to the majority of teams.

App I.3: Enterprise is an effective substitute for senior design in the delivery of this outcome.

Conclusions: In general, the senior design program does very well at delivering this outcome. Sponsors are generally very pleased with the results, and students have opportunity to consider most of the nine considerations listed, even though their importance to individual projects varies.

MTU/ECE/RMK 21 Approved by ECE Faculty – 10/5/2004

(m) completion of: 1 year of mathematics and basic sciences, including experimental experience.

The CpE curriculum contains 33 credits (1.03 years) of math and sciences courses. Completion is documented by the degree audit forms executed for each student.

App C: There is no evidence that any students have completed less than 32 credits of mathematics and basic sciences.

App E.1: In Sophomore Surveys, numerical scores for CH-1100 were about average. The comments cited lack of relevance.

App E.1: In Sophomore Surveys, PH-1100 and 2100 scored 4 and 7 out of 10, respectively in overall rankings. The few written comments on these courses indicate that the courses are too hard and that PH1100 is useless. Actions have already been taken to improve these courses (see Sec 2, item 14).

Conclusions: This outcome is being satisfactorily accomplished.

(n) completion of 1.5 years of engineering topics;

The CpE curriculum contains 52 credits (1.63 years) of engineering topics courses. Completion is documented by the degree audit forms executed for each student.

App C: There is no evidence that any student has completed less than 48 credits of engineering topics.

App D: add this outcome to the Course Specs for EE-2303, 2304, 3130, 3160, and 3305.

Conclusions: This outcome is being satisfactorily accomplished, although for documentation purposes, it needs to be added to the Course Specs for EE-2303, 2304, 3130, 3160, and 3305.

(o) completion of a broad, non-technical general education component. All MTU students are required to take 28 credits of “general education” courses. Completion is documented by the degree audit forms executed for each student.

App C: In the past, a few students inadvertently took too many lower division Gen Ed courses instead of the required number of upper division courses. However, the general Gen Ed requirements were still met. This problem has been corrected.

Conclusions:

MTU/ECE/RMK 22 Approved by ECE Faculty – 10/5/2004

This outcome is being accomplished, although there is still room for improvement (see Sec 3.2, outcomes (h) and (j).

(p) knowledge of discrete math, probability & statistics, calculus, diff. equations, linear algebra, basic sciences, computer science, and engineering sciences necessary to analyze and design complex electrical and electronic devices, software, and systems containing both hardware and software.

App A: In Prerequisite Exams, there is general dissatisfaction directed toward variety of applied mathematics topics such as probability and statistics, differential equations, and discrete system analysis. These topics have a negative impact on CpE outcome (p) and EE outcome (l) and (n). Two patterns have emerged:

1. There is a wide disagreement between instructors regarding preparation for some topics.

2. To date, problems in this area have been addressed by adjusting the applied mathematics portions of EE courses. It would now appear necessary to open a dialogue with the mathematics department concerning this issue.

App E.1: In Sophomore Surveys, MA-1160 and MA2160 were highly ranked (8 and 9 out of 10) for Productivity, Efficiency and in the overall ranking. We did not receive any written comments on either of these courses.

App F: The Fundamentals of Engineering Exam Math Section shows a 24% drop in this grade for EE students from an acceptable 2.32 last year, to an unacceptable 1.76 this year. Although the CpE sample size was too small to be statistically significant, their results closely mirrored the EE results.

Conclusions: While mild dissatisfaction with various mathematical topics has been a minor problem for years, there appears to be a sudden drop in the quality of delivery for this outcome. A serious department-wide investigation, in collaboration with the Mathematics department would seem to be in order

(q) proficiency in computer programming, data structures, algorithms, and numerical methods, including familiarity with at least 2 high-level languages, 2 assembly languages, and 1 hardware description language.

The curriculum contains courses spanning all of these topics. Students learn 3 high-level languages (Java, C++, and FORTRAN), two assembly languages (MIPS-R3000 and Altera NIOS), and one hardware description language (Verilog)

App A: In Prerequisite Exams, The single topic of “Computer Programming” negatively affected outcome CpE(q), as reported in EE-3173 and 3970.

App E.1: In Sophomore Surveys, CS-1121 achieved the highest ranking of 10 in Productivity, Efficiency and in the overall rankings. The comments indicate that Java is not a useful language for CpE and EE majors; C would be much more useful.

MTU/ECE/RMK 23 Approved by ECE Faculty – 10/5/2004

App E.1: In Sophomore Surveys, ENG 1101 and 1102 received the lowest overall rankings (2 and 1 out of 10, respectively). In addition, they had the two lowest scores for efficiency, indicating that they provided “the least gain for the most pain”. In addition, these courses received more written comments than any other courses, all of which were negative. The overwhelming consensus was that these courses serve no purpose, have no relevancy to EE or CpE, and the work associated with these courses involves a lot of busy work and is more applicable to mechanical engineering students.

As listed in Tables 1 and 2 of reference [1], these courses do not contribute to any outcomes for either the CpE or EE program (a conclusion reinforced by the previous two years’ outcomes assessments). As discussed in last year’s annual report, their chronic failure to deliver meaningful education can no longer be considered a minor inconvenience. In the bigger picture, the credit hours wasted in these courses must be considered as contributing to the detriment of outcomes that could be served if they were replaced with relevant courses. In particular, there is no room for expansion or improvement of coverage in the CpE curriculum because all 128 credits are spoken for.

App H: The UAC concurred with development of the computing undecided freshman option, stating that it “should help alleviate some of the problems associated with the discontent of the students of the department with regards to the engineering fundamentals program.”

App E.3: In Alumni Surveys, both majors ranked ability (q2) write and debug assembler language programs as the third-lowest importance. However, CpEs gave (q1) write & debug High-level Language programs the third most negative residual, despite ranking it as the fourth most important item

Conclusions: Prerequisite exams and alumni surveys both indicate that despite its importance, high-level language programming skills are not being learned to the level desired. The UPC needs to investigate this problem further. Direction of the additional electives that transfer student from the CS department would have available could be used to address this problem. The CS department has appropriate courses available. In addition, elimination of ENG-1101 and 1102 would also free up credits that could be used to address this problem.

(r) depth of knowledge indicated by completion of courses whose prerequisite precedence graph has a depth of at least 3 sequential courses, and by at least 6 credits in one technical elective focus track,

The following required courses are preceded by a prerequisite chain at least three courses deep:

CS-2141, CS-3421, CS-4411, EE-3130, EE-3303, EE-3160, EE-3173, EE-3175, EE-3970, EE-4900, EE-4901, EE-4910.

Conclusions: This outcome is being accomplished. Completion is documented by the degree audit forms executed for each student.

MTU/ECE/RMK 24 Approved by ECE Faculty – 10/5/2004

(s) breadth of knowledge spanning the areas of electronics, signal processing, logic design, computer architecture, operating systems, fundamentals of computer science, and technical electives spanning more than one focus track.

The curriculum contains at least one course for each of the topics listed.

App A: In Prerequisite Exams, The topic of Boolean Algebra and Combinational Logic adversely affected this outcome. This topic was addressed this year, but not in time to show up on the PRE results (see Sec. 2, item 5.c).

Conclusions: This outcome is being accomplished. Completion is documented by the degree audit forms executed for each student.

3.4 EE-Specific Outcomes (l) knowledge of probability and statistics, including applications appropriate for electrical engineers.

App A: In Prerequisite Exams, there is some dissatisfaction directed toward variety of applied mathematics topics such as probability and statistics, differential equations, and discrete system analysis. These topics have a negative impact on CpE outcome (p) and EE outcome (l) and (n). Two patterns have emerged:

1. There is a wide disagreement between instructors regarding preparation for some topics.

2. The scores from sophomores are lower than those for seniors, indicating that students get better with experience.

3. To date, problems in this area have been addressed by adjusting the applied mathematics portions of EE courses. It would now appear necessary to open a dialogue with the mathematics department concerning this issue.

App F: The Fundamentals of Engineering Exam Math Section: shows a 24% drop in this grade from an acceptable 2.32 last year, to an unacceptable 1.76 this year.

.Conclusions:

While mild dissatisfaction with various mathematical topics has been a minor problem for years, there appears to be a drop in the quality of delivery for this outcome, especially as noted the FE exam. A department-wide investigation, in collaboration with the Mathematics department would seem to be in order

(m) knowledge of mathematics through differential and integral calculus, basic sciences, and engineering sciences necessary to analyze and design complex electrical and electronic devices and systems containing electrical and electronic components.

App D: Add this outcome to the course spec for EE-3130.

MTU/ECE/RMK 25 Approved by ECE Faculty – 10/5/2004

App E.1: In Sophomore Surveys, MA-1160 and MA2160 were highly ranked (9 and 10 out of 10) for Productivity, Efficiency and in the overall ranking. We did not receive any written comments on either of these courses.

App E.1: In Sophomore Surveys, numerical scores for CH-1100 were below average. The comments cited lack of relevance.

App E.1: In Sophomore Surveys, PH-1100 and 2100 scored 6 and 8 out of 10, respectively in overall rankings. The few written comments on these courses indicate that the courses are too hard and that PH1100 is useless.

App F: The Fundamentals of Engineering Exam Math Section: shows a 24% drop in this grade from an acceptable 2.32 last year, to an unacceptable 1.76 this year.

Conclusions: While mild dissatisfaction with various mathematical topics has been a minor problem for years, there appears to be a drop in the quality of delivery for this outcome, especially as noted the FE exam. A department-wide investigation, in collaboration with the Mathematics department would seem to be in order

This outcome may be added to the course spec for EE-3130.

(n) knowledge of advanced mathematics, typically including differential equations, linear algebra, complex variables, and discrete mathematics.

App A: In Prerequisite Exams, there is general dissatisfaction directed toward variety of applied mathematics topics such as probability and statistics, differential equations, and discrete system analysis. Two patterns have emerged:

1. There is a wide disagreement between instructors regarding preparation for some topics.

2. To date, problems in this area have been addressed by adjusting the applied mathematics portions of EE courses. It would now appear necessary to open a dialogue with the mathematics department concerning this issue.

App F: The Fundamentals of Engineering Exam Math Section: shows a 24% drop in this grade from an acceptable 2.32 last year, to an unacceptable 1.76 this year.

Conclusions: While mild dissatisfaction with various mathematical topics has been a minor problem for years, there appears to be a drop in the quality of delivery for this outcome, especially as noted the FE exam. A department-wide investigation, in collaboration with the Mathematics department would seem to be in order

MTU/ECE/RMK 26 Approved by ECE Faculty – 10/5/2004

4 Evaluation of Outcomes Assessment Process 4.1 Evaluation Relative to External Criteria In this section, the assessment process is evaluated relative to the three criteria specified in the UPAC SOP [1, Sec 7, item 1.c.].

4.1.1 Relative to ABET Criteria The ABET Matrix for Implementation of Assessment [7, Figure A-1] is a rubric listing six categories for evaluating assessment processes. Each category is ranked in the range [1…5] with 5 being the best. The results of our program evaluation are shown in Table 4.1.

Table 4.1: Performance relative to ABET Levels of Implementation

# Category Score Actions needed to advance to next higher level

1 EducationalObjectives 4.0 Execute the current Objectives Assessment procedures continuously, especially with

regard to alumni input.

2 Constituents 3.0 Execute the current Objectives and Outcomes Assessment procedures continuously, especially with regard to alumni input.

3 Processes 4.0 Publish our assessment procedures and evaluate whether they are good enough to be considered a "benchmark by other institutions"

4 OutcomesAssessment 5.0 N/A

5 Results 4.0 Sustain current procedures and determine the definition of "World Class".

6 System 4.0 Sustain current procedures and assist extra-departmental entities in solving their problems by a more "systematic approach".

7MeanandSummary

4.0 The most critical action is to continue working with CoE and Gen Ed to systematically address their deficiencies .

4.1.2 Relative to Other Criteria In response to widespread criticism of the Draft criteria for levels of implementation evaluated last year, NCA has discontinued use of those criteria.

4.1.3 Relative to Department-Defined Target Attributes The target attributes of [1, Subsec 5.2] the status of each are listed in Table 4.2.

MTU/ECE/RMK 27 Approved by ECE Faculty – 10/5/2004

Table 4.2: Performance Relative to Department-Defined Criteria

# Description Status Actions needed to improve compliance

1 CompletenessNearly

Satisfactory Continuous improvement in Gen Ed assessment with regard to (h) and (j)

2 Accountability Satisfactory

3 Consistency Satisfactory

4 SustainabilityApparently Satisfactory

The elimination of unproductive instruments and the delegation of some assessment duties appear to have relieved the overburden on the Assessment Coordinator. However, time will tell for sure.

5 Efficiency Satisfactory

4.2 Summary of Assessment Process This section presents comments about the ECE department assessment procedures themselves gleaned from the assessment instruments. Specifically:

1. App C: It is recommended that in the future the Advisor not be given the responsibility of evaluating the advising survey results. The Advisor should be given the evaluated results to use as a tool to improve or enhance his/her performance and advising services.

2. App C: It appears that freshman students have difficulty scheduling courses around their cohort schedules; in particular UN1001, UN1002, and CS1121, all of which are required to be taken in the first year by both EEs and CpEs. At this time, no hard statistics on this problem are available.

3. App C: Students do not find the department website useful for advising purposes.

4. App C: Since August 2003, students are generally very satisfied with the services of the department’s academic advisor.

5. App D: In the UPAC SOP, Table 2, for EE-3120, the “1” under outcome “j” should be moved to outcome ”k”.

6. App I.1: Based on observations made during the April 2004 external review, a policy should be formally specified in the UPAC SOP stating that authority for each degree program shall be vested in the program faculty, rather than the entire department faculty.

MTU/ECE/RMK 28 Approved by ECE Faculty – 10/5/2004

5 Compilation of Problem Items This section presents a compiled list of problem items, based on the assessment results presented above. Each listed item is labeled with the subsection numbers from Sections 3 and 4 identifying its origins.

In addition, each listed problem was subjected to a standard Failure Modes and Effects Analysis (FMEA) process that produces three numerical metrics for each problem item: Severity (S), Credibility of the Indicators (C), and Recurrence likelihood (R). The values of S, C, and R are integers in the range [1…10], with 10 being the most severe. These values are extracted from verbal descriptions listed in three rubrics defined in the UPAC SOP [1, Sec 7.2].

The final output of an FMEA process is a Relative Priority Number (RPN), which is the Product of the three factors (i.e. RPN = S * C * R). Thus, each RPN is in the range of [1…1000], with 1000 being the most severe (highest) priority.

1. [3.2 (a), 3.3 (n), 3.4 (m)] In the referenced sections, several inter-course coordination and routine course maintenance issues have been identified, primarily through Prerequisite Exams and Course Outcome Verification forms. These issues should be attended to.

2. [3.2(a,c,e,k)] Over the past few years, the laboratory coordinator has made significant upgrades to the core laboratory courses with respect to the referenced outcomes. He should be provided whatever support is needed to continue and document this process.

3. [3.2(b,d,g)] This year, demonstration of the referenced outcomes in EE-3970 was not as rigorous as required by the PreCAP assessment rubrics. Examination of the EE-3970 Course Specification reveals that the required level of rigor is not clearly stated in the specification. This ambiguity should be corrected to provide future instructors with adequate guidance on these topics.

4. [3.2 (f,g)] In light of recent assessment data collected by the Director of General Education, it may now be credible to list UN-1001 (Perspectives) as making minor contributions to outcomes (f) and (g).

S= 4 C= 8 R= 9 RPN = 288

S= 5 C= 7 R= 9 RPN = 315

S= 5 C= 7 R= 9 RPN = 315

S= 5 C= 8 R= 6 RPN = 240

MTU/ECE/RMK 29 Approved by ECE Faculty – 10/5/2004

5. [2.2(2), 3.2(h)] While there have been significant improvements in assessment and demonstration of outcome (h), the UPC should continue its collaboration with the Director of General Education and the Social Sciences department in order to help maintain this trend.

6. [3.2(j)] While there have been significant improvements in assessment and demonstration of outcome (j), the UPC should continue its collaboration with the Director of General Education in order to help maintain this trend.

7. [2.2(3), 3.2 (k), 3.3(q), 4.2(2)] There is increasing evidence that the Engineering Common First Year is detrimental to our programs (especially for Computer Engineering students). Specifically, ENG-1101 and 1102 expend six credits on marginally relevant topics, which could better be spent on relevant topics.

8. [3.3 (p), 3.4(m)] The Mathematics department should be informed of the outstanding ranking achieved by courses MA-1160 and 2160 in the sophomore surveys.

9. [3.3 (q)] The Computer Science department should be informed of the outstanding ranking achieved by the course CS-1121 in the sophomore surveys.

10. [2(1), 3.3(p), 3.4 (l,m,n)] Mild dissatisfaction with various mathematical topics has been a minor problem for years, but was primarily considered an in-house problem. However, there now appears to be a drop in the quality of delivery for this outcome. A department-wide investigation, in collaboration with the Mathematics and Computer Science Departments would seem to be in order.

S= 8 C= 8 R= 7 RPN = 448

S= 8 C= 8 R= 7 RPN = 448

S= 4 C= 10 R= 9 RPN = 360

S= 8 C= 8 R= 9 RPN = 576

S= 1 C= 7 R= 9 RPN = 63

S= 1 C= 7 R= 9 RPN = 63

MTU/ECE/RMK 30 Approved by ECE Faculty – 10/5/2004

11. [4.2(1,5,6)] The UPAC SOP [1] should be edited

a. so that the academic advisor does not evaluate the advising surveys,

b. to correct the typo in Table 2 referenced in Sec 4.2(5),

c. to vest authority for programmatic decisions in the program faculty, rather than department faculty.

12. [2.2(1)] : Regarding outcomes EE(n) Advanced Mathematics, and EE(l) Probability and Statistics, The UPC has determined that these outcomes are covered in a variety of courses, but not accurately documented in the course specs .

6 Action Items 6.1 Definition of Action Items This Subsection defines action items to address each of the problem items listed in Section 5. Each action item is labeled with the problem item(s) it is intended to address.

1. [For problem 1] The UPC shall coordinate with the relevant lead/actual instructors to address the following inter-course coordination and routine course maintenance issues, and document their resolutions:

a. Low PRE scores in communication theory and electromagnetics, due solely to low scores reported by EE-4255,

b. Low PRE scores in Photonics, due solely to low scores reported by EE-3190,

c. The lead instructors for the photonics courses (e.g. EE 3190 and 3291) shall review and redistribute the material between the courses, and revise the relevant course specs,

d. The lead instructors for EE 4221 and 4222 shall review and redistribute their material and revise the relevant course specs,

e. The time-domain and z-domain topics mentioned by the EE-3160 instructor,

f. Whether EE-3120 or EE-3160 shall be added as a prerequisite to EE-3221,

g. Add outcome CpE(n) to the Course Specs for EE-2303, 2304, 3130, 3160, and 3305,

h. consider adding outcome EE(m) to the course spec for EE-3130.

2. [For problem 2] The UPC shall provide the laboratory coordinator whatever assistance and support he may need to continue the laboratory upgrade process currently in progress, with respect to the abilities to (a) apply engineering topics, (c) design a system or component, (e) identify and formulate engineering

S= 6 C= 7 R= 9 RPN = 378

S= 6 C= 6 R= 9 RPN = 324

MTU/ECE/RMK 31 Approved by ECE Faculty – 10/5/2004

problems, and (k) use appropriate tools and techniques of the profession. The UPC shall also assist in documenting these results.

3. [For problem 3] the lead instructor for EE-3970 shall edit the course specification such that it clearly articulates the level of rigor required by the PreCAP assessment rubrics regarding (b) experimental design and analysis, (d) teaming skills, and (g) technical writing skills.

4. [For problem 4] The UPC shall determine whether to list UN-1001 (Perspectives) as making minor contributions to outcomes (f) and/or (g), and update the UPAC SOP tables 1 and 2 accordingly.

5. [For problem 5] Regarding of outcome (h), the UPC shall

a. Congratulate the Director of General Education on the significant improvements in its assessment and demonstration,

b. Continue to offer whatever support and assistance the Director of General Education may require in order to help maintain this trend,

c. Address with the General Education Council the apparent inability of UN-1002 (World Cultures) to contribute to this outcome, and raise the question of whether continued existence of this course is justified,

d. Continue its collaboration with the Social Sciences department in order to help improve delivery of this outcome.

6. [For problem 6] Regarding of outcome (j), the UPC shall

a. Congratulate the Director of General Education on the significant improvements in its assessment and demonstration,

b. Continue to offer whatever support and assistance the Director of General Education may require in order to help maintain this trend,

c. Address with the General Education Council the apparent inability of UN-1002 (World Cultures) and UN-2002 (Institutions) to contribute to this outcome, and raise the question of whether continued existence of these courses is justified.

7. [For problem 7] There is increasing evidence that the Engineering Common First Year is detrimental to our programs (especially for Computer Engineering students). Therefore, the UPC shall update and re-propose the alternative first year program rejected by the College of Engineering last year. In the meantime, the department shall continue to advise prospective freshmen to enroll in Computer Science for the first year, whenever that option seems more appropriate to the individual student.

MTU/ECE/RMK 32 Approved by ECE Faculty – 10/5/2004

8. [For problem 8] The UPC shall inform the Mathematics department of the outstanding ranking achieved by courses MA-1160 and 2160 in the sophomore surveys.

9. [For problem 9] The UPC shall inform the Computer Science department of the outstanding ranking achieved by the course CS-1121 in the sophomore surveys.

10. [For problem 10] Mild dissatisfaction with various mathematical topics has been a minor problem for years, but was primarily considered an in-house problem. However, there now appears to be a drop in the quality of delivery for this outcome. Therefore, the UPC shall:

a. In collaboration with the Mathematics and Computer Science Departments, conduct a major study of the entire spectrum of mathematical topics presented across the EE and CpE curricula by all three departments,

b. Propose, implement, and document the changes necessary to improve delivery of outcomes CpE(p) and EE(l, m, and n).

11. [For problem 11] The UPC shall edit the UPAC SOP [1]:

a. so that the academic advisor does not evaluate the advising surveys,

b. to correct the typo in Table 2 referenced in Sec 4.2(5),

c. to vest authority for programmatic decisions in the program faculty, rather than department faculty.

12. [For Problem 12] Carryover Item: The UPC shall coordinate as all instructors examine their course content and document delivery of outcomes EE(l) and EE(n) in their course specs; the UPC shall then update Tables 1 and 2 of the UPAC SOP[1] as needed.

6.2 Execution of Action Items Table 6.1 lists all action items, the problems they address, and their Relative Priority Numbers (RPNs). The value of an action item’s RPN is defined as the largest value of the RPNs for all problem items that it addresses.

It is hereby mandated that all action items with an RPN > 215 are required actions that shall be executed in accordance with the deadlines listed in Table 6.1. Remaining action items are optional and may be executed at the AC’s discretion if time and resources permit; thus these items have no assigned deadlines.

All deliverables listed in action items shall be delivered to the AC.

MTU/ECE/RMK 33 Approved by ECE Faculty – 10/5/2004

Table 6.1: Summary of Action Items, RPNs, and Deadlines (Sorted by action item RPN)

Action Item

(from Sec 6.1)

Problem item

s addressed

(from Sec 5)

Action Item

RPN

(max Prob R

PN)

Execution R

esponsibility

Monitoring

Responsibility

Deadline

(Sem / W

eek)

10 10 576 UPC DOW spr / 145 5 448 RMK RMK fall / 096 6 448 RMK RMK fall / 14

11 11 378 RMK RMK fall / 097 7 360 UPC LJB fall / 09

12 12 324 UPC & lead inst DOW spr / 143 3 315 RMK RMK fall / 144 4 315 UPC RMK spr / 141 1 288 UPC & lead inst RMK fall / 092 2 240 UPC & GEA LJB spr / 148 8 63 RMK RMK optional9 9 63 RMK RMK optional

MTU/ECE/RMK 34 Approved by ECE Faculty – 10/5/2004

7 Assessment Summary 7.1 Coverage of Assessment Process The relevance of each instrument to each program outcome is listed in tables 7.1 and 7.2 for CpE and EE, respectively. These tables can be used to evaluate how well each outcome is covered by the assessment instruments (right hand column) and the utilization or productivity of each instrument for outcomes assessment (bottom row).

7.1.1 Computer Engineering Table 7.1 shows good multiple instrument coverage of all outcomes except (r) depth of knowledge..., which is. verified by the senior Degree Audit forms. The table also shows that the permanent instruments all cover more than one outcome.

7.1.2 Electrical Engineering Table 7.2 shows good multiple instrument coverage of all outcomes. The table also shows that most of the permanent instruments cover more than one outcome.

MTU/ECE/RMK 35 Approved by ECE Faculty – 10/5/2004

Table 7.1: Instrument-to-Outcome Coverage – CpE

Prereq Review

Exam

s

Sr Des Sponsor A

ssess

Sr Des A

dvisor Assess

Undergrad A

dvisor's Report

Course O

utcome V

erification

Sophomore Surveys

Senior Exit Surveys

Alum

i Surveys

Fund of Engineering E

xam

CpE

PreCA

P

UA

C O

utcomes A

sses

Ad H

oc (Extern. R

eview R

eport)

Ad H

oc (Gen E

d Report)

Ad H

oc (Enterprise R

eport)

Ad H

oc (Eng Fund C

rses Report)

CO

VE

RA

GE

TO

TA

L

App

A B.1-2

B.3

C

D E.1

E.2

E.3

F G H I.1 I.2 I.3 I.4 App

a 1 2 1 1 2 1 a 8b 1 1 2 b 4c 2 2 2 1 c 7d 2 1 1 1 1 d 6e 2 1 1 1 e 5f 1 1 1 1 f 4g 2 1 1 1 1 1 g 7h 1 1 1 1 h 4i 1 1 i 2j 1 1 1 j 3k 2 1 1 1 1 2 1 1 k 10l 2 2 2 1 l 7

m 2 1 m 3n 2 1 n 3o 2 o 2p 2 1 1 p 4q 1 1 1 1 q 4r r 0s 1 s 1

Instr.Util. 5 14 3 8 2 5 12 14 3 6 2 0 4 5 1 Instr.

Util.

MTU/ECE/RMK 36 Approved by ECE Faculty – 10/5/2004

Table 7.2: Instrument-to-Outcome Mapping - EE

Prereq Review

Exam

s

Sr Des Sponsor A

ssess

Sr Des A

dvisor Assess

Undergrad A

dvisor's Report

Course O

utcome V

erification

Sophomore Surveys

Senior Exit Surveys

Alum

i Surveys

Fund of Engineering E

xam

CpE

PreCA

P

UA

C O

utcomes A

sses

Ad H

oc (Extern. R

eview R

eport)

Ad H

oc (Gen E

d Report)

Ad H

oc (Enterprise R

eport)

Ad H

oc (Eng Fund C

rses Report)

CO

VE

RA

GE

TO

TA

L

App

A B.1-2

B.3

C

D E.1

E.2

E.3

F G H I.1 I.2 I.3 I.4 App

a 1 2 1 1 2 1 a 8b 1 1 b 2c 2 2 2 1 c 7d 2 1 1 1 d 5e 2 1 1 1 e 5f 1 1 1 1 f 4g 2 1 1 1 1 g 6h 1 1 1 1 h 4i 1 1 i 2j 1 1 1 j 3k 2 1 1 1 1 1 1 k 8l 2 1 1 l 4

m 1 2 1 m 4n 2 1 n 3

Instr.Util. 5 12 1 0 3 4 12 13 5 0 1 0 4 4 1 Instr.

Util.

7.2 Outcome and Process Rankings Table 7.3 lists all outcomes for both majors including the RPN of the most severe problem item relevant to that outcome. Overall, the table shows significant improvement in the worst-case RPNs from last year. However, some new problems have arisen, they are not nearly as severe as the problems from previous years. The table also reveals that the most critical outcomes for this year: are

1. RPN = 576 for outcomes CpE(p) and EE(l,m,n). These are the general outcomes for the abilities to apply various categories of mathematical knowledge. This problem is being addressed by a more formal and wide-ranging investigation than has been dome in the past.

2. RPN = 448 for outcomes (h) broad general education... and (j) contemporary issues.... While there has been significant progress in assessing these outcomes and documenting their delivery, this year’s action items continued require continued focus on this area.

MTU/ECE/RMK 37 Approved by ECE Faculty – 10/5/2004

3. RPN = 360 for outcomes (k) tools and techniques... and CpE (q) computer programming,etc... The primary problem is the chronic waste 6 credits on First year classes that contribute little or no relevant material, usurping credits that could be expended on relevant material.

Figure 7.3 Outcome and Process RPN History

CpE

O

utcome

Problem

Item(s)

AY

01-02M

ax RPN

AY

02-03M

ax RPN

AY

03-04M

ax RPN

EE

Outcom

e

Problem

Item(s)

AY

01-02M

ax RPN

AY

02-03M

ax RPN

AY

03-04M

ax RPN

a 1,2 500 378 288 a 1,2 500 378 288

b 3 630 378 315 b 3 630 378 315

c 2 500 378 240 c 2 500 378 240

d 3 900 441 315 d 3 900 441 315

e 2 378 378 240 e 2 378 378 240

f 4 0 0 315 f 4 0 486 315

g 3,4 900 0 315 g 3,4 900 378 315

h 5 567 648 448 h 5 567 648 448

i 0 0 0 i 0 378 0

j 6 900 810 448 j 6 900 810 448

k 2,7 500 360 360 k 2,7 500 360 360

l 378 0 0 l 10,12 0 729 576

m 315 0 0 m 1,8,10 500 378 576

n 1 0 0 288 n 10,12 500 378 576

o 0 0 0 Max 900 810 576

p 8,10 315 378 576 Mean 513 495 379

q 7,9 0 360 360 StDev 331 174 167

r 0 0 0

s 500 0 0

Max 900 810 576

Mean 404 260 252

StDev 340 286 198

(c) Assessment Procs378

(a) Computer Eng'g (b) Electrical Eng'g

7,11 900 810

MTU/ECE/RMK 38 Approved by ECE Faculty – 10/5/2004

8 References

1. Undergraduate Program Assessment and Control (UPAC), Standard Operating Procedure, Dept of Electrical and Computer Engineering, Michigan Tech University, (latest revision), available through: http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

2. Annual Outcomes Assessment Report Covering Academic Year 2002-2003, Dept of Electrical and Computer Engineering, Michigan Tech University, Feb 4, 2002, available through: http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

3. D.O. Wiitanen, Problems 5, 16 in 02/20/04 Assessment Report, Dept of Electrical and Computer Engineering, Michigan Tech University, May 26, 2004 available through: http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

4. L.J. Bohmann, Delivery of Ethics in the Electrical Engineering Curriculum, Dept of Electrical and Computer Engineering, Michigan Tech University, Jun 30, 2004, available through: http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

5. L.J. Bohmann, Delivery Introductory Physics Labs, Dept of Electrical and Computer Engineering, Michigan Tech University, Jun 20, 2004available through: http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

6. Summary of Assessment Activities for ENG1101/2, Fall 2000 through Spring 2004, Dept. of Engineering Fundamentals, Michigan Tech University, (undated), available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

7. Self Study Questionnaire, Engineering Accreditation Commission, Accreditation Board for Engineering and Technology, Rev Aug 7, 2002, available at: http://www.abet.org/info_prgs.html .

8. Mary Durfee, General Education Assessment Report for 2003/2004, Michigan Tech University, (undated), available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

9. Elizabeth Flynn, David Munson, and Karan Watson, Electrical and Computer Engineerng Department Review March 31-April 2, 2004, (undated), available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

10. Enterprise Assessment Report, 2002-03 Academic Year, College of Engineering, Michigan Tech University, (undated), available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

11. Enterprise Assessment Report, 2003-04 Academic Year, College of Engineering, Michigan Tech University, (undated), available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

12. L.J. Bohmann, Report on mid-program writing assessment within EE3120 – Fall 2003, Dept of Electrical and Computer Engineering, Michigan Tech University, Jun 28, 2004available through http://www.ece.mtu.edu/faculty/rmkieckh/abet/WWW/Assess1.html .

10/5/04 AppA-PRE.doc

A-1

A Prerequisite Review Exam (PRE) Report Forms PRE Report Forms provided both numerical scores and open-ended written comments.

A.1 Numerical Scores For the numerical scores, instructors rated their satisfaction with individual topics on a (0…4) scale (with zero being the worst). In particular, a score of 2 indicated "almost satisfactory" results.

A.1.1 Results by Topic A spreadsheet of the PRE numerical scores for each topic is shown in Table A.3. They have been sorted into order of increasing mean score. Numerical results indicating the most serious need for attention are those scoring below 2.5. These can be categorized as follows.

1. One of the greatest concentration of low scores are specific course topics within the ECE Department. These are: Communications Theory (1.7), Electromagnetics (2.0), BJTS, JFETs, MOSFETs, and their circuits (2,3), and Photonics theory (2.4).

2. The area of concern can be categorized as Applied Mathematics topics. Specifically, Probability and Statistics (2.0), Differential Equations (2.0), Signals and Systems (2.3), and Discrete Systems Analysis (2.4).

3. Also scoring below 2.5 was the topic of Computer Languages and Programming (2.3) and Combinational and Sequential Logic (2.4).

A.1.2 Results by Outcome PRE results were grouped by outcomes affected by each reported topic. Table A.1 and Figure A.1 show these results for CpE outcomes, and Table A.2 and Figure A.2 do the same for EE Outcomes. In some cases, some judgment was used to assign a topic to a particular outcome. The important results are:

1. Outcome EE(l), “knowledge of probability and statistics”, scored 2.0. The lowest score came from the CpE core course EE3970.

2. Outcome (k), “Ability to use relevant techniques, skills, …” scored 2.3, for both majors due to the topic Signals and Systems, caused by a single low score of “1” reported in EE-3130, while both EE-3160 and EE-4261 reported a “3” for this topic.

3. Outcome CpE(q), “proficiency in computer programming” scored an 2.3, as reported in EE-3173 and EE3970.

4. Outcome (a), “apply knowledge of math, science, and engineering…” Scored 2.4 for both CpE and for EE. The low score can be attributed primarily due to the low score on Electromagnetics and Communication Theory. This low average was to solely to low scores reported by EE-4255.

5. Outcome EE(n), "knowledge of advanced mathematics, ..." scored a 2.5, pulled down by a low score in "Differential Equations". While EE-2110 and 3160 reported very low scores on this topic, EE-4221 and 4261 reported very high scores.

6. Outcome CpE(p), "knowledge of discrete math, ..." scored a 2.5, pulled down by low scores in "Differential Equations" and "Probability and Statistics".

10/5/04 AppA-PRE.doc

A-2

7. Outcome CpE(s), “breadth of knowledge spanning…” scored an acceptable 2.7, with the "Combinational and Sequential Logic" being the lowest component.

8. Outcome EE(m), "knowledge of mathematics through differential and integral calculus, ..." scored an acceptable 2.8.

A.2 Written Comments The most notable of the instructors’ written comments are paraphrased below.

1. 2110 – 1/3 of the class had trouble with simple 1st order differential equations.

2. 2150 – Calculus is hardly required in this course and was not checked in the test topics.

3. 3130 – I am satisfied with the student's performance on this exam.

4. 3160 – Had problem finding phase from time delay.

5. 3173 – I see improvement over last semester, but sequential logic and pipeline operations are still weak.

6. 3175 – (1) Majority of students expressed confidence in computer architecture. (2) Majority of students had significant insecurity in Probability and Statistics, although most were able to handle it with self-directed review.

7. 3175 – (1) Several students commented that the terminology and notation used for Probability and Statistics were not the same as used in the Math courses, (2) Some students commented that they were familiar with MIPS architecture but not RISC, (3) Significant insecurity with Probability and Statistics yet they scored acceptable on the exam. The problem seems to be the context and notation from MATH to ECE.

8. 3221 – There is a mixture of Juniors and Seniors in this class. The Juniors seemed to do significantly worse (on ac circuits topics). I think this is because the Seniors have more experience.

9. 3970 – We should do something about the statistics prerequisite. (the students do poorly on the prereq exam but perform satisfactorily in class).

10. 4231 – I go over MOSFET physics again. On the whole, PRE was satisfactory.

11. 4232 – Felt I have devoted a couple of lectures addressing the weaknesses observed in the prerequisite test.

12. 4261 – Very good scores this year.

13. 4271 – I do review MOSFETs in this course. On the whole, a satisfactory performance!

14. 4441 – Most students were able to articulate the essentials relevant to the topics tested.

A.3 Summary Based on both numerical and written results, there is an increase in the satisfaction with students ‘performance as compared to last year. The written comments seem to indicate a higher degree of satisfaction than the numerical results. In short, the following can be concluded:

1. The single biggest area of instructor dissatisfaction covers the following Electrical Engineering topics affecting outcome (a):

10/5/04 AppA-PRE.doc

A-3

a. Communication theory and electromagnetics, due solely to low scores reported by EE-4255

b. Electronics was addressed this year, but not in time to show up on the PRE results (see Sec 2, item 5.a).

c. Photonics, due solely to low scores reported by EE-3190.

2. The single topic of “Computer Programming” negatively affected outcome CpE(q), as reported in EE-3173 and 3970.

3. The single topic of Boolean Algebra and Comb. Logic affected outcome CpE(s). This topic was addressed this year, but not in time to show up on the PRE results (see Sec. 2, item 5.c).

4. There is general dissatisfaction directed toward variety of applied mathematics topics such as probability and statistics, differential equations, and discrete system analysis. These topics have a negative impact on CpE outcome (p) and EE outcome (l) and (n). Two patterns have emerged:

a. There is a wide disagreement between instructors regarding preparation for some topics.

b. To date, problems in this area have been addressed by adjusting the applied mathematics portions of EE courses. It would now appear necessary to open a dialogue with the mathematics department concerning this issue.

Relevant Outcomes: CpE a, k, p, q, s EE a, k, l, m, n

10/5/04 AppA-PRE.doc

A-4

TableA.1: Satisfaction Sorted by CpE Outcomes

Topic

CpE O

utcome

Num

ber M

ean

Weighted M

ean

CpE O

utcome

CpE N

et Mean

Communication Theory a 3 1.7 5.1 a 2.4BJT, JFET, MOSFET & ckts a 7 2.3 16.1 k 2.3Photonics Theory a 9 2.4 21.6 p 2.5ac/dc Circuit Analysis a 16 2.6 41.6 q 2.3Frequency response a 2 3.0 6.0 s 2.7Signals and Systems k 3 2.3 6.9 Diff. Equations p 6 2.0 12.0 Probability & statistics p 5 2.0 10.0 Discrete system analysis p 8 2.4 19.2 Calculus p 3 2.7 8.1 Complex number/arithmatic p 8 2.9 23.2 Linear Algebra & Equ'ns p 3 3.0 9.0 Binary & Hex Arithmetic p 2 3.0 6.0 Comptr lang & prog'ming q 3 2.3 6.9 Comb.& Seq. Logic s 5 2.4 12.0 Computer & proc. Arch. s 5 3.0 15.0

TableA.2: Satisfaction Sorted by EE Outcomes

Topic

EE Outcom

e

Num

ber

Mean

Weighted M

ean

EE Outcom

e

EE Net M

ean

Frequency response a 2 3.0 6.0 a 2.4Communication Theory a 3 1.7 5.1 k 2.3BJT, JFET, MOSFET & ckts a 7 2.3 16.1 l 2.0ac/dc Circuit Analysis a 16 2.6 41.6 m 2.8Photonics Theory a 9 2.4 21.6 n 2.5Electromagnetics a 2 2.0 4 Power Xmission & Xform a 6 2.5 15Signals and Systems k 3 2.3 6.9Probability & statistics l 5 2.0 10Calculus m 3 2.7 8.1Binary & Hex Arithmetic m 2 3.0 6.0Complex number/arithmatic n 8 2.9 23.2Diff. Equations n 6 2.0 12.0Linear Algebra & Equ'ns n 3 3.0 9.0Discrete system analysis n 8 2.4 19.2

10/5/04 AppA-PRE.doc

A-5

0.0

0.5

1.0

1.5

2.0

2.5

3.0

a k p q s

Figure A.1: Satisfaction Sorted by CpE

Outcomes

0.0

0.5

1.0

1.5

2.0

2.5

3.0

a k l m n

Figure A.2: Satisfaction Sorted by EE

Outcomes

8/3/04 AppB-SrDes.doc

B-1

B Senior Design Assessment Report This appendix reports the results of the Senior Design Assessment Instruments, including the Faculty Advisors’ Assessments, Sponsors’ Written Report Evaluations, and Sponsors’ Oral Report Evaluations.

B.1 Sponsors’ Oral Report Evaluations Senior Design project sponsors were asked to rank several aspects of the teams’ final oral reports on a (1…4) scale (with 4 being the best). If a sponsor’s representative was unable to attend the presentations, the evaluation task was assigned to a member of the department’s External Advisory Committee (EAC). As shown in Table B.1, all aspects of both the visual and spoken portions of the presentations averaged well above 3.0.

Relevant Outcomes: CpE g EE g

B.2 Sponsors’ Written Report Evaluations Senior Design project sponsors were asked to rank several aspects of the teams’ final written reports on a (1…4) scale (with 4 being the best). The evaluation form covered three main areas; (1) written report style and quality, (2) design quality and deliverables, (3) student skills and abilities displayed during the project.

B.2.1 Written report Style and Quality As shown in Table B.2, the means scores for all aspects of the written reports ranked well above 3.0.

Relevant Outcomes: CpE g EE g

B.2.2 Design Quality and Deliverables As shown in Table B.2, sponsors were generally satisfied with the results and expressed interest in participating again. However, the only question to receive less than a 3.0 mean score was “How well did the project deliverables achieve your overall goals?” This question received a mean score of 2.75.

Relevant Outcomes: CpE c, l EE c

B.2.3 Student Skills and Abilities As shown in Table B.2, sponsors rated student skills and abilities at well above 3.0 for outcomes (a), (c), (d), (e), and (k).

Relevant Outcomes: CpE a, c, d, e, k EE a, c, d, e, k

8/3/04 AppB-SrDes.doc

B-2

B.3 Faculty Advisors’ Assessments Senior design faculty advisors, in collaboration with the student team leaders, evaluated the importance of real-world considerations and the ability to use the techniques, tools, and standards of modern engineering practice.

Of the nine considerations listed, the majority were of non-trivial importance to the majority of teams. As shown in Table B.3, economic benefit to the sponsor and project budget limits were important to 100% of the teams, while sustainability, manufacturability, ethical issues, health or safety issues, and societal impacts, were of major concern to the majority of teams.

In addition, 71% of the teams were required to use at least one industry or government standard, and 94% needed to use commercially available hardware or software tools. Table B.4 shows the number of teams that listed having used a given number of standards or a given numbers of engineering tools.

Relevant Outcomes: CpE k, l EE k

8/3/04 AppB-SrDes.doc

B-3

Table B.1: Sponsors' Oral Report Evaluation Summary Technical Content Mean Median Introduction Section of the Presentation 3.82 4

Main Body of the Presentation 3.71 4

Conclusions Section of the Presentation 3.25 3

Summary Opinion of Technical Content 3.39 3

Visual and Oral Presentation Mean Median Overall Evaluation of the Visual Aids 3.71 4

Overall Evaluation of the Speakers' Verbal Performances 3.36 3

Overall Evaluation of Team's Answers to Questions 3.31 3

Variability in Style & Quality Between Speakers 3.14 3

Summary Opinion of Oral and Visual Presentation 3.43 3.25

Table B.2 Sponsors' Written Report Evaluation Summary Written Report Style and Quality Mean MedianExecutive Summary 3.50 3.5

Main Body of the Report 3.75 4

Conclusions and Recommendations 3.25 3

Grammar, Spelling, and Punctuation 3.75 4

Summary Opinion of Written Report Style and Quality 4.00 4

Design Quality and Deliverables Mean MedianHow well did the project deliverables achieve your original goals 2.75 2.5

Overall, how satisfied are you with the results of this senior design project? 3.50 3.5

Given another topic of mutual interest, would you like to participate again? 4.00 4

Student Skill and Abilities Mean MedianAbility to partition a team project into tasks and lay out a project plan 3.25 3

Ability to execute a team project and produce required deliverables 3.25 3Ability to apply knowledge of math, science, & engineering to the problem at hand 3.75 4

Ability to design a system, device, or program to meet desired needs 3.75 4

Ability to identify, formulate, and solve engineering problems 3.25 3Ability to use the techniques, skills, and modern engineering tools necessary for engineering practice 3.75 4

8/3/04 AppB-SrDes.doc

B-4

Table B.3: Importance of Real-World Considerations

Question Mean Score

% of Teams

1. How Strongly were your team’s design decisions influenced by: 2.20 68

(a) environmental issues or concerns? 1.35 29

(b) the economic benefit of your project to your sponsor? 3.00 100

(c) the limitations of your project budget? 3.18 100

(d) the sustainability of the technology behind your project? 2.65 82

(e) the manufacturability of your product or results? 2.59 88

(f) ethical issues or concerns regarding your product or results? 2.00 71

(g) public health and/or safety issues regarding your product or results? 2.18 65

(h) potential social or societal impacts of your product or results? 1.53 53

(i) potential political impacts of your product or results? 1.29 24

2. How important was it to adhere to one or more industry or government standards (such as IEEE, SAE, or Mil Standards) in the execution of your project? (pleased list any such standards used)

2.41 65

3. How important was it to employ to one or more commercially available hardware or software tools (such as Matlab, Simulink, or D-Space) in the execution of your project? (please list any such tools used)

3.59 94

Table B.4: Standards and Tools Employed

# of Standards

Listed Num ofteams

# of Commercial

Tools Listed

Num ofteams

0 5 0 1 1 7 1 3 2 2 2 2 3 3 5 4 1 4 5 5 1 5 1 6 6 7 8 1

% of teams that listed standards

70.6 % of teams that listed

tools 94.1

9/24/04 AppC-Advise.doc

C-1

C Undergraduate Advisor’s Assessment Report This appendix reports the undergraduate advisor’s assessment, and the results of the biennial Undergraduate Advising Survey. All tables and figures are collected at the end of this appendix.

C.1 Advisor’s Observations

C.1.1 Ease of Scheduling Courses: It appears that freshman students have difficulty scheduling courses around their cohort schedules. They have problems when trying to schedule their General Education courses (UN1001 and UN1002). They also have problems when trying to schedule CS1121 which is a required course for EEs and CpEs. They experience a number of time conflicts that are difficult for us to resolve since none of the courses are in the ECE Department. Data should be collected in the future to evaluate the number of problems that occur and how many students are affected.

CpE students have had a couple of problems scheduling due to a required CS course and required EE course being offered at conflicting times. The course schedulers for the ECE and CS Department are communicating so that this does not happen again.

C.1.2 Frequency of waivers and exceptions: Waivers and exceptions to degree requirements have been rare and do not significantly affect any outcomes. Specifically:

1. In the past we had a small number of EE and CpE students who had a problem meeting the General Education Distribution Course requirements. Students are required to take 15 credits of Distribution Courses of which at least 9 of these credits have to be upper division courses. Some students did not understand the requirement clearly and some exceptions had to be made in order for the students to graduate as planned. This problem has been addressed and the students are reminded of these requirements throughout their education. The only other exception involving General Education is one that was approved by the Director of General Education. She made an exception for a graduating senior and allowed them to work with her on an independent basis to meet their last Distribution Course requirement.

2. We give a minimal number of waivers for any EE courses. Over the past year we allowed one student to take MA3521 Differential Equations at the same time as EE2110 Circuits based on the approval of the instructor who felt the student could handle the courses as a coreq. This was a situation that arose because the student was given incorrect information on the course prerequisites. The other exception that was made was allowing a student to take PH2000 University Physics 2 as a coreq to EE3140 Electromagnetics when it is actually a prerequisite. The exception was made based on the approval of the EE3140 instructor so that the student’s graduation was not delayed because of one course.

3. We regularly waive ENG1101 and ENG1102 for transfer students and allow them to substitute engineering courses in place of them. This exception is made because the students will get more value out of a higher level engineering course as opposed to taking a freshman level course while they are upperclassmen. Prior to August 2003 transfer students were being allowed to substitute free electives for ENG1101 and ENG1102. After August 2003 transfer students were being allowed to substitute technical electives for these courses. After a decision by the ECE Undergraduate Program Committee in Spring 2004 we are now only allowing engineering courses to be substituted for ENG1101 and ENG1102.

9/24/04 AppC-Advise.doc

C-2

Since these courses do not contribute to any program outcomes other than “Engineering Topics”, these waivers do not affect any outcomes.

4. For CpEs, EE-3173, 3175, and 3970 are prereqs. to the Senior Design sequence. In the past, some students were allowed to take one or more of these as coreqs. to Senior Design. These exceptions were largely due to after effects of the transition from quarters to semesters. No such exceptions were made during this academic year.

5. For EEs EE3305 is a prereq. to the Senior Design sequence. In a few instances, students were allowed to take this course as a coreq., but only with the concurrence of the Senior Design advisor.

C.1.3 Completion of ABET Criterion 4 1. All students are required to fulfill a major design experience. This is done via senior design or

enterprise and we do not make any exceptions to this requirement.

2. There is no evidence that any students have completed less than 32 credits of mathematics and basic sciences.

3. There is no evidence that any student has completed less than 48 credits of engineering topics.

4. In the past we had a small number of EE and CpE students who had a problem meeting the General Education Distribution Course requirements. Students are required to take 15 credits of Distribution Courses of which at least 9 of these credits have to be upper division courses. Some students did not understand the requirement clearly took lower division courses where some of their upper division courses were required. Since this confusion was largely due to thr transition to semesters, some exceptions were made in order for the students to graduate as planned. This problem has been addressed and the students are reminded of these requirements throughout their education. The only other exception involving General Education is one that was approved by the Director of General Education. She made an exception for a graduating senior and allowed them to work with her on an independent basis to meet their last Distribution Course requirement.

C.2 Advising Surveys

C.2.1 Closed-Ended Questions A total of 315 advising surveys were completed by the students. Students were asked to rate the academic advising information available on the ECE Department Website on a 4-point scale of Strongly Disagree (1) to Strongly Agree (4) with respect to 5 statements:

1. Overall, I find the ECE Website useful for choosing my courses.

2. Degree requirements info on the ECE Website is easy to find.

3. Degree requirements info on the ECE Website is accurate and up-to-date.

4. The ECE Website has enough info that I feel confident choosing my own Tech Electives.

5. The ECE Website has enough info that I feel confident scheduling my own courses.

9/24/04 AppC-Advise.doc

C-3

Students were also asked to rate the academic advice received from the Advisor and their experience when interacting with the Advisor on a 4-point scale of Strongly Disagree (1) to Strongly Agree (4) with respect to 8 statements:

1. Overall, I am pleased with this advisor’s performance on my behalf.

2. This advisor is readily available and accessible.

3. This advisor shows respect for my needs.

4. This advisor is helpful in dealing with any special needs I may have.

5. This advisor is helpful in scheduling my courses.

6. This advisor does a good job of explaining the ECE Department degree requirements.

7. This advisor does a good job of explaining the General Education degree requirements.

8. This advisor does a good job of explaining my options (e.g. Minors or Enterprise).

C.2.2 Open-ended Questions In addition, respondents were asked one open-ended question: “Please write down your comments or recommendations to help us improve the ECE Department’s academic advising practices.

The written comments indicate that some students don’t feel that they can easily find information about their academic program requirements or that they can not find up-to-date information on the website.

C.2.3 Analysis Mean student responses to the questions about the web site are listed in Table C.1. Mean student responses to questions about the advisor are listed in Table C.2 and graphed in Figure C.1. Both open-ended and closed-ended questions revealed that:

1. Students do not find the department website useful for advising purposes,

2. Prior to August 2003, students were largely dissatisfied with the services of the academic advisor, as shown by a mean score of 2.0; however, starting in August 2003, students were generally very satisfied with the services of the advisor, as shown by a mean score of 3.41.

C.3 Summary and Recommendations Based on the above data, the following conclusions and recommendations can be drawn.

1. It is recommended that in the future the Advisor not be given the responsibility of evaluating the advising survey results. The Advisor should be given the evaluated results to use as a tool to improve or enhance his/her performance and advising services.

2. It appears that freshman students have difficulty scheduling courses around their cohort schedules; in particular UN1001, UN1002, and CS1121, all of which are required to be taken in the first year by both EEs and CpEs. At this time, no hard statistics on this problem are available.

3. Waivers and exceptions to degree requirements have been rare, and have been managed so as not to significantly affect any outcomes.

4. All students have completed the requirements of ABET criterion 4. In the past, a few students inadvertently took too many lower division Gen Ed courses instead of the required number of

9/24/04 AppC-Advise.doc

C-4

upper division courses. However, the general Gen Ed requirements were still met. This problem has been corrected.

5. Students do not find the department website useful for advising purposes.

6. Since August 2003, students are generally very satisfied with the services of the department’s academic advisor.

Relevant Outcomes: CpE l, m, n, o EE N/A

Statements mean med1a Overall, I find the ECE website useful for choosing my courses 2.36 2.001b Degree requirements info on the ECE website is easy to find 2.50 3.001c Degree requirements info on the ECE website is accurate and up-to-date 2.38 2.001d The ECE website has enough info tha t I feel confident in choosing my own Tech Electives 2.06 2.001e The ECE website had enough info that I feel confident scheduling my own courses 2.55 3.00

avg 2.37 2.40std 0.19 0.55

Statements Prior to Aug. 2003 Current

a Overall, I am pleased with the Advisor's performance on my behalf 1.88 3.50b The Advisor is readily available and accessible 2.27 3.33c The Advisor shows respect fo rmy needs 2.00 3.51d The Advisor is helpful in dealing with any special needs I may have 1.89 3.48e The Advisor is helpful in scheduling my courses 1.95 3.41f The Advisor does a good job of explaining the ECE Department degree requirements 2.17 3.43g The Advisor does a good job of explaining the General Education degree requirements 2.07 3.38h The Advisor does a good job of explaining my options ( e.g. Minors and Enteprrise) 1.80 3.21

avg 2.00 3.41std 0.16 0.10

Figure C.1: Undergraduate Advisor Survey Results

Table C.2: Undergraduate Advising Survey Results

TableC.1: Website Evaluation

Undergraduate Advising Survey Data 2003-04

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

a b c d e f g h

Statements specific to individual advisors

Mea

n Sc

ores

Advisor prior to Aug. 2003

Advisor after Aug. 2003

9/23/2004 C-5 Data-Advise.xls

9/20/04 AppD-COVer.doc

D-1

D Course Outcome Verification Forms This appendix reports the results of the Course Outcome Verification (COVer) Instruments, submitted by each instructor at the end of each semester.

D.1 Numerical Results Numerical Results from the front of the COVer forms are summarized in Table D.1. These results indicate how strongly and how often students are required to demonstrate each outcome during regular lecture and laboratory EE courses. The results in Table D.1 cover only required core courses given by the ECE department. For courses taught in both semesters, only the lowest of the two scores is listed.

Table 1 indicates weak coverage for several outcomes in both majors; however several outcomes are covered by non-EE courses. Examination of the relevant Course Specifications indicates that a few of the specs is in need of updating because they do not list all outcomes served by the course.

Relevant Outcomes: CpE a, b, c, d, e, f, g, i, k, n, p, q, r, s EE a, b, c, e, g, k, l, m, n

D.2 Instructors’ Written Comments The following list summarizes instructors’ comments and the AC’s observations on COVer forms. Only those courses and comments indicating a need for UPC attention are listed.

1. EE-2110: (a) I believe power in ac circuits will be dropped, therefore there should be more room for LaPlace and Bode. (b) With the decoupling of EE2110 with EE2160, the circuits course will no longer rely on LaPlace from EE2160 and therefore will require paring down other topics.

2. EE-2150: Course Spec seems inconsistent with Table 1 and Table 2.

3. EE-2171: (a) We now use Verilog, Not VHDL. (b) The book has changed to show Verilog Design with a latter copyright.

4. EE-2303: The UPC should consider upgrading this to a 2 credit course.

5. EE-3120: Delete the topic "Overview of Power Systems"

6. EE-3160: Omission of discrete systems (time domain and z-domain) e.g. solution of difference equations, and z-transforms is a serious problem that needs to be addressed. No other course addresses these topics and they are, in my opinion, central to a modern EE curriculum.

7. EE-3190: Deemphasize the material on 4) Optics of the Eye and 6) Various Types of Interferometers

8. EE-3221: think about making EE3120 or EE3160 a prerequisite. Pros: ease the ambitious schedule. Cons: Cuts down on the students who could take the course (CpE, Photonics do not have the prerequisites as required courses).

9. EE-3291: Having now taught all but the optical comm class in the photonics curriculum, I suggest that we sit down and re-assess the division of topics between classes and the choice of textbooks to see if we are delivering the material as well as we can.

10. EE-4221: implement the following catalog description: Complex Power flow in circuits and the effects of real and reactive flow on a system: transformer and load representations in power systems; power transmission line parameters and steady-state operation of transmission lines; the

9/20/04 AppD-COVer.doc

D-2

per unit system; development of the bus admittance matrix; power flow. Credits: 3.0, Lec-Rec-Lab: (0-3-0) Semesters Offered: Fall, Prerequisites: EE3120.

11. EE-4222: implement the following catalog description: Topics covered include symmetrical components; symmetrical faults; unbalanced faults; generating the bus impedance matrix and using it in fault studies; power system protection; power system operation; power system stability. Credits: 3.0, Lec-Rec-Lab: (0-3-0) Semesters Offered: Spring, Prerequisites: EE4221.

12. EE-4231: Focus on topics 1-6 with brief introduction to topics 7 & 8.

13. EE-4250: Add "L" to outcomes.

14. EE-4252: Change the name to "Digital Signal Processing" to more closely reflect the true nature of the course and the topics covered.

15. EE-4272: "L" for CpE should not be listed.

D.3 Evaluator's Observations While compiling this data the following errors were found when comparing Table 1 and Table 2 of the SOP and the Course Specs. These should be fixed by Nov. 19, 2004.

1. The CpE outcome "n" is missing from and should be added to Course Specs for EE2303, EE2304, EE3130, EE3160, and EE3305.

2. The CpE outcome "L" should not be listed in the Course Spec for EE-4250.

3. The EE outcome "m" is missing and should be added to the Course Spec for EE3130

4. In Table 2 of the SOP for course EE3120, the "1" should be deleted from column "j" and should be added to column "k".

D.4 Summary Based on the above data, several recommendations and observations can be made, all of which involve routine course-spec maintenance.

1. The lead instructors for the photonics courses (e.g. EE 3190 and 3291) should review and redistribute the material between the courses, and revise the relevant course specs.

2. The lead instructors for EE 4221 and 4222 should review and redistribute their material and revise the relevant course specs.

3. The UPC shall address the time-domain and z-domain topics mentioned by the EE-3160 instructor.

4. The UPC shall consider whether EE-3120 or EE-3160 shall be added as a prerequisite to EE-3221.

5. In the UPAC SOP [1], Table 2, the “1” under outcome “j” should be moved to outcome :|”k”.

6. The following editorial revisions shall be made to the relevant course specs:

a. EE-2303, 2304, 3130, 3160, and 3305 – add CpE (n),

b. EE-3130 – add EE(m),

c. EE-4250 – delete CpE (l).

9/20/04 AppD-COVer.doc

D-3

Table D.1: COVer Data - AY 2003-2004 CpE Core Courses

CpE

O

utcome

NU

M

SUM

2110

2150

2171

2303

2304

3130

3160

3173

3175

3305

3970

a 7 21 3 3 3 3 3 3 3 b 4 13 3 3 3 4 c 4 12 2 3 4 3 d 1 4 4 e 1 3 3 f 1 2 2 g 4 12 3 3 2 4 i 1 0 0 k 6 18 3 3 3 3 3 3 n 5 12 3 3 3 0 3 p 3 9 3 3 3 q 1 2 2 r 2 5 2 3 s 5 15 3 3 3 3 3

Table D.1: COVer Data - AY 2003-2004 EE Core Courses

EE

O

utcome

NU

M

SUM

2110

2150

2171

2303

2304

3120

3130

3140

3160

3170

3305

3306

a 9 28 3 3 3 3 4 3 3 3 3b 4 12 3 3 3 3c 8 24 3 2 4 3 3 3 3 3e 2 6 3 3 g 5 14 3 3 4 2 2k 5 15 3 3 3 3 3l 1 3 3 m 6 19 3 3 3 3 3 4 n 3 9 3 3 3

10/2/04 AppE-Surveys.doc

E-1

E Progressive Survey Reports This appendix reports the results of the Sophomore, Senior, and Alumni Surveys conducted during the subject academic year. All tables and figures are collected at the end of this appendix.

E.1 Sophomore Surveys

E.1.1 Closed-Ended Questions Students were asked to rate all required First Year courses on a 5-point scale of Far-Below Average (0) to Far Above Average (5) with respect to 3 questions:

1. Rank the relevance (R) of the material in each course below, for a student in your major.

2. Rank the amount of knowledge (K) you acquired in each course below, relative to its number of credits.

3. Rank the amount of work (W) required by each course below, relative to its number of credits.

The mean results of these questions are shown in are shown in Table E.1-0. These results were applied to obtain three metrics for each course:

• Productivity = R*(K / 25). = a measure of the amount of relevant knowledge gained, (normalized to a maximum score of 1.0). Productivity results are shown in Table E.1-1 and Figure E.1-1.

• Efficiency = K/(W*5) = a measure of the knowledge gained relative to the work invested (normalized to a maximum score of 1.0). Efficiency results are shown in Table E.1-2 and Figure E.1-2.

• Mean Rank = a measure of the relative value of the course. It is computed by sorting the Productivity and Efficiency scores by increasing scores, assigning a rank to that ordering (1 = worst, 10 = best) and taking the mean of the two orderings. The CpE and EE Productivity and Efficiency scores were close so we randomly chose to use the CpE scores to determine the Mean Rank. Mean Rank results are shown in Table E.1-3a and Figure E.1-3a for CpEs and in Table E.1-3b and Figure E.1-3b for EEs.

E.1.2 Open-ended Questions In addition, respondents were asked one open-ended question: “Please write down your comments or recommendations to help us improve the BS degree program in your major. Note: a few short, well-focused, highest-priority items are much more useful to us than a long laundry list of gripes, or a long rambling ‘flame’.”

In response to this question, students returned a total of 28 written comments (after deletion of illegible responses and personally offensive comments regarding individuals).

E.1.2.1 Engineering Fundamentals 1. ENG courses are worthless.

2. ENG courses should be improved, workload decreased.

3. ENG courses are focused too much away from EE majors.

4. ENG courses are completely worthless and do not apply to anything relevant in my major. Very poor!

10/2/04 AppE-Surveys.doc

E-2

5. Engineering II focused a lot of time on IDEAS; a ME program that I won’t use as an EE. I found that frustrating.

6. ENG1101/2 isn’t necessary.

7. Less busy work in the Eng classes.

8. Engineering fundamentals is designed for (and probably by) stupid people. Nothing in there is new to anyone with intelligence, and is just a lot of busy work.

9. Please, please, please do something about engineering fundamentals!

E.1.2.2 General Education 1. Perspectives should be eliminated.

2. UN classes are completely worthless.

3. World Cultures is the worst class MTU offers.

4. Less emphasis on UN courses.

5. Less world cultures, it appeared to be a waste of time.

6. The UN classes didn’t offer much for my major, but taking them has changed the way I look at the world. I am better off having taken them.

7. The UN courses are a waste of resources, they should be replaced with other ‘liberal arts’ classes.

E.1.2.3 Chemistry 1. Chemistry and Physics should be easier, too many people fail or get bad grades.

2. Chemistry wasn’t relevant for the most part.

E.1.2.4 Physics 1. Physics lab is useless – it’s stuff most people already know.

2. Physics is too hard.

E.1.2.5 Computer Science 1. Java is useless. The new EE3710 book assumes we know C.

2. CS should use Animator, confuses you having to stop using it.

3. They really need to change from Java as the intro to CS course. By learning something like C, we could have a useful tool. In addition the focus is on applets, which are utterly useless, instead of applications, which aren’t too bad.

E.1.2.6 Mathematics 1. No comments received on math courses.

E.1.2.7 EE laboratories 1. Re-write lab reports for EE2303/4.

10/2/04 AppE-Surveys.doc

E-3

E.1.2.8 General Comments 1. I like the organization of the ENG department, maybe reflect their style? I think weekly homework

should be due and count toward your grade in every class.

2. Please restructure all of the first year courses as they were all fairly useless.

3. Having an EE course or lab the first year would help.

4. Specialize more based on majors! Seemed to learn a lot of unnecessary information.

E.1.3 Analysis

E.1.3.1 General Comments A few general comments were submitted. The main topics were requests for more major-specific courses in the first year.

Relevant Outcomes: none

E.1.3.2 EE Lab Courses While they were not a particular topic of the survey, 1 written comment was submitted on EE lab courses, requesting that the labs be rewritten.

Relevant Outcomes: CpE a EE a, k

E.1.3.3 MA-1160 and 2160 MA-1160 and MA2160 were ranked very highly in both Productivity, Efficiency and in the overall ranking (8 and 9, respectively from CpEs and 9 and 10, respectively from EEs). We did not receive any written comments on either of these courses.

Relevant Outcomes: CpE p EE m

E.1.3.4 CS-1121 CS-1121 achieved a ranking of 10 from CpEs and 7 from EEs. The comments indicate that Java is not a useful language for CpE and EE majors; C would be much more useful.

Relevant Outcomes: CpE k, q EE k

E.1.3.5 ENG 1101 and 1102 ENG 1101 and 1102 received the lowest overall rankings (2 and 1, respectively) from CpEs and a more modest 5 and 4 from EEs. In addition, CpEs gave them the two lowest scores for efficiency, indicating that they provided “the least gain for the most pain”. In addition, these courses received more written comments than any other courses, all of which were negative. The overwhelming consensus of the written responses is the same as it has been for the previous two years: that these courses serve no purpose, have no relevancy to EE or CpE, and the work associated with these courses involves “busy work” and is more applicable to mechanical engineering students.

10/2/04 AppE-Surveys.doc

E-4

As listed in Tables 1 and 2 of reference [1], these courses do not contribute to any outcomes for either the CpE or EE program (a conclusion reinforced by the previous two years’ outcomes assessments). As discussed in last year’s annual report, their chronic failure to deliver meaningful education can no longer be considered a minor inconvenience. In the bigger picture, the credit hours wasted in these courses must be considered as contributing to the detriment of outcomes that could be better served if they were replaced with relevant courses. In particular, there is no room for expansion or improvement of coverage in the CpE curriculum because all 128 credits are spoken for.

Relevant Outcomes: CpE q, k EE k

E.1.3.6 UN 1001 and 1002 UN 1001 and 1002 received the lowest possible rankings (2 and 1), respectively from EEs, and a more modest rankings of 5 and 5, respectively from CpEs. Both majors gave UN-1002 their lowest ranking in Productivity, while EEs also gave it their lowest ranking in Efficiency. Of the 7 written comments directed at these courses, most of them labeled these courses as worthless and a “waste of time”. It must be concluded that at least UN-1002 is not serving its assigned purpose in the curriculum. This has been a chronic problem with this course for the last two years.

Relevant Outcomes: CpE h, j EE h, j

E.1.3.7 CH 1100 Numerical scores and rankings for CH-1100 slightly below average average. The comments cited lack of relevance to the majors.

Relevant Outcomes: CpE m EE m

E.1.3.8 PH-1100 and 2100 PH-1100 and 2100 were ranked 3.5 and 7, respectively, by CpEs and 5 and 8, respectively by EEs. this indicates that students found the lab to be much less useful than the lectures.The few written comments on these courses indicate that the courses are too hard and that PH1100 is useless.

Relevant Outcomes: CpE m EE m

E.2 Senior Exit Surveys Senior Surveys were administered to outgoing EE seniors during the 15th week of the Spring semester. The Survey asked respondents to rate their confidence in their ability regarding degree-specific statements relevant to the program outcomes. Each ability was ranked on a 1-5 scale (with 5 being best). Some outcomes were examined by more than one question.

E.2.1 Numerical Answers Numerical results are shown in Table E.2-1 and Figure E.2-1 for CpEs and in Table E.2-2 and Figure E.2-2 for EEs.

10/2/04 AppE-Surveys.doc

E-5

Table E.2-3 and Figure E.2-3 indicate show the differences in mean scores between EE and CpE responses. On this scale, a negative value indicates that CpEs were more confident, while a positive value indicates that EEs were more confident.

Tables and Figures E.2-4 through E.2-5 indicate continuing education plans.

E.2.2 Open Ended Questions In response to the open-ended request for comments, those comments focused on the laboratories were the single largest category. In addition, the balance of “Theory vs. Practice”, and the first year ENG courses received focused comments. The remaining comments were distributed over several topics with no clearly discernable pattern.

E.2.2.1 Laboratories 1. They should improve the labs and make better lab teaching courses.

2. Totally reconstruct the labs. Have labs that are actually used to construct a common product and show it hit applies to industry.

3. Invest more money and time to develop lab and their equipment.

4. Improve the labs with better written lab assignments and better trained TAs.

5. I think more hands-on lab type learning would be helpful in preparing students for their careers.

6. EE labs are a lot of work for only 1 credit.

7. Lab hardware needs more work.

8. Not enough hands on experience in labs. Building circuits, understanding of theory in a practical sense.

9. Labs should give the students more exposure into using measuring equipment such as oscilloscopes, function generators, spectrum analyzers, as well as basic soldering.

E.2.2.2 Theory vs. Practice 1. Improve practical application of EE through more hands-on courses.

2. More design exposure throughout EE classes. Once I got an internship I realized I didn’t know how to apply anything I learned here.

3. It would be useful to learn more about the applications of what we learn in each class.

4. Have classes that are program based and how they interact with the software, instead of just concepts of everything.

5. Real-world problems as examples in class. In theory, theory should suffice. In practice, theory fails miserably.

6. More work with practical electronics, like building different circuits, component choices, etc.

7. Get back to a practical approach to engineering that is supplemental and driven by theory.

E.2.2.3 First Year ENG Courses 1. ENG1101/1102 focused too much on mechanical engineering. I knew I wanted to an EE, so I

felt this class to be a relatively big waste of time.

10/2/04 AppE-Surveys.doc

E-6

2. Drop General Engineering or add relevant topics to the general engineering curriculum (ENG1101 & ENG1102).

3. Gen. Eng. Was worthless for EE and CpE.

E.2.2.4 Other Comments 1. Senior needs more tools.

2. Teach C++ to EEs. Is it almost a necessary sill to learn this programming language.

3. Make students more aware of the concentrations available within electrical engineering.

4. Require CpE to take EMag, power, statics, & thermo.

5. Make sure everyone takes CpE prof development courses.

6. Higher quality equipment… especially for Senior Design Teams.

7. Teach more computer engineering classes, instead of shoving us off to the CS department who doesn’t care about CpEs. Offer more tech electives.

8. Teachers who have a firm grasp of English tend to be easier to learn from.

9. Go back to quarters!! Your curriculum has been squashed, you spend too much time on banal, unrelated topics, and students can’t get as deep as they’d like to in their major concentration.

10. Could use projects that have enough financial support to avoid financial crunch in the middle of a project.

11. Get better libraries/compilers for C++ and Java so CpE can develop code in the labs, or all CpE’s should have degree long accounts in CS Dept. servers.

E.2.3 Analysis Numerical results show that the majority of the outcomes received a mean score above 4.0 on a [1...5] scale ( where 5 is best). In fact, all outcomes received median score of 4.0 or higher, with the single exception of outcome (j). Thus, overall, the majority of students feel “fairly confident” or “very confident” in their abilities in all outcomes.

E.2.3.1 Significant Numerical Results:

Figures E.2-1 and E.2-2 show that those responses below or near -1σ from the mean involved:

(h) “understand the impact of my engineering solutions in a global or societal context” received the lowest ranking for CpEs and the second-lowest ranking for EEs.

(c) “design an (electrical or computer) system, component, or process to meet desired needs” received the lowest ranking for EEs and the second lowest ranking for CpEs.

(e) “identify, formulate, & solve (electrical or computer) engineering problems” received the third lowest ranking for CpEs, and ranked about average for EEs.

Relevant Outcomes: CpE c, e, h EE c, e, h

10/2/04 AppE-Surveys.doc

E-7

E.2.3.2 Significant Numerical Differences:

Table E.2-3 shows that EEs felt much less confident (at least 1σ) regarding:

(a) “apply knowledge of math, science, and engineering to real-world problems”.

(f2) “respond to a professional ethical dilemma in accordance with the IEEE Code of Conduct”.

CpEs felt more confident than EEs in all categories. CpEs indicated that they are fairly confident to very confident in all categories.

Relevant Outcomes: CpE EE a, f2

E.2.3.3 Continuing Education As shown in Tables E.2-4 and E.2-5, over 79% of seniors in both majors either strongly or mildly expect to take continuing education courses throughout their careers. In addition, 33% of EEs, but only 9% of CpEs either strongly or mildly plan to seek an advanced degree.

Although the number of seniors planning to immediately enter graduate school has declined since last year, these results still show a strong appreciation of the need for life-long education.

Relevant Outcomes: CpE i EE i

E.2.3.4 Significant Open ended Responses Two topics received the bulk of the open ended comments:

1. The open-ended responses of seniors were critical of the laboratory experience. They were considered to be poorly organized, poorly equipped, and lacking in design-related hands-on experiences. Other comments received indicated that the students would like to have more hands-on courses and learn more about the practical applications of what they learn in class. Since this class of seniors took the core laboratory courses, significant improvements have been and are being made that address the type of complaints voiced in this instrument.

2. Some seniors complained that the freshman engineering courses ENG-1102, and ENG-1102 were “worthless”, “a big waste of time”, and should be dropped.

Relevant Outcomes: CpE a, c, k EE a, c, k

E.3 Alumni surveys The AY 2002-2003 annual report mandated that the department drop the college of Engineering alumni survey in favor of a more focused and more timely department Alumni Outcomes Survey targeted at alumni about one year after graduation. This survey was mailed out to both CpE and EE alumni from the class of 2003. The survey administration details for the two majors are:

• EE: 12 returned of 73 mailed out for a return rate of 16%.

10/2/04 AppE-Surveys.doc

E-8

• CpE: 4 returned of 15 mailed out for a 27% return rate. In addition, all 2003 graduates were transfer students who entered the CpE program after its establishment in fall of 2000. Therefore, CpE survey results must be used with caution.

E.3.1 Numerical Answers For a list of skills mapping directly to program outcomes, alumni were asked to rank both the importance of that skill to their current job, and the quality of preparation in that skill provided by their Michigan Tech education. Both metrics were ranked on a [0...4] scale, with 4 meaning best prepared or most important.

A correlation analysis of importance vs. preparation was run for each major, with the result that neither major showed a strong correlation between the two metrics. Therefore, a residual score for each skill was computed simply as the difference (Preparation – Importance). The resulting residuals, sorted by increasing importance, are shown for EE and CpE in Tables E.3-1 and E.3-2, and graphed in Figures E.3-1 and E.3-2, respectively.

E.3.1.1 Importance Rankings Figures E.2-1 and E.3-2 are sorted in order of increasing importance to their respective majors. The most note-worthy results regarding importance are:

1. Both majors ranked abilities (i1) succeed in graduate-level course work and (i2) conduct independent graduate-level research as the two skills of lowest importance. Accordingly, these abilities received the two highest residuals, indicating alumni felt over-prepared for graduate studies. This result makes sense for recent graduates getting started in industry.

2. Both majors ranked ability CpE (q2) write and debug assembler language programs as the third-lowest importance. However, CpEs ranked (q1) write & debug High-level Language programs as fourth most important item.

3. Both majors ranked as high in importance, the three elements of outcome (d) regarding team productivity. Specifically:(d1) Function as an effective member of a multi-disciplinary team, (d2) Partition a project into tasks & lay out a project plan, and (d3) Execute a project to completion & produce the required deliverables.

4. Both majors ranked as high in importance, ability (a3) apply knowledge of engineering topics to real-world problems, although they ranked the two associated skills (a1) apply knowledge of mathematics and (a2) apply knowledge of physical sciences as considerably less important.

5. EEs ranked ability (k2) utilize modern software tools & techniques of {electrical or computer} engineering as the single most important skill, while ranking ability (k1) utilize modern hardware tools & instruments of {electrical or computer} engineering as considerably less important. By contrast, CpEs ranked both of these skills closer to average.

6. CpEs ranked ability (c1) design an {electrical or computer} system, component, or process to meet desired needs as the third most important item. EEs ranked this skill as much closer to average in importance.

E.3.1.2 Residual Analysis Analysis of the residuals (Preparation – Importance) are graphed in Figures E.3-1 and E.3-2. In this context, a perfectly tuned curriculum would yield residuals near zero for all outcomes. A positive

10/2/04 AppE-Surveys.doc

E-9

residual indicates over-preparation relative to importance, while a highly negative number indicates under-preparation, and is a cause for concern. The most significant results are:

1. For both majors, ability (c1) design an {electrical or computer} system, component, or process to meet desired needs received the most negative residual value.

2. For both majors, ability (a3) apply knowledge of engineering topics to real-world problems received the second most negative residuals, while the two associated skills (a1) apply knowledge of mathematics and (a2) apply knowledge of physical sciences faired considerably better.

3. For both majors, none of the three elements of outcome (d) regarding team productivity received positive residuals.

4. For EEs, abilities (k1) utilize modern hardware tools & instruments of {electrical or computer} engineering and (k2) utilize modern software tools & techniques of {electrical or computer} engineering received the third most negative residuals.

5. For CpEs, ability (q1) write & debug High-level Language programs received the third most negative residual.

E.3.1.3 Comparison Between Majors Table E.3-3 and Figure E.3-3 show the differences in the absolute preparation scores for both majors (EE preparation – CpE Preparation). In this context a positive number indicates that EEs felt better prepared than CpEs on a particular ability. The figure shows that for the majority of outcomes, CpEs felt better prepared than EEs. Most significantly:

1. EEs felt significantly less prepared for ability (d3) Execute a project to completion & produce the required deliverables than CpEs.

2. CpEs felt less prepared for both components of outcome (f) professional ethics.

E.3.2 Written Comments Only one written comment was received. Since it came from a CpE student, it has been referred to the Associate Chair for Computer Engineering.

E.3.3 Analysis The following are the most significant conclusions drawn from the Alumni Outcomes Surveys.

1. Outcome (a): ability (a3) apply knowledge of engineering topics to real-world problems was ranked as highly important by both majors, but received the second most negative residuals. By contrast, the two associated abilities (a1) apply knowledge of mathematics and (a2) apply knowledge of physical sciences faired considerably better.

2. Outcome (c): design an {electrical or computer} system, component, or process to meet desired needs received the most negative residual values from both majors, despite ranking as the third most important ability for CpEs.

3. Outcome (d): all three components for the ability to function in a multidisciplinary team received mildly non-positive residuals, despite their relatively high importance. In particular, for ability (d3) Execute a project to completion & produce the required deliverables EEs felt significantly less prepared than CpEs.

4. Outcome (f): professional ethics showed CpEs considerably less confident than EEs. However, the CpE class of 2003 took the one-credit EE-3900, rather than the 2-credit EE-3970.

10/2/04 AppE-Surveys.doc

E-10

5. Outcome (g): written and oral communication, yielded residuals near zero. From a historical standpoint, this is a significant improvement.

6. Outcome CpE(q): both majors ranked ability (q2) write and debug assembler language programs as the third-lowest importance. However, CpEs gave (q1) write & debug High-level Language programs the third most negative residual, despite ranking it as the fourth most important item.

Relevant Outcomes: CpE a, c, d, f, g, q EE a, c, d, f, g

Course

Major CpE EE Total CpE EE Total CpE EE Total

Respond. 23 75 98 23 75 98 23 75 98CH-1100 2.25 2.19 2.20 2.62 2.91 2.84 3.50 3.86 3.78CS-1121 4.33 3.44 3.65 3.78 3.52 3.58 3.33 3.62 3.55EN-1101 2.50 3.04 2.91 2.30 3.04 2.87 4.20 3.62 3.76EN-1102 2.30 3.46 3.19 2.40 3.00 2.86 4.30 3.79 3.91MA-1160 3.56 4.15 4.01 3.44 3.92 3.81 3.22 3.25 3.24MA-2160 3.62 4.12 4.00 3.44 4.04 3.90 3.33 3.42 3.40PH-1100 2.40 2.92 2.80 2.40 2.79 2.70 3.00 2.92 2.94PH-2100 2.80 3.37 3.24 3.20 3.70 3.58 3.40 3.57 3.53UN-1001 2.20 2.07 2.10 2.80 2.31 2.43 3.40 3.16 3.22UN-1002 2.00 1.88 1.91 2.70 2.16 2.29 3.20 3.27 3.25

Mean 2.80 3.06 3.00 2.91 3.14 3.08 3.49 3.45 3.46Median 2.45 3.21 3.05 2.75 3.02 2.86 3.37 3.50 3.46

Course CpE EE Mean1 CH-1100 0.24 0.23 0.362 CS-1121 0.65 0.52 0.363 EN-1101 0.23 0.28 0.364 EN-1102 0.22 0.33 0.365 MA-1160 0.49 0.57 0.366 MA-2160 0.50 0.57 0.367 PH-1100 0.23 0.28 0.368 PH-2100 0.36 0.43 0.369 UN-1001 0.25 0.23 0.3610 UN-1002 0.22 0.20 0.36

Mean 0.34 0.36Median 0.25 0.31

Course CpE EE Mean1 CH-1100 0.15 0.15 0.182 CS-1121 0.23 0.19 0.183 EN-1101 0.11 0.17 0.184 EN-1102 0.11 0.16 0.185 MA-1160 0.21 0.24 0.186 MA-2160 0.21 0.24 0.187 PH-1100 0.16 0.19 0.188 PH-2100 0.19 0.21 0.189 UN-1001 0.16 0.15 0.1810 UN-1002 0.17 0.13 0.18

Mean 0.17 0.18Median 0.17 0.18

Table E.1-2: Efficiency = K/(W*5)

Table E.1-0 Mean Quantitative ScoresR = Relevance to Major K = Knowledge Gained W = Work Expended

Sophomore Survey Data 2003-04

Table E.1-1: Productivity = R*(K/25)

Figure E.1-1: Course Productivity

0.00

0.100.20

0.30

0.40

0.500.60

0.70

CH-1100

CS-1121

EN-1101

EN-1102

MA-1160

MA-2160

PH-1100

PH-2100

UN-1001

UN-1002

CpEEEMean

Figure E.1-2: Course Efficiency

0.00

0.05

0.10

0.15

0.20

0.25

0.30

CH-1100

CS-1121

EN-1101

EN-1102

MA-1160

MA-2160

PH-1100

PH-2100

UN-1001

UN-1002

CpEEEMean

10/2/2004 E-11 Data-Surveys-Soph.xls

Course Prod. Rank

Effic. Rank Mean

NetMean

EN-1102 1 1 1 5.25EN-1101 3 1 2 5.25PH-1100 3 4 3.5 5.25UN-1002 1 6 3.5 5.25CH-1100 5 3 4 5.25UN-1001 6 4 5 5.25PH-2100 7 7 7 5.25MA-1160 8 8 8 5.25MA-2160 9 8 8.5 5.25CS-1121 10 10 10 5.25Mean 5.30 5.20 5.25

Median 5.50 5.00 4.50

Course Prod. Rank

Effic. Rank Mean

NetMean

UN-1002 1 1 1 5.35UN-1001 2 2 2 5.35CH-1100 3 2 2.5 5.35EN-1101 5 5 5 5.35EN-1102 6 4 5 5.35PH-1100 4 6 5 5.35CS-1121 7 6 6.5 5.35PH-2100 8 8 8 5.35MA-1160 9 9 9 5.35MA-2160 10 9 9.5 5.35

Mean 5.50 5.20 5.35Median 5.50 5.50 5.00

Table E.1-3a: CpE Mean Course Ranks

Table E.1-3b: EE Mean Course Ranks

Figure E.1-3a: CpE Mean Course Ranks

0

2

4

6

8

10

EN-1102

EN-1101

PH-1100

UN-1002

CH-1100

UN-1001

PH-2100

MA-1160

MA-2160

CS-1121

MeanRank

NetMean

Figure E.1-3b: EE Mean Course Ranks

0

2

4

6

8

10

UN-1002

UN-1001

CH-1100

EN-1101

EN-1102

PH-1100

CS-1121

PH-2100

MA-1160

MA-2160

MeanRank

NetMean

10/2/2004 E-12 Data-Surveys-Soph.xls

Table E.2-1 CpE Outcome ConfidenceQues Statement of Outcome outcm mean med Mean Mn-Sig Mn+Sig

1.h understand the impact of my engineering solutions in a global & societal context h 4.08 4.00 4.42 4.23 4.621.c design an {electrical or computer} system, component, or process to meet desired needs c 4.17 4.00 4.42 4.23 4.621.e identify, formulate, & solve {electrical or computer} engineering problems e 4.33 4.00 4.42 4.23 4.621.j2 understand the imipact of institutions and organizations in shaping today's world j2 4.33 4.50 4.42 4.23 4.621.g1 write an effectice, formal, written report or paper in an IEEE-compatible style g1 4.33 4.50 4.42 4.23 4.621.g2 prepare and present an effective, formal, oral presentation on a technical topic g2 4.33 5.00 4.42 4.23 4.621.k utilize modern tools and techniques of {electrical or computer} engineering k 4.42 4.00 4.42 4.23 4.621.b design & conduct experiments, as well as analyze & interpret data b 4.42 4.50 4.42 4.23 4.621.f2 respond to a professional ethical dilemma in accordance with the IEEE Code of Conduct f2 4.50 5.00 4.42 4.23 4.621.j1 work and interact effectively with people of different cultural backgrounds j1 4.58 5.00 4.42 4.23 4.621.f1 recognize a professional ethical dilemma, should the need arise f1 4.58 5.00 4.42 4.23 4.621.a apply knowledge of math, sci, & engg to real-world EE problems a 4.58 5.00 4.42 4.23 4.621.d function as an effective member of a multidisciplinary team d 4.83 5.00 4.42 4.23 4.62

avg 4.42 4.58std 0.20 0.45

Table E.2-2 EE Outcome ConfidenceQues Statement of Outcome outcm mean med Mean Mn-Sig Mn+Sig

1.c design an {electrical or computer} system, component, or process to meet desired needs c 3.52 4.00 3.87 3.60 4.141.h understand the impact of my engineering solutions in a global & societal context h 3.57 4.00 3.87 3.60 4.141.j2 understand the imipact of institutions and organizations in shaping today's world j2 3.62 4.00 3.87 3.60 4.141.f2 respond to a professional ethical dilemma in accordance with the IEEE Code of Conduct f2 3.71 4.00 3.87 3.60 4.141.k utilize modern tools and techniques of {electrical or computer} engineering k 3.71 4.00 3.87 3.60 4.141.a apply knowledge of math, sci, & engg to real-world EE problems a 3.76 4.00 3.87 3.60 4.141.f1 recognize a professional ethical dilemma, should the need arise f1 3.81 4.00 3.87 3.60 4.141.e identify, formulate, & solve {electrical or computer} engineering problems e 3.86 4.00 3.87 3.60 4.141.b design & conduct experiments, as well as analyze & interpret data b 4.00 4.00 3.87 3.60 4.141.g2 prepare and present an effective, formal, oral presentation on a technical topic g2 4.00 4.00 3.87 3.60 4.141.j1 work and interact effectively with people of different cultural backgrounds j1 4.10 4.00 3.87 3.60 4.141.g1 write an effectice, formal, written report or paper in an IEEE-compatible style g1 4.29 4.00 3.87 3.60 4.141.d function as an effective member of a multidisciplinary team d 4.38 5.00 3.87 3.60 4.14

avg 3.87 4.08std 0.27 0.28

Senior Survey Data 2003-04

EE

Comb

Comb

CpE

10/2/2004 E-13 Data-Surveys-Senior.xls

Figure E.2-1: CpE Outcome Confidence

2.0

3.0

4.0

5.0

h c e j2 g1 g2 k b f2 j1 f1 a d

Outcome ID

Con

fiden

ce (1

...5)

CpEMn - St DvMn + St DvMean

Figure E.2-2: EE Outcome Confidence

2.0

3.0

4.0

5.0

c h j2 f2 k a f1 e b g2 j1 g1 d

Outcome ID

Con

fiden

ce (1

...5)

EEMn - St DvMn + St DvMean

10/2/2004 E-14 Data-Surveys-Senior.xls

Ques Statement of Outcome outcm EE CpE EE-CpE Mn - Sig Mn + Sig1.a apply knowledge of math, sci, & engg to real-world EE problems a 3.76 4.58 -0.82 -0.77 -0.331.f2 respond to a professional ethical dilemma in accordance with the IEEE Code of Conduct f2 3.71 4.50 -0.79 -0.77 -0.331.f1 recognize a professional ethical dilemma, should the need arise f1 3.81 4.58 -0.77 -0.77 -0.331.k utilize modern tools and techniques of {electrical or computer} engineering k 3.71 4.42 -0.71 -0.77 -0.331j2 understand the imipact of institutions and organizations in shaping today's world j2 3.62 4.33 -0.71 -0.77 -0.331.c design an {electrical or computer} system, component, or process to meet desired needs c 3.52 4.17 -0.65 -0.77 -0.331.h understand the impact of my engineering solutions in a global & societal context h 3.57 4.08 -0.51 -0.77 -0.331.j1 work and interact effectively with people of different cultural backgrounds j1 4.10 4.58 -0.48 -0.77 -0.331.e identify, formulate, & solve {electrical or computer} engineering problems e 3.86 4.33 -0.47 -0.77 -0.331.d function as an effective member of a multidisciplinary team d 4.38 4.83 -0.45 -0.77 -0.331.b design & conduct experiments, as well as analyze & interpret data b 4.00 4.42 -0.42 -0.77 -0.331.g2 prepare and present an effective, formal, oral presentation on a technical topic g2 4.00 4.33 -0.33 -0.77 -0.331.g1 write an effectice, formal, written report or paper in an IEEE-compatible style g1 4.29 4.33 -0.04 -0.77 -0.33

avg 3.87 4.42 -0.55std 0.27 0.20 0.22

Table E.2-3: Score Differences (EE - CpE)

Figure E.2-3 Score Differences (EE - CpE)

-0.80

-0.60

-0.40

-0.20

0.00

0.20

0.40

0.60

a f2 f1 k j2 c h j1 e d b g2 g1

Difference

Out

com

e ID

EE-CpEMn - SigMn + Sig

EE StrongerCpE Stronger

10/2/2004 E-15 Data-Surveys-Senior.xls

Answer SA MA A MD SD SA+MACpE 9.1 0.0 0.0 36.4 54.5 9.1

EE 22.2 11.1 11.1 27.8 27.8 33.3

Answer SA MA A MD SD SA+MACpE 60.0 20.0 10.0 0.0 10.0 80.0

EE 48.7 30.8 15.4 0.0 5.1 79.5

Table E.2-5: Continuing Ed Plans (pcnt)

Table E.2-4: Grad School Plans (pcnt)

Figure E.2-4: Grad School Plans (pcnt)

0.010.020.030.040.050.060.0

SA MA A MD SD

CpEEE

Figure E.2-5: Continuing Ed Plans (pcnt)

0.010.020.030.040.050.060.070.0

SA MA A MD SD

CpEEE

10/2/2004 E-16 Data-Surveys-Senior.xls

Outcome prep import Resid Outcome prep import Resid Outcome EE prep CpE prep Diffi2 2.11 1.30 0.811 i2 2.67 0.67 2.000 d3 2.83 3.75 -0.917i1 3.00 1.60 1.400 i1 3.00 0.67 2.333 q1 1.92 2.75 -0.833q2 1.92 1.67 0.250 q2 2.25 1.00 1.250 j1 2.50 3.25 -0.750h1 2.08 1.92 0.167 b2 2.75 1.50 1.250 a2 2.67 3.25 -0.583q1 1.92 2.00 -0.083 g2 2.75 1.75 1.000 i2 2.11 2.67 -0.556j2 2.17 2.04 0.125 f1 2.50 2.00 0.500 e2 2.75 3.25 -0.500g1 2.75 2.04 0.708 a1 3.25 2.00 1.250 b2 2.33 2.75 -0.417b2 2.33 2.17 0.167 a2 3.25 2.00 1.250 c1 2.08 2.50 -0.417g2 2.83 2.38 0.458 j1 3.25 2.00 1.250 h1 2.08 2.50 -0.417j1 2.50 2.42 0.083 k1 2.75 2.25 0.500 d1 2.92 3.25 -0.333b1 2.42 2.67 -0.250 j2 2.00 2.50 -0.500 d2 2.42 2.75 -0.333f2 2.75 2.75 0.000 f2 2.25 2.50 -0.250 q2 1.92 2.25 -0.333f1 2.83 2.75 0.083 b1 2.50 2.50 0.000 k1 2.50 2.75 -0.250a1 3.08 2.83 0.250 g1 2.75 2.50 0.250 a1 3.08 3.25 -0.167a2 2.67 2.92 -0.250 h1 2.50 2.75 -0.250 b1 2.42 2.50 -0.083d2 2.42 3.00 -0.583 d2 2.75 3.00 -0.250 a3 2.58 2.67 -0.083c1 2.08 3.17 -1.083 k2 2.75 3.00 -0.250 g1 2.75 2.75 0.000k1 2.50 3.17 -0.667 e1 2.75 3.25 -0.500 i1 3.00 3.00 0.000e2 2.75 3.17 -0.417 e2 3.25 3.25 0.000 e1 2.82 2.75 0.068e1 2.82 3.27 -0.455 a3 2.67 3.50 -0.833 g2 2.83 2.75 0.083d1 2.92 3.33 -0.417 q1 2.75 3.50 -0.750 k2 2.83 2.75 0.083a3 2.58 3.42 -0.833 c1 2.50 3.75 -1.250 j2 2.17 2.00 0.167d3 2.83 3.42 -0.583 d1 3.25 3.75 -0.500 f1 2.83 2.50 0.333k2 2.83 3.58 -0.750 d3 3.75 3.75 0.000 f2 2.75 2.25 0.500

Average 2.55 2.62 -0.08 Average 2.78 2.47 0.31 Average 2.55 2.78 -0.24Correl 0.48 Correl 0.19

Alumni Outcomes Survey Data - Class of 2003

Table E.3-1: EE Outcomes - 2003 Table E.3-2: CpE Outcomes - 2003 Table E.3-3 (EE prep - CpE prep)

10/2/2004 E-17 Data-Surveys-Alum.xls

Figure E.3-1: EE Alumni Relative Residuals (Preparation - Importance)

-1.5-1.0-0.50.00.51.01.52.02.5

i2 i1 q2 h1 q1 j2 g1 b2 g2 j1 b1 f2 f1 a1 a2 d2 c1 k1 e2 e1 d1 a3 d3 k2

Outcomes

Res

idua

l

Figure E.3-2: CpE Alumni Alumni Relative Residuals (Preparation - Importance)

-1.5-1.0-0.50.00.51.01.52.02.5

i2 i1 q2 b2 g2 f1 a1 a2 j1 k1 j2 f2 b1 g1 h1 d2 k2 e1 e2 a3 q1 c1 d1 d3

Outcomes

Res

idua

l

10/2/2004 E-18 Data-Surveys-Alum.xls

Figure E.3-3: Preparation Differences Between Majors (EE Prep - CpE Prep)

-1.0

-0.5

0.0

0.5

1.0

d3 q1 j1 a2 i2 e2 b2 c1 h1 d1 d2 q2 k1 a1 b1 a3 g1 i1 e1 g2 k2 j2 f1 f2

Outcomes

Diff

eren

ce

10/2/2004 E-19 Data-Surveys-Alum.xls

9/24/04 AppF-FE-Exam.doc

F-1

F Fundamentals of Engineering (FE) Exam Evaluation This appendix reports the results of the Fundamentals of Engineering (FE) General (morning) exams and the EE Subject (afternoon) exams taken by EE and CpE students in October 2003 and April 2004. As required by the UPAC SOP [1, Sec 6.6], a “grade” was computed for selected sections of the exam on a scale of (0…4) where 2.0 = the national average, and >2.0 is considered acceptable.

F.1 Test Results Table F.1 and Figure F.1 summarize the test results and trends for EE students, while Table F.2 and Figure F.2 summarize the test results and trends for CpE students. The detailed statistics and their resulting grades are listed in Tables F.3 and F.4, and graphed in Figures F.3 and F.4 for EE and CpE students, respectively.

F.2 Analysis

F.2.1 Electrical Engineering The results for EE students show the following:

1. EE Subject Test: Figure F.1 shows that the overall EE test grades are relatively constant at just below 2.0. Figure F.3 shows a tight clustering of topic grades right around the national average (2.0), with grades confined to the range: [1.62 ... 2.22]. Thus there were no outstandingly good or bad topics shown in the subject test. The three weakest areas on the test were Solid State Electronics & Devices, Digital Systems, and Analog Electronic Circuits, indicating minor weaknesses on Outcome (a).

2. General Test – Ethics Section: Figure F.1 shows significant improvement in the ethics scores of EE students, from just under 2.0 to just under 3.0, reflecting well on Outcome (f).

3. General Test – Math Section: Figure F.1 shows a 24% drop in this grade from an acceptable 2.32 to an unacceptable 1.76, reflecting poorly on Outcomes EE(l), (m), and/or (n). Although the CpE sample size was too small to be statistically significant, their results closely mirrored the EE results.

F.2.2 Computer Engineering The sample size for CpE is too small to be significant. The results are reported here only for completeness.

Relevant Outcomes: CpE q EE a, f, l, m, n

GRADES BY TOPIC

AY

-02/03

AY

-03/04

Number of Students 19 43EE Subject Test 1.98 1.92General Test - Ethics 1.83 2.82General Test -Math 2.32 1.76

GRADES BY TOPIC

AY

-02/03

AY

-03/04

Number of Students 1 2EE Subject Test 1.40 2.04General Test - Ethics 4.00 3.38General Test -Math 1.87 1.78

Table F.1: Summary - ELECTRICAL ENGINEERING

Figure F.1: Trends - ELECTRICAL ENGINEERING

Table F.2: Summary - COMPUTER ENGINEERING

Figure F.2: Trends - COMPUTER ENGINEERING

0.0

1.0

2.0

3.0

4.0

AY-02/03 AY-03/04

EE Subject Test

General Test -Ethics

General Test -Math

0.0

1.0

2.0

3.0

4.0

AY-02/03 AY-03/04

EE Subject Test

General Test -Ethics

General Test -Math

7/22/2004 F-2 Data-FE-Exam-sorted.xls

Electrical EngineeringStudents

Fall MT

U

Fall Nat'l

Spring MT

U

Spring Nat'l

MT

U A

ve.

Nat'l A

ve

Num

. Quest.

MT

U G

rade

Number of Students Taking Exam 6 628 37 1258 43 1886SOLID ST ELEC & DEV 39 55 44 51 43.3 52.3 6 1.62DIGITAL SYSTEMS 61 60 49 57 50.7 58.0 6 1.65ANALOG ELEC CIRCUITS 25 34 27 39 26.7 37.3 6 1.66INSTRUMENTATION 17 33 36 44 33.3 40.3 3 1.77CONT SYS THEORY ANAL 44 44 50 56 49.2 52.0 6 1.88ELECTRO THEORY & APP 72 66 47 44 50.5 51.3 6 1.97POWER SYSTEMS 39 31 20 19 22.7 23.0 3 1.99SIGNAL PROCESSING 33 36 41 42 39.9 40.0 3 2.00COMM THEORY 33 38 51 53 48.5 48.0 6 2.02COMP & NUM METHODS 39 33 44 47 43.3 42.3 3 2.03COMP HARDWR ENGINRNG 33 39 46 44 44.2 42.3 3 2.06NETWORK ANALYSIS 33 34 55 55 51.9 48.0 6 2.15COMP SOFTWR ENGINRNG 44 51 68 65 64.7 60.3 3 2.22EE SUBJECT TEST - AVERAGE 39.4 42.6 44.5 47.4 43.8 45.8 1.92GENERAL TEST - ETHICS 63.0 59.0 82.0 68.0 79.3 65.0 2.82GENERAL TEST - MATH 73.0 74.0 61.0 63.0 62.7 66.7 1.76

Electrical EngineeringStudents

Fall MT

U

Fall Nat'l

Spring MT

U

Spring Nat'l

MT

U A

ve.

Nat'l A

ve

Num

. Quest.

MT

U G

rade

Number of Students Taking Exam 0 0 2 208 2 208ELECTRO THEORY & APP 0 0 8 36 8.0 36.0 6 1.13COMP HARDWR ENGINRNG 0 0 50 63 50.0 63.0 3 1.30COMP & NUM METHODS 0 0 33 48 33.0 48.0 3 1.42ANALOG ELEC CIRCUITS 0 0 17 33 17.0 33.0 6 1.52CONT SYS THEORY ANAL 0 0 42 46 42.0 46.0 6 1.85POWER SYSTEMS 0 0 17 19 17.0 19.0 3 1.95COMP SOFTWR ENGINRNG 0 0 83 82 83.0 82.0 3 2.11SOLID ST ELEC & DEV 0 0 58 50 58.0 50.0 6 2.32NETWORK ANALYSIS 0 0 58 49 58.0 49.0 6 2.35DIGITAL SYSTEMS 0 0 75 69 75.0 69.0 6 2.39SIGNAL PROCESSING 0 0 50 35 50.0 35.0 3 2.46INSTRUMENTATION 0 0 67 46 67.0 46.0 3 2.78COMM THEORY 0 0 83 50 83.0 50.0 6 3.32EE SUBJECT TEST - AVERAGE 0.0 0.0 49.3 48.2 49.3 48.2 2.04GENERAL TEST - ETHICS 0.0 0.0 90.0 68.0 90.0 68.0 3.38GENERAL TEST - MATH 0.0 0.0 60.0 64.0 60.0 64.0 1.78

Table F.3: Detailed Results - ELECTRICAL Engineering

Table F.4: Detailed Results - COMPUTER Engineering

7/22/2004 F-3 Data-FE-Exam-sorted.xls

Figure F.3: Category Results - ELECTRICAL Engineering

Figure F.4: Category Results - COMPUTER Engineering

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0

SOLID ST ELEC & DEV

DIGITAL SYSTEMS

ANALOG ELEC CIRCUITS

INSTRUMENTATION

CONT SYS THEORY ANAL

ELECTRO THEORY & APP

POWER SYSTEMS

SIGNAL PROCESSING

COMM THEORY

COMP & NUM METHODS

COMP HARDWR ENGINRNG

NETWORK ANALYSIS

COMP SOFTWR ENGINRNG

EE SUBJECT TEST - AVERAGE

GENERAL TEST - ETHICS

GENERAL TEST - MATH

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0

ELECTRO THEORY & APP

COMP HARDWR ENGINRNG

COMP & NUM METHODS

ANALOG ELEC CIRCUITS

CONT SYS THEORY ANAL

POWER SYSTEMS

COMP SOFTWR ENGINRNG

SOLID ST ELEC & DEV

NETWORK ANALYSIS

DIGITAL SYSTEMS

SIGNAL PROCESSING

INSTRUMENTATION

COMM THEORY

EE SUBJECT TEST - AVERAGE

GENERAL TEST - ETHICS

GENERAL TEST - MATH

7/22/2004 F-4 Data-FE-Exam-sorted.xls

8/5/04 AppG-PreCAP.doc

G-1

G CpE Pre-Capstone Assessment Package (PreCAP) As specified in the UPAC SOP [1, Sec 6.7] this appendix reports the results of the CpE PreCAP procedure to assess representative samples of student work from EE-3173, EE-3175, and EE-3970.

G.1 Implementation and Results A team of ECE faculty was assembled to review and evaluate selected student work. Three representative samples of student work were chosen for each evaluation. Two evaluators were chosen for each sample assessed, such that the evaluators had not been instructors for any of the student work that they evaluated.

Each sample was evaluated according to a detailed rubric for that category of work. All line-items were ranked on a (1…4) scale, with 4 being the best score. The desired score for each category examined is 2.5 for the mean, and 3.0 for the “top end” sample.

Evaluation of the numerical results relative to the previous year should be tempered by the fact that there will be variations in the scores given by different evaluators between the two years. Thus, minor differences in scores should not be taken as significant.

G.1.1 EE-3173 Lab Reports A representative sample of selected laboratory reports from EE-3173 was evaluated for outcomes (c), and (k). The samples were chosen from three lab assignments that required the students to implement sensing and control functions using an FPGA-based microprocessor, and to add special-purpose hardware to a design in order to evaluate hardware/software performance tradeoffs. The results of this evaluation are shown in Table G.1.

Table G.1 (a) shows that the flexibility afforded to the students by the problem statements scored a mean of 3.5. This was a significant improvement over the 2.5 mean score of last year.

Table G.1 (b) shows the results of the evaluation. The mean score over all assignments was 3.53, up from last year’s mean of 2.66. In addition, all line items in the rubric showed improvement from last year, although some were not statistically significant. Thus, both the mean and high-end samples met their goals.

Relevant Outcomes: CpE c, k EE N/A

G.1.2 EE-3970 Technical Writing A representative sample set of EE-3970 individual technical writing assignments from late in the term was evaluated for the quality of technical writing, and adherence to the specified IEEE-CS Style Guide.

Table G.2 shows the results of this evaluation. The overall mean score was a disappointing 1.90, down significantly from last year’s mean of 3.19. All line items in the rubric showed a decrease from last year, although some were not statistically significant. The papers assessed were quite short and simple. The most obvious problems are:

1. There were no abstracts, figures, tables, or section headers in any of the papers, and the boundaries of Introduction and Conclusion sections tended to be fuzzy and indistinct.

8/5/04 AppG-PreCAP.doc

G-2

2. The mean score for “References and Citations” was 2.25, down significantly from last year’s mean of 3.35. However, the top-end sample scored 3.0 on this item.

3. The mean score for “Writing Style and Grammar” was 2.73, roughly the same as last year’s mean of 2.83. However, the top-end sample scored over 3.0 in this item.

Clearly, the level of writing skill actually demonstrated is lower than the previous year. The single biggest problem was that the assignment was not of sufficient length or complexity to exercise all elements of the rubric.

Relevant Outcomes: CpE g EE N/A

G.1.3 EE-3970 Team Projects A representative sample of project plans and written reports from the final team projects in EE-3970 were evaluated for the abilities to: (1) design and conduct an experiment, (2) analyze data in a statistically valid manner, (3) effectively organize a team to execute a project and publish the results. The results are shown in Table G.3.

Table G.3 (a) summarizes the evaluation of the assignment statement for clarity, flexibility, and complexity. The overall evaluation was a respectable 3.0, indicating that students had sufficient opportunity to demonstrate the desired outcomes.

Table G.3 (b) summarizes the students’ teamwork skills as demonstrated in the project plan. The overall mean score was a disappointing 1.38, down dramatically from last year’s mean of 3.67. Even the top-end sample scored a mean of only 2.0. The biggest problem was that the instructor had students fill out Human Resource Inventory forms, but did not ask them to analyze them or use them to assign personnel to tasks.

Table G.3(c) summarizes the students’ abilities to design an experiment, analyze the data in a statistically valid manner, and actually deliver the specified deliverables. The overall mean score was a disappointing 2.13, down significantly from last year’s mean of 3.27. However, the top-end sample scored a mean of 3.0. All line items in the rubric showed a decrease from last year, although some were not statistically significant. The most significant problem items were:

1. Choice of Experimental Approach. Some teams did not run enough benchmark samples to obtain meaningful statistics.

2. Execution of Statistical Analysis Methods. Although most teams selected appropriate statistical methods, the execution and exposition of the results left much to be desired.

3. How Well Deliverables Were Delivered. There was across-the-board disappointment as to whether and how well the teams presented the deliverables requested in the assignment.

Relevant Outcomes: CpE b, d EE N/A

8/5/04 AppG-PreCAP.doc

G-3

G.1.4 EE-3175 Modeling and Simulation A representative sample-set of modeling and simulation assignments performed in EE-3175 was evaluated for the abilities to use: (1) methods and tools for modeling and simulation of digital system performance, (2) methods and tools for modeling and simulation of digital system dependability.

Sample assignments were assessed for two properties: the ability to derive a model from a system description, and the ability to apply an appropriate software tool to evaluate the derived model. The following conclusions can be drawn from Table G.4:

1. The breadth of modeling methods employed and the number of tools used has increased from last year.

2. The overall mean score for “derive” was 2.89, roughly equivalent to last year’s mean of 2.70.

3. The overall mean score for “apply” was 3.06, up slightly from last year’s mean of 2.58.

4. The top-end samples scored at or above 3.5 for all line items.

These results show significant improvement in the breadth of methods covered, and in the ability to “apply” software modeling tools. However, there is still room for improvement in the ability to “derive” a model. This can be improved by assigning more complex and sophisticated modeling problems.

Relevant Outcomes: CpE k EE N/A

G.2 Summary Assessment of EE-3173 and EE-3175 assignments showed significant improvements in the demonstration of outcomes (c) and (k), as assigned to these courses To further increase the depth of delivery of (k), the complexity of modeling problems assigned in EE-3175 should be increased. Otherwise, the instructors for these two courses should keep doing what they are doing.

Assessment of EE-3970 assignments showed significant decreases in the demonstration of outcomes (b), (d), and (g), as assigned to this course. Specifically:

1. Written assignments were too short and simplistic to demonstrate all desired attributes,

2. Grammar and Writing Style have not improved since last year,

3. The ability to properly use references and citations showed some degradation,

4. The final project assignment did not require demonstration of all required teamwork skills,

5. Students were sloppier in the execution, analysis, and exposition of their team projects than in the previous year. It appears that the big picture has been lost.

To summarize, the level of performance demanded of the students in EE-3970 appears to have dropped significantly relative to the previous year. Examination of the EE-3970 Course Specification reveals that the level of rigor required by the PreCAP assessment instruments is not clearly stated in the specification. This ambiguity in the specification should be corrected to provide future instructors with adequate guidance on these topics.

Evaluator E1 E2 RowMean

Prev Year Change

Flexibility afforded by assignments 4 3 3.5 2.5 1.00

Sample ID SA SA SB SB SC SC

Evaluator E1 E2 E1 E2 E1 E2Ability to Design a system, component, or process to meet a specific need 4.0 4.0 4.0 3.0 3.5 3.0 3.58 2.50 1.08

Ability to Design and implement a software solution to a problem 4.0 4.0 3.5 3.5 3.0 3.0 3.50 2.78 0.72

Ability to Design and implement a hardware solution to a problem 3.5 3.5 3.5 3.5 3.5 3.0 3.42 2.75 0.67

Ability to Integrate hardware & software into a unified system 4.0 4.0 3.5 3.5 3.0 3.0 3.50 2.75 0.75

Ability to apply modern tools & techniques of Computer Engineering 4.0 4.0 4.0 4.0 3.0 3.0 3.67 N/A N/A

Column Means 3.9 3.9 3.7 3.5 3.2 3.0 3.53 2.66 0.87

Mean scores for each sample 3.9 3.6 3.1 3.53 2.66 0.87

Sample ID SA SA SB SB SC SC

Evaluator E1 E2 E1 E2 E1 E2

Abstract or Executive Summary 1.0 1.0 1.0 1.0 1.0 1.0 1.00 3.30 -2.30

Introduction 2.0 3.7 3.0 2.3 1.0 1.0 2.17 3.12 -0.95

Conclusions 2.0 3.0 3.0 3.0 2.0 2.0 2.50 3.07 -0.57

Organization of Main Body 1.0 1.0 1.0 1.0 1.0 1.0 1.00 3.65 -2.65

Writing Style & Grammar 3.0 3.2 3.0 2.7 2.0 2.5 2.73 2.83 -0.09

Content of Main Body 3.0 3.0 3.0 3.0 2.0 1.5 2.58 3.52 -0.93

Tables & Figures 1.0 1.0 1.0 1.0 1.0 1.0 1.00 1.50 -0.50

References & Citations 3.0 2.5 3.0 3.0 1.0 1.0 2.25 3.35 -1.10

Column Means 2.00 2.30 2.25 2.13 1.38 1.38 1.90 3.04 -1.14

Mean scores for each sample 2.15 2.19 1.38 1.90 3.04 -1.14

Change

Change

Prev Year

Table G.1: Hardware-Software Integration - EE-3173 Lab Reports

RowMean

RowMean

Table G.2: Technical Writing Skills - EE-3970 Individual Written Reports

(a) Assignment Statement

(b) Completed Laboratory Reports

Prev Year

8/5/2004 G-4 Data-PreCAP.XLS

Evaluator E1 E2 Row Mean

Prev Year Change

Clarity of Goals & deliverables 4.0 3.0 3.50 4.00 -0.50

Flexibility Afforded by Assignement 3.0 2.0 2.50 3.50 -1.00

Complexity of Assignment 3.0 3.0 3.00 4.00 -1.00

Column Means 3.33 2.67 3.00 3.83 -0.83

Sample ID SA SA SB SB SC SC

Evaluator E1 E2 E1 E2 E1 E2

Human Resources Inventory 1.0 0.0 1.0 0.0 1.0 0.0 0.50 3.58 -3.08

Partitioning of Projects into tasks 4.0 2.0 4.0 2.0 3.0 3.0 3.00 3.67 -0.67

Mapping of Team Members to Tasks 2.0 3.0 2.0 1.0 2.0 2.0 2.00 3.75 -1.75

Column Means 2.33 1.67 2.33 1.00 2.00 1.67 1.83 3.67 -1.83

Mean scores for each sample 2.00 1.67 1.83 1.83 3.67 -1.84

Sample ID SA SA SB SB SC SC

Evaluator E1 E2 E1 E2 E1 E2

Choice of Experimental Approach 3.0 3.0 1.0 2.0 1.0 1.0 1.83 3.50 -1.67

Execution of Experimental Method 2.0 4.0 2.0 3.0 1.0 1.0 2.17 3.08 -0.91

Choice of Statistical Analysis Methods 4.0 4.0 3.0 3.0 1.0 1.0 2.67 3.50 -0.83

Execution of Statistical Analysis Methods 2.0 3.0 2.0 2.0 1.0 1.0 1.83 3.00 -1.17

How well were Deliverables Delivered 2.0 3.0 2.0 2.0 2.0 2.0 2.17 3.25 -1.08

Column Means 2.60 3.40 2.00 2.40 1.20 1.20 2.13 3.27 -1.14

Mean scores for each sample 3.00 2.20 1.20 2.13 3.27 -1.14

Change

ChangeRowMean

RowMean

(b) Project Plan

(c) Execution of Project as Described in Project Report

(a) Assignment Statement

Table G.3: Experimental Design & Analysis - EE-3970 Project Reports

Prev Year

Prev Year

8/5/2004 G-5 Data-PreCAP.XLS

Sample ID SA SA SB SB SC SC

Evaluator E1 E2 E1 E2 E1 E2Ability to derive model from system description (Cache Architecture) 3.0 3.5 2.0 2.0 3.0 2.0 2.58 N/A N/A

Ability to apply an appropriate software tool(Dinero) 4.0 3.5 3.0 2.0 3.0 2.0 2.92 N/A N/A

Ability to derive model from system description (TTPN) 4.0 4.0 3.0 2.5 1.5 1.0 2.67 N/A N/A

Ability to apply an appropriate software tool(SPNP DES Model) 4.0 4.0 3.0 2.0 1.5 2.0 2.75 N/A N/A

Ability to derive model from system description (Proc/Cache/Mem architecture) 3.5 4.0 3.5 3.5 3.0 2.5 3.33 2.85 0.48

Ability to apply an appropriate software tool(SimpleScalar) 4.0 4.0 3.5 3.5 3.0 3.0 3.50 2.83 0.67

Ability to derive model from system description (Fault-Tree) 4.0 4.0 3.0 3.0 3.0 2.5 3.25 N/A N/A

Ability to apply an appropriate software tool(Itemsoft FTA module) 4.0 4.0 3.5 3.0 3.0 2.5 3.33 N/A N/A

Ability to derive model from system description (GSPN) 4.0 4.0 3.0 3.5 3.5 2.5 3.42 2.80 0.62

Ability to apply an appropriate software tool(SPNP Analytical Model) 4.0 4.0 3.5 3.5 3.5 2.5 3.50 2.42 1.08

Mean for each Sample (Ability to Derive) 3.80 2.90 2.45 2.89 2.70 0.19

Mean for each Sample (Ability to Apply) 3.95 3.05 2.60 3.06 2.58 0.48

Table G.4: Modeling and Simulation Skills - EE-3175 Assignments

ChangeRowMean

Prev Year

8/5/2004 G-6 Data-PreCAP.XLS

9/23/04 AppH-UAC.doc

H-1

H Undergraduate Advising Committee Outcomes Assessment The ECE Department Undergraduate Advisory Committee was asked their viewpoint regarding: a) the validity and relevance of the Program Outcomes, b) how well the curriculum achieves the Program Outcomes, c) the most prominent strengths and weaknesses identified in the curriculum or the assessment process , and d) any recommendations for improving the Program Outcomes and assessment processes. They reported their results in a written memorandum.

H.1 Verbatim Copy of Memo It is the opinion of the committee that the department should take action on the following items.

The "computing undecided" curriculum program for freshman that was proposed to us by the chair of the UPC should help alleviate some of the problems associated with the discontent of the students of the department with regards to the engineering fundamentals program. This option will be beneficial for all students who choose to participate in it. In addition to this program, we feel that the ECE Department should work with the CS department in other ways, particularly those regarding cross-curricular courses. Courses such as CS3421 Computer Architecture contain a lot of overlap with the EE Department courses like EE2171 Digital Logic, and EE3170 Microcontroller Applications. Corroboration between the departments could eliminate some of the redundant content of these courses freeing up time to explore additional areas of the topic.

The appointment of faculty for 4000-leval courses needs to be reexamined. Some courses such as EE4253 Real Time Signal Processing are currently being taught by faculty that would be better suited to teaching other courses. We feel that the 4000 level faculty selection should be made more carefully so as to allow for the greatest amount of learning by the senior students who choose to take the course.

H.2 Analysis and Recommendations The UPC evaluated the recommendations in the UAC report and concluded the following.

1. The UAC concurred with development of the computing undecided freshman option, stating that it “should help alleviate some of the problems associated with the discontent of the students of the department with regards to the engineering fundamentals program.” (CpE: k, q and EE: k)

2. The UPC reviewed the content of EE2171 and CS3421. There is about one week of material overlap between the two courses. The overlap is intentional; it is required so that CS3421 will fit into the Computer Science curriculum as well as the Computer Engineering curriculum. No further action is required.

3. The overlap between CS3421 and EE3170 is also intentional. CS3421 is a required course for Computer Engineering students and EE3170 is a required course for Electrical Engineering students. It is not intended that students take both courses. No further action is required.

4. The observation about assigning instructors to 4000 level courses was brought to the attention of the Associate Chair for Electrical Engineering, who schedules the instructors for EE courses. He will contact the UAC if he needs more details. No further action is required by the UPC.

Relevant Outcomes: CpE k, q EE k

10/5/04 AppI-AdHoc.doc

I-1

I Ad Hoc Assessment Instruments Four ad hoc assessment instruments were evaluated this year, comprising: (1) a report by external reviewers conducted in April 2004, (2) the General Education assessment report for AY-2003/2004, (3) two assessment reports for the Engineering Enterprise program, (4) an Engineering Fundamentals assessment repot covering from Fall 2000 through Spring 2004. Historically, these reports have not been executed with a fixed periodicity. However, some seem to be establishing a reliable period.

I.1 External Department Reviewers’ Report In April 2004, a team of outside reviewers was brought in to the ECE department by the College of Engineering to conduct a formal review of all departmental programs and operations.

I.1.1 Reported Assessment Results Two items of interest arose as a result of the external review process.

1. In their final report, the review team had only one comment relevant to the UPAC process [9]: “The process of assessment for ABET is outstanding, but we have a concern about its sustainability. We wonder whether expenditure of less time and effort may be prudent, especially given that this is not an area that the department has targeted for national leadership.” Sustainability is one of the five "Target Attributes" that the UPAC SOP requires the UPC to assess annually [1, Sec 5.2] The AY-2001/2002 Annual Outcomes Assessment Report identified a problem with the workload of the Department Assessment Coordinator. That report mandated corrective actions (primarily reduced reporting requirements). The AY-2002/2003 Annual Outcomes Assessment Report documented the improvement in the Assessment Coordinator's workload, but mandated additional corrective actions, (See Sec. 2, item 16 of this report). At this time, the assessment workload appears to have entered a sustainable steady-state condition.

2. During his review of the report, the Dean of Engineering noted that the UPAC SOP gives the authority for curriculum revisions to the entire department faculty [1, Sec. 2]. By contrast, authority for each degree program should be vested in the program faculty, rather than the entire department faculty. To date, this has been the de facto policy of the department, but only because of the congeniality of the faculty in yielding programmatic decisions to those closest to an issue. However, this policy is not formally specified in the UPAC SOP.

I.1.2 Conclusions Two conclusions are drawn from this assessment:

1. The UPC has already taken corrective actions on sustainability, and continuously monitors and reports on the issue in accordance with the UPAC SOP [1, Sec 7.1, item 1.c.i].

2. The policy that authority for each degree program shall be vested in the program faculty, rather than the entire department faculty, should be formally specified in the UPAC SOP.

Relevant Outcomes: CpE N/A (affects assessment processes)

EE N/A (affects assessment processes)

10/5/04 AppI-AdHoc.doc

I-2

I.2 General Education Assessment Report In the summer of 2004, MTU published an assessment of the General Education Program [8]. This report shows noticeable improvements in the assessment methods employed for General Education courses, achieved as a direct result of a proactive attack on the problem by the Director of General Education and the General Education Council (on which the ECE Department has a representative).

In particular, there has been a concerted effort to define and articulate the outcomes to be achieved by each of the four UN courses. There has also been significant movement away from reliance on anecdotal evidence toward a mixture of direct and indirect formal assessment instruments keyed to the specified outcomes.

I.2.1 Reported Assessment Results For the first time, the General Education assessment report provides hard evidence for the level of achievement of several UN course outcomes, and maps these generic outcomes into ABET outcomes (a)...(k). Some of the conclusions document across-the-board delivery of specific outcomes. Others are more speculative, or show that some sections seem to be achieving certain outcomes. Since the ECE department does not control section assignments in UN courses, the speculative or section-specific conclusions must be ignored, herein, in favor of the across-the-board conclusions. Specific results of particular interest to this department are:

I.2.1.1 Perspectives (UN-1001): Perspectives topics vary widely between sections, to the point where they are defacto independent courses. This variability makes it difficult to achieve the across-the-board consistency needed for this department to rely on the course for outcome delivery. However, the General Education assessment report documents several areas in which consistency has been achieved. A review of the course syllabi revealed the following [8, Sec. 1]:

1. All sections require at least 20 pages of graded writing, and one oral presentation (outcome g),

2. All sections review the university’s academic integrity policies, and 75% discuss additional ethical issues (outcome f),

3. “Virtually all” sections include materials on current events and on global and societal aspects of the section’s topic (outcome h).

In addition, at the end of the semester, students are asked to rank their agreement with ten specialized statements of their learning outcomes [8]. Results of particular interest are:

1. Question 4 showed that 66% of students either “agreed” or “strongly agreed” that their writing skills had improved as a result of taking the course (outcome g),

2. Questions 3, 6, and 10 asked about skills related to lifelong learning (outcome i); between 61 and 75% of students either “agreed” or “strongly agreed” that these skills had improved as a result of taking the course.

It thus appears that Perspectives makes an across-the-board contribution to outcomes g, f, h, and potentially i, even though some of these contributions may be rudimentary or indirect.

10/5/04 AppI-AdHoc.doc

I-3

I.2.1.2 World Cultures (UN-1002) This course is of major importance to outcome (j.1) exposure to the diversity of world cultures and an ability to work with people of different cultural backgrounds, and to a lesser extent, outcome (h) receipt of a broad education necessary to understand the impact of engineering solutions in a global and societal context.

New assessment instruments, instituted this year for this course, appear to indicate that the desired outcomes are being delivered by the end of the senior year. This is in sharp contrast to last year, when a few rudimentary assessments indicated that the course was not effective in improving students’ knowledge of the core topic. A variety of assessment instruments are being tried out, but the most compelling results from this year’s assessment involve the use of familiar instruments.

1. Senior Exit Surveys and interviews of ECE students have chronically indicated that the students do not believe they are attaining outcomes (j) or (h). However, cross-departmental analysis by the Director of General Education provides some indications to the contrary [8, Sec 2.1.1]. Specifically:

a. MEEM senior exit surveys concurred with the ECE conclusion that students do not think they have achieved any of the relevant learning outcomes

b. However, MEEM also used an essay question directed at outcome (h), with the result that students “do very well on the question”.

c. Biology and Physics senior exit interviews indicated that students believe the course is quite substantive—if students are inclined to take advantage of it.

2. For both UN-1002 and 2002, the Director of General Education has adapted the ECE Course Outcome Verification (COVer) form to assess how heavily students are required to actually demonstrate each desired core concept. This required first defining and articulating the concepts, and then formatting them into the instrument. Results indicate that 100% of faculty required students to demonstrate all of the core concepts. In many cases, students were asked to demonstrate competency in some concepts 3 or more times [8, Sec 2.1.3].

3. One instructor gave students the National Geographic Society’s Geography Quiz. The scores achieved indicated that by the end of the course “Tech students are much stronger in geography than are Americans at large” and that they rank “at the very top in an international comparison” [8, Sec 2.1.2].

Based on the disagreements between the direct and indirect instruments identified above, the report concluded: “We believe that students do, in fact, get a very sound general education, but perceive that they do not. There is some sort of gap between substance and perception.” Additional instruments are planned for AY-2004/2005 to further test this conclusion.

The evidence listed above indicates that the desired outcomes of UN-1002 are taught in the course, and are most likely achieved by the senior year. However, there is other evidence to indicate that UN-1002 itself may not be contributing to the achievement of these outcomes. In particular, two instructors used a 20-question pre/post quiz to assess the “value added” by the course. The median score on the pre-quiz was 10 correct, while on the post-quiz it was 11 correct [8, Sec 2.1.1]. This would appear to be a statistically insignificant change, indicating that the course did not contribute to the knowledge tested by these 20 questions. However, insufficient data was provided to execute a hypothesis test on this conclusion.

10/5/04 AppI-AdHoc.doc

I-4

I.2.1.3 Revisions (UN-2001) Revisions is a course on writing and communications, and should thus contribute strongly to outcome (g). Students write frequently and do multiple drafts of projects. Writing assignments range from very short to the final long project that requires research. The report claims that “This course contributes strongly to outcomes (g, i).” However, previous assessment instruments for the course did not capture important dimensions of the course. Therefore, in Fall 2003, the decision was made to start over. A new instrument and new guidelines for its application have been drafted and will be tested during AY-2004/2005.

As a result, there is at this time no credible assessment data to indicate whether or not the course successfully delivers its outcomes. Fortunately, outcome (g) is well covered in several other courses in the EE and CpE curricula.

I.2.1.4 Institutions (UN-2002) This course aims to show how the larger constraints from government, markets, family, religion, and schooling shape human behavior. As such, it should make a significant contribution to outcome (j.2) a knowledge of the nature and role of institutions in shaping today’s world, and to a lesser extent, outcome (h) receipt of a broad education necessary to understand the impact of engineering solutions in a global and societal context.

Last year, no assessment results of any kind were reported. This year, two instruments were employed in the same manner as was done for UN-1002.

1. Senior Exit Surveys and interviews of students have shown the same contradictory results as they did for UN-1002. Specifically, there is reason to believe that the outcomes are being delivered, but student perceive that they are not. Thus, there is a gap between substance and perception.

2. As was done for UN-1002, the Director of General Education has adapted the ECE Course Outcome Verification (COVer) form to assess how heavily UN-2002 students are required to actually demonstrate each desired core concept. This required first defining and articulating the concepts, and then formatting them into the instrument. Results indicate that 100% of faculty required students to demonstrate all of the core concepts. However, there is still appears to be significant variation in the level of depth required. [8, Sec 4.1].

I.2.1.5 Distribution Courses Students take an additional 15 hours of distribution courses from across campus. There are a total of 270 courses listed. The report states “We believe the distribution contributes in varying degrees to outcomes (d,f,g,h,i,j)”, although no direct supporting evidence is provided. At the suggestion of the ECE Assessment Coordinator, General Education has begun surveying faculty about the cognitive goals and the methods used to assess learning in distribution courses. The intent is to identify areas of commonality, and gaps in the outcomes covered by the distribution courses. This is a particularly daunting task, given the number and diversity of these elective courses.

I.2.2 Conclusions While assessment of General Education is still in its formative stages, there have been significant improvements in both the methods and results obtained, relative to last year. The Director of General Education has aggressively instituted a set of new instruments that are beginning to bear fruit. In addition, new instruments are being developed and tested, in collaboration with the MTU Assessment Council. The most significant results from this year are:

10/5/04 AppI-AdHoc.doc

I-5

1. It appears that all sections of UN-1001 (Perspectives) make minor, but documentable contributions to outcomes g and f, and potentially to h, and i,

2. There is evidence that students do, in fact, receive a sound general education, including demonstration of outcomes (h) and (j). However, engineering seniors perceive that they have not received these skills. Additional instruments planned for AY-2004/2005 will test this conclusion in further depth.

3. Pre/post tests in UN-1002 (World Cultures) indicate that the course may not be contributing to its assigned outcomes. While other indirect measures contradict this result, this instrument is the most direct measure of “value added” by the course.

4. There is no evidence that UN-2001 (Revisions) actually delivers toward outcome (g). However, new assessment instruments are being developed for implementation in AY-2004/2005.

5. Course-specific assessment for UN-2002 (Institutions) consists of only the one indirect instrument imposed by the Director of General Education. The report states that there is considerable variability in the depth to which different instructors require demonstration of different outcomes. This result indicates that some sections may not be contributing to the outcomes of this course.

Relevant Outcomes: CpE g, f, h, i, j EE g, f, h, i, j

I.3 Enterprise Program Assessment Reports A significant fraction of students take the Enterprise design sequence (ENG-3950, 3960, 4950, and 4960) in place of the ECE Senior Design sequence (EE-4900, 4901, and 4910). It is therefore necessary to ensure that the Enterprise Design sequence covers those outcomes for which the Senior Design courses are major contributors. These are (c, d, e, g, l) for CpE, and (c, d, e, g, k) for EE.

I.3.1 Reported Assessment Results Assessment reports for the Enterprise Program show that Enterprise is assessed for outcomes (c, d, e, g), and the “major design experience” (CpE l). The enterprise assessment reports for AY-2002/2003 and AY 2003/2004 were both received in the spring of 2004 [10, 11]. The AY-2002/2003 report was somewhat rudimentary, but evaluation of the AY-2003/2004 report showed significant improvement in both assessment and delivery of the required outcomes.

1. For outcome (c) ability to design a system, component, or process to meet a desired need: External project sponsors executed a rubric form that evaluated this outcome. For non-funded enterprises, surrogate evaluators used. All reports received rated the design itself as either “satisfactory” or “outstanding”.

2. For outcome (d) ability to function on multi-disciplinary teams: A random sample of videotapes showing student team meetings were viewed by the Associate Dean. The tapes were evaluated according to a rubric for students’ abilities to problem-solve effectively in team-based environments, while working together on “real-world” engineering problems. In addition, each advisor, in consultation with student leaders, rated the contribution each student made towards the enterprise design.

10/5/04 AppI-AdHoc.doc

I-6

The report concluded that Enterprise students have “met expectations with regards to teaming”,

3. For outcome (e) ability to identify, formulate, and solve engineering problems: External project sponsors were filled in a rubric that evaluated this outcome. For non-funded enterprises, surrogate external evaluators used. The report concluded that the Enterprise students are meeting or exceeding expectations with regards to design and problem-solving, as assessed by external evaluators, most of whom are currently working in industry.

4. For outcome (g) ability to communicate effectively: Oral communication skills were assessed at the Undergraduate Expo, through external judges who viewed team presentations. A random sample of team oral presentations was also evaluated by a panel of external reviewers. In addition, student representatives from fifteen enterprise teams each gave fifteen minute presentations highlighting their Enterprise’s activities during the academic year. Each presentation was evaluated by three judges on the basis of content, delivery, and the ability to respond effectively to audience questions. The report concluded that the oral communication skills of the Enterprise students are “adequate”, however, there is room for improvement. In the comments regarding communication on external reviewers’ rubrics, it appeared that most concerns in this area stemmed from a lack of communication rather than poor communication, i.e., external sponsors were expecting more interim progress reports or communication of this nature. Written Communication skills were assessed by an interdisciplinary team of faculty members, using randomly selected memoranda generated in ENG-2962 “Communication Contexts”. The report concluded that, in general, performance on memo writing improved compared to the previous year. However, based on the results from this assessment, future offerings of the course will include a stronger focus on grammar and mechanics.

5. For outcome (EE k) ability to use tools and techniques ... : Since (k) is not one of the outcomes defined for the enterprise program. Furthermore, since major-specific results can not be obtained in the Enterprise context, no assessment of this particular outcome can be expected in the future.

6. For criterion 4 (outcome CpE l) significant design experience ... : Each enterprise team was required to complete a rubric regarding consideration of the design constraints for the professional component. Data gathered through this instrument showed that most of the teams met the ABET requirement for consideration of the realistic design constraints. In addition, external evaluators were also asked to comment on the extent to which ABET criterion 4 was met, and whether the “Team took into consideration most of these other factors (as appropriate) when choosing an optimal solution ethical, environmental, sustainability, political, manufacturability, health & safety, and social.”

10/5/04 AppI-AdHoc.doc

I-7

The report concluded that the external reviewers believe that the Enterprise design teams are meeting the professional component criterion.

I.3.2 Conclusions Results of the Enterprise assessment indicate that

1. Enterprise is an effective substitute for senior design in the delivery of outcomes (c), (d), (e), and (CpE l).

2. As reported, Enterprise assesses outcome (g) by sampling simple memoranda. Although it was alluded to, no direct results regarding the written final design reports were presented. The single most significant problem reported was “grammar and mechanics”. Oral communication skills were deemed to be “adequate”.

3. Enterprise does not assess outcome (EE k), as is done in senior design. It may therefore be necessary to reduce the level of dependency on Senior Design for this outcome, and further distribute it throughout the EE curriculum.

Relevant Outcomes: CpE c, d, e, g, l EE c, d, e, g, k

I.4 Engineering Fundamentals Assessment Report In spring 2004, an assessment report covering engineering fundamentals courses ENG-1101 and ENG 1102 was received from the College of Engineering, covering the eight-semester period from Fall 2000 through Spring 2004 [6].

I.4.1 Reported Assessment Results The status and methods of delivery for several outcomes were presented in the report. The statistics were not partitioned by major. However, the following information can be extracted from the report.

1. Outcome (c) design a system... : simple design projects were assigned to teams with multiple majors and evaluated according to a rubric. The mean assessment score reported for Spring 2004 was 55.4 out of a possible 63 points. No data was reported for previous years.

2. Outcome (d) multidisciplinary teaming... : Teamwork was accomplished through the design projects mentioned above. In Fall 2001, students completed a survey regarding several aspects of their teaming experience, and overall reported it to be a positive experience. In subsequent years, teaming was evaluated by self and peer evaluation. Over 80% of students ranked the “functionality” of their teams highly. In addition, for each team, the maximum difference in Peer Review scores for that team was plotted against the functionality scores. This analysis showed that teams with smaller differences in peer review scores tended to rank their functionality as high. The significance of this trend is not entirely clear.

3. Outcome (f) professional ethics... : Students executed role playing assignments and performed oral presentations of case studies to the class. Initial assessment using final exam questions presented

10/5/04 AppI-AdHoc.doc

I-8

ethical dilemmas to the students; results were good, with a mean score of over 80% correct. Subsequently, assessment was changed to use rubric-based assessment of the ethics presentations, using a 0-2 scale (where 2 is best). For Spring 2004, the mean scores were well above 1.0, although some data appears to be missing from the tables. No data was reported for any other semesters.

4. Outcome (g) technical communication... :

a. Written Communication assessment was limited to rubric-based assessment of Memoranda using a 0-2 scale (where 2 is best). Although no tabular data was presented in the report, the main text stated that mean scores were in the range of 1.05 ... 1.33, indicating general satisfaction with the results. Weakest area was determined to be “writing quality”. There is no evidence that assessment was performed on written work more complex than a simple memorandum.

b. Oral Communication was supposedly assessed through short presentations on ethics case studies. No data or conclusions were presented regarding oral communication.

5. Outcome (k) tools and techniques... A set of software packages and tools was used in the classes, including a hand calculator, a spreadsheet, MathCAD, MATLAB, I-DEAS, and Mathematica. Not all of these tools are relevant to Electrical or Computer Engineers. Those that are relevant are taught elsewhere in the curriculum. Assessment was accomplished via a quiz that asked which tool would be the most appropriate tool for a given problem or task. The “correct” answer was taken to be the consensus answer of the faculty. While raw data tables are provided, no evaluation of the statistics was presented. Nonetheless, the report claimed “students have a solid understanding of the use and selection of the computational tools they are using.”

I.4.2 Conclusions The Fundamentals of Engineering Assessment Report varied considerably in the depth and quality of data presented for different outcomes.

1. Regarding multidisciplinary teaming (d), 80% of students self-assessed their team’s functionality as high.

2. Regarding design a system (c), and professional ethics (f), only one or two semesters’ data was presented covering the eight semester span of the report. However, the limited data available points toward acceptable delivery of the outcomes.

3. Regarding written communication (g), only relatively simple memoranda were assessed. This is significantly less than the level of writing expected in UN-1001 “Perspectives”, and UN-2001 “Revisions”. Regarding oral communication (g), students were tasked to perform oral reports, however neither assessment data nor conclusions were presented in the report.

4. Regarding tools and techniques (k), raw data tables were provided, but no analysis of the data was presented. Nonetheless, the report concluded that students have a “solid understanding of the use and selection” of software tools.

10/5/04 AppI-AdHoc.doc

I-9

To summarize, while there is evidence that ENG-1101 and ENG-1102 do provide instruction in the outcomes claimed, the evidence that the outcomes are actually accomplished is less than compelling. Missing data and missing analyses in the report leave several open questions.

For CpE students, all outcomes claimed for these courses are delivered in greater depth and breadth elsewhere in the CpE curriculum. Thus, the current consensus of the ECE Department faculty that these courses serve no purpose in the CpE curriculum appears to still be valid. The same can be said for the EE curriculum with the exception of outcome (f) professional ethics, which receives only minimal coverage in EE courses.

Relevant Outcomes: CpE none EE f

I.5 EE-3120 EE Writing Assessment A pilot assessment of the writing skills of EE students was implemented and tested in EE-3120 in Fall 2003. The results of this assessment were submitted to the UPC [12].

I.5.1 Administration The assignment was to write a 2-3 page report describing the design of a stand alone power system for an off-the-grid home using a wind turbine as the energy source. Appendices (not counted in the text length) were to include the design calculations and detailed information on the equipment needed for the system.

The report was graded using a rubric which included 4 major areas: Written Report, Required Content, Calculations, and Appendices. For assessment of writing skills, only that portion of the rubric addressing the Written Report section was used. Each topics was graded on a 0 – 5 scale, (with 5 being the best). The results are summarized in Table I.5-1.

I.5.2 Conclusions The data in Table I.5-1 indicates that our students meet our expectations in their ability to write a technical report. The only area in which they show a minor weakness is in Grammar, Punctuation, and Spelling. In this sub-section 13 of 71 students scored “Below Expectations” or “Unacceptable”. Since the average in Grammar Spelling and Punctuation is much closer to “Meets Expectations” (4) than it is to “Below Expectations” (2), this assessment indicates that no action needs to be taken at this time.

Table I.5-1 EE-3120 Writing Results

Organization: 4.27

Format: 4.25

Grammar, Punctuation, and Spelling:

3.89

Length: 4.38

Average: 4.25


Recommended