+ All Categories
Home > Documents > G.1 – Direct Measure Results and Assessment Analysis 2013-2014

G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date post: 16-Oct-2021
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
33
G.1 – Direct Measure Results and Assessment Analysis 2013-2014 Department of Computer Science, Lamar University Summer 2014 Using the feedback from the indirect measures specified in Appendices E.1 and the results from our direct measures, the analysis of our assessment findings, actions taken, and recommendations for improvement are presented in this document. Note that the selected questions used on final examinations for each performance criterion are submitted by the faculty and approved by the departmental Assessment Committee to ensure adequate appropriate depth and consistency of content across time. Assessment and Evaluation Student Outcome 1 Software Fundamentals Indirect Assessment Methods: Student Evaluation, Exit Interview, Alumni Survey, ETS Scores Performance Criteria Strategies Assessment Method(s) Context for Assessment Time of Data Collection Assessment Coordinator Analysis of Direct Results Apply UML interaction diagrams and class diagrams to illustrate object models. COSC 1336, COSC 1337, COSC 2336, CPSC 4360 Selected Questions on Final Exam CPSC 4360 Spring and Fall of each year Dr. Peggy Doerschuk or Dr. Stefan Andrei Size = 13 Percentage = 83.92 The target of 80% was Met Apply important design patterns to OOD. COSC 3308, CPSC 4360 Selected Questions on Final Exam CPSC4360 Spring and Fall of each year Dr. Peggy Doerschuk or Size = 13 Percentage = 68.53 1
Transcript
Page 1: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Department of Computer Science, Lamar University Summer 2014

Using the feedback from the indirect measures specified in Appendices E.1 and the results from our direct measures, the analysis of our assessment findings, actions taken, and recommendations for improvement are presented in this document. Note that the selected questions used on final examinations for each performance criterion are submitted by the faculty and approved by the departmental Assessment Committee to ensure adequate appropriate depth and consistency of content across time.

Assessment and Evaluation

Student Outcome 1 Software Fundamentals Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores Performance Criteria Strategies Assessment

Method(s) Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Apply UML interaction diagrams and class diagrams to illustrate object models.

COSC 1336, COSC 1337, COSC 2336, CPSC 4360

Selected Questions on Final Exam

CPSC 4360 Spring and Fall of each year

Dr. Peggy Doerschuk or Dr. Stefan Andrei

Size = 13 Percentage = 83.92 The target of 80% was Met

Apply important design patterns to OOD.

COSC 3308, CPSC 4360

Selected Questions on Final Exam

CPSC4360 Spring and Fall of each year

Dr. Peggy Doerschuk or

Size = 13 Percentage = 68.53

1

Page 2: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Dr. Stefan Andrei The target of 80% was Not Met

Create useful software architecture documentation.

COSC 2336, COSC 3304, CPSC 3320, CPSC 4302, CPSC 4340 CPSC 4360

Rubric on software architecture documentation on final project

CPSC 4340 Fall of each year

Dr. Kami Makki Size = 14 Percentage = 100 The target of 80% was Met

Develop correct and efficient programs.

COSC 1336, COSC 1337, COSC 2336, COSC 3304, CPSC 3320, *CPSC 4302, *CPSC 4340 *CPSC 4360

Selected Questions on Assignments

COSC 3304 Spring of each year

Dr. L. Osborne Size = 16 Percentage = 94 The target of 80% was Met

Debug implemented software in a proficient manner.

COSC 1336, COSC 1337, COSC 2336 COSC 2372 COSC 3304

Selected Questions on Assignments

COSC 3304 Spring of each year

Dr. L. Osborne Size = 16 Percentage = 62.5 The target of 80% was Not Met

Design user interfaces appropriate to a large software system

COSC 1336 COSC 1337 CPSC 3320 CPSC 4360

Rubric CPSC 4360 Fall and Spring of Each year

Dr. Stefan Andrei and Dr. Peggy Doerschuk

Size = 13 Percentage = 83.92 The target of 80% was Met

2

Page 3: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Develop user-level documentation for software

All courses with programming assignments

Rubric CPSC 4360 and COSC 2336

Fall and Spring each year

Dr. Doerschuk or Dr. Stefan Andrei Dr. Makki

Size = 13 Percentage = 83.92 The target of 80% was Met

* Courses contain material relevant to the performance criteria, but are not used in the assessment strategy at this time.

Date: August 18, 2014. Results: The sample size was between 13 and 16 for all performance criteria. Two of the direct measures did not meet targets. Criteria 2 was not met (63.5%). Criteria 5 was also not met (62.5%). Overall, this is not as good as the previous year when all direct measures were met for all 7 criteria. Indirect results were mixed, similar to last year. Overall, for indirect measures not meeting targets, most were very close to the targets of 3.75/5.0. Regarding criteria 2, there were two courses that contributed data to the direct measures. CPSC 4360 in the fall (sample size 2) did not meet the performance criteria at 0%. CPSC 4360 in the spring (sample size 11) did meet the performance criteria at 81%. Discounting the fall course, which had a low sample size, the criteria would have been met. Regarding criteria 5, COSC 3304 in spring was the only course that contributed data to the direct measure (sample size 16, 62.5%). Actions: By criterion: 1.2 – Instructors will be made award this needs to be given more attention.

1.5 – We will remove COSC 3304 since debugging is not a significant topic in that course. We will add COSC 2336 and COSC 2372 and introduce techniques of debugging in these two courses since these courses are a good place for this topic. Second Cycle Results: Last year we were concerned about the sample size of indirect measures for this outcome. We agreed to bring this to the attention of instructors. That worked well since the sample size for indirect measures is up considerably. As an example, we had a sample size of 14 in COSC 1336 in 2012-2013 and in 2013-2014 we increased that to a sample size of 71. Another example is CPSC 4340 where we went from sample size 6 in 2012-2013 to sample size 11 in 2013-2014.

3

Page 4: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.1 Computer Science Technology Skills – Discrete Mathematics and Structures Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Be able to develop software to support specific operations on frequently used discrete structures such as lists, trees, and graphs.

COSC 2336, COSC 4302, CPSC 3320

Code development on final exams

COSC 2336 Fall and Spring of each year

Dr. Kami Makki Size = 29 Percentage = 82 The target of 80% was Met

Be able to use elementary concepts of combinatorics, probability, and statistics to analyze and evaluate the efficiency of algorithms.

COSC 3304 Selected Questions on Midterm Exam in COSC 3304

COSC 3304 Spring of each year

Dr. L. Osborne Size = 16 Percentage = 8 The target of 80% was Not Met

Be able to use concepts of discrete mathematics, automata, and finite state machines to explain the design of computer hardware.

COSC 2336, COSC 2372, ELEN 3431 COSC 3302

Selected Questions on Final Exam in COSC 3302

COSC 3302 Spring of each year

Dr. Hikyoo Koh Size = 23 Percentage = 91.3 The target of 80% was Met

4

Page 5: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date: August 18, 2014. Results: Last year all direct measures were met. This year, the only performance criteria not meeting direct measures was criteria 2 (8%) which was recorded in the spring COSC 3304 (sample size 16). Indirect results on student evaluations were generally lower than last year. Last year, all targets were met on the student evaluations. This year, though some measures were not met, overall, for indirect measures not meeting targets, most were very close to the targets of 3.75/5.0. Actions: Regarding criteria 2.1.2 which was not met, we will make COSC 2375 (Discrete Structures) a prerequisite for COSC 3304 and not allow students to take those courses in the same semester. This should better prepare students for these topics in COSC 3304. Second Cycle Results: None.

5

Page 6: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.2 Computer Technology Skills – Analysis and Design of Algorithms Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate basic understanding of asymptotic notations and time complexity.

COSC 2336 COSC 3304

Questions from Midterm Exam

COSC 3304 Spring each year

Dr. L. Osborne Size = 16 Percentage = 81 The target of 80% was Met

Design efficient algorithms and compare competing designs.

COSC 2336, COSC 3304 COSC 4360

Questions from Midterm Exam

COSC 3304 Spring each year

Dr. L. Osborne Size = 16 Percentage = 94 The target of 80% was Met

Demonstrate basic understanding of some design approaches such as greedy algorithms, dynamic programming and divide-and-conquer.

COSC 2336, COSC 3304

Questions from Midterm Exam

COSC 3304 Spring each year

Dr. L. Osborne Size =16 Percentage = 75 The target of 80% was Not Met

Demonstrate familiarity with standard searching and sorting algorithms and linear and non-linear structures.

COSC 2336 COSC 3304

Questions from Midterm Exam

COSC 3304 Spring each year

Dr. L. Osborne Size = 16 Percentage = 68 The target of 80% was Not Met

6

Page 7: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date: August 18, 2014. Results: Last year all direct measures were met. This year, the only performance criteria not meeting direct measures were criteria 3 (75%) and criteria 4 (68%), both recorded in the spring COSC 3304 (sample size 16). We note that criteria 2 improved from last year (95% versus 83%). 4 out 5 questions on the student course evaluations were not met as compared with last year when 1 out of 5 questions were not met. However, in the case of this year, all measures were 3.5/5.0 or above. So, these 4 were very close to meeting targets of 3.75/5.0. On the ETS exam, the average algorithms score for last year was better than 7 out of the past 17 years which indicates last year’s ETS scores on the algorithms portion of the test to be about normal for our institution. Actions: Regarding criteria 2.2.3 and 2.2.4 which were not met, we will make COSC 2375 (Discrete Structures) a prerequisite for COSC 3304 and not allow students to take those courses in the same semester. This should better prepare students for these topics in COSC 3304. Second Cycle Results: Last year we agreed to discuss with prospective instructors for COSC 3304 (algorithms course) successful methods for teaching the course with a goal of seeing better results on the student evaluations on the question of whether or not students feel they have a firm theoretical understanding of algorithms. This year we picked a new instructor for the course, Dr. Osborne. We note that results are similar for both years.

7

Page 8: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.3 Computer Science Technology Skills – Formal Languages and Computability Theory

Indirect Assessment Methods: Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment

Method(s) Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate basic knowledge of equivalences between

various types of languages and corresponding accepting devices including Turing

Machines.

COSC 3302 Exam questions

COSC 3302 Spring Semester Dr. Hikyoo Koh Size = 23 Percentage = 95.6 The target of 80% was Met

Demonstrate basic knowledge of practical applicability of

various types of grammar and of some standard

representation forms.

COSC 3302 Exam questions

COSC 3302 Spring Semester Dr. Hikyoo Koh Size = 23 Percentage = 91.3 The target of 80% was Met

Demonstrate knowledge of limitations of computational

capability of computer grammars.

COSC 3308 COSC 3302

Exam questions

COSC 3302 Spring Semester Dr. Hikyoo Koh Size = 23 Percentage = 82.6 The target of 80% was Met

Demonstrate basic knowledge of equivalences and normal forms of logical formulas in

propositional logic.

COSC 3308 COSC 3302

Exam questions

COSC 3302 Spring Semester Dr. Hikyoo Koh Size = 23 Percentage = 95.6 The target of 80% was Met

Demonstrate basic understanding and appreciation

of the various essential

COSC 3308 Exam questions

COSC 3308 Fall Semester Dr. Andrei Size = 16 Percentage = 81.3

8

Page 9: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

programming languages constructs, paradigms, evaluation criteria, and

language implementation issues.

The target of 80% was Met

Demonstrate basic knowledge and skills in programming

techniques with the focus on concepts and not on a particular language.

COSC 3308 Exam questions

COSC 3308 Fall Semester Dr. Andrei Size = 16 Percentage = 93.8 The target of 80% was Met

Date: August 18, 2014. Results: The performance targets for the 6 criteria were met for direct measures with a sample size of 16. Also, we have a similar result for last year when the sample size was 7. The performance targets of this outcome were (3.58, 3.63, 3.63) which have not been met for indirect measures, with a sample size of 19. Note that for last year the same targets were met with a sample size of 7. Actions: Since targets for student evaluations were met last year but not this year, we will inform the instructor that the student evaluations were lower this year and to keep that in mind next year in COSC 3302. Second Cycle Results: None.

9

Page 10: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.4 Computer Science Technology Skills – Operating Systems Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Knows the main components of an operating system and their purposes and modes of interaction.

COSC 4302

Exam Questions

COSC 4302 Fall and Spring Semesters

Dr. Bo Sun

Size = 9 Percentage = 77.7 The target of 80% was Not Met

Knows the structure of device drivers and the interaction between device drivers and operating systems.

COSC 4302

Exam Questions

COSC 4302 Fall and Spring Semesters

Dr. Bo Sun Size = 9 Percentage = 77.7 The target of 80% was Not Met

Outlines the basic issues in memory management design and virtual memory

COSC 4302

Exam Questions

COSC 4302 Fall and Spring Semesters

Dr. Bo Sun Size = 9 Percentage = 77.7 The target of 80% was Not Met

Can develop basic system applications based on operating system APIs.

COSC 4302 CPSC 3320

Exam Questions

COSC 4302 Fall and Spring Semesters

Dr. Bo Sun Size = 9 Percentage = 77.7 The target of 80% was Not Met

10

Page 11: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date: August 18, 2014. Results: The performance targets for all of the four criteria were not met for the direct measures. However, the performance targets of this outcome have been met for the indirect measures with a sample size of 54. As a comparison, last year (2012-2013) all direct and indirect measure targets were met. Actions: Since this is the first time we have not met the targets in this outcome and since the indirect measures have been met and also since we were only about 2% away from meeting the direct measure targets this year, we will monitor this to see if we need to take any actions next year. Second Cycle Results: None.

11

Page 12: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.5 Computer Science Technology Skills – Database Design Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate the application of Entity-Relational diagrams to model real world problems.

CPSC 4340 Exam Questions CPSC 4340 Fall Semester Dr. Kami Makki Size = 14 Percentage = 86 The target of 80% was Met

Design relations for real world problems including implementation of normal forms, keys, and semantics constraints for each relation.

CPSC 4340 CPSC 4360

Exam Questions CPSC 4340 Fall Semester Dr. Kami Makki Size = 14 Percentage = 86 The target of 80% was Met

Demonstrate competence in implementations of database applications.

CPSC 4340 Rubric for final project

CPSC 4340 Fall Semester Dr. Kami Makki Size =14 Percentage = 100 The target of 80% was Met

Date: August 18, 2014. Results: Both the direct and indirect measure targets were met for this outcome. This is an improvement from last year when indirect measure targets for this outcome were not met.

12

Page 13: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Actions: Since direct measure targets were met for all performance criteria and the sample size was small, no actions will be taken as a result of the results except to notify the instructor of results of student evaluations (since evaluations did not meet targets for indirect measures). Second Cycle Results: Last year we notified instructors that indirect measures (student evaluations) were not met for this outcome. Since we see there was an improvement in indirect measures this year, we conclude that informing instructors of the problem was the correct course of action.

13

Page 14: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.6 Computer Science Technology Skills – Computer Networks

Indirect Assessment Methods: Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment

Method(s) Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Employ the socket API to program applications among independent hosts.

CPSC 3320 Exam Questions

CPSC 3320 Fall Semester Dr. Bo Sun Size = 11 Percentage = 82 The target of 80% was Met

Explain common network architectures, the services provided by each layer, and the protocols required for connecting peer layers.

CPSC 3320 Exam Questions

CPSC 3320 Fall Semester Dr. Bo Sun Size = 11 Percentage = 82 The target of 80% was Met

Evaluate network models through simulation and the use of common performance metrics for networks.

CPSC 3320 Project CPSC 3320 Fall Semester Dr. Bo Sun Size = 11 Percentage = 82 The target of 80% was Met

Date: August 18, 2014. Results: The results are all satisfactory in 2013-2014. The direct measure results met our target in all of the performance criteria of the student outcome 2.6 (82%, 82%, and 82% with sample size of 11). The indirect measure results based on Student Course Evaluation for CPSC 3320 questions 28, 20, 28, 29, and 40 also met our target in all criteria (4.36, 4.09, 3.91, 3.91, and 4.09 with sample size of 11).

14

Page 15: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Actions: None. Second Cycle Results: None.

15

Page 16: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 2.7 Computer Science Technology Skills –Computer Organization and Architecture

Indirect Assessment Methods: Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment

Method(s) Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Understands modern ISA design principles and employs them to evaluate systems

COSC 2372, ELEN 3431, COSC 4310

Local Exam Question

COSC 4310 Spring and Fall semesters

Dr. Jiangjiang Liu

Size = 6 Percentage = 100 The target of 80% was Met

Know how to measure performance for different computer architectures

COSC 4310 Local Exam Question

COSC 4310 Spring and Fall semesters

Dr. Jiangjiang Liu

Size = 6 Percentage = 83 The target of 80% was Met

Demonstrate knowledge of hardware implementation of numbers and arithmetic operations

COSC 2372, COSC 4310

Local Exam Question

COSC 4310 Spring and Fall semesters

Dr. Jiangjiang Liu

Size = 6 Percentage = 100 The target of 80% was Met

Date: August 18, 2014. Results: The results are all satisfactory in 2013-2014 except the indirect measure results based on Student Course Evaluation COSC 2372.

16

Page 17: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

The direct measure results met our target in all of the performance criteria of the student outcome 2.7 (100%, 83%, and 100% with sample size of 6). The indirect measure results based on Student Course Evaluation: COSC 4310 questions 35, 38, and 40 met our target in all criteria (4.20, 4.40, and 4.20 with sample size of 6) and COSC 2372 questions 27, 31, 35, and 40 are very close to our target 3.75 (3.35, 3.6, 3.60, and 3.58 with sample size of 26, 26, 25, and 24).

Actions: The department head will bring to the attention of instructors for COSC 2372 the low student evaluations since the indirect measures for student evaluations in that course were not met.

Second Cycle Results: None.

17

Page 18: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 3 Scientific Method**

**Graduates will be able to gather requirements, analyze, design and conduct simulations or other computer experiments in order to evaluate and interpret the data.

Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Be able to justify why selected research methods were chosen and state the intended outcomes of the study.

COSC 2336, CPSC 3320, COSC 4310

Rubric and Project

CPSC 3320 and COSC 4310

Spring and Fall of every year

Dr. Jiangjiang Liu and Dr. Bo Sun

Size = 17 Percentage = 88.35 The target of 80% was Met

Identify steps used in a particular study.

COSC 2336, CPSC 3320, COSC 4310

Rubric and Project

CPSC 3320 and COSC 4310

Spring and Fall of every year

Dr. Jiangjiang Liu and Dr. Bo Sun

Size = 17 Percentage = 88.35 The target of 80% was Met

Be able to outline and explain the key features of the adopted method.

COSC 2336, CPSC 3320, COSC 4310

Rubric and Project

CPSC 3320 and COSC 4310

Spring and Fall of every year

Dr. Jiangjiang Liu and Dr. Bo Sun

Size = 17 Percentage = 88.35 The target of 80% was Met

18

Page 19: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Analyze and interpret collected data based on the adopted method and draw appropriate conclusions.

COSC 2336, CPSC 3320, COSC 4310

Rubric and Project

CPSC 3320 and COSC 4310

Spring and Fall of every year

Dr. Jiangjiang Liu and Dr. Bo Sun

Size = 17 Percentage = 88.35 The target of 80% was Met

Date: August 18, 2014. Results: The results are all satisfactory in 2013-2014 except Exit Interview questions 3 & 7 and some ETS assessment indicators in fall 2013 and spring 2014. The direct measure results met our target in all of the performance criteria of the student outcome 3 (88.35%, 88.35%, 88.35% and 88.35% with sample size of 17). The indirect measure results based on Student Course Evaluation met our target in all criteria: COSC 2336 question 37, 38, and 40 (4.04, 3.86, and 4.08 with sample size of 22), COSC 3320 questions 37, 38, and 40 (4.62, 4.69, and 4.62 with sample size of 13), and COSC 4310 questions 35, 38, and 40 (4.20, 4.40, and 4.20 with sample size of 5). The indirect measure results based on Exit Interview questions 4 and 6 met our target (4.00 and 4.45 with sample size of 11). Questions 3 and 7 did not met our target (3.63 and 3.45 with sample size of 11)

Exit Interview Question 3: Your education provided a firm theoretical foundation so that you were prepared for future scientific advances. Exit Interview Question 7:Your education offered the preparation necessary to design and conduct simulations or other experiments and analyze and interpret data.

The indirect measure results based on Alumni Survey questions 3, 4, 6, and 7 met our target in all criteria (4.00, 4.67, 4.33, and 4.33 with sample size of 3).

19

Page 20: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

The indirect measure results based on ETS scores met our target with overall average of 152 with sample size of 16. The results of Programming, Computer Organization, and Algorithms and Theory are very close to our target 50 (46.5, 46.5 and 44.5 with sample size of 16). The department made improvement on COSC 4310. The target of the Student Evaluation question 35 was not met with the average of 3.71 with sample size of 7 in 2012-2013 and is met with the average of 4.20 with sample size of 5 in 2013-2014.

Actions: None. Second Cycle Results: None.

20

Page 21: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 4 Societal Awareness**

**Graduates will be aware of and understand the impact of computer technology on society at large, on the workplace environment, and on individuals.

Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate understanding of evolving computer technology applications.

COSC 1172, COSC 3325

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 10 Percentage = 91% The target of 80% was Met.

Demonstrate knowledge of positive social impacts including information globalization, E-Commerce, E-learning and new job creation.

COSC 1172, COSC 3325, CPSC 4340, CPSC 3320

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 10 Percentage = 91% The target of 80% was Met.

Demonstrate knowledge of negative social impacts including internet pornography, privacy violation, health hazards, computer crimes and dehumanization.

COSC 1172, COSC 3325, CPSC 4340, CPSC 3320, ELEN 3431

Exam Questions

COSC 3325, CPSC 3320

Fall and Spring each year

Dr. Stefan Andrei, Dr. Bo Sun

Size = 21 Percentage = 95.71% The target of 80% was Met.

Demonstrate basic understanding of intellectual property protection via copyright and patent law and fair use exception for copyrighted software.

COSC 1172, COSC 3325, CPSC 4340, CPSC 4360

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 21 Percentage = 86.28% The target of 80% was Met.

21

Page 22: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date: August 18, 2014. Results: The results met our targets in all of the performance criteria. We reached our goal of 80%. We believe that it was a good idea that we implemented these performance criteria in other courses rather than COSC 3325, such as COSC 1172, CPSC 4340, and CPSC 4360. We intend to continue doing that in 2014-2015. As for the indirect measures, the only course where the target of 3.75 was not met is COSC 1172. We believe that the large classes tend to get lower scores and this class had 78 students. Besides, this is an online class. Being a freshman class, students are not yet familiar with the topics. We are actually pleased to see the improvement from last cycle (3.43) to the current score (3.67). However, no actions are needed as the score of 3.67 was very close to the target of 3.75. As for the exit interviews, both scores for Questions 5 and 9 (these are, 3.54 and 3.72) were close to the target of 3.75. We will monitor COSC 3325 to see why there was such a high difference from 2.75 to 4.00. There were only 4 students in Fall 2013 and 7 students in Spring 2014. Actions: We will monitor COSC 3325 to insure instruction in that class is consistent since we had two different views of students based on Exit surveys. Second Cycle Results: None.

22

Page 23: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 5 Ethical Standards**

**Graduates will be able to recognize and understand the importance of ethical standards as well as their own responsibilities with respect to the computer profession.

Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Know the differences of various philosophical views on ethics such as deontology, utilitarianism, egoism, and relativism.

COSC 3325

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 10 Percentage = 91% The target of 80% was Met.

Understand the ACM or a similar professional body’s code of ethics and principles underlying those ethics.

COSC 3325, CPSC 4360

Exam Questions

CPSC 4360 Fall Spring each year

Dr. Stefan Andrei, Dr. Peggy Doershuk

Size = 13 Percentage = 84.76% The target of 80% was Met.

Honor the property rights of others including copyrights and patents.

COSC 1172, COSC 3325, CPSC 4360

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 10 Percentage = 91% The target of 80% was Met.

Demonstrate ability for ethical decision making within the computer profession.

COSC 1172, COSC 3325, CPSC 3320, CPSC 4360

Exam Questions

COSC 3325 Spring each year

Dr. Stefan Andrei

Size = 10 Percentage = 91% The target of 80% was Met.

Demonstrate knowledge of factors COSC 1172, Exam COSC 3325 Spring each Dr. Stefan Size = 10

23

Page 24: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

affecting fair resolution of conflicts of interests.

COSC 3325, CPSC 4360

Questions year Andrei Percentage = 91% The target of 80% was Met.

Date: August 18, 2014. Results: The results met our targets in all of the performance criteria. We reached our goal of 80%. We believe that it was a good idea that we implemented these performance criteria in other courses rather than COSC 3325, such as COSC 1172, CPSC 3320, and CPSC 4360. We intend to continue doing that in 2014-2015. As for the indirect measures, the student evaluations concur with the direct measures, that is, the target was met. As for the exit interviews, the score for Question 9 (that is, 3.72) were very close to the target of 3.75. The score of 3.67 was very close to the target of 3.75. Actions: We will monitor COSC 3325 to insure instruction in that class is consistent since we had two different views of students based on Exit surveys. Second Cycle Results: None.

24

Page 25: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 6 Collaborative Work Skills**

**Graduates will demonstrate the ability to work effectively in teams to conduct technical work through the exercise of interpersonal communication skills.

Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate the ability to work in heterogeneous environments which are diverse in gender, ethnicity, and academic accomplishment.

COSC 1172, CPSC 4360, CPSC 4340, COSC 4302

Rubrics CPSC 4340, CPSC 4360

Fall and Spring Semesters

Dr. Andrei, Makki, Dr. Doershuk

Size = 27 Percentage = 91.85 The target of 80% was Met

Attend team meetings and contribute towards solution of technical problems during the meetings.

COSC 1172, CPSC 4360, CPSC 4340, COSC 4302

Rubrics CPSC 4340, CPSC 4360

Fall and Spring Semesters

Dr. Andrei, Makki, Dr. Doershuk

Size = 27 Percentage = 91.85 The target of 80% was Met

Make appropriate contributions within their skill set to the completion of the project.

COSC 1172, CPSC 4360, CPSC 4340, COSC 4302

Rubrics CPSC 4340, CPSC 4360

Fall and Spring Semesters

Dr. Andrei, Makki, Dr. Doershuk

Size = 27 Percentage = 91.85 The target of 80% was Met

Demonstrate a sense of interdependence with other team members.

COSC 1172, CPSC 4360, CPSC 4340, COSC 4302

Rubrics CPSC 4340, CPSC 4360

Fall and Spring Semesters

Dr. Andrei, Makki, Dr. Doershuk

Size = 27 Percentage = 91.85 The target of

25

Page 26: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

80% was Met Date: August 18, 2014. Results: All targets for 2013-2014 for the direct measures were met. In COSC 4302, CPSC 4340, and CPSC 4360, the scores on student evaluation questions directed to teamwork were very high. As usual in the freshman online course “Thinking, Speaking and Writing” the scores were below our targets. However, teamwork is hard for freshman who often are not adept at online collaboration and who are unaccustomed to working in groups. On the Exit Interview the scores were above the target average of 3.75. This was an improvement over last year where the target for question 8 relevant to this outcome was 3.63. This year on the same question the score was 3.99 with a sample size of 11. The alumni survey scores matched the expectations for questions 4, 7, 8, 11, 13 and 14 although the sample size was only 3. Actions: Remove questions 25 and 26 from COSC 1172. Remove COSC 1172 from strategies in outcome 6 because freshman have enough to do already and we do not want to overload them their first semester. Second Cycle Results: None.

26

Page 27: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 7 Oral Communications**

**Graduates will demonstrate their ability to verbally communicate clearly. Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria

Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Demonstrate the ability to communicate in a given situation.

COSC 3325, COSC 4172, COSC 1172

Rubrics COSC 3325, COSC 4172

Fall and Spring Semesters

Dr. Stefan Andrei, Dr. Lawrence Osborne

Size = 20 Percentage = 97.1 The target of 80% was Met

Demonstrate the ability to comprehend what is said and to show an appreciation of the importance of listening.

COSC 3325, COSC 4172, COSC 1172

Rubrics COSC 3325, COSC 4172

Fall and Spring Semesters

Dr. Stefan Andrei, Dr. Lawrence Osborne

Size = 25 Percentage = 91.88 The target of 80% was Met

Communicate clearly at the level of the audience the technical material intrinsic to the discipline of computer science.

COSC 3325, COSC 4172, COSC 1172

Rubrics COSC 3325, COSC 4172

Fall and Spring Semesters

Dr. Stefan Andrei, Dr. Lawrence Osborne

Size = 25 Percentage = 91.88 The target of 80% was Met

Demonstrate knowledge of the communication

COSC 3325, COSC 4172, COSC 1172

Rubrics COSC 3325, COSC 4172 CPSC 4360

Fall and Spring Semesters

Dr. Stefan Andrei, Dr. Lawrence Osborne

Size = 25 Percentage = 91.88

27

Page 28: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

process. The target of 80% was Met

Date: August 18, 2014. Results: All the targets for direct measures were met easily. The scores were over 90%. This is a significant improvement from last year where criteria 4 was not met. The score last year for that criteria was 75.7%. This year is was 91.88%. The sample size in all criteria was at least 20. The student evaluation scores showed student responses to question 34 were worse than those last year. Question 34 on the online student evaluations is supposed to be listed as “Students had the opportunity to communication design and implementation concepts to professional and non-professionals” produced a score of 3.44 last year (sample size 9) and a score of only 2.83 this year (sample size 6). This question was written incorrectly on the evaluation. So, students probably did not understand the question. The chair will contact someone to correct this. Graduating seniors, however, gave scores on the Exit Interview questions dealing with oral communications (8, 13) that met our target of 3.75. The scores on questions 8 and 15 were 3.99 and 3.81, respectively. The three alumni surveys showed satisfaction with preparation with the computer science department training in oral communications. Actions: None. Second Cycle Results: We agreed last year to conduct a review of methods for giving an effective presentation in COSC 4172. The instructor did give two lectures on “How to Present Lectures on Technical Subjects.”

28

Page 29: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 8 Written Communication Skills** **Graduates will demonstrate their ability to write effectively both technical and non-technical materials with appropriate multimedia

aids.

Indirect Assessment Methods: Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment

Method(s) Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Provide an introduction that grabs the attention of readers.

COSC 1172, COSC 3325, COSC 4172, CPSC 4360, COSC 4302

Rubrics CPSC 4360, COSC 4302

Fall and Spring Semesters

Dr. Sun, Dr. Andrei, Dr. Doerschuk

Size = 22 Percentage = 95.45 The target of 80% was Met

Organize documents in terms of a few main points or themes.

COSC 1172, COSC 3325, COSC 4172, CPSC 4360, COSC 4302

Rubrics CPSC 4360, COSC 4302

Fall and Spring Semesters

Dr. Sun, Dr. Andrei, Dr. Doerschuk

Size = 22 Percentage = 95.45 The target of 80% was Met

Choose appropriate COSC Rubrics CPSC 4360, Fall and Spring Dr. Sun, Dr. Size = 22

29

Page 30: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

illustrations, examples, or evidence to support the written documents.

1172, COSC 3325, COSC 4172, CPSC 4360, COSC 4302

COSC 4302

Semesters Andrei, Dr. Doerschuk

Percentage = 95.45 The target of 80% was Met

Write appropriately for specified readers in terms of technical content.

COSC 1172, COSC 3325, COSC 4172, CPSC 4360, COSC 4302

Rubrics

CPSC 4360, COSC 4302

Fall and Spring Semesters

Dr. Sun, Dr. Andrei, Dr. Doerschuk

Size =22 Percentage = 95.45 The target of 80% was Met

Write organized, grammatically correct reports.

COSC 1172, COSC 3325, COSC 4172, CPSC 4360, COSC 4302

Rubrics CPSC 4360, COSC 4302

Fall and Spring Semesters

Dr. Sun, Dr. Andrei, Dr. Doerschuk

Size = 25 Percentage = 95.45 The target of 80% was Met

30

Page 31: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Date: August 18, 2014. Results: All direct measure targets were met and indirect measure targets were met (except COSC 1172). This was the seventh consecutive year. Actions: None. Second Cycle Results: None.

31

Page 32: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

Student Outcome 9 Continuing Education and Lifelong Learning**

**Graduates will be demonstrate that they can independently acquire new computing related skills and knowledge in order to pursue either further formal or informal learning after graduation.

Indirect Assessment Methods:

Student Evaluation, Exit Interview, Alumni Survey, ETS Scores

Performance Criteria Strategies Assessment Method(s)

Context for Assessment

Time of Data Collection

Assessment Coordinator

Analysis of Direct Results

Be able to search scholarly publications to assist in resolving problems.

COSC 3325, COSC 4172, COSC 4302, CPSC 4360

Rubrics COSC 3325 and COSC 4172

Fall and Spring Dr. Osborne and Dr. Andrei

Size = 26 Percentage = 84.12 The target of 80% was Met

Intend to engage in additional formal education or participate in employer-related training or research projects.

COSC 4172 Rubrics COSC 4172 Fall and Spring Dr. Osborne Size = 7 Percentage = 100 The target of 80% was Met.

Independent study. Participate in Honors program or in undergraduate research at Lamar. This could be done in the STAIRSTEP Program, Presentations or Posters at Professional Conferences, COOP or Internship position reports. Student

COSC 4172 Rubrics COSC 4172 Fall and Spring Dr. Osborne Size = 8 Percentage = 88 The target of 80% was Met.

32

Page 33: G.1 – Direct Measure Results and Assessment Analysis 2013-2014

could own a software design and development company. Date: August 18, 2014. Results: The results met our targets in all of the performance criteria. We reached our goal of 80%. We believe that it was a good idea that we introduced the first performance criteria, called “Be able to search scholarly publications to assist in resolving problems”, in other courses rather than COSC 3325 and COSC 4172, such as COSC 4375, COSC 4302, CPSC 4360. We intend to continue doing that in 2014-2015. As for the indirect measures, the student evaluations concur with the direct measures, that is, the target was met. In addition, the Assessment Committee approved that Questions 37 and 40 from Student Evaluations to be removed from the Assessment of Student Outcomes #9. The rationale is that they do not refer to this outcome any longer. Also, question 34 from the Student Evaluations should be read “Communicate design and implementation concepts to diverse audiences.” instead of the current “Communicate design and implementation concepts to profs. and non-profs.” As for the exit interviews, all scores for Questions 1, 10, and 11 (these are, 4.26, 4.08, and 3.81) were above the target of 3.75. Hence, the target was met. As for the exit surveys, question 11 met the target with 4.4, way above the target of 3.75. However, question 9 (“Independent study or research opportunities are satisfactory”) got a score of 3.68, slightly less than the target of 3.75. We believe that the new hired Assistant Professor, Dr. Jing Zhang, would be able to bring fresh and hot topics helpful for student research opportunities. In addition, Dr. Bo Sun got a new NSF grant in collaboration with a Faculty member from Department of Civil Engineering. Also, Dr. Doerschuk’s STAIRSTEP was approved for a two year renewal, so we hope these listed news would bring new student research opportunities. Actions: The Assessment Committee noticed that question 34 on the student evaluations was incorrectly worded, probably due to a clerical error after the department submitted it. We will make the necessary corrections. Also, we will remove question 37 from COSC 4172/4272 as it is not relevant for those courses. Second Cycle Results: No changes were made to the rubric to assess criteria 9.

33


Recommended