skip navigation

Florida Gulf Coast University

Website Directory  
SAI header

Planning - Accreditation

Planning: SAI
Related Links

Office of Planning & Institutional Performance
Florida Gulf Coast University
10501 FGCU Blvd S.
AB5 - Suite 313
Fort Myers, FL. 33965-6565

Contact: Nevin Cales
Phone: (239) 590-7040
Fax: (239) 590-7098

Email:  ncales@fgcu.edu

Validation of the Instrument

REPORT ON THE VALIDATION OF:

THE STUDENT EVALUATION OF INSTRUCTION INSTRUMENT

Submitted to

The Institutional Affairs Team

By

The Validity Task Force

Members:

C. Hewitt-Gervais, College of Education

T. Bevins, College of Health Professions

Volety, College of Arts and Sciences

G. Mayfield, School of Social and Public Policy

April 17, 2000


Validity is defined as the usefulness of inferences drawn from test scores for a given purpose under a prescribed set of conditions. When validity is assessed it answers the question to what extent will the interpretation of the scores be appropriate, meaningful, and useful for the intended application of the results. Evidence is sought to demonstrate whether the scores reflect whatever we expect them to measure. In this case, do scores from the instrument reflect students' perception of instructional effectiveness? This is of primary concern to the test developer, the test taker, and the person(s) who will make decisions based on the results from the instrument.

Validation is the process by which a psychometrician or test user collects evidence to support the types of inferences that are to be drawn from the test scores. The process of validation involves: (1) identifying the purpose for which the test scores will be used, (2) identifying behaviors that define the domain, (3) prepare a set of test specifications, (4) construct an initial pool of items, (5) have items reviewed, (6) hold preliminary item tryouts, and (7) field-test items on a large sample representative of the proposed examinee population. Steps three through six are not relevant in this case, as Florida Gulf Coast University has decided to use an intact survey. Therefore, only steps one, two and seven will be addressed in this report.

Evaluation of instructors' ability to teach effectively was identified as the primary purpose(s) for which the scores from this instrument will be used. It is hoped that the results would be used formatively and would inform staff development. A secondary purpose might be to evaluate the course independent of the instructor.

A content analysis, a review of prior research, a selection of critical incidents, direct observation, expert judgment, and/or written instructional objectives can assist in the identification of behaviors which represent the construct or define the domain. In this case, an intact survey was proposed for use. Behaviors and indicators were identified by the Validity Task Force. It was then the duty of the Validity Task Force to create a mapping of items onto these identified behaviors and indicators. The following is a brief outline of the areas identified.

Areas to be covered by student evaluations of instruction:

DEMOGRAPHICS

  1. Semester or class rank
  2. Preparedness for course

TEACHER EFFECTIVENESS

  1. Technical Skills (effective use of time, pace, materials - includes technology, meets objectives, appropriate level of instruction, stays on topic, objectives are clearly stated, syllabus is available, syllabus is clear, uses a variety of instructional techniques)
  2. Interpersonal Skills (accessibility, keeps office hours, answers e-mail, answers phone & phone messages, shows respect for students, assessment is fair, appropriate form of assessment, establishes rapport)
  3. Communication Skills (clarity of presentation, effectiveness of feedback to students)

STUDENT EXPECTATIONS / BEHAVIOR

  1. Tried hard
  2. Came to class prepared
  3. Learned a lot
  4. Felt content and skills learned in the course were useful
  5. Felt course objectives fit into program goals (relevancy)

OTHER

  1. Suggestions for improving the course
  2. Recommend the course to others
  3. Take another course from this same instructor

It was found that there were a total of three demographic questions, 17 teacher effectiveness questions, five student expectations/behaviors questions, and one question regarding recommending the course to others. In addition, there were 5 questions that appeared to be specific to the course and independent of the instructor. The judgment of the task force was that all areas identified were adequately represented by the intact survey.

Field-testing the items on a large sample representative of the examinee population for whom the test is intended is the seventh step in a validation study. A minimum sample size of 200 or 5 -10 subjects per item is required. Once the items have been administered, they would be examined for their contribution to the instrument as a whole. The reliability of the instrument will be estimated using Cronbach's alpha.

The results of the administration from Fall 1999 were used to examine the items and to estimate the reliability. There were over 5000 records resulting in a ratio of 121 respondents for each of 41 items and 185 respondents for each of the 27 items. Results from the factor analysis indicated a two factor solution of the 27 item survey which explains 50% of the variance. The factors identified were: (1) teacher effectiveness and (2) student expectations/behaviors. The items that initially were identified as being course related loaded on the teacher effectiveness factor, an indication that students were not separating the course from the instructor in their evaluations. The factors were correlated (.38). The initial reliability estimates for the teacher effectiveness scale was .95 and for the student expectations/behaviors was .38. After the removal of one item, "This course was very easy for me", the estimated reliability for the student expectations/behaviors increased to .59. Reliability estimates of above .80 are generally acceptable for a survey of this kind.

The examination of the point bi-serial correlations and item contributions to the overall reliability of the factor did not indicate any items that were weak. In an effort to reduce the number of overall items to the survey, the Validity Task Force reviewed items for redundancy. There were 10 items of the 27 that were targeted for removal based on redundancy. In addition, it was recommended that all five items on the student expectations/behaviors factor be removed due to unacceptable reliability estimates. This left 12 items of the 27 to which will be added the eight SUSSAI questions making a total of 20 questions with reliability estimated at .96 (see Appendix A).

The completion of these steps is intended to create a greater level of confidence in those who complete the instrument and those who will use the results to make decisions. This process will better insure a valid measure of teaching effectiveness which is also reliable.

Recommendations

It is the recommendation of the IAT that the revised, twenty-item, student evaluation form be adopted to be used at FGCU. In addition, the IAT recommends that the evaluation document and process be reviewed every three years for continued relevance and usability. This review process is expected to include re-evaluation of the validity and reliability of the instrument and a search for additional high quality items to assist in assessment of the following areas: (1) student attitudes, behaviors, and expectations; (2) distance learning; (3) team teaching and other innovative instructional practices; and (4) course content as separate from the instructor.


APPENDIX A

Student Evaluation of Instruction

STUDENT: (Cronbach alpha = .33; is .60 without item #1) Counting

  1. >This course was very easy for me. (omit)
  1. This course is challenging. (omit)

21. This course is stimulating. (omit)

33. Stimulation of interest in the course. (SUSSAI) 1

  1. I studied hard and put great effort into this course. (omit)

17. I was always fully prepared for each class. (omit)

 

COURSE:

  1. Tests/assignments require problem solving and/or creative thought. 2

12. The assignments helped me understand the subject. 3

19. Evaluation of assignments/exams is returned in a reasonable period of time. 4

  1. I have learned a great deal about the subject. 5
  1. The content of tests/assessments is representative of assigned material. 6

 

INSTRUCTOR:

  1. The instructor explains ideas clearly. (omit)

6. The instructor presents material in a manner that is easy to understand. (omit)

7. The instructor speaks audibly/clearly. (omit)

29. Communication of ideas and information. (SUSSAI) 7

5. The instructor uses a variety of instructional materials/methods in the course. 8

9. The instructor conducts class in an organized way. 9

10. The instructor treats students fairly. (omit)

18. The instructor demonstrates an attitude of respect for students as persons. (omit)

32. Respect and concern for students. (SUSSAI) 10

11. The instructor is careful/precise in answering questions. 11

13. The instructor displays self confidence in his/her presentation. 12

14. The instructor is well prepared. 13

15. The instructor clearly states what is expected of students. (omit)

24. The instructor explains his/her grading system. (omit)

28. Description of course objectives and assignments. (SUSSAI) 14

16. I will recommend this instructor to others. 15

  1. The instructor encourages student participation/involvement. 16

22. The instructor is willing to give individual assistance. (omit)

31. Availability to assist students in or out of class. (SUSSAI) 17

 

26. The instructor seems to enjoy teaching. (omit)

27. In general, the instructor is an effective teacher. (omit)

35. Overall assessment of instructor. (SUSSAI) 18

30. Expression of expectations for performance in this class. (SUSSAI) 19

34. Facilitation of learning. (SUSSAI) 20

 

Estimates of internal consistency (Cronbach alpha)

35 Items = .97

27 Items = .95

8 SUSSAI Items = .95

Suggested set of 20 items = .96