Formal assessments sometimes also include constructed response items, in which students are asked to recall information and create an answer, not just recognize if the answer is correct. Constructed response items can be used to assess a wide variety of kinds of knowledge, but here we discuss just two major kinds: completion (or short answer or short response) and extended response.
Completion and Short AnswerEdit
Completion and short answer items ones that can be answered in a word, phrase, number, or symbol. These items essentially vary only in whether a problem is presented as a statement or a question. For example:
- Completion: The first traffic light in the US was invented by…………….
- Short Answer: Who invented the first traffic light in the U.S.?
Completion or short answer items are often used in mathematics tests. For example:
- 3 + 10 = …………..?
- If x = 6, what does x(x-1) =……….
- Draw the line of symmetry on the following shape...
A major advantage of such items is they that they are easy to construct. However, apart from their use in mathematics they are unsuitable for measuring complex learning outcomes and are often difficult to score. Completion and short answer tests are sometimes called objective, because they are intended to have only one correct answer and thereby reduce variability in scoring. Unless the question is phrased very carefully, however, there is often a variety of correct answers. For example, consider this item:
- Where was President Lincoln born?....................
The teacher may expect the answer “In a log cabin,” but other correct answers are also “on Sinking Spring Farm,” “in Hardin County” or in “Kentucky.” Common errors in these items are summarized in Table 10 – 4.
Extended response items are used in many content areas and answers may vary in length from a paragraph to several pages. Questions that require longer responses are often called essay questions. Extended response items have several advantages, the most important of which is their adaptability for measuring complex learning outcomes. Because the items also require students to write, they also therefore give teachers a way to assess writing skills. A commonly cited advantage to these items is their ease in construction; however, carefully worded items that are related to learning outcomes and assess complex learning are hard to devise. Well-constructed items phrase the question so the task of the student is clear. Often this involves providing hints or planning notes. In the first example below the actual question is clear not only because of the wording but because of the format (i.e., it is placed in a box). In the second and third examples planning notes are provided:
- Example 1: Grade 3 Mathematics
- The owner of a bookstore gave 14 books to the school. The principal will give an equal number of books to each of three classrooms and the remaining books to the school library. How many books are left?
- Show all your work on the space below and on the next page. Explain in words how you found the answer. Tell why you took the steps you did to solve the problem.
- Example 2: 5th grade science: The Grass is Always Greener
- Jose and Maria noticed three different types of soil, black soil, sand, and clay, were found in their neighborhood. They decided to investigate the question, “How does the type of soil (black soil, sand, and clay) under grass sod affect the height of grass?”
- Plan an investigation that could answer their new question. In your plan, be sure to include:
- Prediction of the outcome of the investigation
- Materials needed to do the investigation
- Procedure that includes:
- logical steps to do the investigation
- one variable kept the same (controlled)
- one variable changed (manipulated)
- any variables being measure and recorded
- how often measurements are taken and recorded
- Example 3: Grades 9-11 English: Writing Prompt
- Some people think that schools should teach students how to cook. Other people think that cooking is something that ought to be taught in the home. What do you think? Explain why you think as you do.
- Planning notes: Chose One:
- I think schools should teach students how to cook
- I think cooking should l be taught in the home
- I think cooking should be taught in ……………………………..because………
- Planning notes: Chose One:
A major disadvantage of extended response items is the difficulty in reliable scoring. Not only do various teachers score the same response differently but also the same teacher may score the identical response differently on various occasions. A variety of steps can be taken to improve the reliability and validity of scoring:
- Teachers should begin by writing an outline of a model answer. This helps make it clear what students are expected to include.
- A sample of the answers should be read. This assists in determining what the students can do and if there are any common misconceptions arising from the question.
- Teachers have to decide what to do about irrelevant information that is included (e.g., is it ignored or are students penalized) and how to evaluate mechanical errors such as grammar and spelling.
- A point scoring or a scoring rubric should be used. In point scoring components of the answer are assigned points. For example, if students were asked:
- What are the nature, symptoms, and risk factors of hyperthermia?
- Point Scoring Guide:
- Definition (natures) 2 pts
- Symptoms (1 pt for each) 5 pts
- Risk Factors (1 point for each) 5 pts
- Writing 3 pts
This system provides some guidance for evaluation and helps consistency but point scoring systems often lead the teacher to focus on facts (e.g., naming risk factors) rather than higher level thinking that may undermine the validity of the assessment if the teachers’ purposes include higher level thinking. A better approach is to use a scoring rubric that describes the quality of the answer or performance at each level.
Scoring rubrics can be holistic or analytical. In holistic scoring rubrics, general descriptions of performance are made and a single overall score is obtained. An example from grade 2 Language Arts in Los Angeles Unified School District classifies responses into four levels: not proficient, partially proficient, proficient and advanced (see Table 10–5).
Analytical rubrics provide descriptions of levels of student performance on a variety of characteristics. For example, six characteristics used for assessing writing developed by the NorthWest Regional Education Laboratory (NWREL) are:
- Ideas and Content
- Word Choice
- Sentence Fluency
Holistic rubrics have the advantages that they can be developed more quickly than analytical rubrics. They are also faster to use as there is only one dimension to examine. However, they do not provide students feedback about which aspects of the response are strong and which aspects need improvement. This means they are less useful for assessment for learning. An important use of rubrics is to use them as teaching tools and provide them to students before the assessment so they know what knowledge and skills are expected.
Teachers can use scoring rubrics as part of instruction by giving students the rubric during instruction, providing several responses, and analyzing these responses in terms of the rubric. For example, use of accurate terminology is one dimension of the science rubric in [[Table 10-6[[. An elementary science teacher could discuss why it is important for scientists to use accurate terminology, give examples of inaccurate and accurate terminology, provide that component of the scoring rubric to students, distribute some examples of student responses (maybe from former students), and then discuss how these responses would be classified according to the rubric. This strategy of assessment for learning should be more effective if the teacher
- emphasizes to students why using accurate terminology is important when learning science rather than how to get a good grade on the test (we provide more details about this in the section on motivation later in this chapter);
- provides an exemplary response so students can see a model; and
- emphasizes that the goal is student improvement on this skill not ranking students.
- Illinois Standards Achievement Test ,2006; http://www.isbe.state.il.us/assessment/isat.htm)
- Washington State 2004 assessment of student learning http://www.k12.wa.us/assessment/WASL/default.aspx
- Illinois Measure of Annual Growth in English http://www.isbe.state.il.us/assessment/image.htm