SA NC Doing Investigations/Chapter 5

DOING INVESTIGATIONS: A RESOURCE BOOK FOR GET & FET MATHEMATICS & SCIENCE EDUCATORS

Preface

What is a resource book and why a resource book for investigations?

Introduction to the book and its theme: investigations

The wisdom of the winners: hints and ideas for science and mathematics educators

Ideas for investigations

Managing and assessing investigations

Examples of investigative activities in science and mathematics

Materials developed by the winning educators

Clusters, support networks and communities of practice

Scientific and mathematical literacy

Some useful URLs (internet addresses) for educators

Managing and assessing investigations edit

This chapter is based on work by Maryna de Lange, a primary school science educator from Saldanha Bay and the 2002 Maths and Science Educator of the Year. She is passionate about investigations and was the prime mover behind making "investigations" the theme of this year's MSTotY workshop.

A suggested procedure for managing a scientific or mathematical investigation for school or "Expo" purposes edit

In order to get the most out of learner investigations the process must be managed carefully. It is important to ensure that the learners (and sometimes their parents too) are well-informed about the process and exactly what is expected of them. The following points will be useful for planning your investigations.

1. Clearly state: "This investigation will be for ... e.g. final year assessment / regional competition etc. and will count 50% towards the year mark in science / may, thereafter, be submitted for the national competition etc. ... ."

2. Clearly state and publicise timelines and deadlines.

3. Within the first week of the announcement, the CRITERIA (see the assessment pro forma below) must be fully discussed with participants / learners by organizers / educators.

4. The learner's choice of the topic is the most important part. These must be finalized within a week of the announcement.

5. Educator reviews possible topics with each learner and tests the topic against the criteria provided for the investigation. Get clarification of and agreement on the "focus question."

6. Learners must start work on the literature review / research immediately and get as much outside help as possible on this preliminary stage of the investigation.

7. Within one month of the announcement (or less, depending on overall timeline and deadlines), the educator must review each learner's topic, focus question(s), hypotheses, plans and "research design" in a one-to-one interview with the learner.

8. The investigation report must be handed in within the agreed period. On the deadline for the investigation, all reports must be handed in regardless of their state of readiness. (No extensions can be allowed without seriously disrupting the year programme, so careful monitoring of progress by the educator is essential.)

Doing independent assignments / projects / investigations develop extremely important attitudes, life skills and scientific and mathematical skills e.g. time management self discipline perseverance careful, thoughtful and insightful work reflection accuracy etc. These skills and attitudes will all be assessed at the end of the senior phase.

In order that learners realize the importance of meeting deadlines etc., they can be made to sign a "contract"

Declaration: I understand and agree to abide by all conditions and

deadlines as explained to me by: __________________________________

Learner's signature: __________________________________

Dated: ____/____/200____

Parent's signature: __________________________________

Dated: ____/____/200____

Publication of notice and criteria: Dated: ____/____/200____

Educator's signature: __________________________________

Dated: ____/____/200____

Obviously investigations take a lot of time to set up. Therefore the management of the process must be meticulous and starting on time and keeping to your schedules is critical. Each learner must be allocated time for discussion and consultation if investigations are to serve their educational purpose. It is possible to manage up to 35 learners. For classes larger than this it may be advisable to work in pairs. The problem with doing investigations is that the experience we want the learners to have is really a private one and it is most powerful when it is a personal learning experience. When learners are in pairs there is no guarantee that each will get the full benefit of their experience. It becomes important to choose well-balanced pairs in which both individuals will contribute equally and communicate well, so that they learn from each other.

ASSESSMENT CRITERIA FOR A SCIENTIFIC INVESTIGATION / PROJECT (INCLUDING REPORT AND PRESENTATION OF FINDINGS) edit

NAME: ___________________________________

TOPIC:___________________________________

Criteria

Marks

Ass.

1

Choice of topic and aim:

Where did you get the idea/origin(s) of your idea? (2)

Is the idea for your investigation relevant and useful? Why? (1)

Describe the aim of your investigation (2)

5


2

Focus question:

Describe the focus question. Is your focus question clear? (1)

Can it be fairly tested? Discuss this. (3)

Identify all possible variables present in the focus question? (1)

5


3

Variables:

Independent variables:

What will you deliberately change? (3)

Dependent variables: What will you measure? (3)

Variables to be kept constant:

Variables that will be fixed/kept the same/not allowed to change. (4)

10


4

Hypotheses, suspicions, hunches and educated guesses:

Explain and describe your hypotheses before starting your investigation, what outcomes do you expect? (5)



5

Literature review (at least two sources):

Information on the topic gathered from other sources? (1)

Careful referencing of source i.e. website, title, author etc. (2 X 2)

5


6

Planning the investigation:

Record what you needed, to do the investigation. (3)

Record if and how your focus question changed. (2)

When did you start? When and where was each stage completed? (5)

10


7

Collecting data:

Describe the design of the investigation, especially data collection. (5)

Describe your practical work step by step, any experiments you did etc. (5)

10


8

Data presentation and reasons for choosing the methods used:

Any 3 ways e.g. narrative account, model, tally chart for questionnaire, table of numerical data, video, tape recording, photographs, etc. (3 methods X 5 marks maximum each i.e. 5 max for appropriate method 3 max for less appropriate method 1 max for inappropriate method.)

15


9

Processing and treatment of the data:

Methods for treatment of data to extract information e.g. graphs. (3)

Conclusions drawn from processed and treated data. (5)

Compare hypotheses / predictions with the actual results of investigation. (3)

Discuss whether your hypotheses are confirmed or rejected. (4)

15


10

Verbal / visual presentation of the investigation project:

Layout of presentation: sequence, effective use of 3-5 colours Manner of presentation: knowledge of subject, enthusiasm, confidence

Adjudicator allocates a mark out of 10

5


11

Learner s own evaluation of the project/investigation:

What would you have done differently? (Provide a personal, hand written account after the assessment and a review of others investigations.)

15


Tips on assessing investigations. edit

When all is said and done, the amount of effort put into an investigation by educators and learners is considerable and therefore deserves meticulously thought out and planned assessment. Investigations provide a wonderful platform to assess scientific and mathematical thinking in action. But it also means that the assessment must be continuous in order to extract the most value possible from the process. Assessment can take on different forms, depending on the situation at a particular time. Here are some ideas.

  • Self-assessment. Learners assess themselves. This method works well when you

discuss class work or homework, especially when moving to the next topic. It helps you to recap on work done, provides a basis for moving ahead, saves time and provides immediate feedback. Its drawbacks include not getting to see individual learners' problems. Learners tend to mark themselves correct even when they are not. Sometimes they mark an answer correct when the steps leading to the answer are not.

  • Peer assessment. Like self-assessment but here learners assess each other.

Possibly members of a group assess other members. The method poses the problem of learners giving in to peer pressure and awarding marks for incorrect answers. Control is essential when peer assessing and learners' books must be checked for the accuracy and honesty. It becomes easier when a class gets used to the method and sees the value of accurate assessment.

  • Rubrics. work well and if well designed they are one of the best tools for assessing

learners. Rubrics are particularly useful when learners present the findings from their investigations. Key items to be assessed are listed clearly with their rubrics alongside them and learners' strengths and weaknesses can be "pegged" at a glance.

  • Educator assessment. is the most common form because it is the most important.

In order to assess individual strengths and weaknesses there is no substitute for the educator reviewing and marking each learner's work. The purpose must be not only to allocate a mark but to find the basis for remedial work where it is needed. The other forms of assessment add value, but only if they add to the educator's own review of learner's written work. It may be time consuming but it gets you to know your learners. At least half of a learner's homework and class work must be assessed by the educator. Formal tests must be marked by the educator and ideally even informal tests must at least be reviewed.

Assessing presentations and products edit

There are certain standard types of assessment instrument but apart from the general type, instruments are seldom "one size fits all". So we usually construct instruments for specific activities. An instrument like the one below might be used when individual learners or groups are presenting their "research" findings on a task assigned to them. This example would be used when they are explaining the idea of "systems" in the context of the human body. Assigning a "maximum possible" mark for each item provides the educator with a benchmark or standard by which to judge performance on that item. An important principle of assessment is to ensure that learners know well beforehand, exactly what the criteria are against which they are being judged. When learners know this they begin to develop their own "internal" standards.

POSSIBLE MARK

MARK OBTAINED

1. Knows the meaning of systems, mentions at least three in any context, explains links between at least two separate systems.

3

1

2. Uses biological terms correctly and appropriately e.g. red blood corpuscles, white blood cells.

2

1

3. Mentions at least two systems in the body and understands the role of each e.g. excretory &shy kidneys remove waste, lungs remove CO2 etc.

2

1

4. Presentation: clear, logical, concise, focused etc.

3

2

TOTAL

10

5

A similar instrument might be used to assess a model that has been constructed. In this case the model is of the lungs inside the thorax. It shows how, by using the diaphragm to increase the volume of the thorax the lungs can be inflated and deflated. This model could be made from a two litre cold drink bottle, a balloon, a plastic drinking straw and a cork with a hole bored in it. In this activity, the learners would have been given clear instructions to follow.

YES

NO

1. Is the model made according to the instructions?

Yes


2. Is the model labeled?

NO

3. Can the learner match the parts of the model with the parts of the respiratory system? E.g. balloons = lungs

YES


4. Does the learner explain how the movement of the plastic affects the movement of the balloon?

NO

5. Does the balloon inflate or deflate well (i.e. no leaks)?

YES

TOTAL

3

The score on an instrument like this can be related to an overall, 5-point scale e.g. excellent (5/5) good (4/5) satisfactory (3/5) unsatisfactory (2/5) poor (0-1/5).

The words used to describe the different levels are a matter of choice. If, for instance, it is a 5-point scale, the five words or phrases used must describe a logical sequence of performance levels, communicate to learners how their educator will judge their performance and must indicate to learners where they have performed well or poorly on the task. With the assessment instrument in front of them the educator and learner can decide how to remedy shortcomings and build on strengths.

Rubrics edit

Rubrics use words to describe successive levels of performance. They are useful assessment tools but they are of no value unless they are carefully constructed and then used with fairness, insight and care. It doesn't matter how good a rubric is, if the educator does not use the rubric consistently appropriately learners will have no confidence in the method or the instrument. When learners lack confidence in an educator's judgments (about their performance) they miss the learning opportunity that good assessment practice can provide. The assessment of an investigation can teach almost as much about the process of investigation as the activity itself.

Here are some rubrics. In the first example we have a rubric to assess designs in a technology class. The second the rubric is for assessing actual design drawings. The third rubric is related to RNCS Learning Outcomes (LOs) and Assessment Standards (ASs) for the Natural Sciences.

A rubric for assessing designs in technology.

Learner's performance -->

Feature

... exceeds requirements

... satisfies requirements

... partially satisfies requirements

... does not satisfy requirements

Simplicity and materials used

Clear and simple uses appropriate materials well

Relatively simple does not use the most appropriate materials

Too elaborate materials inappropriate

Over complicated and unrealistic no thought to suitable materials

Ingenuity

Clever, creative design evidence of good insight into problem

Satisfactory design shows insight into problem

Poor design shows little insight into problem

No insight into problem apparent

Effectiveness

Article will work despite simplicity and use of minimal materials

Article should work

Article unlikely to work as currently designed

Article will not work

Visual appeal

Neat, appealing to the eye stylish

Neat and eye catching

Unsatisfactory

Ugly no visual appeal

Easy to make

Simple nothing too intricate or technically demanding

Simple will be relatively easy to make

Over elaborate and will be difficult to make

Cannot be made as presently designed

A similar rubric could be used for assessing learners' actual design drawings.

Learner's performance -->

Feature

... exceeds requirements

... satisfies requirements

... partially satisfies requirements

... does not satisfy requirements

Simplicity

Very clear and simple key features obvious

Clear mostly understandable

Too elaborate difficult to understand

No "flow" in the drawing unusable as a working plan

Clarity and neatness

Clear very neat layout makes easy to follow

Neat and mostly clear

Poor design and untidily presented

Untidy design confusing difficult to understand

Care taken

Great care taken shows real pride in work

Evidence of some care taken

Work hurried and "sloppy" numerous mistakes

No care taken no thought given

Labels

Clear labels logically organized and carefully arranged

Clear and organised

Labelling random and incomplete no logical arrangement

Little or no labelling what exists is unhelpful

Notes to maker

Simple and clear

Mostly clear but incomplete

Some notes but none of any consequence

None made

Any good rubric can be adapted or used as a template to produce others. The revised national curriculum statements (RNCS), scheduled for implementation in 2007, call for assessment of four level of performance in the GET band. Therefore particular tasks and activities can be assessed by designing rubrics from the learning outcomes and assessment standards appropriate to the task. In these examples (adapted from All Aboard Science 7 by Heinemann Publishers) various learning tasks are covered. The first example involves an investigation into the causes of corrosion in metals.

LEARNING OUTCOME 1: Scientific Investigations. The learner will be able to act confidently on curiosity about natural phenomena, and to investigate relationships and solve problems in scientific, technological and environmental contexts.


Level 4 Learner s performance exceeds the ASs

Level 3 Learner s performance satisfies the ASs

Level 2 Learner s performance partially satisfies the ASs

Level 1 Learner s performance does not satisfy the ASs

AS1: Identifies factors to be considered in an investigation.

Identifies all possible factors that could cause corrosion.

Identifies some factors that could cause corrosion.

Identifies factors that could cause corrosion but needs assistance.

Cannot identify factors that could cause corrosion.

Plans collection of data for investigation

Plans the collection of data from an investigation of rusting.

Plans the collection of data from the investigation.

Cannot independently plan the collection of data from an investigation.

Cannot plan the collection of data from an investigation.

AS2: Conducts investigations, collects data and records data accurately

Perseveres works with concentration and care to collect and record data systematically.

Works carefully to systematically collect data during an investigation.

Collects data carelessly during the investigation process not logical or systematic.

No concentration or perseverance is careless when collecting data and plays the fool.

AS3: Evaluates data and communicates findings

Lists all evidence that supports a finding deals with counter evidence and states clear conclusions.

Lists some items of evidence to support a finding and states satisfactory conclusions.

Lists few items of evidence to support a finding states some conclusions but needs help.

Cannot list any sensible items of evidence cannot state conclusions even when given points of evidence.

The investigation of sensitive issues can lead to debate that involves prejudices and emotion. In science it is important to recognize and keep these distractions in check. That is why scientists try &Acirc &shy not always successfully &Acirc &shy to take account of how we accumulate knowledge and the peculiar role that personal feelings and prevailing ideas play in attempting to understand the world we live in. The third learning outcome (LO3) in the Natural Sciences deals with this issue and is notoriously difficult to assess. Rubrics can help when we discuss issues that bring out emotion and contradiction. But we should not allow difficulties like this to put us off because dealing with topics that call for the interpretation of historical evidence and differing points of view are important to our development as scientifically literate people. The rubrics for the Natural Sciences LO3 AS1 can be adapted, for instance, to issues of gold mining and processing of gold ore, the development of dynamite and the effect that refrigeration has had on the international trade of agricultural products.

LEARNING OUTCOME 3: Understands inter-relationships between science and technology, society and environment.


... exceeds requirements

... satisfies requirements

... partially satisfies requirements

... does not satisfy requirements

AS1: Identifies how people build confidence in their knowledge example: these statements could be applied to the production of steel)

Makes critical and considered comments on the historical development of manufacturing processes in the service of humankind and sees connections between these and other historical events or trends.

Makes random, unconnected comments on the historical development of manufacturing processes in the service of humankind but is getting a grasp on "the big picture"..

Makes some cogent comments on the historical development of manufacturing processes in the service of humankind but cannot see or make links between these and other events.

Cannot make comment on the historical development of manufacturing processes.

AS2: Understands the sustainable use of the Earth s resources identifies information required to make judgements about the use of resources.

Makes clear, balanced, reasoned argument on the issue of dune mining in the St Lucia wetlands area having considered all available social, economic and environmental factors.

Makes good points in an argument on the issue of dune mining in the St Lucia wetlands having considered most available social, economic and environmental factors

Makes some, unconnected points on the issue of dune mining in the St Lucia wetlands area but does not consider all available social, economic and environmental factors.

Cannot make any reasoned points on the issue of dune mining in the St Lucia wetlands area and seems unable to understand more than a few social, economic and environmental factors.

Strategies: your own assessment policy edit

Maryna de Lange has set her assessment policy so that several assessable aspects of her learners' work count towards the final mark for the year. She writes: "I balance five forms of assessment so that every learner's preference can be accommodated. I use the assessments to build up learners' portfolios which I keep in my Master file. I regularly record the outcomes of my observations on a formative record sheet. Learners are allowed to choose the work they want to keep in their portfolios for promotion purposes. Assessments consist of educator-, peer- and self- assessments. The forms of assessment I use are:

(a) Four tests and two examinations. The tests count 30 marks each and the examinations are set for between 100 and 150 marks. The totals for the tests and the examinations are then converted to 60 marks each. [Total 60 + 60]

(b) Eight class- or homework activities count 15 marks each. 120 marks are converted to 45. [Total 45]

(c) Two projects of 40 and 35 marks respectively. [Total 40 + 35]

(d) Two assignments of 30 mark each. [Total 60]

(e) Two investigations of 30 mark each. [Total 60]

The total mark for Science for the year is 360. The pie-chart below gives us a picture of this policy. Remember that there is also a portfolio and a formative record that can be used when the learner's performance for the year is assessed. OBE calls for continuous assessment to ascertain a learner's development. Does this policy meet these requirements?

You will agree that this assessment policy gives a very comprehensive picture indeed. The pie chart also shows that investigations and projects (both involve learners working independently on problems of their own choice) make up more than a third of the total mark.

Proportions by Form of Assessment