Developed by
Dr. Judith M. Newman

Reflective Practitioner

PAPER #3

THE ROLE OF ASSESSMENT
IN EDUCATING THE REFLECTIVE PRACTITIONER

Robin Burgess-Limerick (robin@hms.uq.edu.au)
Doune Macdonald (doune@hms.uq.edu.au)
Teresa Carlson (terryc@hms.uq.edu.au)
Trish Gorely (tgorely@hms.uq.edu.au)
Stephanie Hanrahan (steph@hms.uq.edu.au)

Department of Human Movement Studies, The University of Queensland, 4072. AUSTRALIA


Practitioners in the field of Human Movement Studies are employed in a range of professional activities including exercise programming, school teaching, physical activity management, and research and development. Despite the diversity of activities, a common theme evident from a survey of employers was that they valued in graduates the skills of reflective practitioners such as leadership, decision-making, communication, and counselling; and "a professional attitude" (Macdonald & Abernethy, 1994). As suggested by Schön (1987), curriculum reform is necessary to educate such reflective professional practitioners.

Assessment is integral to curriculum reform. We start from an assumption (after Australian Curriculum Studies Association, 1994) that assessment should:

  • relate to the goals of the curriculum
  • be relevant to the learner
  • be formative and educative, and
  • be criteria rather than norm based.

During 1997 we were involved in an action research project concerned with improving our assessment practices. Specifically, we were concerned with how our assessment practices could encourage reflection-in-action. Whereas rule governed behaviour ("knowing-in-action") is relatively easy to assess using structured evaluation tools, we were concerned with how reflective skills (e.g. original thinking, evaluation) and processes (e.g. negotiation, self-/peer assessment) could be recognised and encouraged within assessment schemes.

THE PROJECT

The Department of Human Movement Studies comprises a wide range of subjects and disciplines (science, arts, humanities and education), and has theoretical, applied, clinical, and field based subjects, with numbers that range from directed studies of one to classes of over 400 students. Consequently, the assessment practises developed during the action learning project were diverse. (Examples can be found at http://www.uq.edu.au/hms/alp97appa/alp97appa.htm). The action learning project incorporated interviews with individual staff members and groups of students, and a whole day workshop with departmental staff. An independent consultant with expertise in assessment was also involved in providing comment on the assessment methods developed.

CHOICE OF ASSESSMENT TASKS

The choice of assessment tasks is fundamental. To encourage reflection-in-action the assessment tasks must reflect practitioners' problems. Students of pedagogy, for example, are assessed while planning and teaching a unit of work which reflects the principles of contemporary curriculum design and expectations for teacher practice. Assessment tasks should also require appraisal and evaluation of practice and the provision of alternative practices. For example, students of health promotion appraise health-related practices in a work site and create an alternative program.

DEVELOPING A SHARED UNDERSTANDING OF ASSESSMENT CRITERIA

Students' responses to the changes in assessment were overwhelmingly positive. Students spoke passionately about the perceived unfairness of norm-referenced assessment and believed that the use of criteria and the accompanying standards gave their efforts a sense of focus and direction. They spoke repeatedly of being "guided" in the preparation of their responses and being able to take some responsibility for the attainment of desired grades. A small number of students, however, admitted to not using the stated criteria at all in the preparation of their responses to tasks. These students spoke of the "vague" language in use. They felt that due to the ambiguity of the terminology there was no benefit to be gained by closely reading, or being guided by, the criteria and standards. For example:

"I find I don't even look at criteria because they're just so vague... words like "evaluate", "identify" and "analyse". What do they mean really? I just usually write the assignment I think they (staff) want written and hand it in. I don't bother trying to meet the criteria."

This attitude emphasised the importance of developing a clear and shared understanding of the criteria and standards by which the performance is to be judged. This is the single greatest hurdle to assessment reform. Staff agonised over the terminology chosen, trying to ensure both clarity and conciseness while also capturing the range of meanings intended. The issue magnified in concern when there were stakeholders outside The University (such as practicum sites) using the criteria and standards to judge student performance.

This problem of how to develop a shared understanding was constantly addressed by staff and they realised the need for methods other than relying solely on the standard descriptor statements to help clarify the meaning of the criteria and standards. The provision of responses models is one means of communicating expectations. Students predominantly supported this process especially when the models were annotated. Some students, however, identified a problem with modeling assessment responses. Trying to move away from a "good" exemplar proved very difficult for some students who found it almost impossible to resist the urge to copy the model. They felt at times that models became a hindrance to "original" thought. One student explained, "I found it difficult to deviate from the sample. It was so good and I found myself writing the same sentences." Consequently, if models are to be provided it is likely that multiple models describing a range of response styles/formats and a range of response standards should be provided.

One staff member proposed that the key to developing a shared understanding of the criteria lay in providing discussion and feedback while students were engaged with the assessment task.

"Clarification (of criteria and standards) can only be done through interaction.... I could do more in terms of discussing them with the group, (and) could do more in terms of examples and modeling, but... it's not until the students start to do the task that they realise it doesn't make sense to them - the criteria, the expectations. That's when you need to give feedback."

OTHER ISSUES:

Increasing the weighting of assessment items ie., early items contributing less to a final grade, was viewed as a useful way of encouraging utilisation of feedback by students. This is most useful when common criteria are utilised across assessment tasks. Allowing resubmission of assessment tasks also allows increased utilisation of feedback.

Reflective skills may be explicitly promoted by including self- and peer- evaluation evaluation as a criteria. Writing partnerships are another way of developing these skills.

Negotiating aspects of the assessment with students such as weighting, nature, and criteria is a way of promoting ownership.

An unresolved question which arose is how to encourage risk taking in the form of original thought.

THE PROCESS OF CURRICULUM REFORM

Sparkes (1990) outlined how curriculum change, such as that associated with criterion based assessment, can be manifested at three levels: level one where new and revised materials and activities are introduced (surface change); level two at which changes in teaching practices are observed; and level three where changes are apparent in "beliefs, values and understanding with regard to pedagogical assumptions and themes" (real change). Movement toward real change in the way the staff were thinking about and implementing assessment occurred during the action learning project. While depth of change within individual staff members varied, all members of the project had moved beyond criterion based assessment being merely a new way of presenting assessment tasks. Changes in teaching practices included: an increase in the modelling of assessment tasks; encouragement of peer-assessed feedback; and the bringing of learning and assessment closer together. There was, however, some reticence among the project team, along with other departmental staff, to think differently about the role of assessment in the learning process. For example, while some staff were comfortable looking at assessment as a means to maximise the achievement of all, others were still committed to assessment serving to sift and sort student achievement.

Six interlinking factors contributed to the change process. These factors were: the creation of a supportive environment in which staff could experiment with criterion based assessment; the sharing of ideas, experiences and problems; peer accountability; the encouragement for all participating staff to reflect on their practice through semi-structured interviews; responding to student feedback; and the creation of time for both the design of criterion based assessment and the reflection that followed.

REFERENCES

Australian Curriculum Studies Association (1994). Principles of student assessment: Policy Statment. Canberra: ACSA.

Macdonald, D. & Abernethy, P. (1994). Bringing professional development into focus: A case study of human movement studies. Education Research and Perspectives, 21(2), 69-79.

Schön, D.A. (1987). Educating the reflective practitioner. San Francisco: Jossey-Bass.

Sparkes, A. (1990). Curriculum Change and Physical Education. Geelong: Deakin University Press.


Robin Burgess-Limerick
ph: +61 7 3365 4718
Department of Human Movement Studies,
fax: +61 7 3365 6877
The University of Queensland,
4072 AUSTRALIA
email: robin@hms.uq.edu.au

Personal pages - http://www.uq.edu.au/~hmrburge/
HMS Department pages - http://www.uq.edu.au/hms/
Ergonomics Australia On-Line - http://www.uq.edu.au/eaol/


[Return to the list of Conference papers]