EDST5126  Week 5 – Educational Evaluation

 

When you've finished changing, your're finished Franklin

Evaluation – what is it?

The etymology of Evaluation is from the French evaluer – in essence to extract the value out of something

Fox (2017) summarises three key elements of evaluation as:

  1. Assessing the merit / worth
  2. In a systematic manner
  3. With the purpose of making a judgement about the the worth of something, in other words its significance.

I find Burton’s simple three step process of evaluation (1970) very useful in exploring and conceptualising new concepts and new information.What so what now what

Prof Fox’s three point summary covers the first two elements (“What?” and “So what?”) well but seems to not capture so well the final step of “Now what? For me, evaluation must include that final phase of further action which may include change, or perhaps reinforcing the current state. Evaluation without action is interesting and perhaps academically rewarding but somewhat purposeless.

I have put the phases and elements of evaluation outlined in this week’s presentation / readings within Burton’s model

Evaluation what so what now what

What can be evaluated in HE?

Every aspect of higher education – people, processes and everything in between can be evaluated, which I have tried to capture in my image below.

what can be evaluated

Evaluation of impact of change is very important too and can be measured using Tilly Hinton’s IMPEL model, which we came across last year in EDST5124

IMPEL

What is in a tool?

A point which was highlighted in this evening’s class discussion is about the importance of evaluating the evaluation tool. This discussion echoed considerations of  evaluating assessment tools in education which I focused on in my assessments in EDST5122 last year. I have a particular interest in this area of formal assessment in education, being responsible for a high stakes examination for the medical speciality I am involved with.

The key elements of an assessment tool are summarised below. These are equally applicable when considering tools for evaluation in general.

  1. Reliable
  2. Valid
  3. Fair
  4. Feasible
  5. Cost effective
  6. Defensible
  7. Well-perceived
  8. Consistency
  9. Levelness

(Crosby, 2002 in Feather and Fry 2009 and QAA 2012)

These domains were expanded on through discussions this evening where the importance of considering these other areas was highlighted.

  1. Who designed the tool
  2. Who selected the tool
  3. The granularity of the tool
  4. The lens through which the results are evaluated: is it past focused or future focused?

Many of these aspects are highly relevant to the discussion underway about the tool being used to evaluate courses in HE ie moving from CATEI to My Experience. These issues were further highlighted by Uther (2017) in the class discussion forum where medicine courses across Australia were evaluated using various metrics from QILT from the perspective of both undergraduates and postgraduate students. These numbers could be used to support many arguments (“there are lies, damned lies and statistics”) and confounding factors (such as reputation of the university, numbers, culture, learning methods, teaching approaches and real world experiences.

Lie dam lies and statistics mark twain

Relationship between evaluation and research

Features          Research Evaluation
Purpose Produce generalisable new  knowledge Judges merit or worth of something specific
Why Conclusion oriented

= to improve

Decision oriented = to prove
How – methods Research tools Evaluation tools
Setting Often prescribed / controlled “Real world” environment
Directed by Researcher Party with a vested interest / stakeholder
Conclusions Make new research recommendations Recommendations are based on the question that was asked
Outputs Publish results / article Report for stakeholders
Evaluation of outputs Peer review Not usually peer reviewed

 

Features Journalist Academic
Depth More superficial Deeper
Independence from material Independent Vested interest
Relationship to truth Communicate the truth Discover the truth
Active More active More inactive
Product Writing Knowledge
Payment Usually independent of output Output determines future money (new grants)
Effect / impact  / next step Public opinion and funding Further research

May be reported by journalist

The discussions of comparing and contrasting research and evaluation and the approaches of journalists versus academics was interesting (and I have a distinct tendency to like putting ideas and concepts in boxes….it satisfies my mind’s need for order and control!). I struggled at first to understand why this exercise was undertaken……I felt that I had missed some nuanced point (perhaps parallels between these two comparisons with evaluation being more akin to journalists and research being more aligned with academics…..but this didn’t match wholly). I concluded that that the point of the exercise was by dissecting and breaking down the elements  to more deeply understand and appreciate and connect with the concept of evaluation, as it does have the tendency to have a rather nebulous ill-defined feel to it (much like the concept of governance!)

Why Evaluate?

Why evaluate slide.pngIf evaluation results in change and improvement, it most certainly adds value to an organisation……Evaluation must be planned for and systematic and occur at every level, from the big picture, right down to self evaluation, using a range of tools, as relevant to the things being evaluated. An organisation that does not evaluate will potentially either remain unchanged or change in ways which are not beneficial.

Trainee term evaluation in a medical speciality: a  case study

As an educational leader, if a reporter interviewed you about why your evaluation is necessary, how would you respond?

If the same reporter interviewed you about why evaluation can’t be done, how would you respond?

In responding to these questions, I have chosen to focus on the trainee term evaluation which is undertaken by vocational trainees in Rehabilitation Medicine of their six month terms. For the purposes of this scenario, as a leader within the training program, I am being interviewed by a trainee who is writing an article for the quarterly newsletter which is for fellows (graduates of the specialist medical program) and trainees.

TTEf

 TRAINEE: Why do trainees have to complete the term evaluation every six months? It seems pointless because nothing ever changes as a result of it being done. It is a waste of time and effort.  

 SHARI PARKER (SP): The training program is evaluated in a number of different ways and from a number of different perspectives. These processes are multi-directional and have both formal and informal elements.

 At the individual level, trainees have continuous formative assessment by their supervisors throughout the term. Summatively, there are the various training elements you are very familiar with like exams, external training module essays and formal long cases, as well as your six monthly term assessment. Overall, these are used to determine your progress through the training program.

 Trainees get to evaluate the term placement, their work conditions, teaching, facilities and so forth using the trainee term evaluation which is what you are asking me about. I will talk a bit more about that soon.

 Supervisors undergo evaluation regularly too when they seek re-accreditation as a supervisor. This requires a combination of having attended mandatory training, as well as a process of self-evaluation.

 The training sites and terms are evaluated using a structured process on a three yearly basis by the accreditation committee.

From an organisational perspective, the assessment processes and training elements are evaluated: both the processes and the results which reflect on the training program as a whole. If the pass rates are low, this means that there may be an issue with the training program as a whole.

A couple of years ago, I was actually employed by the college to review and evaluate our training program. As a result of this process, a number of recommendations were made, many of which are currently being actioned. You can see my report here .   This was the first time such a review was undertaken and it took into account perspectives of trainees, fellows and reviewed objective data and incorporated local and international trends and perspectives. While this was a very important process, in my opinion, there must be a plan in place to undertake such an activity perhaps every five years, rather than such a project being initiated by one particularly insightful and visionary president. This holistic evaluation process must be independent of the incumbent leadership for there to be true growth and development.

From an even wider lens, the training program is evaluated regularly by the Australian Medical Council which is responsible for accrediting specialist medical training programs in Australia. They look at a range of elements and key indicators in the training program and make recommendations. You can see a bit more about their processes here

 The trainee term evaluation is a really important cog within this overall evaluation process, each element of which is vital.

Just so you can be familiar with what actually happens after you complete the form, the data is entered into a spread sheet by college staff.  The “new fellow” representative on the trainee committee is the person who is responsible for reviewing this data and addressing any significant “red flag” issues such as inappropriate behaviour or bullying.  In such instances, the new fellow directly contacts the trainee concerned and together they come up with a plan moving forward.

The new fellow provides a report to the Education committee annually and feedback is provided to individual training sites as part of their three yearly re-accreditation site visit so that any systemic issues or patterns that are evident can be addressed.

All feedback is de-identified which his really important because there are few trainees, and particularly in small areas, trainees could be identified. The aim is to both act on the information that you provide but also to protect your confidence.

So, to summarise, the term evaluation is a vital part of the process of evaluation for the rehab medicine training program as a whole and is also important for the individual sites. Armed with this information, it would be great if the return rate for these evaluations could increase so that meaningful information can be provided and acted on. I know that there is a degree of cynicism amongst trainees (and I have been in that boat too!) that the term evaluation is just a “ticking boxes” activity and is shoved in a dusty drawer, never seeing the light of day, but I can personally attest to the importance that this information plays in both improving the trainees’ experience and ultimately the quality of graduating specialists.

If you, or any other trainees have any specific questions about the process or any other aspects of evaluation within the faculty, I would be very happy to address them. Equally, if trainees have any suggestions about how we can improve the process of evaluation, we would warmly welcome them! This feedback is a key part of evaluating the evaluations…..a kind of meta-evaluation!

Decisions decisions decisions…..what to write about for the second assessment!

Evaluation – peer observation of teaching (POT)

A topic which grabbed my attention tonight is one which I may take further in the second assessment task for this course: that of peer observation of teaching in clinical medicine. I first encountered this idea in subject EDST5122 where we were required to evaluate some examples of teaching. This was the first time I became aware of the process of POT and i recognised that, while this is an evolving area of HE which has the potential to improve teaching and learning, that it is largely absent from clinical teaching. That being said, POT is not without its risks, including potentially retrenchment or denial or promotion, changing the culture of an organisation and fostering cynicism of staff towards leadership and also whether this will actually achieve what it is set out to.

One potential outcome I superficially explored in my essay was how POT could be introduced in my own clinical context, potentially developing a template for it to be rolled out in my department, the clinical school at which I teach and possibly further afield. This evening’s tutorial has cemented this interest and it is an area I may delve further into in this subject in the second assessment.

Reflection……..a self evaluation!

What: In writing blog posts for this course,  I tend to try to do too much, resulting in long meandering posts which far exceed the word limit which take up a lot of my time. I try to cover too many areas in too much detail. I lose the woods for the trees. As a result, I am getting behind with the class work for this course.

So what?  The impact can be seen on a number of different levels:

  • Academic: there are rules (albeit flexible) with regards to word limits. If I don’t edit my work substantially, it will be outside the
  • A short punchy focused blog post will have far more impact: firstly because it is more likely to be read and secondly because any valuable ideas will be more clear and not lost within the noise of my logorrhoea.
  • Personally, spending too much time on posts is contributing to a sense of increasing overwhelm and stress. If I don’t get on top of this, there are potentially negative implications for me personally and academically.

Now What?

  • As I catch up with the blog posts I am behind with, I need to learn to drill down and distill the key elements of an issue, to be able to describe them succinctly and communicate them efficiently.
  • I will, with discipline, set a time limit for writing my blogs and stick to it, no matter what!

(note to self…..this reflection is 239 words and not too far off how long I should aim to make my blog posts!) 

rolling-eyes-loud

References

Borton, T. (1970). Reach, teach and touch. London: McGraw Hill. Carper, B (1978). Fundamental patterns of knowing in nursing. Advances in Nursing Practice1, 13-23.

Fox, B. (2017). Session 5: Evaluation (PowerPoint presentation). Retrieved 2017_03_28 from https://moodle.telt.unsw.edu.au/pluginfile.php/2506495/mod_resource/content/1/EDST5126_S5_2017.pptx

Feather, A., & Fry, H. (2009). Key aspects of teaching and learning in medicine and dentistry. A handbook for Teaching and Learning in Higher Education, 424.

The Quality Assurance Agency for Higher Education (QAA). (2012). Understanding assessment: its role in safeguarding academic standards and quality in higher education. Retrieved 2017_03_28 from http://www.qaa.ac.uk/en/Publications/Documents/understanding-assessment.pdf

Uther, P. (2017). EDST5126 Week 5 class discussion forum. Retrieved 2017_04_04 from https://moodle.telt.unsw.edu.au/mod/forum/discuss.php?d=514124

 

 

Advertisements

2 thoughts on “EDST5126  Week 5 – Educational Evaluation

  1. Shari,
    I have really enjoyed your posts and your personal diagrams, especially in this post. I love the detail.
    I also really enjoyed the interview and the comments you have made.
    You’re more than an aspiring educator, you are an educator.

    From ben

    Like

  2. Shari,
    I have really enjoyed your posts and your personal diagrams, especially in this post. I love the detail.
    I also really enjoyed the interview and the comments you have made.
    You’re more than an aspiring educator, you are an educator.

    From ben

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s