Evaluation Forms for training events
- Loughborough University
- Date first submitted:
- 3 Nov 2009
- Date last modified:
- 3 Nov 2010
- Postgraduate researchers
- Doctoral researchers
- Research staff
- Research masters
Rationale, aims and outcomes
What is the rationale for doing this?
How does it fit with institutional strategy?
What are the main features of the provision?
What are the aims and expected outcomes?
Since 2006 Loughborough University Staff Development has had a standard optical mark read feedback form, which is automatically generated using the workshop intended learning outcomes (ILOs) from the staff development database/booking system. The form asks participants whether the ILOs have been addressed and whether participants have increased their learning with respect to the ILOs.
The feedback forms are designed to provide a wider, more useful and in-depth evaluation than a standard ‘happy sheet’.
The electronic database and booking system automatically generates records of attendance at workshops and feedback. The presenter and course manager can access the participants list in advance of the course and have access to the feedback data and scanned-in forms after the course. Statistics from the feedback about each workshop are generated. The system allows archives of course attendance and feedback to be generated. The questions on the course such as ‘Have you increased your learning with respect to ILOs?’ ‘Will you be able to apply this learning?’ ‘Did you involve yourself in the session?’ and ‘Were other participants contributions useful?’ help to give an indication of the learning and behaviour of the participants.
Are there any pre-requisites for engagement, e.g. levels of skill, years of experience, essential pre-activities?
How many participate in each 'activity'?
Presenters should provide intended learning outcomes for each workshop in advance of the course. These appear in the publicity and course information on the booking system.
Evaluation: benefits, challenges and next steps
How do you monitor effectiveness?
Who do you seek feedback from?
Do you have benchmarks?
Standard feedback obtained for each workshop. Deeper evaluation than standard happy sheets, start to give an indication of the learning that has taken place. Run across the programme, not just for researchers. Measures the performance of the presenter and the administration, and the perceived learning and engagement of participants. Simple to use once set up. Can track a course over a period of time and check how it is going.
We have an archive of 3 years’ worth of feedback from all staff development courses.