Skip to content

Results of RQ2: Can the current ELIXIR evaluation questions be criticized?

What are the ELIXIR evaluation questions?

You can find the ELIXIR evaluation questions here.

This paper deals only with the mandatory questions 5 to and including 9.

With the goal of the SFT ('to improve the course and its materials') in mind, here we go through the mandatory questions that resulted from the process described in the results of Research Question 1. The relevant questions are found in Section 3 - Quality Metrics of the NBIS short-term evaluation. Here, we go through each of these questions in detail.

Question 5

5. Have you used the tools/resource(s) covered in the course before?

- Never - Unaware of them
- Never - Used other service
- Occasionally
- Frequently

Question 5 is an interesting way to evaluate the quality of a course, because it is about something learners have done before the course took place.

What are the metrics for this question?

These are the metrics collected at 2025-01-24 7:04 Stockholm time (https://training-metrics-dev.elixir-europe.org/feedback-report):

Reponse n Frequency (%)
Never - Unaware of them 4350 23.5
Never - aware of them 3838 20.8
Never - Used other service 1803 9.7
Occasionally 6974 37.7
Frequently 1528 8.3

Does this indicate good or bad courses? Are the right people reached? It would be interesting to know, how these values are used to determine the quality of a course.

Question 6

6. Will you use the tools/resource(s) covered in the course again?

- Yes
- No
- Maybe

Question 6 is another interesting way to evaluate the quality of a course, because it is about the usefulness of the topic being taught, combined with predicting the future.

What are the metrics for this question?

These are the metrics collected at 2025-01-24 7:16 Stockholm time (https://training-metrics-dev.elixir-europe.org/feedback-report):

Reponse n Frequency (%)
Maybe 2822 15.1
No 105 0.6
Yes 15792 84.4

Also here, does this indicate good or bad courses? It would be interesting to know, how these values are used to determine the quality of a course.

Question 7

7. Would you recommend the course?

- Yes
- No
- Maybe

Question 7 attempt to measure course quality by asking the learner if he/she would recommend the course. It can already be found in one of the two evaluations that ELIXIR based this on, which is [Jordan et al., 2018]. The (little) research on this practice shows that this may be true [Ang et al., 2018].

What are the metrics for this question?

These are the metrics collected at 2025-01-24 8:27 Stockholm time (https://training-metrics-dev.elixir-europe.org/feedback-report):

Reponse n Frequency (%)
Maybe 19597 89.5
No 1790 8.2
Yes 519 2.4

Question 8

8. What is your overall rating for the course

- Poor (1)
- Satisfactory (2)
- Good (3)
- Very Good (4)
- Excellent (5)

Question 8 too attempts to measure course quality by asking the learner to rate it. This question is absent from the two questionnaires (i.e. those described in [Brazas & Ouellette, 2016] and [Jordan et al., 2018]) this questionnaire is based one.

There seems to be overlap between this and previous question, hinged on the assumption that if a course is recommended that is likely to be rated positively.

Asking learners for their course satifaction however, is sketchy. This is already acknowledged by ELIXIR:

We acknowledge that training quality is more complex than solely participant satisfaction and that the community would benefit from future work to obtain a fuller picture on training quality [Gurwitz et al., 2020]

There is, however, according to a meta-analysis, no relation between training quality and participant satisfaction [Uttl et al., 2017] and this meta-analysis gives some examples how problematic this metric is.

What are the metrics for this question?

These are the metrics collected at 2025-01-24 8:28 Stockholm time (https://training-metrics-dev.elixir-europe.org/feedback-report):

Reponse n Frequency (%)
Excellent 7736 37
Very good 8437 40.4
Good 3543 17
Satisfactory 993 4.8
Poor 192 0.9

Question 9

9. A. May we contact you by email in the future for more feedback?

- Yes
- No

Question 9 is an interesting way to measure the course quality, based on the learner being willing to answer questions on the future. It seems more likely that question should be placed outside of the section Section 3 - Quality Metrics.

What are the metrics for this question?

These are the metrics collected at 2025-01-24 8:32 Stockholm time (https://training-metrics-dev.elixir-europe.org/feedback-report):

Reponse n Frequency (%)
No 8756 49.7
Yes 8860 50.3

References

  • [Ang et al., 2018] Ang, Lawrence, Yvonne Alexandra Breyer, and Joseph Pitt. "Course recommendation as a construct in student evaluations: will students recommend your course?." Studies in Higher Education 43.6 (2018): 944-959.
  • [Brazas & Ouellette, 2016] Brazas, Michelle D., and BF Francis Ouellette. "Continuing education workshops in bioinformatics positively impact research and careers." PLoS computational biology 12.6 (2016): e1004916.
  • [Jordan et al., 2018] Jordan, Kari, François Michonneau, and Belinda Weaver. "Analysis of Software and Data Carpentry’s pre-and post-workshop surveys." Software Carpentry. Retrieved April 13 (2018): 2023. PDF
  • [Uttl et al., 2017] Uttl, Bob, Carmela A. White, and Daniela Wong Gonzalez. "Meta-analysis of faculty's teaching effectiveness: Student evaluation of teaching ratings and student learning are not related." Studies in Educational Evaluation 54 (2017): 22-42.