Skip to content

2025-12-15

  • Date: 2025-12-15
  • Lead discussion: Stephan Nylinder
  • Paper: Bragg, Leicha A., Chris Walsh, and Marion Heyeres. "Successful design and delivery of online professional development for teachers: A systematic review of the literature." Computers & education 166 (2021): 104158. Paper

Notes

Questions

  • Q: What grade on a scale from 1 (worst) to 10 (best) would you give this paper?
My answer

5, as it tries to do what it states ('This systematic review’s findings report on design elements that lead to effective OPD learning experiences for teachers'), in a hard-to-read-with-shitty-tables fashion, being sloppy/flexible in putting papers in tables

  • Q: How would you praise the paper?
My answer

It is a fine attempt at doing a meta-analysis on too little time

  • Q: How would you criticise the paper?
My answer
  • I found it hard to understand what this paper was about
  • No public data
  • Q: How would you summarize the paper in one line?
My answer

How to best set up an online course to improve teachers?

  • Q: Should we do what is in the paper?
My answer

I will only follow an online course to improve myself as a teacher for fun: it has no effect on the learners.

  • Q: How does this paper make us a better teacher?
My answer

I can make a better decision on following an online course in improving myself as a teacher.

My notes

Term Description
OPD Online professional development
PCK Pedagogical content knowledge
CASP Critical Appraisal Skills Programme
Andragogy Adult education
Pedagogy Education in general or of children only
Heutagogy Self-study
DBD Data based decisions
ELA English language arts
SWD Student with disability
  • 11 studies
  • Linking to course satisfaction is useless: we know it correlates with nothing.
  • Figure 2: no negative correlations?

I have a hard time to connect to OPD. Let's read a paper, say, [Browns and Woods, 2012], as -according to table 3- had:

  • High participant satisfaction: this I believe to be useless. Finding: their course had a participant satisfaction of around 4 out of 5. Here are the components of this satisfaction:
Satisfaction Mean (max = 5) Standard deviation
personal learning 4.5 0.48
course technology 3.99 0.63
course content: process 4.4 0.35
course content: activities 4.42 0.35
course content: assignments 4.41 0.48
  • Encourages continuous teacher reflection on the content and the learning experience: this I wonder how useful it is

From [Browns and Woods, 2012] I find these two instructional models:

  • R.O.P.E. (Read, Observe, Practice, Exhibit)
  • R2D2 (Read, Reflect, Display, Do)

Let's read a paper that has an outcome I am interested in. I am mostly interested in something practice, hence 'Improved instructional practices'. This also has [Browns and Woods, 2012] mentioned. Let's take a closer look.

Something positive Before After
General expressive examples 97 100
General receptive examples 83 91
True words 50 85
Communication form 54 92
Communication content 33 81
Communication use 60 92
Intervention targets 47 81
Intervention strategies 54 94
Total score 59 89

Let's try a second paper. I am most interested in [Bragg et al., 2021]'s figure 2, outcome 'Instructional practice'. The strongest effect is from the design element 'Practical learning activities'. Let's find a study that could be at that cell, as the size of the blue dot indicates that either the effect is clear or that there are multiple studies.

First author Effect found EPHPP score CASP score
Erickson Increased level of competency to apply and implement research-based practices of secondary transition in the classroom Weak Strong
Magidin de Kramer Significant positive effects on vocabulary, ELA knowledge, and instructional writing practices Weak .
Masters Improved instructional practices Weak .
Orleans Improved teaching practice Weak Moderate

OK, lets try [Erickson et al., 2012] as it seems to be the best done research of the 4 papers that had an effect on actual teaching.

[Erickson et al., 2012] seems to be mostly theory ... with Table 3 having title 'Gains in knowledge on curriculum-referenced assessments for rural participants' and Table 4 'Percent correct on curriculum-referenced assessments for rural participants': these seem to be just tests! I give up, I feel that [Bragg et al., 2021]'s 'Increased level of competency to apply and implement research-based practices [...]' is incorrect.

Appendix

See Table 2 of the appendix:

No effect of a teachers course in grades of learners:

Study Effect on learners
Dash None
Goldenberg None
Griffin None
Magidin de Kramer Reading comprehension practices
O'Dwyer Larger changes in student content knowledge scores (only at an 8th-grade Mathematics trial)
Orleans ?Increased scores in achievement (unsure if this is about learners or teachers)

Table 1 shows the outcome

Back to the original paper, [Bragg et al., 2021], where state that [Browns and Woods, 2012] has the outcome of 'High participant satisfaction' (table 1). I would say: (1) what is high?, (2) the components measured, I do not know if these correlate to anything at all.

What is new?

My initial idea was that this paper would help me improve in my professional development. However, it seems that the paper answers the question 'How to set up a course to improve teachers?'. I see little difference with the question 'How to set up a course?', because:

  • I assume teachers are regular adult learners (however, this paper may change my ideas on that)
  • I assume that useful course outcomes for teachers are regular learning outcomes like any other course: they need to change behavior

What is andragogy?

It is pedagogy for adults. However, it is undecided if this is a special branch of pedagogy (i.e. that it has different principles than regular).

These are its principles, from https://www.instructionaldesign.org/theories/andragogy/:

  • Adults need to be involved in the planning and evaluation of their instruction.
  • Experience (including mistakes) provides the basis for learning activities.
  • Adults are most interested in learning subjects that have immediate relevance to their job or personal life.
  • Adult learning is problem-centered rather than content-oriented.

Where they refer to:

  • Knowles, M. (1975). Self-Directed Learning. Chicago: Follet.
  • Knowles, M. (1984). The Adult Learner: A Neglected Species (3rd Ed.). Houston: Gulf Publishing.
  • Knowles, M. (1984). Andragogy in Action. San Francisco: Jossey-Bass.

What is CASP?

CASP is a method to identify the quality of a research, e.g. CASP Checklist For Qualitative Research:

Item Question
1 Was there a clear statement of the aims of the research?
2 Is a qualitative methodology appropriate?
3 Was the research design appropriate to address the aims of the research?
4 Was the recruitment strategy appropriate to the aims of the research?
5 Was the data collected in a way that addressed the research issue?
6 Has the relationship between researcher and participants been adequately considered?
7 Have ethical issues been taken into consideration?
8 Was the data analysis sufficiently rigorous?
9 Is there a clear statement of findings?
10 How valuable is the research?

References

  • [Brown and Woods, 2012] Brown, J. A., & Woods, J. J. (2012). Evaluation of a multicomponent online communication professional development program for early interventionists. Journal of Early Intervention, 34(4), 222–242. https://doi.org/10.1177/1053815113483316

  • [Bragg et al., 2021] Bragg, Leicha A., Chris Walsh, and Marion Heyeres. "Successful design and delivery of online professional development for teachers: A systematic review of the literature." Computers & education 166 (2021): 104158. Paper

  • [Clayson, 2009] Clayson, Dennis E. "Student evaluations of teaching: Are they related to what students learn? A meta-analysis and review of the literature." Journal of marketing education 31.1 (2009): 16-30. This meta analysis concludes that there are many papers that report a link between learner ratings and any metric. However, this effect vanishes for bigger studies and/or studies with rigorous metrics. It concludes that there is no relation between ratings given by learners and any metric.

  • [Erickson et al., 2012] Erickson, A. S. G., Noonan, P. M., & Mccall, Z. (2012). Effectiveness of online professional development for rural special educators. Rural Special Education Quarterly, 31(1), 22–32. https://doi.org/10.1177/875687051203100104