2025-08-18¶
- Date: 2025-08-18
- Lead discussion: Nima Rafati
- Paper: McClellan, D., Chastain, R.J. & DeCaro, M.S.
Enhancing learning from online video lectures:
the impact of embedded learning prompts in an undergraduate physics lesson.
J Comput High Educ
36, 852–874 (2024).https://link.springer.com/article/10.1007/s12528-023-09379-w
Notes¶
Questions¶
- Q: What grade on a scale from 1 (worst) to 10 (best) would you give this paper?
My grade would be a 7.
-
Q: How would you praise the paper?
- It is quite a straightforward paper
- I learned the term 'surface level information processing'
- The paper made me do a literature search, as some of its references are too old
- I learned about Edpuzzle, which seems a fine website for me to try this out
- I now believe that cognitive prompts do improve grades for disorganized students by 9%, where these grades decrease for organized students by 17%
-
Q: How would you criticise the paper?
- No mention of accessing the data at all. Also the homepage of the first author offers no help
- Demographics of students (especially the distribution of learners that are disorganized or organized) is missing
- Old (2012 and 2015) literature references for 'Research supports the use of video lecture as an effective method of instruction'; with ease I could found reviews in 2019, 2020, 2021 and 2022. The newest paper cited (in this 2023 paper) is from 2021.
- The conclusion is not completely true: cognitive prompts do improve grades for disorganized students. However, cognitive prompts do decrease grades for organized students. This seems to be tucked away
- It does not mention other ways to increase grades
-
Q: How would you summarize the paper in one line?
Cognitive prompts do improve grades for disorganized students by 9%, where these grades decrease for organized students by 17%.
- Q: Should we add cognitive prompts in our video material?
We could argue that we should aim for the weaker students. Cognitive prompts do improve the grades for these disorganized students by 9%.
However, for asynchronous teaching (as in this setting),
other interventions are more effective.
For example, [Brecht, 2012]
found that 'videos with a strong
presentation of relief and change-of-pace elements (design 2's use of
graphics and sounds) are' 13% more effective.
However, for NBIS, we typically teach live, giving is more parameters to work with. On Hattie's list of effect sizes, the closest I could find to 'cognitive prompts on video' was
Rank | Influence | Effect size |
---|---|---|
67 | Interactive video methods | 0.54 |
This means that there are 66 interventions that are more effective, here are some:
Rank | Influence | Effect size |
---|---|---|
3 | Teacher estimates of achievement | 1.62 |
7 | Jigsaw method | 1.2 |
12 | Teacher credibility | 0.9 |
13 | Micro-teaching/video review of lessons | 0.88 |
15 | Classroom discussion | 0.82 |
17 | Deliberate practice | 0.82 |
26 | Evaluation and reflection | 0.75 |
- Q: How would this paper make us a better teacher?
(putting in bigger context, as recommended by
[Deenadayalan et al., 2008]
)
I will be able to resist adding cognitive prompts: I feel it is not worth my time.
My questions¶
I received this paper as a tip from Kristen Schröder on June 9th. On July 27th Nima suggested his picks. How did he get his interest in this paper? Was it from Kristen?
Abstract¶
students who received cognitive prompts exhibited higher quiz scores than students in the control condition
How much?
From Figure 1: From 0.70 to 0.80, an increase of ((0.8-0.7)/0.7=
) 14%.
Students who reported having more disorganized study approaches benefited the most from cognitive prompts.
How much more?
From Figure 2, using graphreader:
- The disorganized students improved from 0.79 to 0.86,
which is a (
(0.86-0.79)/0.79=
) 9% increase - The organized students decreased from 0.86 to 0.71,
which is a (
(0.71-0.86)/0.86=
) 17% decrease
Introduction 1¶
Research supports the use of video lecture as an effec- tive method of instruction, both as supplemental to other learning methods (Brecht, 2012; Stockwell et al., 2015), and in place of traditional lecture methods (Fireman et al., 2021).
Checking [Brecht, 2012]
, this seems to be indeed a paper that demonstrates
that adding videos with change-of-pace elements increases the course
grade by 27% compared to having no videos. This is however based on
a back-of-the envelope calculation.
Videos with a strong presentation of relief and change-of-pace elements (design 2's use of graphics and sounds) are the most learning-effective.
Adapting figure 5 (tables.ods
), these are the average grades:
Parameter | Lecture only | Video 1 | Video 3 | Video 2 |
---|---|---|---|---|
Average grade | 66.015 | 71.06 | 73.925 | 83.675 |
Increase | . | 8% | 12% | 27% |
Increase from 3 to 2: ((83.675-73.9259)/73.925=
) 13%
Checking [Stockwell et al., 2015]
, this is indeed a paper that shows
that video lectures help improve the course grade
over lectures only (from 63 to 74,
which is an increase of ((75-63)/63=
) 19%.
Why not add a proper and newer review paper, such a [Noetel et al., 2021
] ...
Although results may be subject to some experimental and publication biases, they suggest that videos are unlikely to be detrimental and usually improve student learning.
or [Belt & Lowenthal, 2021
].
[Nothing too useful]
or [Fyfield et al., 2022
]:
- Instructional videos that are shorter, segmented, coherent and pairedwith learning activities are more likely to lead to improved learning gains in students.
- Researchers reporting on the use of videos should provide comprehensive descriptions of media, including links to the media where possible.
- Designers of instructional videos should critically evaluate design principles established for non-video media
or [Robertson & Flowers, 2020]
:
we surmise that the creation of video lectures is meaningful and worth the time, but only when provided with other, traditional materials as well.
or [Fyfield et al., 2019]
:
experimental research [...] has established that videos should be short, uncluttered, and restricted to one clearly identified learning goal. There is also robust evidence to suggest videos should be accompanied by learning activities, rather than watched passively.
Introduction 2¶
Here another case of not reading the literature:
We examine whether the type of prompt (cognitive or metacognitive) differentially impacts learning.
This is similar to [Lin & Chen, 2019]
, that used brainwaves
to detect when to give a prompt:
Analytical results indicate that students in the experimental group exhibited significantly better review effectiveness than did the control group, and this difference was especially marked for students who had a low attention level, were field-dependent, or were female.
Study approaches: disorganization¶
Disorganized study strategies are related to tendencies towards mastery-avoidance goal orienta- tion
I like this!
Current study¶
However, because the metacognitive prompt condition has not always elicited benefits compared to cognitive prompts in prior research (Berthold et al., 2007), an alternative possibility is that cogni- tive prompts will lead to greater learning benefits in our study as well.
Seems like HARKing to me.
Learning outcomes¶
Setting | Grade |
---|---|
No prompts | 0.71 |
Metacognitive prompts | 0.74 |
Cognitive prompts | 0.8 |
A main effect of course was also found [...] Despite this main effect, no interaction was found between condition and course, [...] indicating that the effect of condition did not differ across the three semesters
I would have loved to see the data to see for myself if this is true.
Table 3¶
The intercept for Quiz score is 2.97, which is when there are no prompts.
There are, however, 3 questions:
Due to a coding error, this quiz consisted of three items
How is this 2.97 calculated?
Discussion¶
Students with more organized approaches to studying scored equally well across conditions.
Is this what the data in figure 2 states? I would say:
- The disorganized students improved from 0.79 to 0.86,
which is a (
(0.86-0.79)/0.79=
) 9% increase - The organized students decreased from 0.86 to 0.71,
which is a (
(0.71-0.86)/0.86=
) 17% decrease
Conclusion¶
Based on our findings, instructors should consider adding cognitive learning prompts to their asynchronous video lectures.
Disagree. For example, [Brecht, 2012]
found that 'videos with a strong
presentation of relief and change-of-pace elements (design 2's use of
graphics and sounds) are' 13% more effective (compared to
the 9% for disorganized learners only, as shown in this paper).
References¶
-
[Brecht, 2012]
Brecht, H. David. "Learning from online video lectures." Journal of Information Technology Education. Innovations in Practice 11 (2012): 227. -
[Stockwell et al., 2015]
Brent R. Stockwell, Melissa S. Stockwell, Michael Cennamo, Elise Jiang, Blended Learning Improves Science Education, Cell, Volume 162, Issue 5, 2015, Pages 933-936, ISSN 0092-8674, DOI -
[Noetel et al., 2021]
Noetel, Michael, et al. "Video improves learning in higher education: A systematic review." Review of educational research 91.2 (2021): 204-236. -
[Belt & Lowenthal, 2021]
Belt, Eric S., and Patrick R. Lowenthal. "Video use in online and blended courses: A qualitative synthesis." Distance Education 42.3 (2021): 410-440. -
[Fyfield et al., 2022]
Fyfield, Matthew, Michael Henderson, and Michael Phillips. "Improving instructional video design: A systematic review." Australasian Journal of Educational Technology 38.3 (2022): 155-183. -
[Robertson & Flowers, 2020]
Robertson, Barbara, and Mark J. Flowers. "Determining the impact of lecture videos on student outcomes." Learning and Teaching 13.2 (2020): 25-40. -
[Fyfield et al., 2019]
Fyfield, Matthew, et al. "Videos in higher education: Making the most of a good thing." Australasian Journal of Educational Technology 35.5 (2019): 1-7. -
[Lin & Chen, 2019]
Lin, Yong-Teng, and Chih-Ming Chen. "Improving effectiveness of learners’ review of video lectures by using an attention-based video lecture review mechanism based on brainwave signals." Interactive Learning Environments 27.1 (2019): 86-102.