Student-authored questions as a learning activity: utilising the generation, testing and self-explanation effect.
Students use PeerWise to answer, create and share multiple-choice questions with their peers. Core concepts are reinforced through practice as they answer questions relating to their subject. Then they apply their knowledge by creating questions for others and critically engaging with peer-generated content.
Background
In 2007, Associate Professor Paul Denny developed PeerWise, a system of student-authored assessment questions. While recognising the popularity of practice multiple choice questions (MCQs) among his students; creating large banks of questions is a time-consuming activity. His innovative solution was to involve his students in question authoring. PeerWise allows students to create, answer, rate, and discuss questions. By requiring students to be part of the construction and evaluation of questions, they are further developing higher-order thinking skills and deepening their learning. Writing questions helps students actively and critically engage with concepts.
A growing body of literature recognises learning strategies that rely on further cognitive effort e.g., self-explanation, retrieval practice, or drawing from memory; encourage students to engage with and process material on a deeper level. Providing banks of test questions allows students to drill and practice, yet this relies predominately on lower-order thinking skills; namely remembering, understanding, and applying. The testing effect1 (being tested on previously studied material) is a powerful way to reinforce information, but Paul recognised there was more to be gained through further cognitive challenges. Encouraging students to generate questions and stems, indicating which alternative is correct and why, shifts them into using higher-order thinking skills. That students remember information better when they take an active role in producing that information is known as the Generation Effect2. Students are also required to interact critically with other peer-generated questions through feedback and discussion. Encouraging self-explanation supports the development of conceptual understanding.
Paul Denny talks about PeerWise student questions.
Self and peer assessment
Students self-assess by answering MCQs in a drill-and-practice fashion, while peer assessment occurs in question feedback forums. Students are engaged with cognitively demanding tasks supporting consolidation and reinforcing core concepts. Requiring students to review, clarify and provide feedback on peer contributions can lead to diagnosing misconceptions or identifying knowledge gaps. Students also gain experience in clearly presenting ideas or concepts while being exposed to various perspectives, ideas, and approaches to course content.
Professor David Nicol talks about the importance of activating learners’ judgment of their work by providing opportunities to compare their work with exemplars or peers’ work. His research claims that analogical comparisons (comparing your work against similar work) are often better than analytical comparisons (comparing your work against criteria or feedback).
How does it work?
Typically, at the beginning of a term, a course using PeerWise begins with an empty repository. This grows gradually as the course progresses and students author and contribute relevant questions. All activity remains anonymous to students; however, instructors can view the identity of question and comment authors and have the ability to delete inappropriate questions. In practice, instructor moderation is rarely necessary, and PeerWise is often used with little staff involvement.
Benefits to students
Designing questions
Generating a question requires students to think carefully about the topics of the course and how they relate to the learning outcomes. Writing questions focuses on learning outcomes and makes teaching and learning goals more apparent to students.
Choosing distractors
Creating plausible distracters (multiple-choice alternatives) requires students to consider misconceptions, ambiguity, and possible interpretations of concepts.
Writing explanations
Explanations require students to express their understanding of a topic with as much clarity as possible. This acts to develop their written communication skills and deepen their understanding.
Answering questions
Answering questions in a drill and practice fashion reinforces learning and incorporates self-assessment elements. Students are shown how others have answered the same questions, allowing them to gauge how well they are coping in the course.
Evaluating quality
Evaluating existing questions incorporates higher-order cognitive skills, requiring a student to consider the content and what makes a particular question more effective than other questions.
Benefits to teachers
Early feedback
Instructors can see how students are answering individual questions in real time and can identify and address common misunderstandings in a timely fashion. Analysing student comments can reveal further insight into student perception of topics within the course.
Large test banks
The development of MCQ test banks is a very time-consuming activity. Placing this in the hands of the students is a fast, low-cost way for instructors to access a large body of MCQ test items designed specifically to test the course content.
Student confidence
By evaluating the topic areas for which students have created questions, instructors can get a sense of which topics students are more confident with and which topics students are not engaged with.
Large classes
PeerWise performs well in large classes. The number of high-quality questions is more significant; students can access more effective questions.
Further resources
- Read more about Paul Denny’s research on PeerWise.
- Watch Paul Denny talk about crowdsourcing, gamification and producing personalised feedback as a repository for students’ learning resources.
- Hilton, C. B., Goldwater, M. B., Hancock, D., Clemson, M., Huang, A., & Denyer, G. (2022). Scalable Science Education via Online Cooperative Questioning. CBE—Life Sciences Education, 21(1), ar4.
- Karpicke, Jeffrey D., and Janell R. Blunt. “Retrieval practice produces more learning than elaborative studying with concept mapping.” Science 331, no. 6018 (2011): 772-775. ↩
- Slamecka, Norman J., and Peter Graf. “The generation effect: Delineation of a phenomenon.” Journal of experimental Psychology: Human learning and Memory 4, no. 6 (1978): 592. ↩