What can we learn about mathematics multiple choice problems from attached supporting examples?

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Multiple choice (MC) items are the natural choice for automated online assessment. Ideally, making a choice should be based on knowledge and reasoning. Nevertheless, studies demonstrate that often various techniques (e. g. guessing) are the common practices. In the last decade technology is recruited to support real-time feedback as formative assessment for teaching and learning. One of the affordances of the STEP platform is for students to use an interactive diagram to explore an example space and submit examples that respond to a prompt given in a task. This study examines whether and how learner generated examples, when required as support to the choice made in MC task, could be automatically identified to give insight about the learners' understanding. Results show discrepancies between chosen correct statements and their supporting examples. Other automatically assessed characteristics are related to learner's approaches and strategies.
Original languageEnglish
Title of host publicationProceedings of the Tenth Congress of the European Society for Research in Mathematics Education
EditorsT. Dooley, G. Gueudet
Place of PublicationDublin, Ireland
PublisherDCU Institute of Education & ERME
Pages2437-2445
Number of pages9
StatePublished - 1 Feb 2017

Fingerprint

Dive into the research topics of 'What can we learn about mathematics multiple choice problems from attached supporting examples?'. Together they form a unique fingerprint.

Cite this