Skip to content

Question types: Closed (Students select from options presented to them)


Closed question types such as Multiple-Choice, ranking, and scales questions allow students to select responses from a list. There are options for how results are presented including showing the results of a questions organised by responses to a previous question (segmentation) and showing changes in responses over time (trends).

Multiple-choice questions

Multiple-choice questions provide an option for students to select one or more options from a list. You can pre-select a correct answer to be revealed at a time of your choice if you wish. The results can displayed in the form of a bar chart as follows:

MCQ responses in bar chart format.  Question: In which of the following contexts have you used Mentimeter / polling? Responses: Lectures/presentations (15) Smaller workshop/seminar sessions (18) Online synchronous sessions, e.g. using Zoom (21) Asynchronous use between sessions (e.g. in the VLE or emailed to participants before a session) (2) Other(s) 8

Alternatively, the results can be shown in the form of a pie chart, donut or dots as follows:

MCQ Results (same question as above) displayed in the form of a pie chart, donut (ringed pie chart) and dots (one small dot for each response shown in a group next to each option


Feedback from students at the University shows a preference in knowledge checking MCQ questions for the answers to be hidden as they are received to give all students the chance to respond without being influenced by the answers of others. See the following Mentimeter guide to learn how to show or hide responses.

For multiple-choice questions with a single correct answer you can also apply 'segmentation' which allows you to display the results of questions organised according to responses to a previous question. This can allow you to make connections between the results of different questions and to highlight patterns in, for example, whether those answering question 1 in a particular way are more likely to answer question 2 in a particular way. Asking the same question(s) at different stages of a session can also be a useful way of exploring and highlighting changes after teaching or discussion. The following video example shows how segmentation was used by Sally Quinn (Department of Psychology) to show a breakdown of how responses to a question asked at the beginning of the session changed when the same question was asked at the end:

Sally Quinn: Using segmentation to track changes in learning (Panopto video player, 2 mins 3 secs, UoY Panopto log in required)


Ranking questions allow students to put options in order and displays the responses by average ranking:

Question: What aspects of Mentimeter did students think were the most important? Ranked responses are shown from 37 responses as follows: Rank 1: Anonymity & Active participation in class time; 2: Checking understanding & Interaction with teaching staff; 3: Variety & Fun; 4: Change of pace & Comparison with others; 5: Receiving feedback & Stimulating reflection/deeper learning.


Scales questions allow students to select a response on a sliding scale. Answers are presented with the mean highlighted and the distribution shown. Hovering over a question shows the specifics of how many students selected each option on the scale:

Question: How well do these methods support your learning.  Average responses on a scale of 1 to 10 - bad to good - are:  Active learning: 8.9, Lectures: 7.1, Peer-to-peer problem solving: 5.3, Reading: 5.6, Writing: 5.9.  The breakdown for active learning shows that 1 person chose 10, 4 chose 9, and 2 chose 8.

The following example shows how Gareth Evans (Department of Biology) used scales questions to encourage self assessment at the beginning of workshops by asking students to rate how well they felt they had met each of the learning outcomes for the week.

Question: Do you feel you have met this weeks learning outcomes. Responses shown for each of three learning outcomes as the average on a scale from 1 - not confident to 5 - very confident.  Each has a curve showing the distribution of responses around the mean.

This was the first part of a standard pattern of activities in workshops to support a flipped learning approach. The review and self-assessment activity was followed by a Q&A, a knowledge check, and a group practice activity with discussion and feedback. This is described in the following example video:

Gareth Evans: Mentimeter for flipped workshops (Panopto video player, 3 mins 41 secs, UoY Panopto log in required)

When using scales questions it is possible to collect and compare historical data to identify trends in the way that students respond. When reusing a presentation, you can opt to ‘reset results’ and use the same questions a second time. If you do this, historical responses are stored and you can use the ‘Show trends’ option. In the example shown below, students are asked to rate their confidence about meeting the module learning outcomes at the end of each weekly session. The ‘show trends’ option reveals how confidence levels for each outcome might rise and fall as the module progresses to track self-assessment of progress and achievement over time.

Graph showing 9 sessions on the x axis and a scale from 0 - not confident to 5 - very confident on the y-axis.  For each of 5 learning outcomes, a line shows changes in confidence ratings across the sessions.

100 points questions

100 points questions allow students to prioritise pre-fixed options by allocating a total of 100 points across the different options. This is similar to ranking but allows each respondent to estimate the strength of preference. The average points allocation for each option is show in the results.