Copying from Word and pasting directly into the Moodle editor takes too much Microsoft formatting information with it. Authors who wish to paste in text from Word are asked to ensure they use the 'paste from Word' button. Going via this route will strip much of the unwanted formatting information.
Please be aware that several authors have not followed this advice with the consequence that their questions failed to operate correctly. They, of course, blamed Moodle but investigations revealed that the problems were of their own making.
The Question bank can be accessed from the site Administration block.
There are four question bank actions for creating and editing questions, creating categories in which to store the questions, import (from a Moodle XML format file) and export (to a Moodle XML format file).
Understanding these tools will enable you to store and retrieve your questions more efficiently.
Questions: Where the questions are held in the VLE. Your questions will normally be held within your module where they will be visible to all your colleagues who are 'Website updaters' on this module. When the module is rolled forwards for the next year the questions will be included. It is also possible to share questions across modules by storing the questions in categories at the Faculty or School level - please read on.
Categories: By creating categories for different topics, books or blocks of the module you are creating a logical storage structure so that you and all your colleagues will be able to locate all the questions on that topic in one area. It is also possible to create categories that span Faculties or Schools.
Import: It is possible to import files of questions in Moodle XML format.
Export: It is possible to export questions for import into another module. Please ensure that you use ‘Moodle XML format’.
Click on 'Categories' and you will be taken to the following screen where you can create your own categories.
Possible structures can be based on Blocks, Books, Chapters or Units. We anticipate that most questions will be used within individual modules and consequently we anticipate that most question categories will be created within the module. In the example above there are three main categories of questions for iCMA01, iCMA02 and iCMA22.
Please note that it is also possible to reach the question bank from the Quiz administration block and if you do this you will also find a category for this particular quiz.
Please be aware that entering the question bank from the site Administration block (Figure 1.2) does not show the categories associated with individual quizzes. Please compare and contrast Figures 1.2 and 1.3.
Should you wish to export questions from one module and import them into another module you will be offered a range of possible formats. Our experience however is that only one of these is reliable and we recommend that you use 'Moodle XML format'.
In Moodle 2 images and any audio-visual content are included within questions when exporting and importing. If there are multiple large videos in your questions than it is possible that you will exceed the memory allocated to the export system. If you encounter this problem and your intention is to move questions within OU VLE modules please talk to LTS staff who are able to move questions within the VLE via higher level faculty categories in the question bank (and thereby circumvent the export).
Having created your categories click on the Questions link in the administration block to show:
In the Question bank select the category that you wish to store your question in and then choose one of the question types from 'Create a new question…'. This will bring up the ‘Choose a question type to add’ dialogue box.
These Moodle question types can be categorised by the type of interaction that they support.
There is also a seventh category, Description, which enables you to include blocks of information within an assessment.
There are a variety of question types that accept numeric responses.
The simplest to use but also the most limited.
All Variable numeric types:
One or more numeric response fields can be combined with any of Pattern match, OU multiple response and Select missing words.
The STACK question type has the Maxima computer algebra system behind it and as such offers a complete mathematical sub-system for handling a wide variety of problems that result in numeric answers.
When you include a number in a Pattern match response string it is treated like a number, that is match(5) with match 5, 5.0, 5e0 etc.
These two question types are described in detail elsewhere on thie website
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
You may also include formatted mathematics using TeX or MathML. The text surrounded by two $ symbols in this example will be interpreted by the TeX filter and converted to nicely formatted mathematics. Please see the Including TeX and MathML section below.
The response box may be placed within the question rubric by including a sequence of underscores e.g. _____. At runtime the response box will replace the underscores and the length will be determined by the number of underscores. The minimum number of underscores required to trigger this action is 5. If no underscores are present the response box will be placed after the question and will be a full length box.
Default mark: Decide on how to score your questions and be consistent.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. We do not recommend that authors rely on using the machine generated 'Right answer' (from the iCMA definition form) as this is:
Answer: an integer or decimal number. Use 'E' format for very large or very small values. '*' matches all other answers.
Error: Only rely on exact values for integers, for all other values specify a suitable range.
Grade: The percentage of the mark to be allocated for this answer. The correct answer must be graded at 100%.
Feedback: Feedback that is specific to the answer.
Responses are matched in the order that they are entered into the form.
If you are asking your user to enter an integer then it is OK to leave 'Error' empty. This means you are looking for the exact value.
But if you are expecting a non-integer value, or if any of your 'units' calculations will result in a non-integer answer, then you should be aware that the computer will treat the student's answer as a 'real', i.e. decimal, number. And because of the digital nature of computers and the finite number of 'bits' that are used to store 'reals' it is often the case that the real number cannot be held exactly; try writing 2/3 in decimal format where you only have 6 digits available and you'll see the problem. So if you are testing a 'real' value you should always use a small 'Accepted error'. In the example of 2/3, you may choose to look for a value of 0.665 and an 'Accepted error' of 0.0051, which will allow for both 0.66 or 0.67 and everything in-between such as 0.6667.
The Numerical question type does not allow students to input powers and the Unit handling and Units sections provide ways around this. An alternative approach is provided in the Variable numeric with units question type.
Unit handling: a choice of three options:
Unit penalty:
Units are input using: A choice of three options:
Units go: left or right.
The multiplier is the factor by which the correct numerical response will be multiplied.
The first unit (Unit 1) has a default multiplier of 1. Thus if the correct numerical response is 5500 and you set W as unit at Unit 1 which has 1 as default multiplier, the correct response is 5500 W.
If you add the unit kW with a multiplier of 0.001, this will add a correct response of 5.5 kW. This means that the answers 5500W or 5.5kW would be marked correct.
Note that the accepted error is also multiplied, so an allowed error of 100W would become an error of 0.1kW.
If you are configuring your question to run in 'Interactive with multiple tries' mode such that students can answer questions one by one you can also provide hints and second (and third) tries by entering appropriate feedback into the boxes provided in the 'Multiple tries' section of the question definition form.
To use this section of the question form please ensure that on the iCMA definition form you have chosen 'How questions behave' = 'Interactive with multiple tries'.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
N.b. If you create a hint and then decide to delete it such that there is one less hint to the question you must ensure that all HTML formatting is also removed. To do this click on the 'HTML' icon in the editor toolbar which will display the HTML source. If any HTML formatting (in '< >') is present delete it and click on the HTML icon again to return to normal view.
Variable numeric questions use numeric values and calculations that are performed at runtime following rules laid down in the question. As such it is straightforward to produce numerous variations on a question. However this places an onus on the author to be sure that all possible variations will function properly and not inadvertently run into ‘division by zero’ or ‘square root of a minus number’ errors. Please see Variable numeric sets questions for a ‘safer’ alternative.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
Variables that are calculated by the computer are included in text fields by placing the names of the variables within double square brackets e.g. [[b]]. The formatting of the numbers may be controlled using sprinf() style controls (http://php.net/ manual/ en/ function.sprintf.php ). For example [[b,.3e]] will display the value of a in scientific notation with 3 decimal places. The most usual formatting choices are ‘d’ for integers, ‘f’ for floating point or ‘e’ for scientific notation.
You may also include formatted mathematics using TeX or MathML. The text surrounded by two $ symbols in this example will be interpreted by the TeX filter and converted to nicely formatted mathematics. Please see the Including TeX and MathML section below.
The response box may be placed within the question rubric by including a sequence of underscores e.g. _____. At runtime the response box will replace the underscores and the length will be determined by the number of underscores. The minimum number of underscores required to trigger this action is 5. If no underscores are present the response box will be placed after the question and will be a full length box.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. We do not recommend that authors rely on using the machine generated 'Right answer' (from the iCMA definition form).
String to act as seed for randomisation: Sometimes it is desirable to base a sequence of questions on the same set of numbers. To do this use the same string e.g. ‘mystring’ as the ‘seed for randomisation’ in each question in the sequence.
Please note that this then generates the same set of numbers across a sequence of questions for a student but different students get different sets of numbers.
As shown on this page variable names should use lower case letters.
The formula for all calculations should start with a variable name followed by an equals (=) sign and may use common mathematical operators +, -, *, /, brackets, (, ) and other variables defined within the question.
The function rand_int(n,m)provides a random integer within the range n to m inclusive. rand_int(n,m)uses the seed described above.
The mathematical processing that is available is similar to that provided within the Moodle Gradebook and includes
Recalculate now: With all variables specified the ‘Recalculate now’ button will calculate a set of values within the greyed out boxes.
Require scientific notation:It is possible to require the student to enter their response using scientific notation. The input box is provided with a superscript facility and all of the following formats are accepted: 1.234e5, 1.234x105, 1.234X105 and 1.234*105.
For keyboard users the superscript may be accessed with the up-arrow key. Down-arrow returns entry to normal. You may wish to include this information within your question. If you do please note that the up-arrow and down-arrow provided in the HTML editor’s ‘insert custom characters’ list are not spoken by a screen reader and you should also include the words ‘up-arrow’ and ‘down-arrow’.
Answer: This may be a numeric value, a variable as above or a mathematical expression.
Accepted error: Computers store numbers to a finite accuracy e.g. 1/3 is stored as a finite number of recurring 3s in 0.33333… When used in calculations these minor infelicities can soon propagate and as such we strongly recommend that answers are matched within a suitable numerical range as shown here.
Significant figures: It is possible to request that responses be given to a specified number of significant figures.
Typically these are responses that have a variation on the correct numerical value e.g.
In all of these four cases specific feedback can be supplied using these settings.
If you are configuring your question to run in 'Interactive with multiple tries' mode such that students can answer questions one by one you can also provide hints and second (and third) tries by entering appropriate feedback into the boxes provided in the 'Multiple tries' section of the question definition form.
To use this section of the question form please ensure that on the iCMA definition form you have chosen 'How questions behave' = 'Interactive with multiple tries'.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
N.b. If you create a hint and then decide to delete it such that there is one less hint to the question you must ensure that all HTML formatting is also removed. To do this click on the 'HTML' icon in the editor toolbar which will display the HTML source. If any HTML formatting (in '< >') is present delete it and click on the HTML icon again to return to normal view.
Variable numeric sets questions are similar to Variable numeric questions but are limited to ‘sets’ of pre-specified values. As such Variable numeric sets questions are safer as all possible variations can be tested before the question is released to students.
Variable numeric sets questions use numeric values and calculations that are performed at runtime following rules laid down in the question.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
Variables that are calculated by the computer are included in text fields by placing the names of the variables within double square brackets e.g. [[a]]. The formatting of the numbers may be controlled using sprinf() style controls (http://php.net/ manual/ en/ function.sprintf.php ). For example [[a,.3e]] will display the value of a in scientific notation with 3 decimal places. The most usual formatting choices are ‘d’ for integers, ‘f’ for floating point or ‘e’ for scientific notation.
You may also include formatted mathematics using TeX or MathML. The text surrounded by two $ symbols in this example will be interpreted by the TeX filter and converted to nicely formatted mathematics. Please see the Including TeX and MathML section below.
The response box may be placed within the question rubric by including a sequence of underscores e.g. _____. At runtime the response box will replace the underscores and the length will be determined by the number of underscores. The minimum number of underscores required to trigger this action is 5. If no underscores are present the response box will be placed after the question and will be a full length box.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. We do not recommend that authors rely on using the machine generated 'Right answer' (from the iCMA definition form).
String to act as seed for randomisation: Sometimes it is desirable to base a sequence of questions on the same set of numbers. To do this use the same sets of numbers and the same string e.g. ‘mystring’ as the ‘seed for randomisation’ in each question in the sequence.
Please note that this then selects the same set of numbers across a sequence of questions for a student but different students get different sets of numbers.
As shown on this page variable names should use lower case letters.
The formula for all calculations should start with a variable name followed by an equals (=) sign and may use common mathematical operators +, -, *, /, brackets, (, ) and other variables defined within the question.
The mathematical processing that is available is similar to that provided within the Moodle Gradebook and includes
Recalculate now: With all variables specified the ‘recalculate now’ button will calculate a set of values within the greyed out boxes.
Require scientific notation: It is possible to require the student to enter their response using scientific notation. The input box is provided with a superscript facility and all of the following formats are accepted: 1.234e5, 1.234x105, 1.234X105 and 1.234*105.
For keyboard users the superscript may be accessed with the up-arrow key. Down-arrow returns entry to normal. You may wish to include this information within your question. If you do please note that the up-arrow and down-arrow provided in the HTML editor’s ‘insert custom characters’ list are not spoken by a screen reader and you should also include the words ‘up-arrow’ and ‘down-arrow’.
Answer: This may be a numeric value, a variable as above or a mathematical expression. For the example shown here it would have been quite acceptable to place ‘a/(-b)’ in the Answer box.
Accepted error: Computers store numbers to a finite accuracy e.g. 1/3 is stored as a finite number of recurring 3s in 0.33333… When used in calculations these minor infelicities can soon propagate and as such we strongly recommend that answers are matched within a suitable numerical range as shown here.
Significant figures: It is possible to request that responses be given to a specified number of significant figures.
Typically these are responses that have a variation on the correct numerical value e.g.
In all of these four cases specific feedback can be supplied using these settings.
If you are configuring your question to run in 'Interactive with multiple tries' mode such that students can answer questions one by one you can also provide hints and second (and third) tries by entering appropriate feedback into the boxes provided in the 'Multiple tries' section of the question definition form.
To use this section of the question form please ensure that on the iCMA definition form you have chosen 'How questions behave' = 'Interactive with multiple tries'.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
N.b. If you create a hint and then decide to delete it such that there is one less hint to the question you must ensure that all HTML formatting is also removed. To do this click on the 'HTML' icon in the editor toolbar which will display the HTML source. If any HTML formatting (in '< >') is present delete it and click on the HTML icon again to return to normal view.
This is an extension of Variable numeric sets by the addition of a Pattern match response matching field to match the units.
The description below deals only with the differences from Variable numeric sets questions.
The grade for the numeric value and the unit can have different weights.
The question type splits the numeric value from the units before matching the units using the Pattern match algorithm.
The full capabilities for the Pattern match question is described in section 2.4.1.
The Combined question enables multiple numeric responses to be matched.
The full capabilities of the Combined question are described in section 2.4.1.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response fields have the form
After adding new input fields and to remove empty unwanted input fields click the ‘Verify the question text and update the form’ button. At this point your question text will be validated.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. There is no system generated 'Right answer' (from the iCMA definition form) for Combined questions.
Thie Combined question uses a restricted form of the Variable numeric response matching.
Weighting. Different responses can have different percentages of the total mark. The weightings must add to 100%.
Answer: an integer or decimal number. Use 'E' format for very large or very small values.
Accepted error: Only rely on exact values for integers, for all other values specify a suitable range.
Scientific notation:It is possible to enable the student to enter their response using scientific notation. The input box is provided with a superscript facility and all of the following formats are accepted: 1.234e5, 1.234x105, 1.234X105 and 1.234*105.
Feedback for any incorrect response: Where some fields are answered correctly and others incorrectly the intention is that the feedback associated with the field is used to say what is wrong. Correct answers can be counted up using the the option to 'Show the number of correct responses' in the Combined feedback and Hints fields.
If you are configuring your question to run in 'Interactive with multiple tries' mode such that students can answer questions one by one you can also provide hints and second (and third) tries by entering appropriate feedback into the boxes provided in the 'Multiple tries' section of the question definition form.
To use this section of the question form please ensure that on the iCMA definition form you have chosen 'How questions behave' = 'Interactive with multiple tries'.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Four question types will accept typed textual responses.
Pattern match has a range of features:
This is the original Moodle text matching question type but it is very limited.
One or more Pattern match response fields can be combined with any of numeric, OU multiple response and Select missing words.
The Essay question has to be marked by hand.
The Pattern match question type is used to test if a short free-text student response matches a specified response pattern.
Pattern match is a more sophisticated alternative to the Short answer question type and offers:
For certain types of response it has been shown to provide an accuracy of marking that is on a par with, or better than, that provided by a cohort of human markers.
Pattern match works on the basis that you have a student response which you wish to match against any number of response matching patterns. Each pattern is compared in turn until a match is found and feedback and marks are assigned.
The key to using Pattern match is in asking questions that you have a reasonable hope of marking accurately. Hence writing the question stem is the most important part of writing these questions.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response box may be placed within the question rubric by including a sequence of underscores e.g. _____. At runtime the response box will replace the underscores. The minimum number of underscores required to trigger this action is 5. The size of the input box may also be specified by __XxY__ e.g. __20x1__ will produce a box 20 columns wide and 1 row high. If no underscores are present the response box will be placed after the question.
Default mark: Decide on how to score your questions and be consistent.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. For the Pattern match question type you cannot rely on using the machine generated 'Right answer' (from the iCMA definition form).
Case sensitivity: No or yes.
Allow use of subscript/superscript: No or Yes. Any subscripts entered by the student are contained in their response between the standard tags of and or and . For example:
At run time keyboard users may move between normal, subscript and superscript by using the up-arrow and down-arrow keys. You may wish to include this information within your question. If you do please note that the up-arrow and down-arrow provided in the HTML editor’s ‘insert custom characters’ list are not spoken by a screen reader and you should also include the words ‘up-arrow’ and ‘down-arrow’.
If answer is more than 20 words: We strongly recommend that you limit responses to 20 words. Allowing unconstrained responses often results in responses that are both right and wrong – which are difficult to mark consistently one way or the other.
Check spelling of student: How many ways do you know of to spell ‘temperature’? We’ve seen 14! You will improve the marking accuracy by insisting on words that are in Moodle system dictionary.
Add these words to dictionary: When dealing with specialised scientific, technical and medical terms that are not in a standard dictionary it is most likely that you will have to add them by using this field. Enter your words leaving a space between them.
Convert the following characters to space: In Pattern match words are defined as sequences of characters between spaces. The exclamation mark and question mark are also taken to mark the end of a word. The period is a special case; as a full stop it is also a word delimiter but as the decimal point it is not. All other punctuation is considered to be part of the response but this option lets you remove it.
Words and synonyms: All words and synonyms are specified as they are to be applied by the response matching. They do not have to be full words but can be stems with a wildcard.
Synonyms may only be single words i.e. the ability to specify alternate phrases in synonym lists is not allowed.
From the example above any occurrence of the word oil in the response match will be replaced by oil|glycer*|paraf* before the match is carried out.
Answer: This example shows that Pattern match will support complex response matching.
Take the first answer field match_o(ampere candela kelvin mole) is the exact match for the four words with the additional feature that the matching option 'o' allows the words to be given in any order.
The answer field match_ow(ampere candela kelvin mole) requires the same four words, again in any order, but also allows other words.
The third answer field match_mow(ampere candela kelvin mole) allows for misspellings which are still in the dictionary e.g. mule instead of mole.
Please see the section on Pattern match syntax for a full description.
Grade: Between ‘none’ and 100%. At least one response must have a mark of 100%.
Feedback: Specific feedback that is provided to anyone whose response is matched by the response matching rule in Answer.
The feedback for all non-matched responses should go into the ‘Any other answer’ field.
The basic unit of the student response that Pattern match operates on is the word, where a word is defined as a sequence of characters between spaces. The full stop (but not the decimal point), exclamation mark and question mark are also treated as ending a word.
Numbers are special instances of words and are matched by value and not by the form in which they are given. match_w(25 ms-1)will match the following correct responses; 25 ms-1, 2.5e1 ms-1, 2.5x10-1 ms-1
With the exception of numbers and the word terminators (
The response is treated as a whole with the exception that words that are required to be in proximity must also be in the same sentence.
The match syntax can be considered in three parts.
Matching option | Symbol | Description |
allowExtraChars | c | Extra characters can be anywhere within the word. Authors are expected to omit allowExtraChars when using the misspelling options below. |
allowAnyWordOrder | o | Where multiple words are to be matched they can be in any order. |
allowExtraWords | w | Extra words beyond those being searched for are accepted. |
misspelling: allowReplaceChar | mr | Will match a word where one character is different to those specified in the pattern. The pattern word must be 4 characters or greater, excluding wildcards, for replacement to kick in. Authors are expected to omit allowExtraChars when using this option. |
misspelling: allowTransposeTwoChars | mt | Will match a word where two characters are transposed. The pattern word must be 4 characters or greater, excluding wildcards, for transposition to kick in. Authors are expected to omit allowExtraChars when using this option. |
misspelling: allowExtraChar | mx | Will match a word where one character is extra to those specified in the pattern. The pattern word must be 3 characters or greater, excluding wildcards, for extra to kick in. Authors are expected to omit allowExtraChars when using this option. |
misspelling: allowFewerChar | mf | Will match a word where one character is missing from those specified in the pattern. The pattern word must be 4 characters or greater, excluding wildcards, for fewer to kick in. Without this 'no' would be reduced to just matching 'n' or 'o'. Authors are expected to omit allowExtraChars when using this option. |
misspelling | m | This combines the four ways of misspelling a word described above i.e. m is equivalent to mxfrt. Authors are expected to omit allowExtraChars when using this option. |
misspellings | m2 | Allows two misspellings, as defined by option 'm', in pattern words of 8 characters or more, excluding wildcards. Authors are expected to omit allowExtraChars when using this option. N.b. use this option sparely. It introduces a huge number of possible acceptable spellings all of which have to be checked and the time taken for the match to complete goes up and up. |
allowProximityOf0 | p0 | No words, or full stops, are allowed in between any words specified in the proximity sequence. |
allowProximityOf1 | p1 | One word is allowed in between any two words specified in the proximity sequence. The words must not span sentences. |
allowProximityOf2 | p2 | (Default value) Two words are allowed in between any two words specified in the proximity sequence. The words must not span sentences. |
allowProximityOf3 | p3 | Three words are allowed in between any two words specified in the proximity sequence. The words must not span sentences. |
allowProximityOf4 | p4 | Four words are allowed in between any two words specified in the proximity sequence. The words must not span sentences. |
The matching options are appended to the word match with an intervening underscore and may be combined. A typical match combines the options ‘mow’ to allow for
and is written match_mow(words to be matched).
Within a word 'special characters' provide more localised control of the patterns.
Special character | Symbol | Description |
Word AND | space | 'space' delimits words and acts as the logical AND. |
Word OR | | | | between words indicates that either word will be matched. | delimits words and acts as the logical OR. |
Proximity control | _ | Words must be in the order given and with no more than n (where n is 0 - 4) intervening words. All words under the proximity control must be in the same sentence. _ delimits words and also acts as logical 'AND'. Other words included in the match that are not under proximity control must be outside the words under the proximity control in the response for them to be matched.For example: 'match(abcd_ffff ccc)'does match the response 'abcd ffff ccc' but does not match the response 'abcd ccc ffff' |
Word groups | [ ] | [ ] enables multiple words to be accepted as an alternative to other single words in OR lists. [ ] may not be nested. Single words may be OR'd inside [ ]. Where a word group is preceded or followed by the proximity control the word group is governed by the proximity control rule that the words must be in the order given. |
Single character wildcard | ? | Matches any single character. |
Multiple character wildcard | * | Matches any sequence of characters including none. |
It is possible to match some of the special characters by ‘escaping’ them with the ‘\’ character. So match(\|) will match ‘|’. Ditto for _, [, ], and *. And if you wish to match round brackets then match(\(\)) will match exactly ‘()’.
match_all() | All matches contained within the brackets must be true. | match_all(
|
match_any() | Just one of the matches contained within the brackets must be true. | match_any(
|
not() | The match within the brackets must be false. | not (
|
match_all(), match_any()and not() may all be nested.
Student response | Matching options | Pattern match | match method return |
tom dick harry | empty | tom dick harry | True. This is the exact match. |
thomas | c | tom | True. Extra characters are allowed anywhere within the word. |
tom, dick and harry | w | dick | True. Extra words are allowed anywhere within the sentence. |
harry dick tom | o | tom dick harry | True. Any order of words is allowed. |
rick | m | dick | True. One character in the word can differ. |
rick and harry and tom | mow | tom dick harry | True. |
dick and harry and thomas | cow | tom dick harry | True. |
arthur, harry and sid | mow | tom|dick|harry | True. Any of tom or dick or harry will be matched. |
tom, harry and sid | mow | tom|dick harry|sid | True. The pattern requires either tom or dick AND harry or sid. Note that 'tom,' is only allowed because m allows the extra character, the comma, in 'tom,'. |
tom was mesmerised by maud | mow | [tom maud]|[sid jane] | True. The pattern requires either (tom and maud) or (sid and jane). |
rick | empty | ?ick | True. The first character can be anything. |
harold | empty | har* | True. Any sequence of characters can follow 'har'. |
tom married maud, sid married jane. | mow | tom_maud | True. Only one word is between tom and maud. |
maud married tom, sid married jane. | mow | tom_maud | False. The proximity control also specifies word order and over-rides the allowAnyWordOrder matching option. |
tom married maud, sid married jane. | mow | tom_jane | False. Only two words are allowed between tom and jane. |
tom married maud | mow | tom|thomas marr* maud | True. |
maud marries thomas | mow | tom|thomas marr* maud | True. |
tom is to marry maud | mow | tom|thomas marr* maud | True. |
tempratur | m2ow | temperature | True. Two characters are missing. |
temporatur | m2ow | temperature | True. Two characters are incorrect; one has been replaced and one is missing. |
How can you possibly guess the multiplicity of phrases that your varied student cohort will use to answer a question? Of course you can’t, but you can record everything and over time you will build a bank of student responses on which you can base your response matching. And gradually you might be surprised at how well your response matching copes.
Before we describe the response matching it’s worth stressing:
The underlying structure of the response matching described here was developed in the Computer Based Learning Unit of Leeds University in the 1970s and was incorporated into Leeds Author Language. The basic unit of the word, the matching options of allowAnyChars, allowAnyWords, allowAnyOrder and the word OR feature all date back to Leeds Author Language.
In 1976 the CALCHEM project which was hosted by the Computer Based Learning Unit, the Chemistry Department at Leeds University and the Computer Centre of Sheffield Polytechnic (now Sheffield Hallam University) produced a portable version of Leeds Author Language.
A portable version for microcomputers was developed in 1982 by the Open University, the Midland Bank (as it then was; now Midland is part of HSBC) and Imperial College. The single and multiple character wildcards were added at this time.
The misspelling, proximity and Word groups in 'or' lists additions were added as part of the Open University COLMSCT projects looking at free text response matching during 2006 - 2009.
Philip G. Butcher and Sally E. Jordan, A comparison of human and computer marking of short free-text student responses, Computers & Education 55 (2010) 489-499
The response matching in the Short answer question type is very limited. Consider using the Pattern match question type instead.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response box may be placed within the question rubric by including a sequence of underscores e.g. _____. At runtime the response box will replace the underscores and the length will be determined by the number of underscores. The minimum number of underscores required to trigger this action is 5. If no underscores are present the response box will be placed after the question and will be a full length box.
Default mark: Decide on how to score your questions and be consistent.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. We do not recommend that authors rely on using the machine generated 'Right answer' (from the iCMA definition form).
Case sensitivity: The author can specify whether or not the case of the response is important.
Answer: The Short answer question type will search for a sequence of consecutive characters within a response. The wildcard character '*' allows other words and characters, including space, to be present.
Beyond this wildcard there are no other response handling facilities. The Pattern match question type offers more sophisticated features. For example if you wished to look for ‘Tom’ and ‘Dick’ and ‘Harry’ in any order then Pattern match allows you to do this easily but Short answer requires you to specify all possible sequences.
The wildcard '*' on its own in the 'answer' field will match 'any other answer'.
Feedback: Use this to give feedback to the responses that are matched by the current response match.
Do remember that Moodle keeps all the responses that students enter and as such it is straightforward to gauge the accuracy of your response matching. If you are writing this type of question you should allow time to modify your response matching in the light of actual student responses.
The Combined question enables multiple text responses to be matched.
The full capabilities for the Combined question is described in section 2.4.1.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response fields have the form
After adding new input fields and to remove empty unwanted input fields click the ‘Verify the question text and update the form’ button. At this point your question text will be validated.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. There is no system generated 'Right answer' (from the iCMA definition form) for Combined questions.
Thie Combined question uses a restricted form of the Pattern match response matching.
Weighting. Different responses can have different percentages of the total mark. The weightings must add to 100%.
Allow use of subscript/superscript: for students to include powers or chemical formulae.
Case sensitivity: Yes or no.
Check spelling of student: Yes or no.
Answer: Use Pattern match syntax.
Feedback for any incorrect response: Where some fields are answered correctly and others incorrectly the intention is that the feedback associated with the field is used to say what is wrong. Correct answers can be counted up using the the option to 'Show the number of correct responses' in the Combined feedback and Hints fields.
If you are configuring your question to run in 'Interactive with multiple tries' mode such that students can answer questions one by one you can also provide hints and second (and third) tries by entering appropriate feedback into the boxes provided in the 'Multiple tries' section of the question definition form.
To use this section of the question form please ensure that on the iCMA definition form you have chosen 'How questions behave' = 'Interactive with multiple tries'.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Essay questions are not marked automatically unlike other questions in the iCMA. If an Essay question is to be marked this has to be undertaken by an appointed marker who is given appropriate permission by Exams and Assessments; the Tutor role in OU Moodle does not have permission to mark questions.
We recommend that essay questions are shown one per page.
Question text: You may use the full functionality of the editor to state the question.
General feedback: Perhaps use this for a model answer.
Response format: With or without the normal on-screen editor.
Input box size
Allow attachments: Any text entered here will be displayed in the response input box when a new attempt at the question starts.
Response template: Any text entered here will be displayed in the response input box when a new attempt at the question starts.
Information for graders: The mark scheme can be placed here.
Essay questions must be marked manually and as such cannot exhibit 'interactive with multiple tries' behaviour. If an essay question is included in an iCMA that is running with 'interactive with multiple tries' behaviour the essay question will be shown without a 'Check' button and will function with 'deferred feedback' behaviour.
Essay questions are useful for collecting student feedback during an iCMA and when used for this purpose module teams do not wish the essay to be marked or to interfere with the marking process. Please indicate this to the system by giving such questions a 'Default mark' of 0. Essay questions with a 'Default mark' of 0 are set to 'Complete' when the iCMA is submitted, c.f. essay questions with a non-zero 'Default mark' which require human marking before the full results for the iCMA appear.
Questions in this category provide the correct answers together with distractors and ask the student to choose or place their choices.
Drag and drop questions for completing texts are a popular form of 'selection' question consisting of words which can be dragged from a list and dropped into pre-defined gaps in the text.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The Question text is written with two sets of square brackets '[[n]]' indicating the positioning of gaps and a number 'n' inside the brackets indicating the correct choice from a list which follows the question.
It is allowed to place the drop zones in lists and tables.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. We do not recommend that authors rely on using the machine generated 'Right answer' (from the iCMA definition form).
Limited formatting of text using , , , , and is allowed.
Choices that are within the same Group are colour coded and may only be dropped in a gap with the corresponding colour.
Choices that are marked as 'infinite' may be used in multiple locations.
It is not possible to have drag boxes containing multiple lines. If you want to drag long sentences - don't. Give each a label and drag the label.
The resulting question when run in 'interactive with multiple tries' style looks as follows:
Drag and drop questions are keyboard accessible. Use the
All gaps are weighted identically; in the above example each gap is worth 25% of the marks. Only gaps that are filled correctly gain marks. There is no negative marking of gaps that are filled incorrectly.
The display of Combined feedback is governed by the Specific feedback setting on the iCMA definition form.
In interactive with multiple tries mode Combined feedback is shown after every try as well as when the question completes.
The option ‘Show the number of correct responses’ is over-ridden in interactive with multiple tries mode by the same settings in the ‘Settings for multiple tries’ section of the editing form.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
If the question is used in 'interactive with multiple tries' behaviour the marking is modified as follows:
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
Drag and drop questions for labelling images are a popular form of 'selection' question consisting of images or words which can be dragged from a list and dropped into pre-defined gaps on the base image.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
General feedback: We recommend the correctly labelled image be included in this box. Students who did not answer completely correctly can then compare and contrast to see where they made an error. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect.
Start by uploading a background image. There is a maximum size of 600 x 400 pixels for this image. Please note that we recommend a maximum image width of 570 pixels so as to fit all parts of an iCMA into a 1024 wide window. When you have done this it will appear in the preview area.
Next create the drag items. These may be text items as shown here, or images. There is a maximum import size for images of 150 x 100 and images larger than this will be scaled on import. If you choose to use draggable images please ensure that you add a short text descriptor. For each item having entered the text please press Enter to complete the addition of the draggable item and register it in the drop-down lists of Drop zones.
Now place the drag items onto the background image by completing the Drop zones section above. Once you have established the draggable item for a drop zone the item will appear beneath the background image. At this point you may position the item either by dragging it or by completing the Left and Top boxes for the item.
The drag item will be placed at the top and left coordinates as measured in pixels from the top left of the background image.
Within draggable text items limited formatting of text using , , , , and is allowed.
is also allowed.
Choices that are within the same Group are colour coded and may only be dropped on a drop zone with the corresponding colour. Choices that are marked as 'infinite' may be used in multiple locations.
The resulting question when run in 'interactive with multiple tries' style looks as follows:
Drag and drop questions are keyboard accessible. Use the
All gaps are weighted identically and in the above example each drop zone is worth 12.5% of the marks. Only drop zones that are filled correctly gain marks. There is no negative marking of drop zones that are filled incorrectly.
Whether or not Combined feedback is shown to students is governed by the Specific feedback setting on the iCMA definition form.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
If the question is used in 'interactive with multiple tries' style the marking is modified as follows:
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
Matching questions are created by the author supplying several questions and their correct answers. The computer lays out the questions and shuffles the answers in a drop-down list. Matching questions raise the same issues as multiple choice in that all answers are supplied and students are 'matching' answers rather than providing them. If the same number of matching options is provided as there are questions then it is the case that students do not have to know all the correct answers to obtain full marks; if there is one answer they do not know then it is clearly the one that is left over after all other questions have been answered. However it is possible to add extra answers as distractors to make the question more challenging.
Please note that once a Matching question has been used by students you should not edit it. This is because of the way that the components of the question are held in the database.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
General feedback: We recommend the correct answer be included in this box. Students who did not answer completely correctly can then compare and contrast to see where they made an error. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect.
The answer fields in Matching questions can only contain plain text.
Whether or not Combined feedback is shown to students is governed by the Specific feedback setting on the iCMA definition form.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
The Multiple Choice question type is the most common form of question used in the OU's paper-based CMA system which was used by large numbers of modules in the 1970s, 1980s and 1990s and remains in use today.
Typically the student is provided with a question and a range of possible answers and asked to choose one. The skill in writing multiple choice questions is in providing questions that expose misunderstanding of the materials under test such that suitable distractors can be written. The response to the distractors can then be used to provide remedial feedback to help students overcome their misunderstanding.
A problem with all forms of multiple choice questions is that students can guess the answer from the range of options provided. To discourage this various penalty mechanisms have been suggested to try and prevent students from simply guessing. In its current form Moodle allows the author to attach negative marks to options though most users at the OU do not use this facility and score the correct choice at 100% and incorrect choices at 0%.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
General feedback: We recommend the correctanswer be included in this box. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect.
One or multiple answers? It is possible to use the Multiple choice question to create a Multiple response question that requires the student to choose multiple options. However marking these questions can become complex. If for example a correct response requires that options (a), (b) and (c) be chosen then each could be allocated 33.3% of the marks, but what marks are allocated to options (d), (e) and (f)? Should it be zero or a negative value; if the latter at what level, -33.3%, -100%? This can become difficult and because of this we strongly recommend that authors wishing to create Multiple Response questions use the OU Multiple Response question type which incorporates the mark schemes used in the original paper-based CMA system.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct. Clearly this should not be used for questions with single choices.
Along with the Multiple Choice question type, the Multiple Response question type is the most common form of question used in the OU's paper-based CMA system. The Multiple Response question type requires the student to choose multiple options.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
General feedback: We recommend the correct answer be included in this box. Students who did not answer completely correctly can then compare and contrast to see where they made an error. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect.
The OU Multiple Response question type simplifies the authoring process by incorporating the marking scheme developed for the OU paper-based CMA system. In this marking scheme
Example: A question has five choices of which only 'a' and 'b' are correct. How student responses are marked:
Whether or not Combined feedback is shown to students is governed by the Specific feedback setting on the iCMA definition form.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
If the question is used in interactive with multiple tries mode the marking is modified as follows:
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
Show the feedback for the selected responses: This setting controls the feedback to individual choices however if too many choices are made it is suppressed.
The Select missing words question type is very similar to the Drag and drop words into text question type. It is useful for questions where the on-screen space is insufficient to provide space for both the question and the drag items – if the drag items are off-screen it’s rather hard to drag them anywhere.
This question type will therefore most likely be used when the question author has a large text that they wish the student to label or complete by selecting missing words from a drop-down list. Though, of course, some authors might prefer to use it to the drag and drop alternative.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The Question text is written with two sets of square brackets '[[n]]' indicating the positioning of gaps and a number 'n' inside the brackets indicating the correct choice from a list which follows the question.
General feedback: We recommend the correct answer be included in this box. Students who did not answer completely correctly can then compare and contrast to see where they made an error. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect.
Answer: Unlike the Drag and drop words into text question type the Select missing words question allows no formatting of the words to be selected.
Group: Choices that are within the same Group appear in the same drop-down lists.
The resulting question when run in 'interactive with multiple tries' style looks as follows:
Missing words questions are keyboard accessible. Use the
All gaps are weighted identically. Only gaps that are filled correctly gain marks. There is no negative marking of gaps that are filled incorrectly.
Whether or not Combined feedback is shown to students is governed by the Specific feedback setting on the iCMA definition form.
Penalty for each incorrect try: The available mark is reduced by the penalty for second and subsequent tries. In the example above a correct answer at the second try will score 0.6666667 of the available marks and a correct answer at the third try will score 0.3333334 of the available marks.
If the question is used in 'interactive with multiple tries' style the marking is modified as follows:
Hint: You can complete as many of these boxes as you wish. If you wish to give the student three tries at a question you will need to provide two hints. At runtime when the hints are exhausted the question will finish and the student will be given the general feedback and the question score will be calculated.
Clear incorrect responses: When ‘Try again’ is clicked incorrect choices are cleared.
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
The True/false question is the simplest question type. There is one right answer, worth 100% and one wrong answer worth 'none'.
The Combined question enables multiple OU multiple response and Select missing words responses to be matched.
The full capabilities for the Combined question is described in section 2.4.1.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response fields have the form [[
Multiresponse can be displayed vertically or horizontally
Selectmenu must be followed by the correct choice.
Note that it is possible to have multiple choices that use the same menu choices e.g. The quick brown [[4:selectmenu:2]] jumps over the lazy [[4:selectmenu:4]].
Where the choices are, cow, fox, cat, dog.
After adding new input fields and to remove empty unwanted input fields click the ‘Verify the question text and update the form’ button. At this point your question text will be validated.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. There is no system generated 'Right answer' (from the iCMA definition form) for Combined questions.
These are restricted forms of the Selectmenu and OU multiple response response matching fields.
Where some fields are answered correctly and others incorrectly the intention is that the feedback associated with the field is used to say what is wrong. Correct answers can be counted up using the the option to 'Show the number of correct responses' in the Combined feedback and Hints fields.
There are two question types that accept responses to multiple inputs.
The Combined question enables multiple Numeric, Pattern match, OU multiple response and Select missing words responses to be matched.
The STACK question supports complex matching on multiple response fields.
The STACK question type is described in 2.6.
The Combined question incorporates features of four existing question types into one composite question. The four question types that are included are:
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
The response fields have the form [[
Numeric and pmatch have the option to limit the size of the input box.
If no ____ is present the input box follows the question stem and is a ‘full single line’.
Multiresponse can be displayed vertically or horizontally
Selectmenu must be followed by the correct choice.
Note that it is possible to have multiple choices that use the same menu choices e.g. The quick brown [[4:selectmenu:2]] jumps over the lazy [[4:selectmenu:4]].
Where the choices are, cow, fox, cat, dog.
After adding new input fields and to remove empty unwanted input fields click the ‘Verify the question text and update the form’ button. At this point your question text will be validated.
General feedback: We recommend that all questions should have this box completed with the correct answer and a fully worked explanation. The contents of this box will be shown to all students irrespective of whether their response was correct or incorrect. There is no system generated 'Right answer' (from the iCMA definition form) for Combined questions.
These are restricted forms of the Variable numeric, Pattern match, OU multiple response and Selectmenu response matching fields.
Weighting. Different responses can have different percentages of the total mark. The weightings must add to 100%.
Where some fields are answered correctly and others incorrectly the intention is that the feedback associated with the field is used to say what is wrong. Correct answers can be counted up using the the option to 'Show the number of correct responses' in the Combined feedback and Hints fields.
The resulting question when run in interactive with multiple tries behaviour looks as follows:
Drag and drop marker questions allow for multiple markers to be placed onto an image and the positioning of each marker to be assessed.
Drag and drop marker questions differ from Drag and drop onto image questions in that there are no predefined areas on the underlying image that are visible to the student.
Question name: A descriptive name is sensible. This name will not be shown to students.
Question text: You may use the full functionality of the editor to state the question.
Highlight drop zones which have not had the correct marker dropped on them: Does what it says when the General feedback is shown.
There is a short video on this on YouTube if that's your thing...https://www.youtube.com/ watch?v=Tvl4GuGvtxc
Start by uploading a background image. There is a maximum size of 600 x 400 pixels for this image. Please note that we recommend a maximum image width of 570 pixels so as to fit all parts of an iCMA into a 1024 wide window. When you have done this it will appear in the preview area.
Next create markers by assigning a text label. And for each label set the allowed drop zone on the background image. Zones may have one of three basic shapes:
This online image map tool provides an easy interface for plotting the points of complex polygons though you will have to amend the syntax of the coordinates produced for inclusion in the question.
All x dimensions are in pixels from the left of the image and all y dimensions are in pixels from the top of the image.
Within marker text limited formatting using , , and is allowed.
is also allowed.
Markers that are marked as 'infinite' may be associated with multiple drop zones.
The centre of each marker is defined as the centre of the small circle to the top left of each label.
It is possible to provide more markers than are needed to complete the question thereby providing an extra level of distraction.
Marker questions are keyboard accessible. Use the
All drop zones are weighted identically; in the above example each drop zone is worth 20% of the marks. Only drop zones that are correctly labelled gain marks. There is no negative marking of drop zones that are incorrectly labelled.
If more markers are placed on the image than there are drop-zones a penalty is applied for the extra markers. If there are n drop zones and n+m markers are placed of which p are placed correctly the score is calculated as p/(m+n).
If the question is used in 'interactive with multiple tries' style the marking is modified as follows:
Show the number of correct responses: Include in the feedback a statement of how many choices are correct.
State which markers are incorrectly placed: Include in the feedback a list of markers which are incorrectly placed.
Move incorrectly placed markers back to default start position below image: When ‘Try again’ is clicked incorrectly positioned labels are removed.
The resulting question when run in interactive with multiple tries behaviour looks as follows:
The centre of each marker is defined as the centre of the small circle to the top left of each label.
Pattern match with JME combines the Pattern match algorithm with Peter Ertl’s JavaScript Molecular Editor (http://peter-ertl.com/ jsme/ ) to provide questions that allow students to draw their answer.
The JavaScript Molecular Editor is able to convert 2D (really 3D) molecular structures into SMILES strings (http://www.daylight.com/ dayhtml/ doc/ theory/ theory.smiles.html ) which may then be readily matched using Pattern match. A variety of specialised feedback options are provided.
The Question stem, Default mark and General feedback sections are as usual.
1,2-Difluoroethene has the formula CHF=CHF. Stereochemically there are two possibilities E and Z.
Z-difluoroethene and E-difluoroethene
With stereochemistry turned on the JME will return F/C=C\F for the Z isomer and F/C=C/F for the E isomer.
Please be aware that the ‘\’ character is the ‘escape’ character in the pattern match algorithm that enables ‘special characters’ to be matched. For example match(\|) will match ‘|’. Consequently in the response match for Z-difluoroethene the ‘\’ character is used to ‘escape’ itself and the response match must be written as match(F/C=C\\F).
Answer: The first answer must be the SMILES string for the correct answer as this is used as the basis for the Atom count feedback. In this case the correct answer for acetic acid is ‘CC(=O)O’. Please note that brackets are special characters that have to be ‘escaped’ as shown in the example.
Grade: Between ‘none’ and 100%. At least one response must have a mark of 100%.
Feedback: Specific feedback that is provided to anyone whose response is matched by the response matching rule in Answer.
Atom count feedback: If this is ticked the system generates feedback based on a comparison of the student’s response and the correct answer given in Answer 1
The feedback for all non-matched responses should go into the ‘Any other answer’ field.
Atom count feedback: The student’s response is compared against the correct answer given in ‘Answer 1’ and feedback is generated on the atoms that are in both the student’s response and in the correct answer
The resulting question when run in 'interactive with multiple tries' style looks as follows:
The student draws their response using the tools provided by the JME.
Pattern match with JME questions are not accessible to totally blind students or to students who are unable to use the pointer.
System for Teaching and Assessment using a Computer algebra Kernel has been developed by Christopher Sangwin (ex. Birmingham University now Loughborough University) and integrated as a Moodle question type by Tim Hunt at the OU. It uses the features of the Maxima computer algebra system to support the setting of questions with a mathematical basis and the analysis of responses that are provided in the form of numbers and functions.
A large set of example STACK questions are available on the main STACK site at http://stack.bham.ac.uk.
When writing mathematics in a STACK question please note the following:
Following this convention enables us to share STACK questions with non-OU sites.
To learn more please use the standard STACK documentation which can be found here http://stack.bham.ac.uk.
STACK uses a specially configured Maxima environment and the STACK documentation details how to configure your desktop version of Maxima to work in this environment. You will need a copy of the configuration file maxima.mac.
OpenMark questions are made available to Moodle iCMAs through the Opaque question type.
OpenMark questions may be included in a Moodle iCMA that is operating with 'interactive with multiple tries' behaviour and An interactive tour of the Moodle Quiz and iCMA02 on this site demonstrate this. Questions for use with interactive with multiple tries behaviour are typically written by extending the SimpleQuestion1() class.
OpenMark questions for inclusion in Deferred feedback iCMAs are typically written by extending the DeferredFeedbackQuestion1() class.
We have tested the Opaque interface with Firefox, IE7, IE8, IE9 and Chrome on Windows and Safari on a Macintosh. Unfortunately we have been unable to support the following OpenMark question features in IE7 when they are run through Opaque:
Question name: A descriptive name is sensible. This name will not be shown to students.
Question engine: The OpenMark server that is handling the question. Developments should be undertaken on the development server. When all testing is complete the question should be moved to the 'live servers' ready for use by students and this entry should be changed to match.
Question id: The name of the question on the OpenMark server e.g. mu120.module5.question05
Question version: OpenMark supports version numbers of questions usually starting at 1.0. Provide the version number here. Please note that should you alter the version number in 'mid-test', Moodle will start to use the new version for all students immediately. Never alter the logic of a question in 'mid-test'.
The number of tries that a student is allowed at a question and the marking of the question is carried out entirely within OpenMark. The question author must ensure that they are 'extending' a suitably configured OpenMark class when writing their questions, for example SimpleQuestion1. OpenMark will share appropriate information with Moodle.
OpenMark questions that are to be used in Deferred feedback iCMAs must extend the OpenMark class DeferredFeedbackQuestion().
The following OpenMark functions and XML components control the number of tries* and the marking and supply this information to Moodle
For examples of the partial scoring please consult the SDK125 interactive media developers who have developed SimpleQuestion3 that allows two tries with partial scoring after the second try.
Please note that in Moodle a ‘try’ is a try at a question and in interactive mode ‘Try again’ leads to a second or third try. An ‘attempt’ is an attempt at the whole iCMA which finishes when the ‘Submit’ button is pressed.
Student data associated with OpenMark questions that are included in a Moodle iCMA are stored in the Moodle database. As is usual with OpenMark it is the responsibility of the interactive media developer to decide what information should be stored.
The OpenMark question developer may use
to store any information on the variant of the question in the Moodle database. Using the above form for setQuestionLine() enables Moodle to calculate the usual statistics for each variant of a question as well as the overall statistics for all variants of the question. This information can then be extracted by the author using the Statistics report. Use
to store the student's responseString in the Moodle database. This information can then be extracted by the author using the Responses report.
Moodle does not use
Marks and responses stored in the Moodle database can be queried with the Grades and Responses features from the Results page for the iCMA. The Statistics feature also includes OpenMark questions from the December 2014 release.
Descriptions enable you to include blocks of information within an iCMA. They carry no score and are labelled in the navigation panel by the letter 'i'. (Think of the i for tourist information centres.)
Images may be included by clicking on the icon. This icon allows you to upload images from your hard disk where they will be stored with the question. However we recommend that images that appear within questions should not exceed 570 pixels wide. At this width the navigation panel, the status panel and the image will all fit in a screen that is 1024 pixels wide.
On occasion students are asked to read information off an image. If an image that is 570 pixels wide is not large enough to show the information in sufficient detail we recommend the following:
Students may then be instructed to open the enlarged image in a separate tab and inspect it closely before answering the question.
You can include the following within your question:
Once inserted into the question the transcript is included in the question. On display the student just sees a ‘display transcript’ link. Click on it and the transcript appears in situ.
Example:
The Maths On-Line Project's preferred way of displaying mathematical equations in the VLE is to use MathML. However writing MathML is not so easy. The alternative is to use TeX.
Within the VLE both TeX and MathML are displayed through a Mathematics service which will also convert between the two, i.e. authors can enter TeX and the mathematics service will create the corresponding MathML. Students may choose to receive their mathematics as images from TeX or as MathML that is interpreted by their browser.
To see a rendition of your maths as you type it use the 'insert equation' tool . Choose the appropriate tab and enter your expression. Here are a couple to try.
This following TeX will render as one half.
\frac{1}{2}
The equivalent MathML is:
This MathML
will render as one half.
Note the annotation field. This is used for building the TeX based Alt-text.
Should you wish to include MathML here is a paper on generating MathML with MathType. http://www.dessci.com/ en/ reference/ white_papers/ TUG_paper.pdf .
Having written your question you will want to know how it looks to the student. Alongside the question name in the question bank under each category are a series of icons.
The preview icon will allow you to try the question. The system tries to apply 'intelligent' initial settings for you to test your question but you are given full control of how you preview the question using the 'Change options' settings.
The Question variant field is also available for the Variable numeric trio of questions.
If you wish to amend your question you may go back to the question editing form by clicking on the edit icon .
And if your question isn't at all what you intended the delete icon will remove it.
Please note:
Steps 2 and 3 of the summative iCMA workflow (elsewhere on this website) require authors and reviewers to test all questions thoroughly using this preview facility.
The HTML tag is an inline tag that can be used to indicate snippets of computer code within sentences. However to use this tag authors must switch to HTML mode to enter it.
The ‘preformatted’ option in the format menu of the editor is an alternative for showing blocks of code in a different font.
However the font for
and is different. If you wish to mix inline and block code snippets and keep the font consistent you will probably have to amend preformatted blocks with …
again by putting the editor into HTML mode.
Table cells may be individually controlled.
[[4]] | 2 | 3 |
In a combined question it is possible to arrange two input boxes such that they look like the numerator and denominator in a fraction using the following HTML.