BestPractices
Link pages
If pages are to be linked, e.g. with the help of the "Page Jumper" component, the page ID of the target page to which the link is to be inserted is required. For example, a link for chapter 2 is to be inserted at the end of chapter 1. The component "Page Jumper" has already been inserted at the end of chapter 1, now it should be linked to the corresponding page.
Copy page ID of target page to clipboard
In the module overview, click on the properties of the target page (gear wheel in the upper left corner of the element) and select the "Copy ID to clipboard" entry from the context menu.
Now open the component "Page Jumper" in chapter 1. In the Properties block, the field name and the link can be added. The field name corresponds to the displayed text of the button and the link corresponds to the ID of the position to be jumped to.
Link to the overview page
Since the default overview page at the beginning of the module is not visible in the course content editor, no corresponding ID can be selected. For this case, simply add the character "#" in the link field in the "Page Jumper" component instead of the page ID.
Assessment
Create assessment
First, make sure that the "Assessment" extension is activated in the module (under "Manage Extensions").
Then, the basic data must be defined on the article in which the assessment is to be located:
1 | Switching the assessment extension on or off |
2 | Name of the assessment. This is only required if you want to insert several assessments in one module. If a content consists of only one test chapter, the designation can be ignored and thus left empty. |
3 | Number of attempts allowed (= work-throughs unless the required minimum pass rate has been met). -1 = unlimited number of attempts >0 = defined number of max. attempts (e.g. 3 = max. 3 attempts). |
4 | Score to be achieved to pass the assessment. |
5 | If the function is activated, the question elements (blocks) are displayed in a random order. |
6 | With the help of "Question Banks", questions can be divided into different (administrative) groups. This is required, for example, to compile test sets from a question pool with different chapters. The setting is made in the properties of the blocks: |
7 | The split function can be used to define how many questions from the individual chapters (Question Banks) should be randomly created for a test. e.g. our test chapter consists of 4 categories (Question Banks) with 10 questions each. From "chapter 1" 4 questions, "chapter 2" 3 questions each, "chapter 3" 5 questions each and from "chapter 4" their 2 questions should be randomly compiled to a set. In the Split field this would result in the following entry:4, 3, 5, 2 |
8 | If the function is activated, the authoring tool creates a random order of the individual chapters (question banks) when the test chapter is started. |
Evaluate assessment
The evaluation of an assessment is done with the help of the "Assessment Results" component.
The "Assessment Results" component must be located **in a separate article** outside the questions.
Afterwards, the component can be opened and edited by double-clicking on it.
Feedback after evaluation of the assessment
The feedback that should appear after the assessment is evaluated is entered in the "Feedback Text" field:
The following variables are available for the feedback text:
{{{attemptsSpent}}} -> number of attempts already performed
{{{attempts}}}**** -> number of attempts (total)
{{{attemptsLeft}}} -> number of remaining attempts
{{{score}}} -> score achieved by the learner
{{{scoreAsPercent}}} -> Percentage of success achieved by the learner (achieved score / max. score)
{{{maxScore}}} -> Maximum possible score
{{{feedback}}} -> Link to the feedback text, which can be created individually on the achieved score or success rate.
Under the entry "Bands" individual feedbacks have to be defined depending on the test result:
1 | Under the Score entry, the value for the feedback can be recorded. If participants reach the score (or percentage value), the corresponding feedback is displayed. |
2 | In the feedback text, the corresponding feedback can now be recorded, which is displayed when the participants reach the recorded score (or percentage value). |
3 | If the Allow Retry option is activated, the participants are given the opportunity to work through the test chapter again. The precondition for this is, of course, that they still have attempts left. |
In principle, as many "bands" as desired and required can be entered. With the button "Add" more "Bands" can be added at any time:
In the finished content, the above feedback then appears as follows, for example:
Create SCORM package
In order for test results to be saved on the LMS and expressed as a percentage, the following settings must be made.
The extension "Spoor" must be activated. Then the extension can be set in the Configuration settings.
The following checkmarks must be set for this:
The following settings must be made in the article settings of the article in which the assessment is located:
In the LMS, it then displays the test result as a percentage. The module is only "completed" when all chapters have been worked through.
In order to continue at the same point where you left the module on your next visit, the following settings must be made:
The extension "Bookmarking" must be activated. Afterwards the extension can be set in the Project settings:
Under "Level" select whether it jumps to a Page, Block or Component.
When re-entering, the following question will appear: