Synap Academy
Search…
⌃K

Results page configuration

A look at what each of the options on the results page config means and how it looks to users
Customise the results page configuration to curate what users will see after an exam. From here you can decide which way the results will be released, and what metrics you want to show to users.
Results configuration

🛫 Release results

The way you release results will depend on the following:
  • Presence of free text questions (that need to be manually marked)
  • Moderating the questions (analysing results and omitting 'bad questions'
  • Results date (release results at the same time for everyone)
Once you know what you want to to with your results you can then select a release method.
Method
Description
Automatically when possible
Results will be released to each candidate as soon as marks are finalised. For MCQ only exams this is as soon as the candidate finishes.
When there are questions that require manually marking (free text), results will be released once they have been marked.
Always automatically
Results will always be released immediately after completion, even if some questions could not be automatically marked. If selected you cannot go back and mark long form questions on platform. This should only be selected if you are planning on marking these questions and storing the marks off platform.
Always manually
Results will not be released to students until an admin or educator manually does so. You can include a custom message to show to students after their exam

⚙️ Results page configuration

Select the configuration you'd like users to see once their results are released.
To use a similar results page to the one above, your portal will need to have questions tagged and faceted.

🧑‍🎓 Individual metrics

Individual metrics on the results page
Score = Number of credits awarded on an attempt, as a percentage of the total number of credits available.
Time spent = Total time spent on the attempt (includes instructions and section instructions)
Grade = custom grade awarded set at the exam level
Question breakdown = Detailed feedback for each question such as which were answered incorrectly and what the correct answers were.

👨‍👩‍👧‍👦 Cohort metrics

🧠: Cohorts on Synap are all users on the portal, using data from the last 24 months. All cohort related data requires a minimum of 10 attempts.
Cohort average = The average score of the cohort on attempts on the same test. Note on dynamically generated exams this is all attempts on this exam. But on static exams this is all attempts on the same test, anywhere on platform.
Percentile = Score performance on the test in comparison to everyone else who has taken. Both the percentile and the cohort average use all attempts made in the last year.
Score distribution = Cohort's score distribution chart with the attempt-taker's placement in this exam
Score distribution bell curve

Tag metrics

Show tag breakdown = Shows users breakdown by tags and facets on their attempt, compared to the cohorts. Breakdowns are focused on content facets primarily: Subject >Topic >Subtopic.
In order to enable the breakdown page for students, tag breakdown must be set to all available tags
Enabling tag breakdown page
Breakdown by content facet
Show progress over time = Show users progress over time on content facets Subject> Topic > Subtopic. Minimum of 2 attempts required.
Student progress charts
Last modified 6mo ago