Test & Question Analytics
Learn how to configure Synap's Segment integration to capture detailed insights into how your users are performing and engaging with their tests.
Synap integrates with Segment, a market-leading data integration platform, to provide a simple, unified way to connect your data.
To learn how to connect Synap to Segment, please view this page. This article will focus on the details of a specific, but commonly-requested use case for Segment, which is gathering data on the tests and questions your users are answering.
General Information & Set-up
Segment primarily works by tracking Page, Identify and Track calls:
Page: An event that is automatically sent when a user loads a certain page
Track: An event that is sent when a user takes a specific action on the platform, e.g. Logged In, Question Answered
Identify: An event that is sent when a user successfully logs in, or is otherwise authenticated onto the platform
For the purposes of this article, we will assume that you have performed the basic Segment set-up process in which case your Page, Track and Identify calls should be coming through automatically.
By default, we send through test-related events when a user submits an attempt. However, if you turn Detailed Event Tracking on from your Settings/Integrations page, then you will also receive the Question Answered event, which is a much more granular event sent for each individual question a user answers on the platform. Enabling this option will significantly increase the amount of data being sent into Segment (e.g. for a 100-question test, it will result in 100 Question Answered events, plus a Completed Test event. However with Detailed Event Tracking turned OFF, you will just receive a Completed Test event).
If you are interested in in-depth tracking and analysis of your users performance, we would recommend turning Detailed Event Tracking ON. Otherwise, you should leave it OFF.
Track Events Overview
Synap sends a number of different Track events to Segment - we will not list all of them here, but instead will focus on the ones most relevant to Test & Question analytics.
Event Name | Sent When | Description |
---|---|---|
Began Test | A user starts a test | Contains basic information about the test (e.g test ID, attempt ID) |
Submitted Test | A user submits a test | Contains basic information about the test |
Completed Test | A user submits a test and it has been marked | Contains detailed information about the test. This event is sent immediately upon completion if the test can be automatically marked. If it contains questions that require manual marking, then it is sent once the results have been marked and released |
Question Answered | A user submits a test and it has been marked. (1x event for each question answered) | Contains detailed information about the question that was answered. Sent alongside Completed Test. Only sent if Detailed Event Tracking is ON |
If you are looking to build an analytics pipeline - we would recommend focusing on the Completed Test and Question Answered events, as these are by far the most detailed events. The other events essentially act as notifications and have comparatively limited information.
The other events however may be useful, for example in performing basic usage analytics such as counting how many users have performed a certain action.
The remainder of this article will focus on the Completed Test and Question Answered events.
The events above are also available via Webhooks if you do not want to use Segment to manage your data - please speak to your Account Manager if you would like to use Webhooks for the events mentioned above
Connecting Destinations for your Data
Segment is an incredibly powerful tool that lets you send your data to thousands of other tools, as well as custom destinations. As a few examples, Segment integrates with Mailchimp, Intercom, Stripe, Mixpanel and much, much more.
For this article, we will focus on just a few examples which are most relevant to building an analytics pipeline.
Popular Analytics Services
You may want to connect your data to an Analytics tool. For event-based data, we recommend looking at a tool which is specifically designed for event-based analytics, as opposed to a generalist 'website analytics' tool such as Google Analytics.
Some of the most popular event-based analytics tools supported by Segment are:
These tools will let you get started very quickly, and have powerful data visualisation tools built into them. Many of our customers use them to get great analytics 'out of the box' using Synap's event data.
However, if you want even more control over your data, such as the ability to build custom reports and perform your own analysis, then you may want to send your data to a Data Warehouse. Fortunately, Segment supports this too, you can easily send your data to:
Segment also supports Functions, a Lambda/Micro-service style coding environment you can use to react and respond to incoming data in real-time.
The Completed Test Event
As stated above, this event is triggered when a user submits a test and it has been marked. For automatically-marked tests this will be immediately upon completion. For manually-marked tests it will be once the marking process has been completed.
The JSON snippet below shows an example payload from this event, and the table below it explains the key fields in more detail
Properties
(note: userId and event are standard fields required by Segment and are not detailed in the Properties table below)
In the table below, property names have been flattened, with nested properties indicated by a dot (.) and arrays indicated by a $. Bold fields indicate a 'parent object' rather than a value itself but have been included to provide a helpful reference to what the overall object is used for, with its child fields explained below
Field | value | Description |
---|---|---|
accessedBy | Object | Various meta-information about how the user accessed this test |
accessedBy.contentDeliveryType | collection | exam | assignment | spacedRepetition | selfPractice | mockExam | Enumerated string denoting how the user accessed this test, e.g. was it part of an Exam, or via their Spaced Learning, or an Assignment |
accessedBy.userGroupId | string / id | String representing the Synap UserGroup ID through which this user was granted access to the test |
className | Attempt | All individual test attempts are referred to as an Attempt on Synap |
collection.id | string / id | If contentDeliveryType is 'collection', this field will show the ID of the collection the user accessed the test from |
collection.name | string | If contentDeliveryType is 'collection', this field will show the name of the collection the user accessed the test from |
createdAt | Date | The date at which this attempt was first created |
id | string / id | The Synap ID of this particular Attempt |
collectionItem.id | string / id | If contentDeliveryType is 'collection', this field will show the Synap ID of the Collection Item that the test was taken from |
collectionItem.title | string | If contentDeliveryType is 'collection' this field will show the title (name) of the Collection Item that the test was taken from |
isExam | boolean | Denotes whether or not this Attempt was taken as a part of a Synap Exam |
isFullAttempt | boolean | Denotes whether or not this Attempt was created using all of the available Questions in the Test. If False then it indicates that the user chose to take a subset of the questions available |
isGenerated | boolean | Denotes whether this test was 'generated' in which case it indicates that the user created this attempt based on specifific criteria, e.g. 'Show me all of the Hard questions only', or created the test via Spaced Learning |
isSpacedRepetition | boolean | Denotes whether this was a Spaced Learning test or not |
percentageOfQuestionsSkipped | number | A fractional percentage (0-1) indicating the percentage of questions that the user did not answer during the test |
portal.hostname | string | The primary hostname of the portal the test was taken on. This can be useful if you have multiple Synap portals attached to one Segment account and/or Data Warehouse and need to distinguish between them |
portal.id | string / id | The Synap Portal ID of the portal the test was taken on. (see portal.hostname above for more info) |
portal.name | string | The name of the portal the test was taken on (see portal.hostname above for more info) |
quizContext.metadata.view | card | player | If the quiz was taken as part of a Collection, this field indicates the type of View the collection was in, e.g. 'Card' or 'Player' |
quizContext.subTargetId | string / id | If taken as part of a collection, this will be the ID of the Collection Item |
quizContext.targetId | string / id | This is the 'main' ID of the item that the test was accessed through. If taken through a Collection, it will be the Collection ID. If taken through an Exam, it will be the Exam ID |
quizContext.type | collection | assignment | Denotes whether the test was taken via a Collection or an Assignment |
quizMode | test | practice | timedExam | Denotes which 'test mode' the test was taken in. Practice means the student received feedback after each question. Test or TimedExam mean they received feedback only at the end, with timedExam meaning that the attempt was also taken under timed conditions |
score | number (0-100) | Integral percentage denoting the user's percentage score in the test |
scoreFrac | number (0-1) | Fractional percentage denoting the user's percentage score in the test |
grade | Object | If grades were used, this field contains information about the grade the user achieved for this Attempt |
grade.isPass | boolean | Whether or not the grade achived constitutes as passing grade |
grade.label | string | The label of the grade achieved, e.g. 'Strong Pass' |
grade.minScore | number (0-100) | Integral percentage representing the minimum score a user needed to achieve the grade they did |
sections | Object array | If the test contained Sections, this field will contain summaries of the user's performance in each section |
sections.$.id | string / id | The ID of the section |
sections.$.score | number (0-100) | Integral percentage denoting the user's percentage score in that section |
sections.$.scoreFrac | number (0-1) | Fractional percentage denoting the user's percentage score in that section |
sections.$.timeSpentInMs | number | The amount of time, in milliseconds, that the user spent on that section |
sections.$.title | string | The title of the Section |
sections.$.totalAnsweredCorrectly | number | The number of questions the user answered correctly in that section |
sections.$.totalQuestions | number | The total number of questions in that section |
tags | Object array | If the test used Tags, this field will show a summary of the user's performance in each Tag - this can be very helpful for analysis |
tags.$.facet | subject | topic | subtopic | skill | difficulty | Enumerated string indicating the Facet of the tag |
tags.$.id | string / id | The Synap ID of the Tag |
tags.$.label | string | The label of the tag, e.g. 'Cardiovascular Anatomy' |
tags.$.score | number (0-100) | Integral percentage indicating the user's score across all questions with this tag |
tags.$.scoreFrac | number (0-1) | Fractional percentage indicating the user's score across all questions with this tag |
tags.$.totalAnsweredCorrectly | number | The number of questions answered correctly with this tag |
tags.$.totalQuestions | number | The total number of questions with this tag |
test.id | string / id | The Synap ID of the underlying Test object |
test.title | string | The title (name) of the underlying Test object |
timeCompleted | Date | The date at which the user completed this Attempt |
timeStarted | Date | The date at which the user started this attempt |
timeSpentInMs | Number | The amount of time, in milliseconds, that the user spent on this Attempt |
timestamp | Date | (use timeCompleted instead) |
totalAnswered | Number | The total number of questions that were answered (e.g. not skipped) by the user in this attempt |
totalAnsweredCorrectly | Number | The total number of questions that were answered correctly by the user in this attempt |
totalQuestions | Number | The total number of available questions in this attempt |
totalQuestionsSkipped | Number | The total number of questions that were skipped (not-answered) in this attempt |
totalSections | Number | The total number of sections in this attempt (0 if Sections were not used) |
totalSectionsCompleted | Number | The total number of sections that the user completed in this attempt (0 if sections were not used) |
totalSectionsSkipped | Number | The total number of sections that were not completed (skipped) in this attempt (0 if sections were not used) |
user.id | string / id | The Synap ID of the user who completed this attempt |
user.name | string | The name of the user who completed this attempt |
userGroupIds | string array | A list of the UserGroup IDs that this user belongs to |
userGroups | Object Array | A list of the UserGroup IDs, and their names, that this user belongs to |
userGroups.$.id | string / id | The Synap ID of this UserGroup |
userGroups.$.name | string | The name of this UserGroup |
usersNthAttempt | number | Indicates how many times the user had taken this test previously. If usersNthAttempt is 2, it means that this attempt is the user's second attempt at this test |
The Question Answered Event
As you'll see above, the Completed Test event contains a lot of information about the Attempt, including scores by each Tag and Section if relevant.
However, if you need even more detail - then the Question Answered event gives you insights into each individual question that was answered during a user's Attempt. Example payload:
Properties
(note: userId and event are standard fields required by Segment and are not detailed in the Properties table below)
In the table below, property names have been flattened, with nested properties indicated by a dot (.) and arrays indicated by a $. Bold fields indicate a 'parent object' rather than a value itself but have been included to provide a helpful reference to what the overall object is used for, with its child fields explained below
Field | Value | Description |
---|---|---|
attempt | Object | Contextual information about the Attempt this Response belongs to |
attempt.attemptId | string / id | The Attempt ID that this response belongs to. Can be used to associate a set of Responses / Question Answered events to an Attempt / Completed Test event |
attempt.questionNumber | number | The position that this question appeared in within the Attempt (note: this number starts from 0, so 0 would be the first question, 1 would be the second question etc) |
attempt.totalQuestions | number | The total number of questions in this attempt |
chosenOptionIndexes | number array | An array of numbers indicating the options that the user selected for this question. This starts from 0, so Option A would be 0, Option B would be 1 etc. If chosenOptionIndexes is [1, 3] this would indicate that the user selected options B and D. |
correctOptionIndexes | number array | Similar to chosenOptionIndexes above, this denotes the correct set of options for this question |
className | Response | (This will always be 'Response' as Question Answered always related to a Synap Response object - you can ignore it) |
createdAt | Date | (Roughly corresponds to the time at which the user answered this question, however it is better to use timeCompleted instead) |
id | string / id | The Synap ID of this Response |
isCompleted | boolean | Whether or not the response was completed - this should always be set to true even if the user skipped this question |
isCorrect | boolean | Whether or not the user answered this question correctly |
marks | Object | Detailed information about the marks the user received for this question |
marks.credit | number | The number of credits the user received for this Response |
marks.maxPoints | number | The maximum number of points that could have been awarded for this Response |
marks.outcome | correct | incorrect | partial | Whether the response to this question was correct, incorrect or 'partially correct' |
marks.points | number | The overall number of points awarded for this Response (credits minus penalties) |
marks.penalties | number | If negative marking was used in this question, this field denotes the number of negative marks that were applied for it |
question | Object | This object holds detailed information about the underlying Question that this Response relates to |
question.doi | string | null | The external ID ('doi') of this question, if specified |
question.emqGroup | string | null | The Synap ID of the EMQ Group that this question belongs to, if any |
question.facets | Object | A summary of the Faceted Tags attached to this question |
question.facets.subject | string | The label of the Subject tag of this question (if specified), e.g. Cardiovascular |
question.facets.topic | string | The label of the Topic tag of this question (if specified), e.g. Arteries |
question.facets.subtopic | string | The label of the Subtopic tag of this question (if specified), e.g. Arterial Disease |
question.facets.skill | string | The label of the Skill tag of this question (if specified), e.g. Pathophysiology |
question.facets.difficulty | string | The label of the Difficulty tag of this question (if specified), e.g. Medium |
question.id | string | The Synap ID of this question |
optionAnsweringType | string | (deprecated, use optionType instead) |
optionContentType | string | (deprecated, use optionType instead) |
optionType | freeText | singleCorrect | multipleCorrect | ranked | fileUpload | readOnly | audioRecording | The type of question being answered |
positionInTest | number | The position in which this question appeared in the Attempt (0-indexed) |
sourceId | string / id | The ID of the 'Source Question' this Response was derived from. This will always relate to the original test that appears in your Library so can be helpful when comparing instances of a question across multiple different exams |
tagList | Object Array | Contains a list of all of the tags associated with this question, including unfaceted tags |
tagList.$.facet | subject | topic | subtopic | skill | difficulty | Enumerated string indicating the Facet of the tag |
tagList.$.id | string / id | The Synap ID of the Tag |
tagList.$.label | string | The label of the Tag |
tags | String Array | An array of strings denoting the tag labels associated with this question (deprecated, use tagList or facets instead) |
score | number (0-100) | Integral percentage indicating the user's overall score for this question |
scoreFrac | number (0-1) | Fractional percentage indicating the user's overall score for this question |
test | Object | Contains information about the underlying Test that this Attempt was based on |
test.id | string / id | The Synap ID of the test |
test.title | string | The title (name) of the test |
timeCompleted | Date | The Date at which the user completed this Attempt |
timestamp | Date | (use timeCompleted instead) |
totalCorrectlyChosenOptions | number | The number of options that were chosen correctly by the User |
Last updated