Questionmark Analytics helps provides the tools and reporting capabilities you need to uncover the meaning hidden within your assessment results -- and to share data in a meaningful way with stakeholders. This next-generation reporting system pulls data from your Perception repository and stores it in a specially constructed Results Warehouse, and includes built-in reports that you can use to drill down on the results you need. Questionmark Analytics is now available to all customers of the Questionmark On Demand service.
The Item Analysis Report provides an in-depth Classical Test Theory (CTT) psychometric analysis of item performance. It enables you to drill-down into specific item statistics and performance data. The report includes key item statistics including item difficulty p-value, high-low discrimination, item-total correlation discrimination, item-rest correlation discrimination and item reliability. It also provides assessment statistics relating to the amount of time taken and the scores achieved.
The report includes an assessment level overview graph and summary table. The overview graph plots a single point for each item in the summary table. Each question is plotted in terms of its item difficulty p-value (X-axis) and by its item-total correlation discrimination (Y-axis). The report also calculates item-rest correlation discrimination (the scatter-plot provides the option to plot "item-total" or "item-rest" along the Y-access):
The summary table beneath the scatter plot graph contains a line item for every question on the assessment. The table provides information on the question order, question wording and description, and summary information regarding the item difficulty p-values and the item-total correlation discrimination for each question. You can select an item in the table to navigate to the details of the selected item. And you can sort on each column to get different views of question performance. For example, you can sort the questions by difficulty to look at the hardest questions at the top of the table.
This interactive report provides a high-level view of your items' performance and enables you to drill-down into specific item statistics and performance data. The report includes user-adjustable visual cues to indicate results that may need attention, graphical representation of how items compare with each other in terms of P-value and correlation, and several publish-to-PDF and publish-to-CSV options.
Assessment results over time report (right) The Assessment results over time report provides information on the assessment results over a period of time specified by the user. It provides the mean, minimum and maximum values for an assessment, as well as 95% confidence interval of the assessment.
Survey frequency analysis report: The Survey frequency analysis report provides detailed frequency analysis information for each question on a survey assessment in terms of how many participants selected each answer. The report consists of a listing of each question on the survey and a graph showing the corresponding number of participants responding to each answer option.
participants that have been scheduled to take assessments at test centers so that users can review performance at various test center and evaluate whether there are potential test security issues at test centers.
- Identify potential cheating
- Flag potential content theft
- Determine if allotted time is sufficient for completing the assessment
Demographic report: Review a breakdown of assessment results by a demographic variable
Question frequency analysis report: Get quick basic view of response frequency for all questions in a given assessment
The Analytics tab in the user interface provides direct access to several reports with a variety of filtering (e.g. by group, date range, demographics) and distribution options (e.g., browser, CSV, PDF).
Job task analysis surveys provide valuable data validate competency assessment conent and crucial audit trails and documentation in the event test or exam validity is ever questioned or challenged. Once respondents have completed JTA survey assessments, administrators can generate reports to analyze results and share with stakeholders.
Questionmark provides two dedicated reports specific to JTA assessments:
- The JTA Summary Report provides the total percentage and number of participants that have selected each task, dimension, and choice in a JTA question.
- The JTA Dimension by Demographic Report groups results by task, dimension and demographic data.
In addition to these reports, three JTA-related OData feeds in Questionmark's Results API enable you to create custom dashboards for your results and/or analyze your results using business intelligence applications.
Question type report: (right) This report is an item bank report that provides information on how many questions you have in your repository by question type (e.g., Multiple Choice, Yes/No, etc.). You can see the information for your entire repository or just for specific topics that you specify. The CSV version of this report gives detailed information about the questions wording, description, type, and the location of questions in topics.
Question status report: This report is an item bank report that provides information on how many questions you have in your repository by question status. You can see the information for your entire repository or just for specific topics that you specify. The CSV output options provide detailed information about the questions wording, description, their status,and location in topics.
Course summary report: (right) The Course summary report compares course evaluation information across courses. This report is useful for managers and supervisors to determine how different courses within an organization compare on course evaluation measures.The Course summary report has a dynamically generated table representing the summary survey scores for all topics in each course. The averages are ranked and color coded from lowest to highest score.
Instructor summary report: The Instructor summary report compares all instructors within a course in terms of Course evaluation feedback. The Instructor summary report has a dynamically generated table representing summary survey scores for all topics for each instructor. The averages are ranked and color coded from lowest to highest score.
Class summary report: The Class summary report compares course evaluation feedback for a single instructor for a single course. This report is useful for managers and supervisors to determine how different instructors within an organization compare on course evaluation measures. The report may also be useful to instructors to obtain summary information on their course evaluation results
Class detail report: The Class detail report provides detailed question level course evaluation information for a single instructor for a single course. This report is useful for managers and supervisors to determine how different instructors within an organization compare on course evaluation measures. The report may also be useful to instructors to obtain summary information on their course evaluation results.