Viewing and interpreting reports
For frequently asked questions about interpreting reports, please click on the links below.
- How do I know when there is a meaningful difference between two scores?
- I want to know the range of scores for individual students, and can't see how to get this from the Learning Pathways reports.
- Where do I get the individual attitude scores?
- What does the red circle on the Individual Learning Pathways report mean?
- Why is the red circle on the Individual Learning Pathways larger than in Version 4?
- How do I get a Console Report on a subset of students in my class?
- One of my students sat a test, but when I bring up the ILP or the Tabular Report, the student’s scores are missing.
- Why do the Learning Pathways Reports repeat items and achievement objectives in different quadrants?
- If two students get the same number of questions correct, why are the scores different?
- How come my high scoring students have no strengths, while the low scoring students have lots of strengths listed in their ILPs?
- Are the data in the Tabular Report sortable?
- What do the labels aMs, aRs, aWs on reports mean?
- How can all students in a year level within a school be compared to the national mean?
- What do the boxes on the Console Report mean?
- Students cannot open the objective links on their Individual Learning Pathway Reports.
- How do I view a Tabular Report? When I click on the Tabular Report icon, the report does not display.
- Why do I have to select a school, then a group in order to view reports?
- Why are some of my Year 6 students reported as Year 7?
- Why is the attitude bar sometimes blank in the Multitest Console Report?
- My Numeracy Project tests do not appear in the View Reports list. How can I access my Numeracy Project information?
- How does the Learning Pathways report work for writing?
The standard error of measurement is about 15 asTTle scale points in each subject. Any point difference of more than 22 points between two students, two classes, or two performances over time for the same student, is statistically significant. Teachers should interpret this as reflecting a change in that individual's or group's performance. Changes of less than 22 points are not so meaningful and not so much dependence should be placed on such close-to-random differences.
I want to know the range of scores for individual students, and can't see how to get this from the Learning Pathways reports.
You can get this information from the Tabular reports.
The overall attitude score is shown on the Tabular report. This is the mean from the 6 attitude questions. Remember that 1 indicates low attitude whereas 4 indicates high attitude. If you want to see more information you will have to go into the marking screen and view what responses were entered or look at the students' individual test forms.
The circle represents one standard error of measurement around the student's score. This is like the 'margin of error' reported in political polls – two out of three times the student’s true score will lie somewhere between the top and the bottom of the red circle.
The circle represents one standard error of measurement around the student's score. The number of points represented by the circle has not changed between Version 4 and e-asTTle. However, as the scale on the reports has changed to reflect the new e-asTTle scoring, the circle appears to be larger.
You will have to create a special group for those students. Within e-asTTle you can go to Manage Student from the left hand menu and create a new group (class), including those students of interest in the group. Students can be in multiple groups in e-asTTle.
One of my students sat a test, but when I bring up the ILP or the Tabular Report, the student’s scores are missing.
Good practice means that no one score should be relied upon for high stakes decisions. In e-asTTle a student needs to get at least three items correct before a score is estimated. This reduces the risk of a student getting one or two items correct by chance and thus skewing their scores.
There are several places where this may be evident on the e-asTTle reports. The first is if the student gets less than three questions correct in the entire test. The ILP will only display the curriculum objectives (all the incorrect questions in the 'To be Achieved', and any correct in the 'Strengths'). The table at the bottom will display a “-” for each of the score and level rows.
The second case is where a student gets three or more questions correct in the test, however, they do not get three or more correct in each strand. The report will show a similar display as above – that is, while there will be an overall score, a “-” will appear in the score table. This also applies to the Surface and Deep scores. It may be possible, particularly for shorter duration tests, for students to only receive an overall and strand score.
Why do the Learning Pathways Reports repeat items and achievement objectives in different quadrants?
Items can measure more than one objective, and objectives are measured by a number of items of varying difficulty – so they can appear in different quadrants. This is determined by comparing the difficulty of the items with the ability of the student, and whether the questions were answered correctly or not.
It is possible to get different scores. While getting the same number of questions right gives the same total raw score, a student's asTTle score is determined by the relative difficulty of the questions they get correct. So, a student who does poorly on a hard question may get a higher asTTle score than one who does well on an easy question. A student who gets the harder questions right on the same test as another student will get a different overall score: in this way, different scores will be generated for each student.
How come my high scoring students have no strengths, while the low scoring students have lots of strengths listed in their ILPs?
The position of objectives into the four quadrants is based on the relative position of the item's difficulty to the student's individual overall ability and whether they were answered correctly or not. Thus, it is not possible to have strengths if all the items answered correctly were easier than one's overall ability. To find a student's strengths the test must have items in it harder than the student's overall ability – you should give such a student a harder test next time. On the other hand, a weak student who gets some hard items correct will have strengths. Remember, the ILP report is a diagnostic analysis of an individual – it is not intended for comparisons between pupils. You should use the Console Report for that purpose.
Yes. The Tabular Report is now created as a .CSV file, which can be opened with a spreadsheet application, such as MS Excel, and the spreadsheet application's standard sorting capability can be used.
These labels are abbreviations for the e-asTTle scales.
This is the asTTle scale that measure students' performance. Scores range between 100-3,000 points, with the national mean of year 5-7 students set at around 1,500. A weighted score for each item, which takes into account the varying degrees of difficulty of items, is converted to the asTTle subject scale. This allows for more dependable variations between students' scores, and comparison of a student's past and present performance.
Because all items are calibrated to a common scale, it is possible to compare performances in a subject regardless of which tests have been completed. The overall asTTle scale score for each subject compares any student to the national sample of year 4-12 students on which the materials were standardised. So all you need to do is collect students' asTTle scale scores and compare them.
The boxes are commonly known as 'Box' and 'Whisker' plots. They enable you to get a richer picture of how your students are doing compared to the national group by taking into account the way in which scores are spread, and not just the middle. The bottom of the box represents the 25th percentile, and the top of the box is the 75th percentile, which means that half of the students are within the box. The line in the middle of the box is the median (50% of your students are below this point, 50% are above). The ends of the whiskers represent the maximum and minimum score. Remember, the blue box shows the New Zealand means based on the sample tested by e-asTTle – it is possible for your students to score higher or lower than the students in e-asTTle's sample.
If you are using a Mac, check which program you are using to open PDF files. PreView does not support the links – they need to be opened in Adobe Reader for Mac.
How do I view a Tabular Report? When I click on the Tabular Report icon, the report does not display.
This can be due to settings on your computer or Internet browser.
Ensure that pop-ups are enabled. You will need to enable pop-ups on the e-asTTle site to be able to view tests and reports. The instructions for enabling pop-ups vary for different Internet browsers. For step-by-step instructions on how to enable pop-ups for your particular browser, please contact the Ministry of Education Contact Centre at email@example.com.
Ensure that the correct application is associated with the file download. Tabular Reports are downloaded as CSV files (Comma Separated Values File). These files need to be associated with an application to open the files. The most common applications are either Microsoft Excel (or some other spreadsheet) or Notepad (or some other file editor). Check the following file type settings to ensure that your preferred application is the default application for opening CSV files:
- Microsoft Windows 2000 / Windows XP
- Open Control Panel
- Open Folder Options
- Click File Types
- Locate “CSV Microsoft Office Excel Comma Separated Values File”
- Microsoft Windows 98 / Windows ME
- Open My Computer
- Select the View drop down menu and click Folder Options
- Click File Types
- In the Registered file types, check the setting for CSV
If the file type settings for opening CSV files are either blank or not your preferred application, please contact the Ministry of Education Contact Centre at firstname.lastname@example.org for assistance with modifying the file type associations.
Check your File Download settings.
- Internet Options (For Internet Explorer 7, select Tools, then select the Security tab)
- Select the appropriate zone, for example, Internet (it should be defaulting to whatever zone you are on; for example, if you on a local intranet, when you select Security you will see the Local Intranet zone highlighted automatically)
- Select the Custom Level button
- Scroll down to the "Downloads" settings (below the ActiveX Controls and Plug-ins)
- There are 3 settings (Automatic prompting for file downloads, File download, and Font download). Ensure all 3 settings are set to "Enable"
The school drop-down menu is present because some e-asTTle users work across multiple schools (such as External Coordinators).
e-asTTle reports are based on the funding year level of the student, rather than the instructional year level. Please ensure that the year level you want your students reported against is entered into e-asTTle as their year level. Instruction year level is used in e-asTTle for informational purposes only.
Average attitude scores are not reported across tests when the tests have different attitude domains. This is because it is not psychometrically sound to average two distinct attitude domains (for example, motivation and interest).
My Numeracy Project tests do not appear in the View Reports list. How can I access my Numeracy Project information?
To view and report on existing Numeracy tests, you will need to click on the Create New Test button on the left hand menu, and then select Numeracy Project. If you select your Numeracy test, you will then be able to click on either Tabular Report or Skyline Report.
There are only seven objectives in writing. Each one is placed in the three quadrants (there is no To Be Achieved quadrant in the top right of the report) according to whether it was performed a lot better or a lot worse than the student's average writing score. If the student gets 3B as an average overall writing score, then all the objectives that are 3B, 3P, or 2A will be in Achieved. Anything worse or better than that will go in Gaps or Strengths respectively.