Te Kete Ipurangi Navigation:

Te Kete Ipurangi
Communities
Schools

Te Kete Ipurangi user options:


e-asTTle Ministry of Education

Home navigation


e-asTTle writing tool (revised)

The following questions and answers are specifically related to the e-asTTle writing tool (revised). This page will be updated regularly as new questions arise.

If you have a question you would like added, please email us here .

Click through for Frequently Asked Questions on the recalibration of the writing tool, April 2013.

When will the new writing tool be available?

The tool will be available for use from the beginning of Term 2, 2012.

Will I be able to access test data from the previous tool once the new tool is implemented?

Yes. All test data from the previous version of e-asTTle writing will be available. However, due to the change in rubric, measurement scale and prompts, you will not be able to directly compare scores on the new version of e-asTTle with scores on the previous versions. The Ministry will provide information to assist in score comparisons.

Can you please let me know how the new rubric fits with the previous rubrics - that is, does it now replace them?

Yes, when the revised e-asTTle writing assessment becomes available in Term 2, the new single rubric replaces the old ones.

For which year levels can I use the new e-asTTle writing tool?

e-asTTle writing is now available for years 1-10. However, teachers should test younger students using e-asTTle writing only if they can communicate at least one or two simple ideas in writing without teacher assistance. For students unable to do this, another assessment tool should be used. It would be invalid for teachers to make a transcription of students’ writing and then use the rubric to assess that transcription.

Is there a site I can go to, to read up about these changes? 

All information will be posted on e-asttle.tki.org.nz and assessment.tki.org.nz .

Will there still be a B, P, A reporting system at each level?

Students’ overall score on e-asTTle writing and their scores on each element of writing assessed against the rubric will still be reported using curriculum levels broken down into basic, proficient and advanced sub levels. For instance, a student’s proficiency in writing might be classed as 4B overall.

How long will we be able to enter scores from the previous rubric into the tool?

Scores from the previous rubric will not be able to be entered when the new application goes live. No scores should be entered after Wednesday April 18, 2012.

What does the ‘R’ mean in R1, R2, R3, R4...?

The ‘R’ stands for ‘Rubric’. The ‘R’ is written in front of the score category to make it clear that the scores relate to the rubric, and not to curriculum levels. Rubric scores are converted to scale scores and curriculum levels within the assessment tool.

How do I mark a script that seems off topic?

The topic outlined on the prompt (for example, ‘being a good friend’) is intended as a springboard for writing, rather than a tightly defined focus. Take this into account when scoring the ‘ideas’ element: ideas can be loosely related to the topic and still be considered relevant.

How do I mark a script that seems to be off purpose?

The purpose for the writing (to explain, persuade, narrate, describe or recount) is the focus of the ‘structure and language’ element. It is not the focus of any other element. If the student has been asked to describe a photograph of two dogs playing on the beach, but actually narrates a story about a dog, this will be reflected in the category score for ‘structure and language’.

What are the annotated exemplars?

The annotated exemplars are samples of student writing produced in response to each e-asTTle writing prompt. They are representative examples of writing for each prompt. Each annotated exemplar has been scored using the marking rubric. The annotations explain the thinking behind each scoring decision. Together, the rubric, the exemplars and the annotations enable consistent marking decisions to be made.

What are the generic exemplars?

The generic exemplars can be used to check interpretation of individual categories (for example, category R2 in spelling, or category R4 in ideas). They are provided from across a range of prompts.

How do I mark when factual information is wrong?

e-asTTle writing does not assess curriculum area content knowledge, so there is no element for scoring the correctness of information. Writing sometimes contains incorrect facts, but this does not mean a student will necessarily have a low score for the ‘ideas’ element.

How long does a piece of writing have to be before it can be marked?

There is no particular length or number of ideas needed before a piece of writing can be marked, but e-asTTle writing is only suitable for students who are able to communicate at least one or two simple ideas in writing. In order for the e-asTTle application to calculate a scale score for a student, each element must be scored against the rubric. Students who score in the lowest category for every element assessed by e-asTTle writing (all R1s) are not well targeted by the assessment.

Can I read the prompt aloud to my students?

Yes, teachers should make sure students understand the prompt fully. Teachers should also lead a short discussion prior to writing where the students begin to develop their own ideas. In each prompt package, teachers are guided through administering the test/assessment. It is important that teachers read and follow the administration guidelines carefully. They provide very specific instructions on what to do and what to say to students.

Can my students use dictionaries or alphabet cards when they write?

No, students are not allowed to use alphabet cards, word cards, dictionaries, thesauruses or other spelling aids, because spelling and vocabulary are two of the elements students are assessed on.

Can I develop my own prompts?

Teachers are encouraged to use the rubric to assess writing other than that generated by the e-asTTle writing prompts. Teachers may wish to write their own prompts that relate explicitly to classroom topics of study. If they do so, consideration of the following will facilitate accurate use:

  • It is recommended that results from teacher-developed prompts are not entered into the e-asTTle application. The e-asTTle application links results to particular e-asTTle prompts. It then takes into account the difficulty of the prompt when transforming rubric scores to scale scores. The difficulty level of a teacher-developed prompt is unknown.
  • Teachers and students will be able to use the rubric to determine ‘next steps’ in teaching and learning, although scale scores and curriculum levels can not be generated outside the application.
  • The rubric has been developed from students’ writing of continuous text. It is recommended that teacher-developed writing prompts maintain this feature.
  • The rubric was developed from students’ writing for five communicative purposes: to describe, explain, persuade, narrate, and recount. The rubric may also be used with other single, or multiple, communicative purposes; although not every element will always be relevant.

How can I use the results for OTJs?

Assessment results from e-asTTle writing can contribute to the wide range of evidence supporting an overall teacher judgment about each student’s performance in relation to the National Standards.

Are some prompts more suitable for older or younger students?

Some prompts will suit older students because they cover topics relating to the wider world. Others will be better suited to younger students. The recounting prompts (Whānau and family time, Time with friends, and What I did well) and three of the describing prompts (Girl, Adult and child, and Dogs at the beach) are written in slightly simplified language because of the likelihood that they will be used by teachers of younger students.

What do I do when I can’t decide which rubric score to give?

A range of exemplars, showing how category scores have been assigned to examples of student writing, is available to provide further guidance on the interpretation of category descriptors. A small set of exemplars is included in each test package, and a larger set of generic exemplars is available from the “Enter Scores” page under “Mark Test”, and on the e-asTTle web site. These exemplars can be used to check interpretation of individual categories (for example, category R2 in spelling, or category R4 in ideas). If writing does not fall clearly into one category, or has features of two different categories, you will need to make an ‘on balance’ judgment.

We’ve checked our e-asTTle curriculum levels against the National Standards, and the e-asTTle results are showing much higher levels of achievement. How can this be?

The curriculum levels reported for e-asTTle Writing are based on a standard-setting exercise undertaken to link performance on an e-asTTle assessment with the descriptions of writing competence provided in the Literacy Learning Progressions. The exercise defined an appropriate score range on an e-asTTle assessment for each level of writing competence described by the progressions. A curriculum level of 4A for example means that given 40 minutes to write to a particular prompt under test conditions the student has been able to produce a text of sufficient quality to indicate they have the writing skills and competencies described as appropriate for students working at an advanced stage for Level 4 of the curriculum.

The important point here is that the e-asTTle curriculum level attempts to identify what the student’s performance in the context of an e-asTTle Writing assessment indicates in terms of achievement against curriculum expectations. This means that a student who has been assessed by e-asTTle Writing to be working at a particular curriculum level will not necessarily have produced a piece of writing that looks exactly like a National Standards exemplar. National Standards exemplars illustrate performance where students have been given the opportunity to engage with a writing task in a classroom situation “largely by themselves”. An e-asTTle assessment is completed in 40 minutes under test conditions without any teacher or peer feedback, or access to writing aids such as dictionaries.

Figure 1 below shows the distribution of e-asTTle Writing scale scores from all the scripts marked and moderated at each level during the trial of e-asTTle writing. The range of scores is within curriculum level expectations although increasingly fewer students in Years 7 to 10 performed at or above the curriculum levels expected for their year levels.

The tool developers went through a careful standard setting exercise during the trial and the data they have from the trial, illustrated in Figure 1, supports the standards set.

Taking imprecision into account

No educational assessment is perfectly precise. e-asTTle Writing provides an indication of the imprecision (margin of error) involved in the assessment by presenting scale scores within a “plus or minus” range, for instance 1250 ± 40. This means that if the student were to repeat the assessment then about 70% of the time we could expect them to score somewhere between 1210 and 1290 scale units. Imprecision also needs to be taken into account when considering the curriculum level descriptor. A student who scores 4A is probably somewhere in the range 4P to 5B.

Using e-asTTle Writing scores as part of an overall teacher judgment

An e-asTTle Writing assessment provides a snapshot of what a student can achieve by themselves under standardised conditions. Teachers can consider this as one piece of evidence when making an overall teacher judgment. No single source of assessment information however, can accurately summarise a student’s achievement or progress. A more comprehensive picture will involve collecting multiple pieces of assessment data from a range of contexts and experiences across the curriculum. This includes teachers’ general observations of students’ writing and careful consideration of the expectations described by the National Standards and supporting documentation.

Figure 1: The distribution of e-asTTle Writing scale scores by year level compared to curriculum level expectations (Term 4 data).

 

Untitled

We have some concerns over the levelling of individual elements within the e-asTTle writing assessment. For example, a piece of writing can be assessed for ideas at R3 - which gives a curriculum level of 3B - while a piece assessed at R4 gives a curriculum level at 4A. There seems to be a huge leap with few discerning differences.

In the revised e-asTTle writing each element has only 6 or 7 scoring categories. Moving from say a R2 to a R3 will involve a large shift in the associated curriculum level. It is unwise to treat the element curriculum scores as precise measures and comparisons with the old element scores will be very tenuous. It is best for teachers to point to the rubric scores given for each element when discussing a report with students.

With well moderated scoring and proper administration e-asTTle writing has the capacity to put students into about six or seven statistically distinct bands of achievement. This is very strong performance for a writing assessment. The reporting system in the e-asTTle tool is built on a system that uses many more bands than this and which are linked to curriculum levels. These curriculum bands can impress a sense of precision that isn't there and mean we have to be very aware of the margin of error in an e-asTTle score when interpreting or comparing them.

There is a need to emphasise the scale score and the margin of error as the e-asTTle measure. The curriculum scores are a reference point, which link the test performance with the performance level that might reasonably be expected from someone working at that level. This is a big change from the previous version of e-asTTle Writing which tied the rubric categories directly to curriculum levels.

Return to top



Site map


Footer: