Wednesday, April 15, 2015

Minnesota Online State Testing - Not Fair to Students

So, Minnesota's high-stakes online student testing (known as MCA's) kicked off on March 9th, but because of Spring breaks and the desire to have as many instructional days as possible before testing, most districts wait until later in the 8-week testing window (which is ridiculously long from a statistical point of view) to administer the tests. This week was the first "heavy" week of testing, though some started earlier. And, as many predicted, the online systems received a failing grade, but it's the students who really lose.

A talented technology coordinator from Milaca, Steve Bistrup, wrote a message to his administration about their experiences with testing thus far. Here is a sample of what he wrote - warning, it is a bit long, but I think well worth the read and sharing with others!!


Long Wait #1 During the mornings in particular, but not exclusively, students have to wait an unacceptably long time for Pearson's main "Self Registration" page to load. Staff are now having to come to work extra early to "pre-load" Pearson's main server page so that students don't have to wait the 5-20 minutes for the login page. This is extremely difficult when we potentially will be testing six sessions simultaneously. (This slowness is NOT a result of our network infrastructure or speeds... these have been ruled out conclusively by technicians and Pearson support).

Long Wait #2 Once students enter their credentials and test code there is another "randomly" long waiting period. We have seen students wait as long as 45 minutes for "the blue bar" to finally load their test questions. (Again, this is NOT a result of our caching server or internal network)

Two to four adults needed to start each session Because we are always having to restart the browser, try a different computer, empty the java cache, reset the browser, etc. until we can finally get a student logged in to the test, an unacceptable number of adults are needed in order to reduce the amount of time students are sitting in front of a spinning wheel while the person next to them is 20 minutes into their test. I can't imagine the amount of anxiety this creates in the students. It certainly creates a high degree of anxiety and frustration in those that desperately want the students to have a fair shot and do their best work. I recall one student who anxiously waited for his test to load....worried that if he had to wait too long he would miss outside recess. The teacher and I had to both reassure him to take his time once he was logged in.

Constant browser plug-ins updates and Pearson's sporadic warnings On 4/14/15 Oracle released a new version of Java. Because Apple computers and TestNAV require you to be on the latest version of Java, staff needed to stay long into the evening that night in order to update Java on every computer that students would be testing on the next morning. For Milaca this is 120 laptops and 90 lab computers. There was also a new version of Flash on this date. Pearson warned us about the Java update but not the Flash update. Luckily we caught it. I can't begin to count how many times Java and Flash have had updates since school started. Some of them Pearson informs us of...more often then not they don't.

Math Test Calculator Issues Pearson acknowledged this as a known issue during the OLPA session. We have now had the same issue during the MCA math sessions. When a student uses the built-in calculator during a math test it randomly locks up their test session. Pearson's only answer has been to "force quit the test and have the student log back into the test." *see points one and two why this is more than inconvenient. Worse, a student has to shut down and restart their testing mindset and momentum.

Server outage and support On 4/14/15 Pearson Access server crashed or "became degraded" across the entire State of Minnesota resulting in testing administrators being unable to manage test sessions. When I called I was put on hold and twice was disconnected with a busy signal. When Pearson eventually sent out a formal message the verbiage was disingenuous. The message made a point to emphasize that "TestNAV and Testing were not effected." Yes, for those kids that were currently in their test they were able to continue. However, no other session could be started or resumed resulting in entire testing sessions being missed or postponed.

Not a fair assessment for kids I have been involved with online testing since the very beginning and I can say that no testing period has been approached with more trepidation by those managing the testing. Unfortunately we have now seen those fears played out. These past two weeks have been frustrating and unfair to kids. If the testing mechanism is supposed to be as transparent as a paper booklet and as technologically easy as working a #2 Dixon Ticonderoga then this has been a colossal failure. I've watched kids put their heads down in frustration as they wait. I've seen students lose their focus as a result of the glitches. I've seen the worry on an instructors face as they watch all the preparation and build-up deflated in a matter of minutes.


Hi - I'm back! I have examples from many other districts too, but Steve says it well, so I wanted to share those experiences. (Congratulations, by the way if you've reached this part of the post!) I can say that not all schools have experienced all of the same issues, but that just goes to reinforce the idea that this is not a fair assessment for kids. In the MN Department of Education Assessment Update - released today 4/15/15 - They included the following message:

To reduce testing disruptions for students, take appropriate actions to reduce noise, such as limiting the use of alarms (not fire alarms), bells, and announcements. Districts should avoid scheduling test sessions during known interruptions, such as during lunch breaks or pre-planned fire or tornado drills.

What about the disruptions caused by all the issues Steve points out? If a district encounters those kinds of issues (NOT a result of anything they did or didn't do) can we really validate that data? Is it a fair comparison to previous years' results if the same issues were not present then? Can anyone feel good about evaluating the success of teachers, students and schools based on such inconsistent testing environments? Heaven forbid we compare districts to each other based on these results. Yet, that is exactly what happens.

The MN Department of Education publishes their "Minnesota School Report Card" on their website each year. Designed for the public to look at progress and demographic information about schools - and built to provide side-by-side comparisons. Front and center on each schools "report card" is proficiency data for the MCA tests for the past 5 years. No where does it indicate that you may be comparing a school where online testing worked flawlessly and one where students were subjected to the kinds of distractions and frustrations that Steve noted. Yes, I firmly believe there would be statistically significant differences. Not to mention you are looking at results from year to year when entirely different groups of students were tested - I'm sure there are no variables there that could influence validity! Plus (though I'll admit this is noted if you look carefully) standards have changed in all three tested areas in the past 5 years, which again, invalidates comparisons from year to year.

I'll stop now - you get the point about how unfair this situation is. Kudos to you if you made it this far! I'll probably break this up, condense it and repackage it in some messages to folks at MDE, legislators, etc. Feel free to share it with anyone you think has the attention span to read it!! And... stay tuned for more updates and even some thoughts about how to improve it.

No comments:

Post a Comment