Md. school test coordinator raises PARCC issues

Wendy Boyer, a testing coordinator in Baltimore City, presumably at an unnamed charter school, writes an op-ed piece in the Baltimore Sun about testing and the evil of the tests developed by the Partnership for Assessment of Readiness for College and Careers, or PARCC, which the paper’s editors have entitled “The problem with PARCC: Children should not be leaving standardized tests in tears.”

I could easily write a response entitled “The problem with charter schools: In Maryland, many of them do not follow best practices when it comes to educating our children.” For starters, Ms Boyer writes:

There is also a programming issue with using JAVA [sic]. The test can only run using a certain version of JAVA and if it is updated, the test won’t run. Why the programmers haven’t fixed this is beyond me. If JAVA does get updated, that testing computer is no longer available for testing until tech support can be called.

Her understanding of the configuration of personal computers used for testing doesn’t indicate that she’s qualified to serve as a test coordinator for a school where our youngest citizens are educated.

First, the facts. Java is a program that can be used to control the behavior of browsers like Internet Explorer, Firefox, Chrome, or Safari. For example, if you see a small, colorful pop-up window with help or instructions in it when you move your mouse over a link, that’s probably a small Java program controlling your browser temporarily and causing the pop-up to display.

Pearson, the large education company that developed the PARCC online test-delivery software, known as TestNav, relies on Java for a variety of functions that are not included by default in a web browser. For example, Pearson uses Java programs to ensure that TestNav is the only program running and the student taking a test isn’t surfing the Web to find other sites that might encourage him or her to cheat.

Java™ isn’t developed by Pearson; it’s developed and maintained by Oracle, which releases updates on a scheduled, timely basis. But Pearson doesn’t have the ability, demanded by Ms Boyer’s letter, to fix any problems with Java because the company doesn’t control Java. The updates are required for security purposes: Because hackers can use Java to force the browser to ignore security issues, holes develop from time to time, and Oracle needs to keep hackers on the run, essentially.

In fact, Java itself automatically checks for updates every time you launch your browser, and the schedule of updates is published years in advance. School testing coordinators, like Ms Boyer, need to keep up with the work and keep software that their education computers rely on up to date.

If the most recent version of Java is installed on computers, TestNav will run without any Java-related issues. It’s also possible to disable the automatic checking during test-taking sessions. It is also highly recommended that schools use a feature of TestNav called proctor caching, which in effect gets all the test questions in advance so students can still take the tests in the event of an Internet connectivity problem. From the sound of Ms Boyer’s letter, it seems her school isn’t using this best practice, either.

Pearson, the Maryland State Department of Education, and probably Baltimore City Schools have published this information and disseminated it to schools. Ms Boyer seems not to have listened.

She goes on: “Much to our surprise there is a video portion of the test for some of the students. My favorite part is that we couldn’t just turn the sound on; we had to log the student out of the test, turn the sound back on, and then resume the test.”

The video portion was not a surprise—or at least it shouldn’t have been. Pearson has published practice tests, tutorials, and sample items that have included plenty of video passages for students to analyze and answer questions about.

Look, this is the 21st century, and kids don’t get all their information by reading anymore. Even newspapers have switched a couple pages on their websites over to videos. Kids need to be able to understand and analyze information as it’s presented to them in the real world, which includes a few modes of delivery besides reading printed text. That’s what the video passages are all about.

But as to Ms Boyer’s accusation of a surprise, this again underscores the need for teachers and others in our public schools to stay current with developments that affect our schools. If she had followed the release of PARCC online tests, which began more than a year ago, she would have known all about the video passages and teachers at her school would have been able to prepare kids for them.

I do agree that schools have more important things to do than prepare students for a specific test, but taking a half hour to page through a practice test would not have had any impact on the education of students at Ms Boyer’s school. It’s just the best way to prepare students for the online test-taking experience, and it really doesn’t take away that much instructional time.

This got me thinking about how this is equitable. How can you assess a child who has watched a video versus a child that has read a passage? (I should now point out that the text the students read are generally 2 grade levels above where they should be.)

First of all, we shouldn’t compare one student to another when we analyze these tests. (I know they compare themselves to each other, but that’s really not the point.) Second, all students have the option of having a computer-simulated voice read to them on the math portion of the tests, which may also include videos. But in English language arts, the use of text-to-speech requires students to have an individualized education program, or IEP.

So, is it fair to all students or do students who take the test online and see and hear video passages have an advantage?

This is something that needs to be worked out for each passage. Statisticians analyze student performance carefully and make adjustments to scaled scores based on the statistical properties of how kids performed on each individual item. They do this before they combine those individual scores into a final scaled score for the tests or for any group of students, such as a classroom, school, or district.

This is one reason standardization is important when it comes to testing kids in our diverse schools. It includes statistical adjustments that allow us to compare scores of student groups who took the test online with video passages to those who took it on paper and encountered only traditional reading passages.

In other words, in a raw form, I understand Ms Boyer’s concern. But after the test scores are scaled, after statistical adjustments have been made based on how actual kids perform on video-based passages compared to text-based passages, it’s perfectly equitable. If it weren’t, we wouldn’t be able to use video passages, and that would mean we were excluding a whole bunch of “documents” in the real world from our testing.

I have sat in many, many English classes where teachers even presented a video, a TED Talk perhaps, and had students analyze it along with a magazine article. This is the real world today, and our schools and tests need to keep pace. A surprise is unacceptable for a teacher.

In addition, the grade level of passages isn’t two years above where students are, following the Common Core, which Maryland schools adopted in 2010. I realize it wasn’t fully implemented back then, though.

Reading levels have indeed become higher, but that’s a different question. It gets more at the standards in the Common Core themselves, a debate worth having. The reading level of questions and passages on the PARCC tests, though, has certainly been checked against the Common Core definitions.

Finally, I agree with Ms Boyer that PARCC is a longer and harder test than what students took previously. Something should be done about that. It would be nice to cut down the hours spent on standardized tests mandated by law. Talk to your representatives in Washington. A few plans for rewriting No Child Left Behind include provisions that chip away at the amount of testing required.

About the Author

Paul Katula
Paul Katula is the executive editor of the Voxitatis Research Foundation, which publishes this blog. For more information, see the About page.