Site icon Voxitatis Blog

Re: #8) Info from the MSA is valuable

The Maryland State Department of Education issued a document today advising parents in the state what they should know about testing in Maryland. This post is my personal, unsolicited, and non-endorsed response to the eighth “thing” parents should know: “Although new academic standards have been implemented this year, the MSA continues to provide teachers, parents, and students with valuable information. Annual assessments show where students are academically and the progress they are making each year. Teachers can also continue using MSA results to help inform their teaching practices and tailor instruction to the needs of their students.”

Using very general language, MSDE here asserts that the Maryland School Assessment provides valuable information to “teachers, parents, and students.” It’s hard to argue with that general point: that’s exactly why we test kids: to find out where they are academically and how much progress they’re making. But the devil’s in the details.

The MSA tests Maryland’s old state curriculum, which was known as the Voluntary State Curriculum before the state board dropped the “voluntary” part in 2009. Teachers aren’t even teaching from that set of standards anymore, as MSDE acknowledges, so why would information as to where their students are on the outdated standards be valuable? Of course, this information isn’t valuable at all to teachers.

Is it valuable to students? No. Last year, the scores on the MSA math test dropped a little, not because teachers weren’t teaching kids what they were supposed to be teaching but because they were. Many questions on the MSA are not found in the Common Core standards in or even before the grade level they’re at in the old state curriculum, and that’s why the scores went down last year. I doubt students across the state will do any better this year, again because the classroom lessons teachers are (supposed to be) using come from a completely different set of standards.

For example, if you don’t teach a kid how to construct a perpendicular bisector of a line segment in seventh grade, he won’t be able to do it on the seventh-grade MSA in math. If there are too many questions like this on the test, the kid’s score will be low not because of where he is academically or the progress he has made in seventh grade, at the hands of his seventh-grade teacher, but because of a misaligned test.

The problem here is that the kid’s teacher could very well have been doing exactly what she was supposed to do: teach kids lessons making sure they master the skills and knowledge in the Common Core. Or she might have been completely slacking off, lazily sitting at her desk while kids socialized all day, every day. And we’ll never know based on the MSA which is which. A bad teacher will see exactly the same percentage of kids being able to construct a perpendicular bisector on the MSA as a good teacher, because perpendicular bisectors aren’t required in seventh grade by the Common Core but are potentially tested on the MSA based on the old state curriculum.

So, school districts don’t learn valuable information, either. They won’t know, say, if a teacher’s students all do poorly on the seventh-grade math MSA, whether she’s a bad teacher and needs to be noted as poor in her teacher evaluation or she’s just doing exactly what she’s supposed to be doing.

It’s just insane. We need to test kids, but the tests need to provide valuable information about what teachers are teaching. I don’t necessarily agree that student tests can produce valuable information about teacher effectiveness, anyway, because of the other variables involved that are outside the classroom, but if the test is misaligned as the MSA is, we don’t even have a chance of identifying bad teachers at the state level.

The good news is, principals and superintendents have access to a lot more data about teachers, and the MSA isn’t really needed to identify a bad teacher. And that would be true even if the test were valid, reliable, and fair. The bottom line is we should find a way to rely on some of that data about teachers in our analysis. A simple score is easier to analyze, I realize, but if the score is meaningless in terms of what teachers are teaching, kids and analysts alike have better and more valuable ways to spend their time.

Exit mobile version