INDIANAPOLIS (Nov. 14, 2009)—The marching band from West Johnston High School in Benson, N.C., begins performing the second they cross the line at the Bands of America Grand National Championships here at Lucas Oil Stadium, a trumpeter laying the famous subject of J.S. Bach’s Little Fugue in G Minor down for a sax duet to answer, then a flute, and so forth.
The extensive pre-show also draws on music from the “Lacrimosa” of Mozart’s Requiem Mass in D Minor (K. 626), taking some of the last musical utterances of the 18th-century composer and using them to create an eerie, dark feel. Color guard dancers are also in black, one carrying a dark umbrella as a metaphor for darkness and oppression, the “weight of the world that today’s students tend to carry with them,” director David S. Duffy said.
The title of the show is “Modern Gothic: A Darkness That Needs Light,” which developed from the modern “Goth” kid, “who wears all black to set themselves apart as an individual,” Mr. Duffy continued. But the set, featuring representations of stained glass windows from some of the world’s great cathedrals, depicts the true Gothic movement: architecture so designed to let light into a church, permeating all those who enter to seek peace and salvation.
In order to represent this peaceful state, not only of those who visit churches but also of the “possibility of what the future will bring for students who are truly enlightened,” color guard dancers change their costumes to a silvery white. It makes for quite a contrast.
The music follows, first with the driving rhythmic force of music from the latest Star Trek movie, then with the ballad “My Immortal” by Evanescence, and finally the closer, “King Fisher’s Catchfire” by John Mackey.
For effect, the program includes some of the most dramatic silences heard today. After West Johnston completes their “different take” on the pre-show, the music falls silent, perfectly timed with the voice of BOA announcing the band on the field.
The formation at that exact moment is concentric rings around the center, flanked by arcs of flags. This explodes into a single circle with straight arms radiating outward, like the rays of light that come from the sun and into the cathedrals.
The second silence comes during a spectacular rifle toss. In fact, it wasn’t only silent on the field during the toss, which has several guard members throwing the rifle up high, down the line, in turn; as each one caught it, more people in the crowd started holding their breath as well. The silence was amazing.
When the last one caught it, the music kicked in, and the crowd erupted in a huge cheer—a moment of marching history made on the field here at Lucas Oil Stadium, the place where many more will enter to seek greatness.
West Johnston High School, established in 2002, brings a marching band to Indianapolis directed by Mr. Duffy and Garrett Griffin. Drum majors are Michael Simon, Brandon Allen, and Sean McBride. The band represented the state of North Carolina at the inaugural parade for President Obama in January, and their tradition of excellence continues to grow here, as they remind us, “True beauty is revealed only by a light from within.”
As we have noted here in earlier posts, numerical scores for artistic events are unreliable at best and misleading at worst. Efforts are made to eliminate any type of identifiable bias in the scoring, but it doesn’t always work out, because judges are humans who don’t have eyes and ears in the back of their heads, and because we’re talking about art, which is subject to personal opinion much more than the correct answer on a math test.
It is not my place to tell band directors how to do their jobs, and Mr. Duffy, along with all the directors here, I’m sure, looked at the recap sheets and got what they needed from them.
But when you start assessing high school students with scores, well, now you’re talking about my job. Although I have said the scores themselves are insignificant, I hope the Bands of America organization will take this constructive criticism from a scoring professional as a way to improve the many details of their process. Schools who compete here have strong attitudes about improving every detail of their performance, and I can only hope the organization that sets the standard in marching band scoring will do the same.
For the graphs below, we have sketched each band’s percentile on the trait specified (general effect, individual visuals, and visual ensemble here) in semi-finals against their percentile on that trait in the prelims.
For example, a point at 12% on the horizontal axis and 45% on the vertical axis (West Johnson in general effect total) would indicate that the band was placed at the 45th percentile (near the middle) among the 34 bands that competed in the semi-finals and at the 12th percentile (bottom quintile) among those same 34 bands in the prelims.
General effect takes all artistic aspects of a band’s program, including music and visuals, into account. It is a measure of how “effective” the show is, from an artistic standpoint.
The general effect score for West Johnston increased dramatically—some would say, unbelievably—between prelims and semi-finals, as did their scores for individual music and music ensemble, in terms of their placement within the 34 semi-final bands. Their visual scores stayed constant.
To explain why West Johnston’s scores in music (and subsequently general effect) were so much higher in the semi-finals, we have to talk about central tendency bias. In the prelims, West Johnston performed after Marcus, which got the caption award in the finals for best music. They were a tough act to follow, so we suspect that in the prelims, West Johnston may have been the victim of central tendency bias, possibly the result of fatigue from their late-night performance.
Training to eliminate rater bias
Organizations that score standardized tests institute training programs not only in the scoring rubrics for the specific content but also in being good scorers in general. They train their scorers to recognize situations that might cause bias, like fatigue and background of the judges, and nip them in the bud.
For marching band, influences off the field, such as the performance level of another band that finished right before the current band, can and do influence the score of the current band: It’s human nature to be biased when it comes to art. Other forms of bias that we have observed in the scoring include
- Halo Effects. This is when a band’s history—good or bad—affects the score given. If a band is known to be a grand national champion, a judge might overlook faults; if a band has never competed, a judge might overemphasize mistakes, penalizing the score too severely.
- Leniency Effects. Scorers tend to be more lenient with bands that have been with the organization for a long history and less forgiving of glitches in the program for newcomers.
- Logic Effects. Bands of America uses multi-trait scoring, and there is a tendency to allow one trait, such as individual music, to cross-pollinate, contributing to the ensemble music score. This is difficult to avoid with marching bands, since no person can evaluate individual music or any of the traits “in a vacuum,” to the exclusion of other qualities of the performance.
- Rater Reliability Effects. Not every judge can see every aspect of a performance, especially if the judge is on the field. States and the ACT and SAT exams use more than one scorer for every essay question on standardized tests (and sometimes more if the original scorers disagree by too much), and Bands of America sometimes averages scores from two different judges before tabulating a total score. In fact, the general effect score is taken from three different judges.
An example of the rater reliability effect can be seen in the general effect graph above, compared to the same graph for visual ensemble, presented below.
It is plain from the graphs that the general effect score, made by three different judges each time, is more reliable than the visual ensemble score, which comes from only a single judge, because the points are closer to the theoretical red line.
General effect has also proven more reliable than the individual visual score, shown in the scatterplot below.
In addition to the forms of bias identified above, some analysts report what they call “similarity effects.” Bias occurs, they say, when a performance is similar or dissimilar to the judge’s own band in a particular trait. We have seen no evidence of similarity errors in marching band judging, but we wanted to make you aware that this form of bias still needs to be addressed in any bias training.