The amount of time students spend doing physical activity in school appears to be linked to higher standardized test scores in math in public schools in Washington, D.C., according to a new American University study, the Washington Post reports.
This study, parts of which have been published in the academic journal Appetite, hypothesizes a positive correlation between the number of minutes kids at a school in the District of Columbia spend in physical education classes every week and the percentage of kids at that school who show test results that say they are proficient in math.
In May 2010, the DC Council passed the Healthy Schools Act, a landmark law designed to improve the health and wellness of students attending public and public charter schools in the District of Columbia. The law requires schools to provide healthy meals and maintain high participation levels in PE.
Researchers divided 120 elementary schools in the city into four groups based on the amount of time students in each school spent in PE classes. Schools that provided students the most time in PE—the top 26 schools in the study, with an average of 151 minutes of PE per week—had an average proficiency rate of 56⅔ percent, compared to a proficiency rate of about 47½ percent for the 22 schools in the lowest quartile for PE time each week.
|Average PE time per week||Proficiency rate (standard deviation)|
|28.74 minutes||47.53% (19.42)|
|45.78 minutes||44.83% (24.33)|
|70.15 minutes||56.06% (20.23)|
|151.0 minutes||56.66% (20.65)|
This is a weak correlation and may not be statistically significant. For p = 0.05 in a t-test, the proficiency rates for the highest and lowest groups aren’t significantly different.
We therefore dismiss this study as being inaccurate but wish to add that even if there were a statistically significant difference in proficiency rates between the highest and lowest groups, it wouldn’t matter. It would be a correlation, not a causation, anyway. Statements like, “PE improves kids’ scores in math,” are academically dishonest and misleading. Of course, it’s just that kind of conclusion people are likely to draw when they read stories like this in the Washington Post.
“This finding demonstrates that students’ academic performance improves when there’s a balance between time spent on physical education and time spent on learning,” the Washington Post, in a feat of irresponsible journalism, quoted Stacey Snelling, dean of American University’s School of Education, as saying.
To be blunt, no, it doesn’t demonstrate that at all. Researchers neither found nor concluded any causal relationship that would allow verbs like “improves” to come into an evaluation of the present study. On top of that, the correlation they found is that there is no statistically significant difference in math proficiency rates between schools where kids got an average of 29 minutes of PE per week and those where kids got an average of 151 minutes per week.
I want to promote physical activity as much as anyone, but I am growing concerned about self-appointed scientists and purported science reporters drawing and publishing conclusions from studies that the original researchers neither intended nor expressed.
Furthermore, we hold that PE has inherent value for kids, independent of any difference it may or may not cause in math performance. To try to sell people on the idea of providing more PE for students using a non-existent difference in math performance completely misses the point and serves neither physical education nor mathematics education.